Jake Hendy Gamer, Programmer, Gym Lover. All rolled in to one package

GSD: Test 123, is this working?

A previous version of this article heavily focussed on Cucumber tests. You can still read where I got up to, if you look at commit df8cad in my website repository. I won’t delete the branch either, so you can still reference that.

Well, it’s sure been a while since we last look at our DataPoint example isn’t it? Would you look at that—Spring Boot has updated to 1.4.0. Let’s go ahead and update our own project and watch the Travis Build pass!

In this article, we’re going to do some automated testing of our application. For now, we’ll just test configuration and our basic endpoints that we created last time.

We’re going to do the following:

  • Create a configuration class to hold our DataPoint API key
  • Create an automated test to ensure this configuration is picked up appropriately.

Creating (and using) a Configuration class

We’re likely to have differing configurations for our application, depending on where the application is running. We should, most of the time, avoid hardcoding our configuration and instead use differing configuration files to drive our application. We can do this with Spring Boot using *.yml/YAML files, as we did before with application.yml to set the HTTP and management ports of our application. Spring calls this Externalized Configuration.

Let’s create a file called “application-dev.yml” in the root of our project, with the following config:

  key: _Your API Key here_

Note, this file won’t exist in the repository as it’ll contain my API Key obviously. Be careful when committing sensitive information; such as this API Key

Now, change your Run Configuration to include the parameters --spring.profiles.active=dev, and run your app. You see no difference? Good :smile:

That’s probably beacuse you haven’t created the configuration class, so create a new class called DataPointConfiguration in a “configuration” package in your project. Pop in the “key” field with some getters and setters. You might be thinking “is Spring really that smart as to figure out the ‘DataPoint’ from the class name translates to the object in the YAML file?”. You’d be forgiven for thinking that, with how clever Spring actually is, but alas you’d be wrong. You need to add two annotations to the class; ConfigurationProperties with a value of “datapoint”, and your regular Spring Configuration . To save you looking at the docs (which I encourage you to do anyway), ConfigurationProperties looks at the available properties in the application and binds those that start with the given annotation value to this class. So, “datapoint.key” in our YAML file translates to the key field in our class. Fantastic, right? Right.

Create an automated test to make sure the configuration is populated correctly

Add a new dependency to your pom, with a group id of “org.springframework.boot” and an artifact id of “spring-boot-starter-test”, with a scope of “test”—no point including any of this in our deployable. This artifact brings in a number of test frameworks such as JUnit, Hamcrest, Mockito, JSONAssert, and so much more. We’ll use JUnit for our tests and Hamcrest for human-like assertions.

Now create a test class corresponding to the DataPointConfiguration class you just created. Add an autowired field for your DataPointConfiguration, and a test method that checks if DataPointConfiguration#getKey is not null.

Something similar to this:

    private DataPointConfiguration dataPointConfiguration;

    public void dataPointConfigurationHasKey() throws Exception {
        assertThat("DataPoint key is null", dataPointConfiguration.getKey(), is(notNullValue()));

If you were to run such a test, it would fail. JUnit, by default, has no idea of Spring and executes this test class in the default way. To fix this, we add 3 annotations and a config file. That sounds a bit much doesn’t it? I mean, we’re only testing this unit—the configuration unit—what else is there? Those 3 annotations get Spring involved, and set us up for a whole world of flexibility. We need to add: RunWith(SpringRunner.class) (RunWith(SpringJUnit4Runner.class) if you’re not using Spring Boot 1.4.0), SpringBootTest, and Profiles.

  1. The first annotation tells JUnit to use the Spring Runner for tests. This provides our Spring contexts with dependency injection and all that goodness.
  2. The second annotation tells Spring this is a test and to invoke our standard application context. We can optionally specify a Configuration class if we want to have more control over how the application interacts with this unit. We’ll come on to this in the next article especially
  3. The third annotation instructs us on which Spring Profile to use. You may have noticed this is option 10/11 in the Externalized Configuration documentation. We set this to “unittest”.

Your class will look similar to this:

package com.jakehendy.demo.datapoint_example.configuration;

import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.test.context.ActiveProfiles;
import org.springframework.test.context.junit4.SpringRunner;

import static org.hamcrest.core.Is.is;
import static org.hamcrest.core.IsNull.notNullValue;
import static org.junit.Assert.assertThat;

 * Created by me on 13/09/2016.
@ActiveProfiles(profiles = {"unittest"})
public class DataPointConfigurationTest {

    private DataPointConfiguration dataPointConfiguration;

    public void dataPointConfigurationHasKey() throws Exception {
        assertThat("DataPoint key is null", dataPointConfiguration.getKey(), is(notNullValue()));

Those are the annotations we need yet the test still doesn’t pass, as the field isn’t populated. Notice we created an application-dev.yml file earlier? Now we’re going to create an application-unittest.yml file. Spring will load all the property files (*.properties or *.yml) for application.yml, and all the active profiles.

If you have a property declared in multiple profile property files, the one loaded last will take priority.

Code up to this point is available at commit fb8610db6258d84871e76b3953eb790f16694802.

Go ahead and create a file in your “src/test/resources” directory called “application-unittest.yml” with a similar structure to our datapoint-dev.yml file—this time fill the key with some random text.

Now run that test, and watch it pass. That’s your first, fully integrated, Spring-based unit test. That only interacted with code inside the app, there were no web requests involved.

Of course, you could do something similar with your controllers. Injecting the controller classes is a possibility, but there’s a better way…

Testing our controllers

This is integration testing. So create a new package called “integration” under your base package; for me this is “com.jakehendy.demo.datapoint_example.integration”. Add on those 3 annotations we had before, but this time add another Spring Profile of “integrationtest”. We’re also going to slightly tweak the SpringBootTest annotation, by adding the property of webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT.

Yes, we’re going to spin up that server for our integration tests on a random port. There’s no reason why we can’t, as Spring will pick an unused port that it can bind to automatically. This means that our integration tests can be performed at any the time, in parallel, on your dev machine or your build server.

How do we get that port? Add a “port” field to your class, of type int, and annotate it with @LocalServerPort. @LocalServerPort is shorthand for @Value("${local.server.port"}), and it reads nicer. This field will contain the port for your app, decided at runtime.

Now let’s add a simple method that will test the HelloWorldController. The URL “http://localhost:[ThatRandomPort]/hello-world” should now return the contents of the view “index.ftl”. Let’s find out.

I’m a big fan of OkHttpClient for HTTP requests, and we’re going to use that in the next few articles when we fetch objects from DataPoint, so add that in to your pom. GroupId of “com.squareup.okhttp3”, artifactId of “okhttp”, and a version of “3.4.1”.

We create a new URL for the request. OkHttp’s HttpUrl has a fluid builder, so this is how I constructed mine:

HttpUrl helloWorldUrl = HttpUrl.parse("http://localhost/hello-world").newBuilder().port(port).build();

We need a request object, so let’s create one of them with:

Request helloWorldRequest = new Request.Builder().url(helloWorldUrl).get().build();

And now we need to issue that request. Let’s add a constant OkHttpClient for this class, as we won’t be doing anything special for now:

private static final OkHttpClient CLIENT = new OkHttpClient();

Client’s set, let’s issue the request:

Response helloWorldResponse = CLIENT.newCall(helloWorldRequest).execute();

The call is now executed synchronously! (Client#newCall(Request))#execute) can throw an IOException, which is handled by JUnit for me as I generate tests using IntelliJ’s add test method stub feature. Just keep that in mind, and always read the JavaDocs :)

We can immediately start asserting on that response object, as it has a method code which returns the status code of the HTTP request.

assertThat("Hello World did not return a HTTP2xx", helloWorldResponse.code() / 100, is(2));

Go on, run it! See that, your test passed! What we have, up until now, is available at commit 49233db9d360d8b601753c2814d69c7d4f08b39b. However all we’ve proved is the reponse returned and it was, alright. Let’s test the response body to see if it’s what we expect. Create a test fixture (output we’re expecting) in your “src/test/resources” directory. I’ve put mine at “src/test/resources/fixtures/integration/HelloWorld/expectedindex.html”. I’ve then simply copied and pasted the content of the “index.ftl” view we created before…

Let’s bring Guava in to the mix, as we’ll be using that shortly as well! In the pom, dependency, groupId of “com.google.guava”, artifactId of “guava”, version of “19.0”. Done.

Guava has a Resources class, which funnily enough handles loading resources in your application. We’ll use the Resources#getResource(String) and Resources#toString(URL, Charset) methods to load in our fixture. I’ve put the fixture path in a constant field, so to load the fixture for me it looks like:

String expectedOutput = Resources.toString(Resources.getResource(FIXTURE_EXPECTED_INDEX), Charsets.UTF_8);

You’re using UTF-8 right? Right? If not (and you have a valid reason), do feel free to change the Charset to that which your fixture is encoded in. You can do this with the Charset#forName(String) method.

Now all we have to do is compare the body and the expected output, which OkHttp makes almost stupidly easy:

assertThat("Hello World output does not match fixture", helloWorldResponse.body().string(), is(expectedOutput));

That, my friends, is it. Available at commit 1816a669dcb585ba3bbca4df711f4d7ff474450b.

And no, it’s not going to be 7 months until my next article. In fact, it’s already being written along side this one… Have a look at the GitHub repo :wink:

Updating my Surface Pro 4 to Windows 10 Anniversary Update "Redstone 1"

I’ve been having major trouble upgrading my Surface Pro 4 to the anniversary update that in fact, Windows Update just didn’t want to give me it in the end. The Upgrade Assistant didn’t want to help and I was reluctant to nuke it just to install an update. My Surface would restart in to the Windows Installer and get some of the way there, I’d blink and it would give me that dreaded text

Restoring your previous version of Windows

While it’s nice that Windows can now restore previous versions extremely cleanly (kudos to MS!) this didn’t help me figure out why the update wasn’t installing. I tried looking in logs, executing Get-WindowsUpdateLog in PowerShell and everything but got no where.

Today I stumbled across this post from Pieter Wigleven at Microsoft. The rundll command gives no feedback so I wasn’t sure if it worked. I ran it again, to be sure, typing it out this time then tried the update.

I’m now on the Anniversary Update. How easy was that :)

Oh my lord, where am I?!

It’s been 6 months since I’ve published a post on my blog, how astonishing. If you’re one of the few people who looked at the backing GitHub repository—where all my posts are stored—then you’ll see there’s been one lingering around for a while now, but hasn’t been touched for a while. However…

I’m here.

GSD: Bootstrapping our application

Let’s start off by bootstrapping our application.

I’ve pushed up a branch called step/1_The-basics in the datapoint-example repository. It’s also been merged in to master, as it is so simple it can’t break.

The first commit details the very basics to getting an application working; the pom, an application.yml and the Application class.

If you’re mainly interested in the DataPoint-related code, you can skip this post and look forward to the next one!

The basics of the basics.

Let’s get a web server started, no worries about content.

The pom

We structure our pom a little differently from how Spring Boot (or Spring Initializr) structures theirs by default. We don’t use a parent pom, instead using the dependencyManagement tag (and subsystem) to deal with it. This means we can have our own parent pom, if we need to.

There’s a property to set the Spring Boot version, so in the future you only have to update it in one place, meaning you can specifically override individual versions. This property also controls the spring-boot-maven-plugin. The what?

Maven Compiler Plugin

We use the maven compiler plugin to set the -source and -target compiler options (javac) to 1.8. This avoids some make warnings and errors. Nothing special here…

Spring Boot plugin

The Spring Boot maven plugin offers two things:

  1. You can use it as a Maven goal, execute mvn spring-boot:run and it’ll automatically spin your application up as necessary.
  2. You can configure it to run during the repackage phase, where it’ll automatically create a MANIFEST file for you. Unless you create a MANIFEST file yourself, your application won’t start when you run it with java -jaror you could specify the path to the main class. I personally prefer to have the MANIFEST file.

The dependencies

In the dependencyManagement tag, we add a dependency with an artifact ID of spring-boot-dependencies (complete coordinate of org.springframework.boot:spring-boot-dependencies:[YOUR_SPRING_BOOT_VERSION]) and version, as of writing, of 1.3.3.RELEASE. You must remember to set the scope to import and the type to pom, else it won’t work. I always forget this, so I’m putting it here to remind you!

In the dependencies section, we’ve 2 dependencies: spring-boot-starter-web and spring-boot-starter-actuator.

Spring Boot Starter Web

spring-boot-starter-web provides the basics for writing a web service. An embedded tomcat server, the Spring Framework, logback, etcetera. The only thing you’ll probably ever exclude here is the Tomcat server, as you can drop in Undertow or Jetty instead. I’ll do a blog post on that later on in the year as I want to do some benchmarking!

Spring Boot Actuator

spring-boot-starter-actuator provides a lot of production features that I find so useful I forget I have to import Actuator. Spring (obviously) do a much better job of explaining it, so without further ado:

Spring Boot includes a number of additional features to help you monitor and manage your application when it’s pushed to production. You can choose to manage and monitor your application using HTTP endpoints, with JMX or even by remote shell (SSH or Telnet). Auditing, health and metrics gathering can be automatically applied to your application.

Spring Boot Reference Guide

Spring Boot Starter Freemarker

Adds in Freemarker support to our application. Nothing more, nothing less.


You may have seen this called application.properties elsewhere. Spring Boot supports both Java Properties and YAML formats! I started using properties but I prefer YAML, so we’ll be using that.

The server object—and it’s only descendent in this case—controls our HTTP server. We won’t be running our application directly exposed to the world, so we’ll spin it up on a predefined port: 11000. Pretty much randomly chosen by me, you can of course pick your own.

We also have the management object. In this object, we set the port for the management endpoit. The management endpoints offers those features from Spring Boot Actuator; metrics, health checks, information, etcetera. There’s another property here, add-application-context-header. This property adds a header to the response, of your hostname and port; e.g. localhost:11000. This isn’t strictly a security risk, but it’s useless data that you oft won’t need in a response, so there’s no reason to put it in for now.

The endpoints object refers to those Actuator endpoints. By default, many actuator endpoints are marked as “sensitive”, meaning you must authenticate with Spring Security before they’ll display any meaningful information. As our management port will not be exposed to the internet, we can mark the endpoints as not sensitive.

The Application class.

Our Application class has an annotation of SpringBootApplication. To quote the reference guide, the annotation is:

equivalent to using @Configuration, @EnableAutoConfiguration and @ComponentScan with their default attributes.

The class sits at the root of our code. The ComponentScan annotation will then search packages that are on the same level/below this class in the hierarchy. You can of course overrides this, but for our purposes it’s really all we need!

There is a main method—defined as the entry point of our application in the MANIFEST file—which calls SpringApplication.run(Class source, String... args). source is the our root class in this instance. args are all the command line arguments that are passed in.

You’re free to add in additional code to the main method, but if it’s before it of course won’t be able to make use of any Spring features; as Spring, your web server and the like, aren’t ready yet!

We’ve defined 2 beans here, FreeMarkerConfigurer and FreeMarkerViewResolver. FreeMarkerConfigurer as you’ve probably guessed handles the configuration of FreeMarker, and FreeMarkerViewResolver instructs Srping to resolve views to FreeMarker templates. Why aren’t we using the default configruation? So we learn how to configure Freemarker! We’ve configured it to resolve to a folder called “/views” (relative to classpath) instead of “/templates”, nothing crazy.


This file configures Logback, which is a great logging library you really should be. For now, we’ll have it as a static configuration. In the future, we’ll expand on this to make it dynamic, depending on where the application is running. It’s not essential, as there is a default configuration. Never the less, I wanted to specify one of my own, that I find more readable. Feel free to adjust this to your liking.

What next?

Let’s get some content going, shall we? We’ll start off with some “Hello, world?” stuff, just to show off how simple our controllers will look.


What a class name… Two annotations you’ll notice upon opening this class: RestController and RequestMapping.

RestController is some annotation sugar (is that a thing? Yeah, it is now) for Controller and ResponseBody. Any methods inside this class will have the ResponseBody annotation by default. Plus to me I like to think this signifies a REST-ful resource.

RequestMapping sets up the request path for this class and the prefix for all RequestMapping-annotated methods inside of it. This means, in this class, every method that has the RequestMapping annotation, the value of the method-level annotation is appended to the class-level annotation. RequestMapping has a default value of “”, an empty string, so it’s the same as the class-level annotation. RequestMapping is also used by Spring to decide which method handles a specific request. Spring takes into account the consumes (incoming body data), produces (the contents of the response), headers (of the incoming request, content-type for example), method (GET, PUT, etc) and params (query parameters).

You can see some example methods in the class. I won’t go over them as they’re self explanatory. Do note, that we return ResponseEntity instead of String or another object. I like to think of ResponseEntity as a wrapper around the object I’m returning. It easily allows you to set Cache-Control headers, eTags, the last modified time and then let Spring handle that in the response. If you can hand some work off to Spring, do it, it’s what it is there for!


This controller delivers some beautiful HTML, what a generic user will get. The class has a resource mapping of “/hello-world”, and its single method has a default value so it’ll respond to GET requests at “/hello-world”

This method does some MAJ… no work at all. It returns a ModelAndView object, which currently has no data model and a view set to “index.ftl”. That’s all. This instructs Spring to invoke Freemarker on a view called “index.ftl”, which is resolved using the Configuration to “/view/index.ftl”.


Your basic hello world document. A HTML 5 document with a h1 element of our favourite text, “Hello, World!”.


I’m using an IDE

Create a maven run configuration in your favourite IDE invoking the “spring-boot:run” goal. That’s all you need.

I’m using the command line

Why?! Execute “mvn spring-boot:run” in the project root.


Now hit http://localhost:11000/hello-world, http://localhost:11000/api/hello-world and http://localhost:11000/api/hello-world/Jake. Watch the magic happen!

Of course, if you’ve changed your port, you should change it in those URLs above.

Let me know on Twitter or GitHub if there’s something you’d like to see or that I’ve missed. Next time, we’ll hit DataPoint up and start doing something useful…

Getting Started With DataPoint, MkII

Round 2, let’s begin. A few colleagues mentioned that I should describe what DataPoint is, and that I should actually write some content :wink:.

DISCLAIMER: The contents of this blog post are strictly my own and nothing to do with my employer. This is all me; every, single, drop!

What is DataPoint?

Straight from the horse’s mouth:

DataPoint is a service to access freely available Met Office data feeds in a format that is suitable for application developers. It is aimed at anyone looking to re-use Met Office data within their own innovative applications, for example professionals, the scientific community, student & amateur developers.

Met Office DataPoint

There are a multitude of products—feeds you can access through DataPoint—like; a 3-hourly forecast feed for UK sites, hourly observations for UK sites, national and regional text forecasts, and map layers such as; surface pressure charts, forecast temperatures, rainfall radar and much more.

Great, how much?

For light usage, DataPoint is free to use. You are subject to a Fair Use Policy, so do be careful with your data. You can of course implement a caching layer in your application. In fact, you really should cache the data. The data for a model run (e.g. the 1900Z run today) won’t change once it’s been published. The data changes because there is a newer model run, which you have to explicitly request.

Let’s do some quick maths! You can make 5000 requests to DataPoint a day on the free policy. 3 Hourly forecasts are produced every hour, so that’s 24 requests, as you can get all 6700 something sites back in one request. Observations too are produced every hour, so that’s another 24 requests. Implemented properly, you’ll use up 1% of your quota for the aboslute basic forecasts.

There is a paid for plan, if you’d like to make a 100,000 data requests a day. This has a per annum cost of £1,500 excluding VAT at the time of writing.

That’s awesome, how can I access this?!

You’ll need a DataPoint key. You can access this simply by going to the DataPoint homepage and clicking the “Register for Met Office Datapoint” key. If you’ve already registered, you can see your API Key again by clicking on the “Your DataPoint API Key” link further down the right hand column.

Technically though

DataPoint is a RESTful API, even though it only offers content with the GET verb. There are generally two types of calls you can make:

  • A capabilities call: A capabilities call gives you some metadata, or capabilities, about the product. Let’s take Forecast for instance. You can query the metadata for timesteps that are available in the three-hourly forecast feed. http://datapoint.metoffice.gov.uk/public/data/val/wxfcs/all/xml/capabilities?res=3hourly&key={key} is the call in question.

    The Regional Forecast capabilities product has two metadata calls, one for the timesteps available (one at the time I’m writing this post) and one to list each available region: * http://datapoint.metoffice.gov.uk/public/data/txt/wxfcs/regionalforecast/xml/capabilities?key={key} for timesteps. * http://datapoint.metoffice.gov.uk/public/data/txt/wxfcs/regionalforecast/xml/sitelist?key={key} for the list of regions.

  • A product call: A product call is a request for the actual product. It’s what you generally really what.

    Some examples: * http://datapoint.metoffice.gov.uk/public/data/txt/wxfcs/regionalforecast/xml/500?key={key} will return the Regional Forecast for the Orkney and Shetland Islands. * http://datapoint.metoffice.gov.uk/public/data/val/wxfcs/all/xml/{locationId}?res=3hourly&key={key} will return the 3-hourly forecast for the given location. Hint hint, Exeter is 310069 ;)

You’ll need to use a combination of these feeds to achieve a complete service, depending on what you want.

That’s it for this post. Let me know what you think. The next post, we’ll create an application that allows us to get a forecast for a given location, in a table. We’ll also load in all the locations (which is a separate call) and see if we can get some basic location search going…