🗓️ January 2020👀  loading

Java Integration Testing

Java integration tests have always been a pain to set up - until the Testcontainers project came along! So in this article, we'll explore how to do it, using this powerful (Docker-based) library.

poster for: java integration testing
testcontainersdockerjavatesting

In this article, we'll explore a neat method for writing Docker-based integration tests, using the powerful Testcontainers Java library.

If you've ever worked with H2 or some other in-memory replacement for your actual database, you might have been frustrated at some point because the replacement was unable to support the exact SQL dialect used by your application.

Testcontainers solve this problem, because, during the integration tests, your application will be connected to real versions of the services you're using in production (think: postgres, redis, rabbitmq or anything else which can be Dockerized).

Let me give a quick overview of how this is going to work:

  1. we use the Testcontainers library to pull real Docker images and start them as containers on random free ports
  2. we override system properties to make sure that our application is able to pick up the correct connection URLs
  3. we start the application itself on a random free port (it should connect to our running containers)
  4. we perform our test
  5. the Testcontainers library automatically shuts down and destroys all containers

Table of Contents

The Application

Because it will be difficult to write integration tests without an actual application to test, I've setup a very simple Spring Boot application where customers can be registered.

Diagram where a user registers via HTTP, is then sent a welcome e-mail (via some external e-mail service), and is then persisten in the database (via JDBC)

You can find this application's source code on GitHub. If you would like to follow along, make sure to have the following tools installed:

Dependencies

You've got a few options when it comes to choosing which Testcontainers dependencies to include in your pom.xml file. If your application depends on a popular database like Postgres, or a popular message broker like RabbitMQ, you should include one of the carefully crafted Testcontainer modules in your project.

For example, because I'm using Postgres for this project, I'll make sure to add the Postgres Module to my pom.xml file, allowing me to use the PostgreSQLContainer class in my integration tests.

You can find a list of all the special modules on the Testcontainers website.

<dependency>
  <groupId>org.testcontainers</groupId>
  <artifactId>postgresql</artifactId>
  <version>1.12.5</version>
  <scope>test</scope>
</dependency>

If your application integrates with some external service via HTTP (think: Stripe, Twilio, Mailgun), and you would like to include this external service in your integration test, I recommend you install the MockServer Module which can be used to spin-up a generic HTTP interface via the MockServerContainer class.

MockServer is actually a project on its own. You should use the Testcontainers MockServer Module to set up the Docker container, but I recommend you also include the MockServer client library in your project, which will allow you to actually intercept, process and verify network calls to this generic HTTP interface, using the MockServerClient class.

I will be adding both dependencies, because my sample application depends on a (nonexistent) external e-mail service.

<!--Testcontainers MockServer Module-->
<dependency>
    <groupId>org.testcontainers</groupId>
    <artifactId>mockserver</artifactId>
    <version>1.12.5</version>
    <scope>test</scope>
</dependency>
<!--MockServer Java Client Library-->
<dependency>
    <groupId>org.mock-server</groupId>
    <artifactId>mockserver-client-java</artifactId>
    <version>5.5.4</version>
    <scope>test</scope>
</dependency>

If Testcontainers doesn't offer a special module for some specific Docker container required by your application, you should include the generic Testcontainers Maven dependency, giving you access to the GenericContainer class.

<dependency>
    <groupId>org.testcontainers</groupId>
    <artifactId>testcontainers</artifactId>
    <version>1.12.5</version>
    <scope>test</scope>
</dependency>

Fun fact: PostgreSQLContainer and MockServerContainer are actually subclasses of GenericContainer.

Container 1: Postgres

I will setup a Containers utility class in my src/test/java folder which can be used for interacting with (i.e.: starting) the Docker container environment.

Here's how to create a Postgres container (with a single postgres schema) which we can connect to using the admin : password credentials.

public class Containers {
 
    public static void ensureRunning() {
        var postgres = new PostgreSQLContainer<>("postgres:12.1")
            .withUsername("admin")
            .withPassword("password")
            .withDatabaseName("postgres");
    }
 
}

In order to reuse the same container environment across multiple integration tests, let's keep the container as a static variable, and only create it, if it wasn't already created.

Let's also start the container.

public class Containers {
 
    public static PostgreSQLContainer POSTGRES;
 
    public static void ensureRunning() {
        if (POSTGRES == null) {
            POSTGRES = new PostgreSQLContainer<>("postgres:12.1")
                .withUsername("admin")
                .withPassword("password")
                .withDatabaseName("postgres");
        }
        if (!POSTGRES.isRunning()) {
            POSTGRES.start();
        }
    }
 
}

This is looking pretty good so far. Whenever we've finished calling this ensureRunning() method from one of our integration tests, we'll know that an actual Postgres Docker container is running on some random port.

The next step, is to find a way for our application to figure out which port to connect to. Because I'm using Spring Boot, I will define a system property holding the correct JDBC url, which should be picked up automatically when Spring boot starts.

public class Containers {
 
    public static PostgreSQLContainer POSTGRES;
 
    public static void ensureRunning() {
        if (POSTGRES == null) {
            POSTGRES = new PostgreSQLContainer<>("postgres:12.1")
                .withUsername("admin")
                .withPassword("password")
                .withDatabaseName("postgres");
        }
        if (!POSTGRES.isRunning()) {
            POSTGRES.start();
        }
        System.setProperty("spring.datasource.url", POSTGRES.getJdbcUrl());
        System.setProperty("spring.datasource.username", "admin");
        System.setProperty("spring.datasource.password", "password");
    }
 
}

Note that we're only able to use this convenient getJdbcUrl() method because we've been working with the special PostgreSQLContainer. If we were to set up Postgres using a GenericContainer, the code would have looked like this.

// Using a GenericContainer for Postgres instead of
// the special PostgreSQLContainer is not recommended
public class Containers {
 
    public static GenericContainer POSTGRES;
 
    public static void ensureRunning() {
        if (POSTGRES == null) {
            POSTGRES = new GenericContainer<>("postgres:12.1")
                    .withEnv("POSTGRES_USER", "admin")
                    .withEnv("POSTGRES_PASSWORD", "password")
                    .withEnv("POSTGRES_DB", "postgres")
                    .withExposedPorts(5432);
        }
        if (!POSTGRES.isRunning()) {
            POSTGRES.start();
            // The start() method waits for port 5432 to start listening
            // Here's a list of other strategies you can use:
            // https://www.testcontainers.org/features/startup_and_waits/
        }
        var ip = POSTGRES.getContainerIpAddress();
        var port = POSTGRES.getMappedPort(5432);
        var database = "postgres";
        var jdbcUrl = String.format("jdbc:postgresql://%s:%d/%s", ip, port, database);
        System.setProperty("spring.datasource.url", jdbcUrl);
        System.setProperty("spring.datasource.username", "admin");
        System.setProperty("spring.datasource.password", "password");
    }
 
}

Database Test

Let's consider a simplified version of the application. A version which doesn't send customers a welcome e-mail, but only saves them to the database.

Because I'm using Spring Boot, I can use the following JUnit template for my integration test, which will start the application on a random port.

@RunWith(SpringRunner.class)
@SpringBootTest(
        classes = Application.class,
        webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT
)
public class IntegrationTest {
 
    @Test
    public void integrationTest() {
 
    }
 
}

This won't work for me, however, because my application is using Flyway for schema migrations, which will try to run all SQL migration files (in the db/migration resources folder) against the database during application startup.

Because I've got a simple V1__create_customer_table.sql migration file, I'm greeted with the following error:

Connection to localhost:5432 refused

As expected, the application is trying to connect to a Postgres database, using the connection details specified in my application.yaml resources file.

spring:
  datasource:
    url: 'jdbc:postgresql://localhost:5432/postgres'
    username: 'admin'
    password: 'password'
 
services:
  email:
    base-url: 'https://api.supermail123.com'

The good news is that all of this can be fixed by calling the Containers.ensureRunning() method before our integration test is executed:

The test below should pass without any errors.

@RunWith(SpringRunner.class)
@SpringBootTest(
        classes = Application.class,
        webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT
)
public class IntegrationTest {
 
    @BeforeClass
    public static void beforeAll() {
        Containers.ensureRunning();
    }
 
    @Test
    public void integrationTest() {
 
    }
 
}

Even though it looks like we aren't doing much here, it's actually a very useful test, providing a guarantee for the important question: will my application start?

Because Flyway runs all migration scripts during startup, this test also verifies that all of my migration scripts are free of errors. This is a pretty useful technique I personally use all the time when I add a new migration script and I quickly want to verify if it's error-free (without accidentally dirtying my local development database).

I won't go too much into the specifics of testing with Spring, so below you'll find a complete database integration test for the simplified version of the application (the version which only saves customers to the database and doesn't send them an e-mail).

@RunWith(SpringRunner.class)
@SpringBootTest(
        classes = Application.class,
        webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT
)
public class IntegrationTest {
 
    @LocalServerPort
    private int localServerPort;
    private RestTemplate restTemplate;
 
    @Autowired
    private CustomerRepository repository;
 
    @BeforeClass
    public static void beforeAll() {
        Containers.ensureRunning();
    }
 
    @Before
    public void before() {
        restTemplate = new RestTemplateBuilder()
            .rootUri("http://localhost:" + localServerPort)
            .build();
    }
 
    @Test
    public void integrationTest() {
        // Given
        var email = "jessy@example.com";
        var name = "Jessy";
        // When
        var request = Map.ofEntries(
                Map.entry("email", email),
                Map.entry("name", name)
        );
        restTemplate.postForEntity("/customers", request, Void.class);
        // Then
        var customer = repository.findByEmail(email).orElseThrow();
        assertThat(customer.getEmail()).isEqualTo(email);
        assertThat(customer.getName()).isEqualTo(name);
    }
 
    @After
    public void after() {
        repository.deleteAll();
    }
 
}

I will end this section with two important observations about the integration test above.

  1. I'm using the fact that restTemplate.postForEntity() is a blocking method which only returns after the HTTP request was completely processed; if your application processes requests asynchronously, I can recommend the Awaitility library which allows you to wait until certain conditions are met (e.g.: repository.count() > 0) before proceeding
  2. When the test finishes, I'm deleting all customer rows from the database; cleaning up after each test is pretty important if you want share the container environment among different integration tests (more on this in Notes on Performance)

Container 2: Email Service

For the remainder of this article, let's consider the full version of the application we've discussed earlier: the version which also sends a welcome e-mail during registration.

Because my application depends on this external e-mail service, I will add a MockServerContainer to the test environment, which will allow me to verify whether my application is actually making the right HTTP requests.

public class Containers {
 
    public static PostgreSQLContainer POSTGRES;
    public static MockServerContainer EMAIL_SERVICE;
 
    public static void ensureRunning() {
        postgresEnsureRunning();
        emailServiceEnsureRunning();
    }
 
    private static void postgresEnsureRunning() {
        // ...
    }
 
    private static void emailServiceEnsureRunning() {
        if (EMAIL_SERVICE == null) {
            EMAIL_SERVICE = new MockServerContainer();
        }
        if (!EMAIL_SERVICE.isRunning()) {
            EMAIL_SERVICE.start();
        }
        System.setProperty("services.email.base-url", EMAIL_SERVICE.getEndpoint());
    }
 
}

As you can see, I'm using the same trick as before to override YAML configuration values via system properties.

This makes it possible for my application's EmailService component (responsible for sending HTTP requests to this external provider) to use one of the following two endpoints, depending on the environment:

All that's left to do now, is setting up a MockServerClient for intercepting, processing and verifying incoming requests to this generic HTTP interface.

End-to-End Test

Let's switch back to our actual integration test now. I will keep all infrastructural logic in the Containers class, so I can add all business logic for mocking this external e-mail service to the integration test itself.

You can setup a simple MockServer expectation as follows.

public class IntegrationTest {
 
    // ...
    private MockServerClient emailService;
    // ...
 
    @Before
    public void before() {
        // ...
        emailService = new MockServerClient(
            Containers.EMAIL_SERVICE.getContainerIpAddress(),
            Containers.EMAIL_SERVICE.getServerPort()
        );
        emailService.when(HttpRequest.request("/send"))
            .respond(HttpResponse.response().withStatusCode(204));
    }
 
    // ...
 
}

Here, I'm always returning the same static 204 No Content response on every HTTP request to /send. If you would like to inject some dynamic behaviour here, you can also pass a function to the respond() method.

public class IntegrationTest {
 
    // ...
    private MockServerClient emailService;
    // ...
 
    @Before
    public void before() {
        // ...
        emailService = new MockServerClient(
            Containers.EMAIL_SERVICE.getContainerIpAddress(),
            Containers.EMAIL_SERVICE.getServerPort()
        );
        emailService.when(HttpRequest.request("/send"))
            .respond(this::handle);
    }
 
    private HttpResponse handle(HttpRequest request) {
        if (request.getBodyAsString().contains("jessy@example.com")) {
            return HttpResponse.response()
                .withStatusCode(400)
                .withBody("E-mail address blacklisted!");
        }
        return HttpResponse.response().withStatusCode(204);
    }
 
 
    // ...
 
}

Every MockServerClient also has a built-in verify() method for verifying whether a specific HTTP request was made to the MockServer. We can even use a number of MockServer utilities like JsonBody.json() (used for converting POJOs to JSON) when verifying the payload of an incoming request.

With this knowledge, let's put everything together and finish up our integration test.

@RunWith(SpringRunner.class)
@SpringBootTest(
        classes = Application.class,
        webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT
)
public class IntegrationTest {
 
    @LocalServerPort
    private int localServerPort;
    private RestTemplate restTemplate;
    private MockServerClient emailService;
 
    @Autowired
    private CustomerRepository repository;
 
    @BeforeClass
    public static void beforeAll() {
        Containers.ensureRunning();
    }
 
    @Before
    public void before() {
        restTemplate = new RestTemplateBuilder()
                .rootUri("http://localhost:" + localServerPort)
                .build();
        emailService = new MockServerClient(
            Containers.EMAIL_SERVICE.getContainerIpAddress(),
            Containers.EMAIL_SERVICE.getServerPort()
        );
        emailService.when(HttpRequest.request("/send"))
                .respond(HttpResponse.response().withStatusCode(204));
    }
 
    @Test
    public void integrationTest() {
        // Given
        var email = "jessy@example.com";
        var name = "Jessy";
        // When
        var request = Map.ofEntries(
                Map.entry("email", email),
                Map.entry("name", name)
        );
        restTemplate.postForEntity("/customers", request, Void.class);
        // Then
        emailService.verify(HttpRequest.request()
                .withPath("/send")
                .withMethod("POST")
                .withHeader(Header.header("Content-Type", "application/json"))
                .withBody(JsonBody.json(new EmailService.SendEmailRequest("WELCOME", email)))
        );
        var customer = repository.findByEmail(email).orElseThrow();
        assertThat(customer.getEmail()).isEqualTo(email);
        assertThat(customer.getName()).isEqualTo(name);
    }
 
    @After
    public void after() {
        repository.deleteAll();
        emailService.stop();
    }
 
}

And this pretty much concludes our end-to-end test, making sure that:

Conclusion

Integration tests and end-to-end tests can be incredibly useful for making sure that the different components of your application are compatible with eachother, or for simulating real world scenario's.

Please keep in mind, though, that integration tests and end-to-end tests are no substitute for unit tests. Even though they can do a great job at simulating actual user behaviour, they're no match for unit tests when it comes to speed or failure isolation.

As a general guideline, you might consider keeping Google's 70/20/10 recommendation in mind, stating that unit tests should make up the bulk of your test suite (70%), followed by integration tests (20%) and end-to-end tests (10%).

A pyramid, consisting of 70% unit tests (at the base), 20% integration tests (in the middle) and 10% end-to-end tests (at the top)

Notes on Performance

When you introduce Testcontainers for your integration tests, it's easy to understand that they will take longer to run.

  1. Docker images must be downloaded
  2. Docker containers must be started, and we must wait until they're ready to accept connections

To minimize the performance hit of #1, I recommend you don't throw away your Docker images once they've been downloaded.

If your integration tests are run automatically as part of a CI pipeline, then there should be a way for your build agent to cache the Docker images, so they can be shared among different test runs.

To minimize the performance hit of #2, I recommend that you keep all of your integration tests in the same module, so they all have direct access to this static Containers utility class we've discussed in this article.

If each integration test

then it should be possible for your entire test suite to share the same Docker environment. JUnit 4 makes no guarantees when it comes to the order of your tests, but as long as every integration test calls the Containers.ensureRunning() method as part of its static @BeforeClass method, each container in the environment should, in theory, only have to be started once.