Java Integration Testing
Java integration tests have always been a pain to set up - until the Testcontainers project came along! So in this article, we'll explore how to do it, using this powerful (Docker-based) library.
testcontainers
docker
java
testing
In this article, we'll explore a neat method for writing Docker-based integration tests, using the powerful Testcontainers Java library.
If you've ever worked with H2 or some other in-memory replacement for your actual database, you might have been frustrated at some point because the replacement was unable to support the exact SQL dialect used by your application.
Testcontainers solve this problem, because, during the integration tests,
your application will be connected to real versions of the services you're using in production
(think: postgres
, redis
, rabbitmq
or anything else which can be Dockerized).
Let me give a quick overview of how this is going to work:
- we use the Testcontainers library to pull real Docker images and start them as containers on random free ports
- we override system properties to make sure that our application is able to pick up the correct connection URLs
- we start the application itself on a random free port (it should connect to our running containers)
- we perform our test
- the Testcontainers library automatically shuts down and destroys all containers
Table of Contents
- The Application
- Dependencies
- Container 1: Postgres
- Database Test
- Container 2: Email Service
- End-to-End Test
- Conclusion
- Notes on Performance
The Application
Because it will be difficult to write integration tests without an actual application to test, I've setup a very simple Spring Boot application where customers can be registered.
You can find this application's source code on GitHub. If you would like to follow along, make sure to have the following tools installed:
- Java 13
- Maven
- Docker
Dependencies
You've got a few options when it comes to choosing which Testcontainers dependencies to include in your pom.xml
file.
If your application depends on a popular database like Postgres, or a popular message broker like RabbitMQ,
you should include one of the carefully crafted Testcontainer modules in your project.
For example, because I'm using Postgres for this project,
I'll make sure to add the Postgres Module to my pom.xml
file,
allowing me to use the PostgreSQLContainer
class in my integration tests.
You can find a list of all the special modules on the Testcontainers website.
If your application integrates with some external service via HTTP (think: Stripe, Twilio, Mailgun),
and you would like to include this external service in your integration test,
I recommend you install the MockServer Module
which can be used to spin-up a generic HTTP interface via the MockServerContainer
class.
MockServer is actually a project on its own.
You should use the Testcontainers MockServer Module to set up the Docker container,
but I recommend you also include the MockServer client library in your project,
which will allow you to actually intercept, process and verify network calls to this generic HTTP interface,
using the MockServerClient
class.
I will be adding both dependencies, because my sample application depends on a (nonexistent) external e-mail service.
If Testcontainers doesn't offer a special module for some specific Docker container required by your application,
you should include the generic Testcontainers Maven dependency,
giving you access to the GenericContainer
class.
Fun fact: PostgreSQLContainer
and MockServerContainer
are actually subclasses of GenericContainer
.
Container 1: Postgres
I will setup a Containers
utility class in my src/test/java
folder
which can be used for interacting with (i.e.: starting) the Docker container environment.
Here's how to create a Postgres container (with a single postgres
schema) which we can connect to using the admin : password
credentials.
In order to reuse the same container environment across multiple integration tests, let's keep the container as a static variable, and only create it, if it wasn't already created.
Let's also start the container.
This is looking pretty good so far.
Whenever we've finished calling this ensureRunning()
method from one of our integration tests,
we'll know that an actual Postgres Docker container is running on some random port.
The next step, is to find a way for our application to figure out which port to connect to. Because I'm using Spring Boot, I will define a system property holding the correct JDBC url, which should be picked up automatically when Spring boot starts.
Note that we're only able to use this convenient getJdbcUrl()
method because we've been working with the special PostgreSQLContainer
.
If we were to set up Postgres using a GenericContainer
, the code would have looked like this.
Database Test
Let's consider a simplified version of the application. A version which doesn't send customers a welcome e-mail, but only saves them to the database.
Because I'm using Spring Boot, I can use the following JUnit template for my integration test, which will start the application on a random port.
This won't work for me, however, because my application is using Flyway for schema migrations,
which will try to run all SQL migration files (in the db/migration
resources folder) against the database during application startup.
Because I've got a simple V1__create_customer_table.sql
migration file, I'm greeted with the following error:
As expected, the application is trying to connect to a Postgres database,
using the connection details specified in my application.yaml
resources file.
The good news is that all of this can be fixed by calling the Containers.ensureRunning()
method before our integration test is executed:
- A real Postgres container will be started
- We will override the YAML configuration above by providing the correct values via system properties (this is very specific to the way Spring resolves configuration values)
The test below should pass without any errors.
Even though it looks like we aren't doing much here, it's actually a very useful test, providing a guarantee for the important question: will my application start?
Because Flyway runs all migration scripts during startup, this test also verifies that all of my migration scripts are free of errors. This is a pretty useful technique I personally use all the time when I add a new migration script and I quickly want to verify if it's error-free (without accidentally dirtying my local development database).
I won't go too much into the specifics of testing with Spring, so below you'll find a complete database integration test for the simplified version of the application (the version which only saves customers to the database and doesn't send them an e-mail).
I will end this section with two important observations about the integration test above.
- I'm using the fact that
restTemplate.postForEntity()
is a blocking method which only returns after the HTTP request was completely processed; if your application processes requests asynchronously, I can recommend the Awaitility library which allows you to wait until certain conditions are met (e.g.:repository.count() > 0
) before proceeding - When the test finishes, I'm deleting all customer rows from the database; cleaning up after each test is pretty important if you want share the container environment among different integration tests (more on this in Notes on Performance)
Container 2: Email Service
For the remainder of this article, let's consider the full version of the application we've discussed earlier: the version which also sends a welcome e-mail during registration.
Because my application depends on this external e-mail service,
I will add a MockServerContainer
to the test environment,
which will allow me to verify whether my application is actually making the right HTTP requests.
As you can see, I'm using the same trick as before to override YAML configuration values via system properties.
This makes it possible for my application's EmailService
component (responsible for sending HTTP requests to this external provider) to use one of the following two endpoints, depending on the environment:
- Production:
https://api.supermail123.com/send
- Integration tests:
http://localhost:31592/send
, where31592
denotes some random free port chosen by Testcontainers
All that's left to do now,
is setting up a MockServerClient
for intercepting, processing and verifying incoming requests to this generic HTTP interface.
End-to-End Test
Let's switch back to our actual integration test now.
I will keep all infrastructural logic in the Containers
class,
so I can add all business logic for mocking this external e-mail service to the integration test itself.
You can setup a simple MockServer expectation as follows.
Here, I'm always returning the same static 204 No Content
response on every HTTP request to /send
.
If you would like to inject some dynamic behaviour here,
you can also pass a function to the respond()
method.
Every MockServerClient
also has a built-in verify()
method for verifying whether a specific HTTP request was made to the MockServer.
We can even use a number of MockServer utilities like JsonBody.json()
(used for converting POJOs to JSON)
when verifying the payload of an incoming request.
With this knowledge, let's put everything together and finish up our integration test.
And this pretty much concludes our end-to-end test, making sure that:
- our application can start
- we don't have any errors in our schema migration files
- a customer with the name of
Jessy
and an e-mail address ofjessy@example.com
can be registered, and that we're sending this customer a welcome e-mail (assuming our external e-mail provider is functioning without any errors)
Conclusion
Integration tests and end-to-end tests can be incredibly useful for making sure that the different components of your application are compatible with eachother, or for simulating real world scenario's.
Please keep in mind, though, that integration tests and end-to-end tests are no substitute for unit tests. Even though they can do a great job at simulating actual user behaviour, they're no match for unit tests when it comes to speed or failure isolation.
As a general guideline, you might consider keeping Google's 70/20/10 recommendation in mind, stating that unit tests should make up the bulk of your test suite (70%), followed by integration tests (20%) and end-to-end tests (10%).
Notes on Performance
When you introduce Testcontainers for your integration tests, it's easy to understand that they will take longer to run.
- Docker images must be downloaded
- Docker containers must be started, and we must wait until they're ready to accept connections
To minimize the performance hit of #1, I recommend you don't throw away your Docker images once they've been downloaded.
If your integration tests are run automatically as part of a CI pipeline, then there should be a way for your build agent to cache the Docker images, so they can be shared among different test runs.
To minimize the performance hit of #2,
I recommend that you keep all of your integration tests in the same module,
so they all have direct access to this static Containers
utility class we've discussed in this article.
If each integration test
- is independent of other integration tests
- cleans up after itself (purges all rows in the database, etc.)
then it should be possible for your entire test suite to share the same Docker environment.
JUnit 4 makes no guarantees when it comes to the order of your tests,
but as long as every integration test calls the Containers.ensureRunning()
method as part of its static @BeforeClass
method,
each container in the environment should, in theory, only have to be started once.