In JUnit 3, Do Tests Share a Spring Container? - java

In JUnit 3, are tests run within the same spring container? or does each test get its own container?
What can be done to prevent tests form 'bleeding' session data to other tests?

I do not know the details of JUnit 3, but I recommend that you upgrade to JUnit 4.
Now, if #ContextConfiguration (or #WebAppConfiguration) is used to specify which application context is used and the SpringJUnit4ClassRunner is used to execute the tests, then it is cached between tests. From the reference docs:
Once the TestContext framework loads an ApplicationContext (or WebApplicationContext) for a test, that context will be cached and reused for all subsequent tests that declare the same unique context configuration within the same test suite. To understand how caching works, it is important to understand what is meant by unique and test suite.
An ApplicationContext can be uniquely identified by the combination of configuration parameters that are used to load it. Consequently, the unique combination of configuration parameters are used to generate a key under which the context is cached.
The reference docs continues to list the configuration parameters that are used to create the context cache key.
What kind of session data is "bleeding"? Use the #Transactional annotation to rollback any database changes for test that write to the database. The #DirtiesContext annotation can be used to re-create the current application context if it gets polluted, but typically that can be avoided by specifying different #ActiveProfiles. Read more about Spring's integration test annotation in the reference docs.

I believe the answer to the first question depends on the system running the tests. Eclipse reuses the context. Maven loads a fresh one per test.
Part 2: use #DirtiesContext

Related

Edit and re-run spring boot unit test without reloading context to speed up tests

I have a spring boot app and have written unit tests using a postgres test container (https://www.testcontainers.org/) and JUnit. The tests have the #SpringBootTest annotation which loads the context and starts up a test container before running the test.
Loading the context and starting the container takes around 15sec on my relatively old Macbook, but the test themselves are pretty fast (< 100ms each). So in a full build with 100s of tests, this does not really matter. It is a one time cost of 15sec.
But developing/debugging the tests individually in an IDE becomes very slow. Every single test incurs a 15 sec startup cost.
I know IntelliJ and Springboot support hot reload of classes when the app is running. Are there similar solutions/suggestions for doing the same for unit tests ? i.e Keep the context loaded and the testcontainer(DB) running but recompile just the modified test class and run the selected test again .
There is a simple solution for your issue I believe. You haven't specified how exactly do you run the test container in the test, however I have a successful experience with the following approach:
For tests running locally - start postgres server on your laptop once (say at the beginning of your working day or something). It can be dockerized process or even regular postgresql installation.
During the test spring boot application doesn't really know that it interacts with test container - it gets host/port/credentials and that's it - it creates a DataSource out of these parameters.
So for your local development, you can modify the integration with the test container so that the actual test container will be launched only if there is no "LOCAL.TEST.MODE" env. variable defined (basically you can pick any name - it's not something that exists).
Then, define the ENV variable on your laptop (or you can use system property for that - whatever works for you better) and then configure spring boot's datasource to get the properties of your local installation if that system property is defined:
In a nutshell, it can be something like:
#Configuration
#ConditionalOnProperty(name = "test.local.mode", havingValue = "true", matchIfMissing = false)
public class MyDbConfig {
#Bean
public DataSource dataSource () {
// create a data source initialized with local credentials
}
}
Of course, more "clever" solution with configuration properties can be implemented, it all depends on how do you integrate with test containers and where do the actual properties for the data source initialization come from, but the idea will remain the same:
In your local env. you'll actually work with a locally installed PostgreSQL server and won't even start the test container
Since all the operations in postgresql including DDL are transactional, you can put a #Transactional annotation on the test and spring will roll back all the changes done by the test so that the DB won't be full of garbage data.
As opposed to Test containers, this method has one significant advantage:
If your test fails and some data remains in the database you can check that locally because the server will remain alive. So you'll be able to connect to the db with PG Admin or something and examine the state...
Update 1
Based on op's comment
I see what you say, Basically, you've mentioned two different issues that I'll try to refer to separately
Issue 1 Application Context takes about 10-12 seconds to start.
Ok, this is something that requires investigation. The chances are that there is some bean that gets initialized slowly. So you should understand why exactly does the application starts so slowly:
The code of Spring (scanning, bean definition population, etc) works for particles of a second and usually is not a bottleneck by itself - it must be somewhere in your application.
Checking the beans startup time is kind of out of scope for this question, although there are certainly methods to do so, for example:
see this thread and for newer spring versions and if you use actuator this here. So I'll assume you will figure out one way or another why does it start slowly
Anyway, what you can do with this kind of information, and how you can make the application context loading process faster?
Well, obviously you can exclude the slow bean/set of beans from the configuration, maybe you don't need it at all in the tests or at least can use #MockBean instead - this highly varies depending on the actual use case.
Its also possible to provide configuration in some cases that will still load that slow bean but will alter its behavior so that it won't become slow.
I can also point of "generally applicable ideas" that can help regardless your actual code base.
First of all, if you're running different test cases (multi-select tests in the IDE and run them all at once) that share exactly the same configurations, then spring boot is smart enough to not re-initialize the application context. This is called "caching of the application context in cache". Here is one of the numerous tutorials about this topic.
Another approach is using lazy beans initialization. In spring 2.2+ there is a property for that
spring:
main:
lazy-initialization: true
Of course, if you're not planning to use it in production, define it in src/test/resource's configuration file of your choice. spring-boot will read it as well during the test as long as it adheres to the naming convention. If you have technical issues with this. (again out of scope of the question), then consider reading this tutorial
If your spring boot is older than 2.2 you can try to do that "manually": here is how
The last direction I would like to mention is - reconsidering your test implementation. This is especially relevant if you have a big project to test. Usually, the application has separation onto layers, like services, DAO-s, controllers, you know. My point is that the testing that involves DB should be used only for the DAO's layer - this is where you test your SQL queries.
The Business logic code usually doesn't require DB connection and in general, can be covered in unit tests that do not use spring at all. So instead of using #SpringBootTest annotation that starts the whole application context, you can run only the configuration of DAO(s), the chances that this will start way faster and "slow beans" belong to other parts of the application. Spring boot even has a special annotation for it (they have annotations for everything ;) ) #DataJpaTest.
This is based on the idea that the whole spring testing package is intended for integration tests only, in general, the test where you start spring is the integration test, and you'll probably prefer to work with unit tests wherever possible because they're way faster and do not use external dependencies: databases, remote services, etc.
The second issue: the schema often goes out of sync
In my current approach, the test container starts up, liquibase applies my schema and then the test is executed. Everything gets done from within the IDE, which is a bit more convenient.
I admit I haven't worked with liquibase, we've used flyway instead but I believe the answer will be the same.
In a nutshell - this will keep working like that and you don't need to change anything.
I'll explain.
Liquibase is supposed to start along with spring application context and it should apply the migrations, that's true. But before actually applying the migrations it should check whether the migrations are already applied and if the DB is in-sync it will do nothing. Flyway maintains a table in the DB for that purpose, I'm sure liquibase uses a similar mechanism.
So as long as you're not creating tables or something that test, you should be good to go:
Assuming, you're starting the Postgres server for the first time, the first test you run "at the beginning of your working day", following the aforementioned use-case will create a schema and deploy all the tables, indices, etc. with the help of liquibase migrations, and then will start the test.
However, now when you're starting the second test - the migrations will already be applied. It's equivalent to the restarting of the application itself in a non-test scenario (staging, production whatever) - the restart itself won't really apply all the migration to the DB. The same goes here...
Ok, that's the easy case, but you probably populate the data inside the tests (well, you should be ;) ) That's why I've mentioned that it's necessary to put #Transactional annotation on the test itself in the original answer.
This annotation creates a transaction before running all the code in the test and artificially rolls it back - read, removes all the data populated in the test, despite the fact that the test has passed
Now to make it more complicated, what if you create tables, alter columns on existing tables inside the test? Well, this alone will make your liquibase crazy even for production scenarios, so probably you shouldn't do that, but again putting #Transactional on the test itself helps here, because PostgreSQL's DDLs (just to clarify DDL = Data Definition Language, so I mean commands like ALTER TABLE, basically anything that changes an existing schema) commands also transactional. I know that Oracle for example didn't run DDL commands in a transaction, but things might have changed since then.
I don't think you can keep the context loaded.
What you can do is activate reusable containers feature from testcontainers. It prevents container's destruction after test is ran.
You'll have to make sure, that your tests are idempotent, or that they remove all the changes, made to container, after completion.
In short, you should add .withReuse(true) to your container definition and add testcontainers.reuse.enable=true to ~/.testcontainers.properties (this is a file in your homedir)
Here's how I define my testcontainer to test my code with Oracle.
import org.testcontainers.containers.BindMode;
import org.testcontainers.containers.OracleContainer;
public class StaticOracleContainer {
public static OracleContainer getContainer() {
return LazyOracleContainer.ORACLE_CONTAINER;
}
private static class LazyOracleContainer {
private static final OracleContainer ORACLE_CONTAINER = makeContainer();
private static OracleContainer makeContainer() {
final OracleContainer container = new OracleContainer()
// Username which testcontainers is going to use
// to find out if container is up and running
.withUsername("SYSTEM")
// Password which testcontainers is going to use
// to find out if container is up and running
.withPassword("123")
// Tell testcontainers, that those ports should
// be mapped to external ports
.withExposedPorts(1521, 5500)
// Oracle database is not going to start if less
// than 1gb of shared memory is available, so this is necessary
.withSharedMemorySize(2147483648L)
// This the same as giving the container
// -v /path/to/init_db.sql:/u01/app/oracle/scripts/startup/init_db.sql
// Oracle will execute init_db.sql, after container is started
.withClasspathResourceMapping("init_db.sql"
, "/u01/app/oracle/scripts/startup/init_db.sql"
, BindMode.READ_ONLY)
// Do not destroy container
.withReuse(true)
;
container.start();
return container;
}
}
}
As you can see this is a singleton. I need it to control testcontainers lifecycle manually, so that I could use reusable containers
If you want to know how to use this singleton to add Oracle to Spring test context, you can look at my example of using testcontainers. https://github.com/poxu/testcontainers-spring-demo
There's one problem with this approach though. Testcontainers is not going to stop reusable container ever. You have to stop and destroy the container manually.
I can't imagine some hot reload magic flag for testing - there is just so much stuff that can dirty the spring context, dirty the database etc.
In my opinion the easiest thing to do here is to locally replace test container initializer with manual container start and change the properties for the database to point to this container. If you want some automation for this - you could add before launch script (if you are using IntelliJ...) to do something like that: docker start postgres || docker run postgres (linux), which will start the container if its not running and do nothing if it is running.
Usually IDE recompiles just change affected classes anyway and Spring context probably wont start for 15 secs without a container starting, unless you have a lot of beans to configure as well...
I'm trying to learn testing with Spring Boot, so sorry if this answer is not relevant.
I came across this video that suggests a combination of (in order of most to least used):
Mockito unit tests with the #Mock annotation, with no Spring context when it's possible
Slice tests using the #WebMvcTest annotation, when you want to involve some Spring context
Integration tests with #SpringBootTest annotation, when you want to involve the entire Spring Context

Use one spring boot context through all SpringBootTests

I want to be able to cache application context through different classes with tests using junit.
Test classes are declared this way:
#SpringBootTest
#RunWith(SpringRunner.class)
public class SomeIntegrationTest {
}
I saw this question Reuse spring application context across junit test classes but in this case I don't use any xml and I want to start context completely, not just few beans from it, so #SpringBootTest is more suitable than #ContextConfiguration, if I got it right.
Ruslan, so your question is on how to reuse the Spring Boot Context for a JUnit Suite, right?
Then, it's almost out-of-the-box provided, you just need to annotate each unit test with the #SpringBootTest annotation.
Also make sure that your main #SpringBootApplication class is loading all the necessary #Configuration classes, this process will be automatically done if the #SpringBootApplication is on the root package above all configuration class and with the inherited #ComponentScan will load up all of them.
From the Spring Boot Testing documentation:
Spring Boot provides a #SpringBootTest annotation which can be used as an alternative to the standard spring-test #ContextConfiguration annotation when you need Spring Boot features. The annotation works by creating the ApplicationContext used in your tests via SpringApplication.
The Spring TestContext framework stores application contexts in a static cache. This means that the context is literally stored in a static variable. In other words, if tests execute in separate processes the static cache will be cleared between each test execution, and this will effectively disable the caching mechanism.
To benefit from the caching mechanism, all tests must run within the same process or test suite. This can be achieved by executing all tests as a group within an IDE
From the Spring Testing documentation:
By default, once loaded, the configured ApplicationContext is reused for each test. Thus the setup cost is incurred only once per test suite, and subsequent test execution is much faster. In this context, the term test suite means all tests run in the same JVM
Check this urls:
http://docs.spring.io/spring/docs/current/spring-framework-reference/html/integration-testing.html#testcontext-ctx-management-caching
http://docs.spring.io/spring/docs/current/spring-framework-reference/html/integration-testing.html#testing-ctx-management
Main Takeaways:
Annotate each unit test with #SpringBootTest
Load all beans and necessary configuration classes in your main #SpringBootApplication class
IMPORTANT: Run a JUnit Suite, not a single JUnit test. Execute all tests as a group within your IDE.
#SpringBootTest provides ApplicatonContext caching and sharing in Junit Test Cases naturally.
While some cases may be exceptions. For example, using #MockBean or #SpyBean in a test class will cause the ApplicatonContext cache failure.
While Spring’s test framework caches application contexts between
tests and reuses a context for tests sharing the same configuration,
the use of #MockBean or #SpyBean influences the cache key, which will
most likely increase the number of contexts.
https://docs.spring.io/spring-boot/docs/2.1.5.RELEASE/reference/html/boot-features-testing.html
And this kind of issue is still not solved in the latest SpringBoot release.
https://github.com/spring-projects/spring-boot/issues/21099
https://github.com/spring-projects/spring-boot/issues/7174

Is there a common way to validate if a (xml based) spring configuration is valid?

Question: Is there a common way to validate if a (xml based) spring configuration is valid?
Further explanation:
With "valid" I mean not if the xml itself is valid (I don't talk about xsd validation), I mean more "logical valid", e.g. if all referenced classes are available, or if a specific reference is available / could be resolved.
The background of this question is a QA-process within a CI-environment for a spring-mvc application:
Assuming a developer have a typo in a class name, or a reference is not unique in a webcontext configuration file, and he commits this change. Now an automated build is triggered:
The application compiles successfully
unit tests are "green" (since they don't need any spring configuration)
integration tests are "green" (since integration tests does not rely on webcontext configurations)
functional / regression testing starts
In current setup we would note this simple typo in step 4 - but it is quite time consuming to reach this point.
It would be great to have a mechanism / tool which can validate if a spring context could be loaded in order to save some time.
integration tests does not rely on webcontext configurations
My integration tests run against the deployed application. If you want to test the web context then... test it. Specifics depends on what you want to test.
You can use Maven or whatever build tool you have to deploy your app in a test environment and run the integration tests afterwards.
You can use a lightweight server like Grizzly, reuse it across test classes etc.
Since those tests are costly I usually have one that checks the app can be deployed and the context starts. I use Grizzly and run this test along the rest of the Unit tests so issues are detected asap. Other similar test cases can be added depending on the situation.
A simple way to "pre-validate" the XML configuration before integration testing would be to use a JUnit test using SpringJUnit4ClassRunner. This is available in the spring-test JAR.
e.g.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations = { "/foo.xml" })
public class XmlValidate {
#Test
public void loadsWithoutError() {
// nothing to do, SpringJUnit4ClassRunner will throw exception
// if there are any class-not-found exceptions etc.
}
}

Spring context dirty after each integration test

I recently started as a freelancer on my current project. One of the thing I threw myself on, was the failing Jenkins build (it was failing starting from April 8th, a week before I started here).
Generally speaking, you could see a buttload of DI issues in the log. First thing I did, was get all tests to work in the same way, starting from the same application context.
They also implemented their own "mocking" thing, which seemed to fail to work correctly. After a discussion with the lead dev, I suggested to start using Springockito. (for a certain module, they needed mocking for their integration testing -- legacy reasons, which can't be changed)
Anyway, stuff started failing badly after that. A lot of beans which were mocked in the test, simply weren't mocked, or weren't found or whatever. Typically, it would fail on the loading of the application context, stating that one or another bean was missing.
I tried different stuff and different approaches, but in the end, only the thing I most feared would work: add #DirtiesContext to every single test. Now, the maven build is starting to turn green again, tests start doing what they are supposed to do. But I am reloading the Spring context each and every time, which takes time - which is all relative, since the context is loaded in about 1 - 2 seconds.
A side note to this story is that they've upgraded to Hibernate 4, and thus to Spring 3.2. Previously, they were using an older version of Spring 3. All tests were working back then, and the #DirtiesContext thing was not necessary.
Now, what worries me the most, is that I can't immediately think of an explanation for this weird behaviour. It almost seems that Springs context is dirtied, simply by launching a test which uses #Autowired beans. Not all tests are using Mocks, so it can't be that.
Does this sound familiar to anyone? Has anyone had the same experiences with integration testing with (the latest version of) Spring?
On Stackoverflow, I've found this ticket: How can a test 'dirty' a spring application context?
It seems to pretty much sum up the behaviour I'm seeing, but the point is that we're autowiring services/repositories/..., and that we don't have any setters on those classes whatsoever.
Any thoughts?
Thanks!
To answer my own question, the secret was in the Spring version. We were using Spring 3.1.3, whereas I presumed they were using Spring 3.2 (they were constantly speaking about a recent upgrade of the Spring version).
The explanation was here, a blog post I stumbled over in my hunt to get it fixed: Spring Framework 3.2 RC1: New Testing Features
And a copy paste of the relevant piece:
The use of generic factory methods in Spring configuration is by no means specific to testing, but generic factory methods such as EasyMock.createMock(MyService.class) or Mockito.mock(MyService.class) are often used to create dynamic mocks for Spring beans in a test application context. For example, prior to Spring Framework 3.2 the following configuration could fail to autowire the OrderRepository into the OrderService. The reason is that, depending on the order in which beans are initialized in the application context, Spring would potentially infer the type of the orderRepository bean to be java.lang.Object instead of com.example.repository.OrderRepository.
So, how did I solve this problem? Well, I did the following steps:
create a new maven module
filter out the tests which needed mocking. All the non-mocked test would run normallly in a Spring build, in a separate Failsafe run (I created a base-package "clean", and sorted them out like that)
Put all the mocked tests in a base package called "mocked", and make an additional run in Failsafe for the mocked tests.
Each mocked test is using Springockito, to create the mocks. I'm also using the Springockito annotations, to easily do a #ReplaceWithMock in place. Every mocked test is then annotated with #DirtiesContext, so the context is dirtied after each test, and the Spring context is reintroduced with each test.
The only reasonable explanation that I could give, is that the context is effectively being dirtied, because there is a framework (Springockito) which is taking over the management of the Spring beans from the Spring framework. I don't know if that's correct, but it's the best explanation I could come up with. That, in fact, is the definition of a dirty context, which is why we need to flag it as dirty.
Using this strategy, I got the build up and running again, and all tests are running ok. It's not perfect, but it's working, and it's consistent.

Did SpringJUnit4ClassRunner load the Context for every Test or for Class?

my Problem is that some test are Failed. I think that a Function destroy the Context and because of that the Test failed.
Did Spring load the Context new for every Test or for every Test Class or just once for the Test Run?
Out of the box, with no configuration changes Spring should have only loaded the context once for each test suite.
By default, once loaded, the configured ApplicationContext is reused
for each test. Thus the setup cost is incurred only once per test
suite, and subsequent test execution is much faster. In this context,
the term test suite means all tests run in the same JVM — for example,
all tests run from an Ant, Maven, or Gradle build for a given project
or module. In the unlikely case that a test corrupts the application
context and requires reloading — for example, by modifying a bean
definition or the state of an application object — the TestContext
framework can be configured to reload the configuration and rebuild
the application context before executing the next test.
Source: http://static.springsource.org/spring/docs/3.2.x/spring-framework-reference/html/testing.html#testcontext-ctx-management

Categories

Resources