We have a microservice based on Spring Boot 2.7.6. For testing we employ dbunit/junit4 and a h2 database. We have some integrationtests which establish a database-baseline via a schema.sql and then some dbunit-xml-magic. Those tests are marked with #DirtiesContext.
My understanding of #DirtiesContext is, that after the test both the database and the Spring context should be discarded and rebuilt. Up to now this has worked well.
(Almost) out of the blue those tests fail. The error message is java.lang.IllegalStateException: Failed to load ApplicationContext with the cause, that a sequence cannot be created, because it already exists. This sequence is being created by the schema.sql. It is my understanding that the error message implies the databases are not discarded between the tests.
The current changes on the branch are purely business driven. While some classes have changed their packages (which seems to have some effect on the ordering during initialization by Spring), no changes on the testframework itself or the like were made.
I can't isolate the change which triggered the behavior but some experimentation showed, downgrading the h2 database from 2.1.212 to 1.4.199 relaxes the problem (fewer tests fail) and downgrading to 1.4.190 resolves the issue.
The question now is, is this some form of bug? If so, some bug in h2 2.1 oder in 1.4? Is my understanding incorrect, that #DirtiesContext should clean out the database?
Related
I have a spring + hibernate application that uses postgres database. I need to write unit tests for the controllers. For tests I wanted to use h2 database but unfortunately test crashes during create-drop leaving me with information that bpchar data type is invalid. I wonder how to solve this issue so I could run tests.
I can't change my columns with bpchar to varchar, it need to stay as it is. I also tried to set postgresql mode but it didn't helped.
Am I right that the only solution I have is to use embedded postgres database in order to perform tests or is there any other approach that I could use?
Am I right that the only solution I have is to use embedded postgres database in order to perform tests or is there any other approach that I could use?
You try to use postgres-specific data type with h2 (which does not have it). Of course, it does not work.
If you cannot change type of this field - use embedded postgres in tests.
Actually you can do this in your application.properties to let H2 know:
spring.datasource.url=jdbc:h2:mem:testdb;INIT=CREATE TYPE BPCHAR AS CHARACTER NOT NULL
Also make sure auto configuration of the database is turned off for your test. You can do this by adding:
#AutoConfigureTestDatabase(replace = AutoConfigureTestDatabase.Replace.NONE)
public class MyTestForTableWithBpcharColumn {
One interesting approach to this issue is Test Containers
Since Postgres doesn't have an embedded mode, but you can use the aforementioned framework to start the docker container before the test, create a schema and apply migrations if you're using something like Flyway or Liquibase or integrate your custom solution.
The idea is that the container will be prepared and available to the test when it runs.
After the test passes (regardless of the actual result, success or failure) you can stop the container.
Firing up the container can be quite expensive (a matter of seconds), however you can take advantage of spring caching configurations during the tests, so when the first test in the module starts, the container is actually started, however, it gets reused between the tests and the test cases, since the application context doesn't get re-started.
Keeping the database clean between tests also becomes a trivial task due to Spring's #Transactional annotation that you put on a test case, so that spring artificially rolls back the transaction after each test. Since in Postgres, even DDL commands can be transactional it should be good enough.
The only limitation of this approach is that you should have a docker available on build machine or local development machine if you're planning to run these tests locally (on Linux and Mac OS its not a problem anyway, but on Windows you need to have Windows 10 Professional edition at least in order to be able to install docker environment).
I've used this approach in real projects and found it very effective for integration testing.
we're using a Spring/Hibernate application and JUnit to execute tests (using IntelliJ). When I have to develop/execute a test, the application startup time is about 10-30 seconds, depending on the number of Hibernate entities to be initialized.
I'm wondering if there is any solution to minimize the initialization time. My idea is to basically have an application server instance where the initialized application is running and I'm able to execute JUnit tests on this application without having to build the Spring/Hibernate context after one test has been finished, but instead to reexecute it immediately after adapting the code.
So my question is if there is a feasible approach and any thoughts/feedback is greatly appreciated. For simplification, let's assume that only test code is going to be changed, not any production code as this probably would require hotswapping or any similar mechanism.
Best regards,
Niko
I'm working on a Java/Spring app that requires audit logs to be written to a database. All services are currently marked as #Transactional - so if there is a failure, the changes are rolled back.
But audit logging is the exception to this - it should always succeed - so have been considering marking the AuditLogService as either Propagation.NOT_SUPPORTED or Propagation.REQUIRES_NEW.
The question is how to craft the integration tests. Ideally these should not leave log entries in the database. Would prefer not to have to manually delete this at the end of the test. Is there perhaps a way of marking a test as transactional that would include all transactions including ones that have started via Propagation.REQUIRES_NEW?
I ended up doing exactly what I said I didn't want to do and deleting all the operational data at the end of each test. (This actually worked better as the tests were no longer run in an overarching transaction, which masked some bugs, e.g. relating to Hibernate lazy loading.)
We run junit tests to test our java codebase. Each test will read/write some data to a mysql database(possibly multiple tables). It seems that the tests are leaving behind data that is interfering with tests that run after it. Is it possible that we can abort/rollback all changes done by the test at the end of each unit test?
We are using cactus framework to test ejbs in glassfish application server. The ejbs can call code in the AS that can read/write to the DB.
We are using hibernate and jdbc to talk to the DB.
A possible solution would be resetting the database after each test with a simple database script you could run after each test, but this would consume a LOT of time.
If you are running integration tests, then you will be using real EJBs, there is not much you can do about this because it would be complicated to make them understand when the test begins and ends. For a simple operation you could force an Exception to cause a rollback, but if you use more than 1 transaction in a single test this won't work.
I (after a while without touching JPA) started a project with Hibernate 4.1, JPA2 and Spring. It is a maven project with a common life cycle.
Upon running mvn test I get the expected result of not finding the required database objects (connection is successful as expected). All my researches and experiments, however, proved not enough to make something that seemed to be a common situation.
I expect to be able to have maven drop/create the schema the local development database while executing unit tests; I imagined that hibernate3-maven-plugin (version 3.0 ideally) should handle this but I didn't manage to have it working. I don't expect any automatic insertion of data (for this I could use DBUnit or even better have each test generate its own test data, but this plays no role here) but I do expect that the schema be refreshed on the test database, reflecting the current state of my annotated model classes. I suppose this would be bound to process-test-resources phase.
I expect to generate a (or set of) sql file with the (current) schema definition, but the best result I got reflects the issue described here: Maven + Spring + Hibernate: hibernate3-maven-plugin hbm2ddl fails for reason "Caused by: java.lang.NullPointerException" (no solution so far).
Am I missing something silly or is it really not possibile at this time?
I would be very happy if someone could provide me any of
proper documentation of how this is supposed to be achieved
an working example using hibernate 4
any guidelines of practical ways of achieve my goals with some other strategy.
If it's of any relevance, database is Postgres 9.1.
Thanks in advance.
One way of doing it is to use Arquillian. You can create a separate deployment package for each test or for a suite of tests, with it's own persistence.xml and datasources. Use the hbm2dll setting in the persistence.xml to either create-drop or update the schema:
<property name="hibernate.hbm2ddl.auto" value="create-drop" />
If you want to prepopulate the database, you can either add a import.sql to your deployment which will be executed by hibernate on application startup, or use the Arquillian Persistence extension.
Here is a complete arquillian test setup as example:
#RunWith(Arquillian.class)
public class MyTest {
#Deployment
public static Archive<?> createTestArchive() {
return ShrinkWrap.create(WebArchive.class, "myTest.war")
.addPackages(true, "com.example")
.addAsResource("testSeeds/seed.sql", "import.sql")
.addAsResource("in-mem-persistence.xml", "META-INF/persistence.xml")
.addAsWebInfResource("in-mem-datasource.xml");
}
One downside is that the in-container tests will be slower than simple unit tests.
I am not sure how well Arquillian is able to play nice with Spring, I have only used it for Java EE applications, so please tell if it is of any help.
Again an answer to my own question.
To achieve the result I wanted I had to chain in maven the execution of hibernate3-maven-plugin's hbm2ddl on process-classes phase with another execution of sql-maven-plugin on the process-test-resources phase. The first generates the sql for Hibernate's model and the second applies it to the database.
With hindsight, seems actually a decent and clean solution.
I'm still interested in knowing the best practices, if they differ from my solution, to achieve the goal of setting up the database for testing with maven from a Hibernate model.