I'm working on a Java/Spring app that requires audit logs to be written to a database. All services are currently marked as #Transactional - so if there is a failure, the changes are rolled back.
But audit logging is the exception to this - it should always succeed - so have been considering marking the AuditLogService as either Propagation.NOT_SUPPORTED or Propagation.REQUIRES_NEW.
The question is how to craft the integration tests. Ideally these should not leave log entries in the database. Would prefer not to have to manually delete this at the end of the test. Is there perhaps a way of marking a test as transactional that would include all transactions including ones that have started via Propagation.REQUIRES_NEW?
I ended up doing exactly what I said I didn't want to do and deleting all the operational data at the end of each test. (This actually worked better as the tests were no longer run in an overarching transaction, which masked some bugs, e.g. relating to Hibernate lazy loading.)
Related
We have a microservice based on Spring Boot 2.7.6. For testing we employ dbunit/junit4 and a h2 database. We have some integrationtests which establish a database-baseline via a schema.sql and then some dbunit-xml-magic. Those tests are marked with #DirtiesContext.
My understanding of #DirtiesContext is, that after the test both the database and the Spring context should be discarded and rebuilt. Up to now this has worked well.
(Almost) out of the blue those tests fail. The error message is java.lang.IllegalStateException: Failed to load ApplicationContext with the cause, that a sequence cannot be created, because it already exists. This sequence is being created by the schema.sql. It is my understanding that the error message implies the databases are not discarded between the tests.
The current changes on the branch are purely business driven. While some classes have changed their packages (which seems to have some effect on the ordering during initialization by Spring), no changes on the testframework itself or the like were made.
I can't isolate the change which triggered the behavior but some experimentation showed, downgrading the h2 database from 2.1.212 to 1.4.199 relaxes the problem (fewer tests fail) and downgrading to 1.4.190 resolves the issue.
The question now is, is this some form of bug? If so, some bug in h2 2.1 oder in 1.4? Is my understanding incorrect, that #DirtiesContext should clean out the database?
In general, I write integration test from my service/ remoting layer to the database so that I can check the server side layers are integrated and tested, I would like to keep the rollback as false if not we will miss out the database constraint level validation. It is a personal preference.
We can follow different approaches
- Create data for each test case and delete it once executed
- Run with a certain amount of existing common data such as (User)
There may be entities depends on other several entities and to be able to test such flows it requires a lot of effort to create every entity for each test case or class and maybe for a business flow if we make a decision we create a certain amount of data and execute a business flow with a certain number of test and clear the data. These things can consume a lot of time to run such test cases.
Is there an effective approach or best practice that is followed in the industry to write integration test in the continues integration environments. I normally use TestNG as it provides spring support. Is there any Java-based frameworks.
I think it really depends on a project and there is no silver bullet solution here.
There are indeed many approaches as you state, I'll mention a few:
Take advantage of Spring's #Transactional annotation on the test. In this case, spring will execute rollback after each test. so that the data changed by the test won't really be saved in the database even if the test passes.
Do not use #Transactional but organize tests so that they won't interfere (each test will use its own set of data that can co-exist with other tests data). If the test fails and doesn't "clean-up" its stuff, then other tests should still run. In addition, if the tests are being run in parallel, they still should not interfere.
Use new schema for each test (obviously expensive, but still can be a viable option to some projects).
Now, the real question is what do you test.
If you test a java code, like that your SQLs are created correctly, then probably the first way is a way to go.
Of course, it also depends on what commands are being executed during the tests, not in all databases all the commands can be in a transaction (for example in Postgres you can use DDL inside a transaction, in Oracle you can't, and so forth).
Another concern to think about during the continuous testing is the performance of tests.
Integration tests are slow and if you have a monolith application that runs hundreds of them, then the build will be really slow. Managing build that runs hours is a big pain.
I would like to mention here 2 ideas that can help here:
Moving to microservices helps a lot in this case (each microservice runs only a bunch of its tests and hence the build of each microservice on its own is much faster by nature)
Another interesting option to consider is running the tests against a docker container of the database that starts right in the test case (it also can be cached so that not every test will raise a docker container). A big benefit of such an approach is that everything runs locally (on the build server), so no interaction with the remote database (performance) + the clean-up of resources is done automatically, even if some tests fail. The Docker container dies and all the data put by the tets gets cleaned up automatically. Take a look at Testcontainers project maybe you'll find it helpful
I created a job which splits my file into small chunks and all these chunks are read in the separated steps. For ex. 3 steps are finished without any errors and the records are commited to the database, but if 4th step fails I need to rollback all records from previous step. Is it possible to rollback them somehow?
Or perhaps there is a possibility to commit all records only when the last step was finished correctly? (But here is problem with large files)
Don't play with transaction while using spring batch; due to its transactional nature is a really bad idea manually manage transaction.
See Transaction Management in Spring batch or Spring batch - One transaction over whole Job for further explanation
Not just spring, for any framework if you need to perform atomic operation across multiple read/write to data source, generally all those calls need to be wrapped in a transaction and then either committed or rolled back at the end. Understanding how JTA works goes a long way in identifying how to use framework that handle transactions, more information on JTA can be found here
Maybe this has been answered already, but i did not find any suggestions out there...
My Project is a Spring Utility Project, the database i use is MySQL and for persistence i´m using Hibernate in combination with c3p0 connection pooling. i´m on spring 3.2 and hibernate 3.5.
So here is what i want to do:
I want to debug a JUnit test, step over some persistence functions (save, update, etc. ) and then check the entries manually in the database via SQL. Because of the JUnit tests always running in a transaction, i cannot check the entries in the database, because a rollback happens every time a test finished / a commit never occurs.
Is there a way to fake transaction existence, or bypassing them during JUnit tests?
Perhaps you can flush the transaction in Hibernate during your debugging session and force Spring/Hibernate to write to the database.
Or you can turn off transactions for your debugging session.
Rather than fake transaction existence, the best approach to looking at the database while the transaction is taking place is to query with an isolation level that allows dirty reads. The mechanism for doing this varies from database to database, and in MySQL you can use
SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED;
prior to querying.
Clearly you will also need to force Hibernate to flush writes to the database during your test, and set your breakpoint after the flush.
I've been assigned with the task to investigate how we should perform our testing in an upcomming project.
Im currently trying to decide whenever or not we should have structured tests for our SQL statements. I've come to the conclusion that it might be a good idea to have test suits for the "get" statements to make sure that they return the correct data but not to test insert, delete or update since this can easily be verified with a select in the DB and foregin key exceptions will be thrown if some depedency is missing.
Now, i've been checking out DBunit to be used to perform these tests but I have a couple of conserns:
1.Is it feasible to perform tests as described above? Or is the time put to create these tests and insert test data not worth the effort? Pherhaps it's enough to let the developers test this ad-hoc?
2.It seems as it might be time consuming to decide on proper test data for each test. Test data should manually be inserted in the flat-xml-file that DBunit requires (if you let the expected data be generated by DBunit you are dependant on the SQL that fetches that data). Is this the case?
3.Is there a better, simpler way to perform database tests to verify SQL statements?
The project will be using Hibernate, Java and a MS SQL Server
you can use dbunit or you can build the db content programatically (in each test). the important part is to let the testing framework (e.g. spring test) do the rollback after each test. once you setup the environment it's easy to test the DML done in a single transaction (get, insert, delete).
if you want to test DDL (some databases does that outside of a transaction) then you have to do manual rolling back or creating a database from scratch before each test (e.g. in memory db). DDL testing is usually not needed as hibernate does the validation part
and yes: you should test your queries - it's worth it to spend some time on preparing environment