We have an application built using spring/Hibernate/MySQL, now we want to test the DAO layer, but here are a few shortcomings we face.
Consider the use case of multiple objects connected to one another, eg: Book has Pages.
The Page object cannot exist without the Book as book_id is mandatory FK in Page.
For testing a Page I have to create a Book.
This simple usecase is easy to manage, but if you start building a Library, till you don't create the whole universe surrounding the Book and Page, you cannot test it!
So to test Page;
Create Library
Create Section
Create Genre
Create Author
Create Book
Create Page
Now test Page.
Is there an easy way to by pass this "universe creation" and just test the page object in isolation. I also want to be able to test HQLs related to Page. eg:
SELECT new com.test.BookPage (book.id, page.name) FROM Book book, Page page.
JUnit is supposed to run in isolation, so I have to write the code to build all the supporting objects in the test case to create the Page. Any tips on how to accelerate the process.
Edit: Spring follows the philosophy of transaction roll-back after the tests have been run, thereyby reverting all changes. Schema changes are expected as we develop further, I want to be able to test it against the production db (backup!) on a regular basis.
I just finished a project with this exact configuration. We had great success using a stand-in HSQLDB database for the unit tests and then turning off referential integrity on the schema for those tests.
Since you're using Spring, these are the steps:
Create a new context configuration file for testing. Set up hibernate to do create-drop for the schema in this configuration.
Create your junit test. Inherit from AbstractTransactionalJUnit4SpringContextTests, the greatest abstract class in the history of the universe, and annotate the class with your new #ContextConfiguration. Also use the #TransactionConfiguration annotation to run each test in a transaction with an automatic rollback.
run the command "SET REFERENTIAL_INTEGRITY FALSE;" through the inherited simpleJdbcTemplate property in your #Before method.
dedicate the rest of the #Before to simpleJdbcTemplate calls that set up the database. Note that you no longer need to specify every referenced column, just what you're testing!
Finally write your unit tests against your DAO's.
Here's a few references that will get you moving in this direction:
http://static.springsource.org/spring/docs/2.5.x/reference/testing.html
http://www.theserverside.com/news/1365222/Unit-Testing-Hibernate-With-HSQLDB
As usual with this stuff, getting the config just right is the hard part. But once it's all working, you'll be a styling unit tester!
The Unitils extensions to junit or testng have very nice support for this. They allow you to define datasets tuned for your class under test so it only needs the part of the universe your class is seeing and then it initializes the database before the tests start.
checkout : link text
We are using it and it just works fine. A lot better that the "MockRepositories" which we used before which do not test the HQL and, also important, the hibernate transaction behaviors.
Related
In general, I write integration test from my service/ remoting layer to the database so that I can check the server side layers are integrated and tested, I would like to keep the rollback as false if not we will miss out the database constraint level validation. It is a personal preference.
We can follow different approaches
- Create data for each test case and delete it once executed
- Run with a certain amount of existing common data such as (User)
There may be entities depends on other several entities and to be able to test such flows it requires a lot of effort to create every entity for each test case or class and maybe for a business flow if we make a decision we create a certain amount of data and execute a business flow with a certain number of test and clear the data. These things can consume a lot of time to run such test cases.
Is there an effective approach or best practice that is followed in the industry to write integration test in the continues integration environments. I normally use TestNG as it provides spring support. Is there any Java-based frameworks.
I think it really depends on a project and there is no silver bullet solution here.
There are indeed many approaches as you state, I'll mention a few:
Take advantage of Spring's #Transactional annotation on the test. In this case, spring will execute rollback after each test. so that the data changed by the test won't really be saved in the database even if the test passes.
Do not use #Transactional but organize tests so that they won't interfere (each test will use its own set of data that can co-exist with other tests data). If the test fails and doesn't "clean-up" its stuff, then other tests should still run. In addition, if the tests are being run in parallel, they still should not interfere.
Use new schema for each test (obviously expensive, but still can be a viable option to some projects).
Now, the real question is what do you test.
If you test a java code, like that your SQLs are created correctly, then probably the first way is a way to go.
Of course, it also depends on what commands are being executed during the tests, not in all databases all the commands can be in a transaction (for example in Postgres you can use DDL inside a transaction, in Oracle you can't, and so forth).
Another concern to think about during the continuous testing is the performance of tests.
Integration tests are slow and if you have a monolith application that runs hundreds of them, then the build will be really slow. Managing build that runs hours is a big pain.
I would like to mention here 2 ideas that can help here:
Moving to microservices helps a lot in this case (each microservice runs only a bunch of its tests and hence the build of each microservice on its own is much faster by nature)
Another interesting option to consider is running the tests against a docker container of the database that starts right in the test case (it also can be cached so that not every test will raise a docker container). A big benefit of such an approach is that everything runs locally (on the build server), so no interaction with the remote database (performance) + the clean-up of resources is done automatically, even if some tests fail. The Docker container dies and all the data put by the tets gets cleaned up automatically. Take a look at Testcontainers project maybe you'll find it helpful
When we develop a Rails application then we use a local database in our development environment, and make sure that our specs pass as part of TDD.
Is it a norm to not use a local database similar to Sqlite while doing TDD in Java? I have been told in-memory database(HSQL) is all that is needed for running unit and integration tests. Is this a standard practice being followed?
We use Sqlite in our Rails application for local development and for running our Rspecs. But my question is for Java development. We are working on rewritting a part of our application in Java. I have been told that you do not need any database for development if you write integration tests covering all functionality. And have been told that HSQL is sufficient for that. As I am used to having database for local development in Rails, I am wondering how you debug any issues later on? It is quite helpful to analyze any issues if we can replicate the data and scenario in local environment. How do you do same in Java/Spring if you do not use any database for development environment and rely completely on HSQL for testing?
For me, I never use any databases including HSQLDB to write an unit-test.
I prefer to create some interfaces like as: *Repository. and let's the SUT communicate with it. and then I write some implementation class let them implement the interface which I have created. and the classes hierarchy looks like below:
<<uses>>
SUT ---------------> Repository
^
| <<implement>>
|
|--------|--------|-------|
| | | |
JPA Hibernate JDBC .etc
this approach is known as Separation of Concerns. the application domain is a concern, data accessing is another concern. following this approach result in many plug-compatible components and independent modules, such as: domain, jpa, jdbc, and .etc, but the important thing is that will make your test is more testable.
Then I use Test Doubles to mock/stub out its collaboration in unit-test to testing them are work together as expected. the pseudo-code like as below:
repo = mock(Repository.class);
SUT it = new SUT(repository);
when(repo.find(id)).thenReturn(entity);
assert it.exercise() == expectedResult;
assert it.currentState == expectedState;
But you must write some integration test using database to testing each Repository implementation that operate on the third-party api. it is called by Martin: Test Isolation.
The answer to your question: is very common to have your test environment database as close as the development environment as possible.
I suppose that you are preoccupied with performance, there are more crucial things that you could improve before considering having an in-memory database.
Usually while TDD-ing you would only run the tests involved and later run your whole suite to check that you didn't break anything. If you are using Rspec you could use tags.
Another important thing is to clean the database at the beginning of every test since tests should be isolated and never depend on the result of previous tests. This will improve complex search queries that you could have in your system. there is a gem that could help you here.
Finally, if you are using some sort of continuous integration tool remember to set it up using rake db:schema:load instead of rake db:migrate. This will run your schema file as a single migration instead of running each single migration every time you commit. (Remember to keep this version-controlled and always up to date)
You are getting terminology wrong. TDD is about writing test cases in general. But most of the time, and also in your question, one thinks about using TDD for unit testing.
And unfortunately, terms are not very clear. When you turn to wikipedia, you find there (my words): "anything you do to test a piece of software" can be called a unit test.
But that isn't helpful. You should rather look for definitions such as here. And the main aspect there: unit tests work in isolation. Quoting from that link:
Runs in memory (no DB or File access, for example)
Thus:
when doing unit testing, you should not use any database
when you integration tests, you want to ensure that your solution works "end to end". In that sense you might be using a special instance of your database, but not a different kind of database.
I'm trying to devise an optimal strategy to unit-test DAO layer of my Spring app.
Many existing approaches like in-memory DB usage, etc (posts: 12289800, 12390813, 9940010, 12801926). do not appeal to me.
So, here is a straightforward way that occurs to me:
Create Spring test-context.xml and put there all the data needed for testing all the DAO classes;
For each test class create a template method to test CRUD operations and all 'select' operations;
Before testing, insert all needed data from test-context.xml to your real DB. We may need also some dependencies (references), so insert them as well, let's say in #Before method.
After all CRUD operations, delete all dependencies (references) from DB, let's say in #After method.
If we have a lot of dependencies, this may become a terribly expensive and laborious approach. Also we have only one #Test method (template method, to ensure the order of operations: create, read... etc.) - so one test per test class.
So, I need an advice whether this strategy is viable? What similar did you do to test your DAOs?
After all, I ended up with this strategy for testing classes responsible for interacting with the database in Spring-based app. Key thoughts:
Use an in-memory DB (H2 is OK), a separate Spring profile with test data source and settings.
Database is set up at the beginning of the entire test process from the schema.sql scripts. So we need to have the sql sources to rebuild the test database. Possibly it comes from DBA or yourself if you are designing the database on your own. Tools like liquibase or flyway are if you work with the database in a large team, where everybody needs the actual state of the database by applying incremental scripts. In this way the results setup script in managed by the tool.
Obviously each test case will require its own set of data to be initialized before executing the test. We do it by making sample.sql/clear_sample.sql scripts (a pair for each test case) to insert and delete data before and after each case. For this we can use either spring's annotations: #Sql or ScriptUtils
To help designing tests we can inject EntityManager for example for retrieving inserted with the help of sql scripts.
Basic JUnit asserts are used to compare.
Thus, we have no additional software layer, like DbUnit or anything and write isolated and maintainable unit-tests.
The unavoidable downside is that when a more or less significant change comes to the DB we need to rewrite the whole test, or even several.
This is a bit of an odd question, but it has been bothering me for a few months now. I have built a JPA-based web application using Wicket + Hibernate (built with Maven), and want to test the DAO layer directly. I created a specific src/test/resources/META-INF/persistence.xml file that I used for testing, but have been running into conflicts with WTP and the like. To get around these issues, I created a separate test project where the unit tests live. Is there a better way to manage unit tests for a JPA project without having duels between persistence files?
Addendum: Would other test frameworks (TestNG, for example) make this any easier?
You may want to try mockito. The test works like this:
You use mockito to "implement" EntityManager. Instead of the real code, you use the methods of mockito to say "if the application calls getReference(), then return this object". In the background, mockito will create a proxy instance which intercepts the Java method calls and returns the values which you specify. Calls to other methods will return null.
Mocking things like createQuery() works the same way but you first need to create a mockup of Query and then use the same approach as in getReference() (return the query mockup).
Since you don't use a real EM, you don't need a real persistence.xml.
A much more simple solution would be if you could set some property to change the name of the persistence.xml file but I don't think that this is possible.
Some other links that may help:
How to configure JPA for testing in Maven
Suggest a JPA Unit test framework
We use dual persistence.xml files for production and test runtimes but it is a classpath related issue only (we use Eclipse but do not rely on WTP plugins heavily). The only difference between the two is that the production version doesn't contain entity definitions.
We don't use a mocking framework to test JPA as this wouldn't add any value to our tests. The tests do run real data access with JPA that talks to PostgreSQL database.
Our approach to tests is based on Spring test framework for persistence layer: in-transaction testing. Our application is Spring-based but this approach is equally usable for arbitrary applications that want to take advantage of Spring test classes. The essence is that each test runs within a single transaction that never commits and at the end (in tearDown) it is automatically rolled back. This solves the problem of data pollution and test dependency in very nice unobtrusive and transparent way.
The Spring test framework is flexible to allow multi-transaction testing but these are special cases that constitute not more than 10% of tests.
We still use legacy support for JUnit 3.8 but new Spring TestContext Framework for JUnit 4 looks very attractive.
For setting up in-transaction test data we use in-house utility class that constructs business entities. Since it's shared between all tests the overhead to maintain and support it is greatly outweight by the benefits of having standard and reliable way to setup test data.
Spring DI helps to make tests concise and self-descriptive but it's not a critical feature.
Using Spring and Spring's unit testing is the best way to go. With spring, you don't require two persistence.xml's as your persistence.xml has nothing in it, everything is specified by spring (all we specify in our persistence.xml is the persistence-unit name) and thus you can change database configuration etc with spring.
And as topchef pointed out, spring's transaction based unit testing is great.
As mentioned here : http://www.devx.com/java/Article/36785/1954,
you can remove the following lines from your project's .settings/org.eclipse.wst.common.component to avoid deploying test resources with the web app.
<wb-resource deploy-path="/WEB-INF/classes" source-path="/src/test/java"/>
<wb-resource deploy-path="/WEB-INF/classes" source-path="/src/test/resources"/>
You can:
Have several persistence units
Have several persistence.xml and copy them on test, and restore them later
Setup your own properties on testing, and use mockito to return your custom entity manager factory
Use spring: https://www.baeldung.com/spring-testing-separate-data-source
The first two options are the most discussed in all suggested questions, and are by far the ones I like the least.
Solution 3. would look like this:
private EntityManager entityManager;
private static EntityManagerFactory entityManagerFactory;
#BeforeClass
public static void mainTestInitClass() {
Properties pros = new Properties();
// Override production properties
pros.setProperty("hibernate.dialect", "org.hibernate.dialect.H2Dialect");
pros.setProperty("hibernate.connection.driver_class", "org.h2.Driver");
pros.setProperty("hibernate.connection.username", "sa");
pros.setProperty("hibernate.connection.url", "jdbc:h2:mem:some_test_db;DB_CLOSE_DELAY=-1;MVCC=TRUE;DATABASE_TO_UPPER=false");
pros.setProperty("hibernate.hbm2ddl.auto", "create");
entityManagerFactory = Persistence.createEntityManagerFactory("your_unit", pros);
}
#Before
public void mainTestORMSetUp() throws Exception {
this.entityManager = entityManagerFactory.createEntityManager();
}
Now you have an entity manager available for every test. Use mockito to inject it where needed.
Solution 4: Use Spring Data+Spring Boot to setup the JPA, so you don't need the Entity Factory anymore, you simply use two different application.properties (one for main, and one for test) and then you use your defined Spring Entity Repository. Alternatively you can use different spring profiles (one for tests, other for production) which would end up allowing you to do the same. This solution is the one I use. Check the URL above for more details.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
This may be a naive question, but I am new to both the junit and hibernate frameworks and I was wondering what the best way to go about unit testing an application that is largely calls to hibernate, or if it is even necessary to do so?
What is the best practice here?
EDIT:
Spring seems to be the big suggestion here. Unfortunately this may be alittle too much to bite off for one project. Junit, Hibernate and Spring are all new to me, and while they are all technologies I want to get under my belt, I think trying to incorporate them all into one project may be too overwhelming for me.
Links to tutorials and/or book suggestions are welcome.
Keep in mind the difference between unit testing and integration testing.
Unit tests should be testing code without any outside dependencies. These dependencies are mocked using a framework like, for example, JMock.
Integration tests are important too but the major drawback of them is that they take a long time to run. You can run thousands of true unit tests in a couple of seconds, but it's not the same with integration tests.
Depending on the size of your project/development team you might want to prioritize true unit tests over integration tests. Both style of tests are important but if you are pressed for resources, just going with Unit testing may be a better idea.
I wrote an application by myself that unit tested the Web (with Spring MVC this is easy) and Service layers, as well as domain objects. But I left the DAO alone because I didn't want to write a bunch of slow integration tests. If I had more people on staff I would have gone with integration tests as well, but in this case I didn't feel the time spent would be worth it.
As for best practices:
use an embedded database for running your tests if possible, so that you don't need a full deployed relational database just to run your tests (locally, or on your continuous build server if you have one). That way you also don't need to (necessarily) worry about rolling back etc, you can just recreate the database when you need to. Testing with an embedded database doesnt test peculiarities that could come up when you use your specific production database, but it does test your code, which should suffice.
You can also use DbUnit, an extension to JUnit, to easily fill the database with expected rows and put it in a known state, before you run your Hibernate tests.
Best practice? I use Spring and make all my tests transactional. I perform the test and rollback all the changes so I don't change the state of the database.
I like to use a in memory hsqldb for testing. The process for each hibernate POJO is:
Create the object
Persist it to the DB
Clear the session
Get it from the DB
Assert the objects are equal.
For DAOs, I create and persist enough objects to accurately test the methods, then run the tests and delete the objects as necessary to not intefere with other tests.
Hibernate source includes a lot of unit tests, I would recommend going through those and adapting a similar approach.
You can also look at the CaveatEmptor which the sample application developed for the book "Java Persistence with Hibernate"
If you're using Hibernate for Domain rich models, Unit testing domain logic is as simple as testing a POJO and Hibernate doesn't get in your way. The only caveat here is, For bidirectional mappings, you might have to set the object on both sides for unit tests.
Integration testing with database is generally not done for simple mappings. However it is suggested in the case of exquisite mappings like Single table inheritance etc. The only thing to remember here is, you may have to explicitly flush to database sometimes.
Sure, you'd unit test your persistence tier if it wasn't written in Hibernate, wouldn't you?
Create a given persistence interface that's implemented using Hibernate, instantiate some sample objects, perform CRUD operations, and ask JUnit to assert that the operations were successful. Same as any other class.
You could use Spring to help here.
It has a great unit test framework, you can use it to test CRUD ops and then rollback changes - great if you don't have the capability to reload a database every time.
Write a simple layer that passes requests to Hibernate. Then use a mocking library like EasyMock or JMock to assert that your Hibernate-veneer layer is correctly called by your application classes. This is nicely described in the partially-complete JMock book (scroll down to the test smell "everything is mocked").
Two cases are easy to test:
When practical, perform your various calculations and transformations in functions that don't know about saving or loading entities. If you can make these pure functions, so much better.
For functions that only save to the database without reading from it, you can choose not to save when testing.
The simplest (crudest) way to do #2 is by adding a reallyUpdate parameter to the function, then surrounding each "save" call with:
if (reallyUpdate) {
HibernateUtil.saveOrUpdate(theThing);
}
For me these were the lowest hanging fruit.