TDD without local database? - java

When we develop a Rails application then we use a local database in our development environment, and make sure that our specs pass as part of TDD.
Is it a norm to not use a local database similar to Sqlite while doing TDD in Java? I have been told in-memory database(HSQL) is all that is needed for running unit and integration tests. Is this a standard practice being followed?
We use Sqlite in our Rails application for local development and for running our Rspecs. But my question is for Java development. We are working on rewritting a part of our application in Java. I have been told that you do not need any database for development if you write integration tests covering all functionality. And have been told that HSQL is sufficient for that. As I am used to having database for local development in Rails, I am wondering how you debug any issues later on? It is quite helpful to analyze any issues if we can replicate the data and scenario in local environment. How do you do same in Java/Spring if you do not use any database for development environment and rely completely on HSQL for testing?

For me, I never use any databases including HSQLDB to write an unit-test.
I prefer to create some interfaces like as: *Repository. and let's the SUT communicate with it. and then I write some implementation class let them implement the interface which I have created. and the classes hierarchy looks like below:
<<uses>>
SUT ---------------> Repository
^
| <<implement>>
|
|--------|--------|-------|
| | | |
JPA Hibernate JDBC .etc
this approach is known as Separation of Concerns. the application domain is a concern, data accessing is another concern. following this approach result in many plug-compatible components and independent modules, such as: domain, jpa, jdbc, and .etc, but the important thing is that will make your test is more testable.
Then I use Test Doubles to mock/stub out its collaboration in unit-test to testing them are work together as expected. the pseudo-code like as below:
repo = mock(Repository.class);
SUT it = new SUT(repository);
when(repo.find(id)).thenReturn(entity);
assert it.exercise() == expectedResult;
assert it.currentState == expectedState;
But you must write some integration test using database to testing each Repository implementation that operate on the third-party api. it is called by Martin: Test Isolation.

The answer to your question: is very common to have your test environment database as close as the development environment as possible.
I suppose that you are preoccupied with performance, there are more crucial things that you could improve before considering having an in-memory database.
Usually while TDD-ing you would only run the tests involved and later run your whole suite to check that you didn't break anything. If you are using Rspec you could use tags.
Another important thing is to clean the database at the beginning of every test since tests should be isolated and never depend on the result of previous tests. This will improve complex search queries that you could have in your system. there is a gem that could help you here.
Finally, if you are using some sort of continuous integration tool remember to set it up using rake db:schema:load instead of rake db:migrate. This will run your schema file as a single migration instead of running each single migration every time you commit. (Remember to keep this version-controlled and always up to date)

You are getting terminology wrong. TDD is about writing test cases in general. But most of the time, and also in your question, one thinks about using TDD for unit testing.
And unfortunately, terms are not very clear. When you turn to wikipedia, you find there (my words): "anything you do to test a piece of software" can be called a unit test.
But that isn't helpful. You should rather look for definitions such as here. And the main aspect there: unit tests work in isolation. Quoting from that link:
Runs in memory (no DB or File access, for example)
Thus:
when doing unit testing, you should not use any database
when you integration tests, you want to ensure that your solution works "end to end". In that sense you might be using a special instance of your database, but not a different kind of database.

Related

Best practices or the effective approach to write integration test which run in a continues integration environment

In general, I write integration test from my service/ remoting layer to the database so that I can check the server side layers are integrated and tested, I would like to keep the rollback as false if not we will miss out the database constraint level validation. It is a personal preference.
We can follow different approaches
- Create data for each test case and delete it once executed
- Run with a certain amount of existing common data such as (User)
There may be entities depends on other several entities and to be able to test such flows it requires a lot of effort to create every entity for each test case or class and maybe for a business flow if we make a decision we create a certain amount of data and execute a business flow with a certain number of test and clear the data. These things can consume a lot of time to run such test cases.
Is there an effective approach or best practice that is followed in the industry to write integration test in the continues integration environments. I normally use TestNG as it provides spring support. Is there any Java-based frameworks.
I think it really depends on a project and there is no silver bullet solution here.
There are indeed many approaches as you state, I'll mention a few:
Take advantage of Spring's #Transactional annotation on the test. In this case, spring will execute rollback after each test. so that the data changed by the test won't really be saved in the database even if the test passes.
Do not use #Transactional but organize tests so that they won't interfere (each test will use its own set of data that can co-exist with other tests data). If the test fails and doesn't "clean-up" its stuff, then other tests should still run. In addition, if the tests are being run in parallel, they still should not interfere.
Use new schema for each test (obviously expensive, but still can be a viable option to some projects).
Now, the real question is what do you test.
If you test a java code, like that your SQLs are created correctly, then probably the first way is a way to go.
Of course, it also depends on what commands are being executed during the tests, not in all databases all the commands can be in a transaction (for example in Postgres you can use DDL inside a transaction, in Oracle you can't, and so forth).
Another concern to think about during the continuous testing is the performance of tests.
Integration tests are slow and if you have a monolith application that runs hundreds of them, then the build will be really slow. Managing build that runs hours is a big pain.
I would like to mention here 2 ideas that can help here:
Moving to microservices helps a lot in this case (each microservice runs only a bunch of its tests and hence the build of each microservice on its own is much faster by nature)
Another interesting option to consider is running the tests against a docker container of the database that starts right in the test case (it also can be cached so that not every test will raise a docker container). A big benefit of such an approach is that everything runs locally (on the build server), so no interaction with the remote database (performance) + the clean-up of resources is done automatically, even if some tests fail. The Docker container dies and all the data put by the tets gets cleaned up automatically. Take a look at Testcontainers project maybe you'll find it helpful

Is it bad practice to allow my Junit tests to interact with a real DB?

I'm building a basic HTTP API and some actions like POST /users create a new user record in the database.
I understand that I could mock these calls, but at some level I'm wondering if it's easier to let my Junit tests run against a real (test) database? Is this a bad practice? Should only integration tests run against a real DB?
I'm using flyway to maintain my test schema and maven for my build, so I can have it recreate the test DB with the proper schema on each build. But I'm also worried that I'd need some additional overhead to maintain/clean the state of the database between each test, and I'm not sure if there's a good way to do that.
Unit tests are used to test single unit of code. This means that you write a unit test by writing something that tests a method only. If there are external dependencies then you mock them instead of actually calling and using those dependencies.
So, if you write code and it interacts with the real database, then it is not a unit test. Say, for some reason your call to db fails then unit test will also fail. Success or failure of your unit test should not be dependent on the external dependencies like db in your case. You have to assume that db call is successful and then hard code the data using some mocking framework(Mockito) and then test your method using that data.
As often, it depends.
On big projects with lots of JUnit tests, the overhead for the performance can be a point. Also the work time needed for the setup of the test data within the database as well as the needed concept for your tests not interfering with the test data of other tests while parallel execution of JUnit tests is a very big argument for only testing against a database if needed and otherwise mock it away.
On small projects this problems may be easier to handle so that you can always use a database but I personally wouldn't do that even on small projects.
As several other answers suggest you should create unit tests for testing small pieces of code with mocking all external dependencies.
However sometimes ( a lot of times) it should worth to test whole features. Especially when you use some kind of framework like Spring. Or you use a lot of annotations. When your classes or methods have annotations on them the effects of those annotations usually cannot be tested via unit-tests. You need the whole framework running during the test to make sure it works as expected.
In our current project we have almost as much integration tests as unit tests. We use the H2 in-memory DB for these tests, this way we can avoid failures because of connectivity problems, and Spring's test package could collect and run multiple integration tests into the same test-context, so it has to build the context only once for multiple tests and this way running these tests are not too expensive.
Also you can create separate test context for different part of the project (with different settings and DB content), so this way the tests running under different context won't interfere with each-other.
Do not afraid of using a lot of integration tests. You need some anyway, and if you already have a test-context it's not a big deal adding some more tests into the same context.
Also there are a lot of cases which would take a LOT of effort to cover with unit-tests (or cannot be covered fully at all) but can be covered simply by an integration tests.
A personal experience:
Our numerous integration tests were extremely useful when we switched from Spring Boot to Spring Boot 2.
Back to the original question:
Unit tests should not connect to real DB, but feel free to use more integration tests. (with in-memory DB)
Modern development practices recommend that every developer runs the full suite of unit tests often. Unit tests should be reliable (should not fail if the code is OK) Using an external database can interfere with those desiradata.
If the database is shared, simultaneous runs of the testsuite by different developers could interfere with each other.
Setting up and tearing down the database for each test is typically expensive, and thus can make the tests too slow for frequent execution.
However, using a real database for integration tests is OK. If you use an in-memory database instead of a fully real database, even set up and tear down of the database for each integration test can be acceptably fast.
A popular choice is the use of an in-memory database to run tests. This makes it easy to test, for example, repository methods and business logic involving database calls.
When opting for a "real" database, make sure that every developer has his/her own test database to avoid conflicts. The advantage of using a real database is that this prevents possible issues that could arise because of slight differences in behavior between in-memory and real database. However, test execution performance can be an issue when running a large test suite against a real database.
Some databases can be embedded in a way that the database doesn't even need to be installed for test execution. For example, there is an SO thread about firing up an embedded Postgres in Spring Boot tests.

Regression component tests with Cucumber. Is there any boundary to the layers that should be tested?

I found myself last week having to start thinking about how to refactor an old application that only contains unit tests. My first idea was to add some component test scenarios with Cucumber to get familiarised with the business logic and to ensure I don't break anything with my changes. But at that point I had a conversation with one of the architects in the company I work for that made me wonder whether it was worth it and what was actually the code I had to actually test.
This application has many different types of endpoints: rest endpoints to be called from and to call to, Oracle stored procedures and JMS topics and queues. It's deployed in a war file to a Tomcat server and the connection factory to the broker and the datasource to the database are configured in the server and fetched using JNDI.
My first idea was to load the whole application inside an embedded Jetty, pointing to the real web.xml so everything is loaded as it would be loaded from a production environment but then mocking the connection factory and the datasource. By doing that, all the connectivity logic to the infrastructure where the application is deployed would be tested. Thinking about the hexagonal architecture, this seems like too much effort having in mind that those are only ports which logic should only be about transforming received data into application data. Shouldn't this just be unit tested?
My next idea was to just mock the stored procedures and load the Spring XMLs in my test without any web server, which makes it easier to mock classes. For this I would be using libraries like Spring MockMvc for the rest endpoints and Mockrunner for JMS. But again, this approach would still test some adapters and complicate the test as the result of the tests would be XML and JSON payloads. The transformations done in this application are quite heavy where the same message type could contain different versions of a class (each message could contain many complex object that implement several interfaces).
So right now I was thinking that maybe the best approach would be to just create my tests from the entry point to the application, the services called from the adapters, and verify that the services responsible to send messages to the broker or to call other REST endpoints are actually invoked. Then just ensure there are proper unit tests for the endpoints and verify everything works once deployed by just providing some smoke tests that are executed in a real environment. This would test the connectivity logic and the business logic would be tested in isolation, without caring if a new adapter is added or one is removed.
Is this approach correct? Would I be leaving something without testing this way? Or is it still too much and I should just trust the unit tests?
Thanks.
Your application and environment sound quite complicated. I would definitely want integration tests. I'd test the app outside-in as follows:
Write a smoke-test suite that runs against the application in the actual production environment. Cucumber would be a good tool to use. That suite should only do things that are safe in production, and should be as small as possible while giving you confidence that the application is correctly installed and configured and that its integrations with other systems are working.
Write an acceptance test suite that runs against the entire application in a test environment. Cucumber would be a good choice here too.
I would expect the acceptance-test environment to include a Tomcat server with test versions of all services that exist in your production Tomcat and a database with a schema, stored procedure, etc. identical to production (but not production data). Handle external dependencies that you don't own by stubbing and mocking, by using a record/replay library such as Betamax and/or by implementing test versions of them yourself. Acceptance tests should be free to do anything to data, and they shouldn't have to worry about availability of services that you don't own.
Write enough acceptance tests to both describe the app's major use cases and to test all of the important interactions between the parts of the application (both subsystems and classes). That is, use your acceptance tests as integration tests. I find that there is very little conflict between the goals of acceptance and integration tests. Don't write any more acceptance tests than you need for specification and integration coverage, however, as they're relatively slow.
Unit-test each class that does anything interesting whatsoever, leaving out only classes that are fully tested by your acceptance tests. Since you're already integration-testing, your unit tests can be true unit tests which stubb or mock their dependencies. (Although there's nothing wrong with letting a unit-tested class use real dependencies that are simple enough to not cause issues in the unit tests).
Measure code coverage to ensure that the combination of acceptance and unit tests tests all your code.

Junit good practices

I have a code which retrieves few information from Database.
For example if you pass person Id, method will return you person details like:
Name: XXX X XXX
Address: XXXXXXXXXXXXX XXXXX
Phone: XXXXXX
In Junit what is the good practice to test this type of code? Is it good practice that Junit to have DB connection?
Is it a good practice, that JUnit will connect to DB and retrieve information for same person Id and do assertion.
Thanks.
For testing the code that really needs to work with the database, you should look at dbunit. As little of the code as possible should know about the database though - allowing you to fake out the "fetch or update the data" parts when testing other components.
I'd strongly advise a mixture of DB tests - lots of unit tests which hit an in-memory database (e.g. HSQLDB) and "enough" integration tests which talk to the real kind of database that will be used in production. You may well want to make sure that all your tests can actually run against both environments - typically develop against HSQLDB, but then run against your production-like database (which is typically slower to set up/tear down) before check-in and in your continuous build.
It sounds like you're talking about something like a Data Access Object. I'd say it's essential to test that kind of thing with a real database. Look at H2 for a fast, in-memory database that's excellent for testing. Create your populated object, use your persistence code to save it to the database and then to load it back. Then make sure the object you get back has the same state as what you saved in the first place.
Consider using the Spring test framework for help managing transactions in persistence tests and for general test support if you're using Spring elsewhere.

Unit testing a Hibernate driven application? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
This may be a naive question, but I am new to both the junit and hibernate frameworks and I was wondering what the best way to go about unit testing an application that is largely calls to hibernate, or if it is even necessary to do so?
What is the best practice here?
EDIT:
Spring seems to be the big suggestion here. Unfortunately this may be alittle too much to bite off for one project. Junit, Hibernate and Spring are all new to me, and while they are all technologies I want to get under my belt, I think trying to incorporate them all into one project may be too overwhelming for me.
Links to tutorials and/or book suggestions are welcome.
Keep in mind the difference between unit testing and integration testing.
Unit tests should be testing code without any outside dependencies. These dependencies are mocked using a framework like, for example, JMock.
Integration tests are important too but the major drawback of them is that they take a long time to run. You can run thousands of true unit tests in a couple of seconds, but it's not the same with integration tests.
Depending on the size of your project/development team you might want to prioritize true unit tests over integration tests. Both style of tests are important but if you are pressed for resources, just going with Unit testing may be a better idea.
I wrote an application by myself that unit tested the Web (with Spring MVC this is easy) and Service layers, as well as domain objects. But I left the DAO alone because I didn't want to write a bunch of slow integration tests. If I had more people on staff I would have gone with integration tests as well, but in this case I didn't feel the time spent would be worth it.
As for best practices:
use an embedded database for running your tests if possible, so that you don't need a full deployed relational database just to run your tests (locally, or on your continuous build server if you have one). That way you also don't need to (necessarily) worry about rolling back etc, you can just recreate the database when you need to. Testing with an embedded database doesnt test peculiarities that could come up when you use your specific production database, but it does test your code, which should suffice.
You can also use DbUnit, an extension to JUnit, to easily fill the database with expected rows and put it in a known state, before you run your Hibernate tests.
Best practice? I use Spring and make all my tests transactional. I perform the test and rollback all the changes so I don't change the state of the database.
I like to use a in memory hsqldb for testing. The process for each hibernate POJO is:
Create the object
Persist it to the DB
Clear the session
Get it from the DB
Assert the objects are equal.
For DAOs, I create and persist enough objects to accurately test the methods, then run the tests and delete the objects as necessary to not intefere with other tests.
Hibernate source includes a lot of unit tests, I would recommend going through those and adapting a similar approach.
You can also look at the CaveatEmptor which the sample application developed for the book "Java Persistence with Hibernate"
If you're using Hibernate for Domain rich models, Unit testing domain logic is as simple as testing a POJO and Hibernate doesn't get in your way. The only caveat here is, For bidirectional mappings, you might have to set the object on both sides for unit tests.
Integration testing with database is generally not done for simple mappings. However it is suggested in the case of exquisite mappings like Single table inheritance etc. The only thing to remember here is, you may have to explicitly flush to database sometimes.
Sure, you'd unit test your persistence tier if it wasn't written in Hibernate, wouldn't you?
Create a given persistence interface that's implemented using Hibernate, instantiate some sample objects, perform CRUD operations, and ask JUnit to assert that the operations were successful. Same as any other class.
You could use Spring to help here.
It has a great unit test framework, you can use it to test CRUD ops and then rollback changes - great if you don't have the capability to reload a database every time.
Write a simple layer that passes requests to Hibernate. Then use a mocking library like EasyMock or JMock to assert that your Hibernate-veneer layer is correctly called by your application classes. This is nicely described in the partially-complete JMock book (scroll down to the test smell "everything is mocked").
Two cases are easy to test:
When practical, perform your various calculations and transformations in functions that don't know about saving or loading entities. If you can make these pure functions, so much better.
For functions that only save to the database without reading from it, you can choose not to save when testing.
The simplest (crudest) way to do #2 is by adding a reallyUpdate parameter to the function, then surrounding each "save" call with:
if (reallyUpdate) {
HibernateUtil.saveOrUpdate(theThing);
}
For me these were the lowest hanging fruit.

Categories

Resources