Flyway has several integration options.
I'm trying to determine what the pros/cons are of using the Gradle integration vs the Spring Boot integration given that your project is already using both Spring Boot and Gradle.
The only thing I can think of is that if you want to be able to do migrations without starting the application or want to save time by not migrating every time you start the app then the Gradle choice could be better.
Think of it as build time vs run time.
In general you will build an artifact once and deploy it to many environments, so run time is a much better fit.
However sometimes build time makes sense. This is primarily for situations where you need a fully migrated database as part of the build, in order to for example generate code based on the structure of that database using frameworks like jOOQ or QueryDSL.
I'm building a basic HTTP API and some actions like POST /users create a new user record in the database.
I understand that I could mock these calls, but at some level I'm wondering if it's easier to let my Junit tests run against a real (test) database? Is this a bad practice? Should only integration tests run against a real DB?
I'm using flyway to maintain my test schema and maven for my build, so I can have it recreate the test DB with the proper schema on each build. But I'm also worried that I'd need some additional overhead to maintain/clean the state of the database between each test, and I'm not sure if there's a good way to do that.
Unit tests are used to test single unit of code. This means that you write a unit test by writing something that tests a method only. If there are external dependencies then you mock them instead of actually calling and using those dependencies.
So, if you write code and it interacts with the real database, then it is not a unit test. Say, for some reason your call to db fails then unit test will also fail. Success or failure of your unit test should not be dependent on the external dependencies like db in your case. You have to assume that db call is successful and then hard code the data using some mocking framework(Mockito) and then test your method using that data.
As often, it depends.
On big projects with lots of JUnit tests, the overhead for the performance can be a point. Also the work time needed for the setup of the test data within the database as well as the needed concept for your tests not interfering with the test data of other tests while parallel execution of JUnit tests is a very big argument for only testing against a database if needed and otherwise mock it away.
On small projects this problems may be easier to handle so that you can always use a database but I personally wouldn't do that even on small projects.
As several other answers suggest you should create unit tests for testing small pieces of code with mocking all external dependencies.
However sometimes ( a lot of times) it should worth to test whole features. Especially when you use some kind of framework like Spring. Or you use a lot of annotations. When your classes or methods have annotations on them the effects of those annotations usually cannot be tested via unit-tests. You need the whole framework running during the test to make sure it works as expected.
In our current project we have almost as much integration tests as unit tests. We use the H2 in-memory DB for these tests, this way we can avoid failures because of connectivity problems, and Spring's test package could collect and run multiple integration tests into the same test-context, so it has to build the context only once for multiple tests and this way running these tests are not too expensive.
Also you can create separate test context for different part of the project (with different settings and DB content), so this way the tests running under different context won't interfere with each-other.
Do not afraid of using a lot of integration tests. You need some anyway, and if you already have a test-context it's not a big deal adding some more tests into the same context.
Also there are a lot of cases which would take a LOT of effort to cover with unit-tests (or cannot be covered fully at all) but can be covered simply by an integration tests.
A personal experience:
Our numerous integration tests were extremely useful when we switched from Spring Boot to Spring Boot 2.
Back to the original question:
Unit tests should not connect to real DB, but feel free to use more integration tests. (with in-memory DB)
Modern development practices recommend that every developer runs the full suite of unit tests often. Unit tests should be reliable (should not fail if the code is OK) Using an external database can interfere with those desiradata.
If the database is shared, simultaneous runs of the testsuite by different developers could interfere with each other.
Setting up and tearing down the database for each test is typically expensive, and thus can make the tests too slow for frequent execution.
However, using a real database for integration tests is OK. If you use an in-memory database instead of a fully real database, even set up and tear down of the database for each integration test can be acceptably fast.
A popular choice is the use of an in-memory database to run tests. This makes it easy to test, for example, repository methods and business logic involving database calls.
When opting for a "real" database, make sure that every developer has his/her own test database to avoid conflicts. The advantage of using a real database is that this prevents possible issues that could arise because of slight differences in behavior between in-memory and real database. However, test execution performance can be an issue when running a large test suite against a real database.
Some databases can be embedded in a way that the database doesn't even need to be installed for test execution. For example, there is an SO thread about firing up an embedded Postgres in Spring Boot tests.
I want to improve my DB access code tests.
I am using GAE datastore. To test the Db classes, I used a Backdoor Servlet. Just wondering, is there more efficient and elegant way to do DAO testing?
Your views on Unit vs Integration tests for DAO?
It depends a bit on how your database is set up. Here are a couple of other options apart from what you already have:
you can write unit tests directly against your DAOs. You can mock the database calls away with mockito.
you can write unit tests that records the integration with the database and then replays it when you run the tests a second time. See the betamax library fot this.
you can run unit tests against the actual database. Now it is not unit tests anymore but a kind of integration test. In this case you will need to think about how to get a clean state in the database to start from.
you can run integration tests against the entire system and make sure that most of your database code is touched by using a code coverage tool.
I prefer to have full blown integrations tests on the whole thing including the database and any other third party integrations. And unit tests on the particulars but not necessarily involving the actual database calls. But - as always - your setup may lead you in other directions.
I've written a JEE6 application using CDI and JPA. My tests are written in JUnit. I'd like to run the database tests against an in-memory HSQLDB database in order to make sure my JPQL (which I consider 'code') is tested. My motivation is that that changing a JPQL statement with a mocked out EntityManager would lead to successful test execution of the code unit.
I'm using Guice and Jukito to run other (non-jpa) tests.
Does anyone have an example for this? I've tried looking around and I've yet to find a good example or framework project to handle this.
Arquillian persistence? DBUnit?
I used to use DBUnit to populate my database with classes/records expected by my Unit tests, and I noticed that they do sort of the same thing using Boostrap.groovy in Grails, but I am wondering if this is the kosher way of doing this in Grails.
Is it better just to setup DBUnit within Grails? Or does Grails have it's own way of doing this?
I wouldn't recommend Bootstrap.groovy for loading test data. It's likely to become unwieldy, particularly if you want to use different datasets for different tests. There are a number of DBUnit Grails plugins that you could use to simplify integrating DBUnit into a Grails app (though you can also just use the JAR directly).
There are also some plugins that provide Grails-specific ways of loading test data. The Fixtures plugin seems to be one of the most popular.
I've always used a combination of Bootstrap.groovy using the environments block and the tests setUp()/tearDown() methods. Sometimes utilizing a base test class.