My database of choice is MongoDB. I'm writing a data-layer API to abstract implementation details from client applications - that is, I'm essentially providing a single public interface (an object which acts as an IDL).
I'm testing my logic as I go in a TDD manner. Before each unit test, an #Before method is called to create a database singleton, after which, when the test completes, an #After method is called to drop the database. This helps to promote independence among unit tests.
Nearly all unit tests, i.e. performing a contextual query, require some kind of insertion logic to occur before hand. My public interface provides an insert method - yet, it seems incorrect to use this method as precursor logic to each unit test.
Really I need some kind of mocking mechanism, yet, I haven't had much experience with mocking frameworks, and it seems that Google returns nothing re a mocking framework one might use with MongoDB.
What do others do in these situations? That is, how do people unit test code that interacts with a database?
Also, my public interface connects to a database defined in a external configuration file - it seems incorrect to use this connection for my unit testing - again, a situation that would benefit from some kind of mocking?
Technically tests that talk to a database (nosql or otherwise) are not unit tests, as the tests are testing interactions with an external system, and not just testing an isolated unit of code. However tests that talk to a database are often extremely useful, and are often fast enough to run with the other unit tests.
Usually I have a Service interface (eg UserService) which encapsulates all the logic for dealing with the database. Code that relies on UserService can use a mocked version of UserService and is easily tested.
When testing the implementation of the Service that talks to Mongo, (eg MongoUserService) it is easiest to write some java code that will start/stop a mongo process on the local machine, and have your MongoUserService connect to that, see this question for some notes.
You could try to mock the functionality of the database while testing MongoUserService, but generally that is too error prone, and doesn't test what you really want to test, which is interaction with a real database. So when writing tests for MongoUserService, you set up a database state for each test. Look at DbUnit for an example of a framework for doing so with a database.
As sbridges wrote in this post it is a bad idea not to have a dedicated service (sometimes also known as repository or DAO) which abstracts the data access from the logic. Then you could test the logic by providing a mock of the DAO.
Another approach which I do is to create a Mock of the Mongo object (e.g. PowerMockito) and then return the appropriate results.
This because you don't have to test if the database works in unit tests but more over you should test if the right query was sent to the databse.
Mongo mongo = PowerMockito.mock(Mongo.class);
DB db = PowerMockito.mock(DB.class);
DBCollection dbCollection = PowerMockito.mock(DBCollection.class);
PowerMockito.when(mongo.getDB("foo")).thenReturn(db);
PowerMockito.when(db.getCollection("bar")).thenReturn(dbCollection);
MyService svc = new MyService(mongo); // Use some kind of dependency injection
svc.getObjectById(1);
PowerMockito.verify(dbCollection).findOne(new BasicDBObject("_id", 1));
That would also be an option. Of course the creation of the mocks and returning of the appropriate objects is just coded as an example above.
I wrote a MongoDB fake implementation in Java: mongo-java-server
Default is a in-memory backend, that can be easily used in Unit and Integration tests.
Example
MongoServer server = new MongoServer(new MemoryBackend());
// bind on a random local port
InetSocketAddress serverAddress = server.bind();
MongoClient client = new MongoClient(new ServerAddress(serverAddress));
DBCollection coll = client.getDB("testdb").getCollection("testcoll");
// creates the database and collection in memory and inserts the object
coll.insert(new BasicDBObject("key", "value"));
assertEquals(1, collection.count());
assertEquals("value", collection.findOne().get("key"));
client.close();
server.shutdownNow();
Today I think the best practice is to use testcontainers library (Java) or testcontainers-python port on Python. It allows to use Docker images with unit tests.
To run container in Java code just instantiate GenericContainer object (example):
GenericContainer mongo = new GenericContainer("mongo:latest")
.withExposedPorts(27017);
MongoClient mongoClient = new MongoClient(mongo.getContainerIpAddress(), mongo.getMappedPort(27017));
MongoDatabase database = mongoClient.getDatabase("test");
MongoCollection<Document> collection = database.getCollection("testCollection");
Document doc = new Document("name", "foo")
.append("value", 1);
collection.insertOne(doc);
Document doc2 = collection.find(new Document("name", "foo")).first();
assertEquals("A record can be inserted into and retrieved from MongoDB", 1, doc2.get("value"));
or on Python (example):
mongo = GenericContainer('mongo:latest')
mongo.with_bind_ports(27017, 27017)
with mongo_container:
def connect():
return MongoClient("mongodb://{}:{}".format(mongo.get_container_host_ip(),
mongo.get_exposed_port(27017)))
db = wait_for(connect).primer
result = db.restaurants.insert_one(
# JSON as dict object
)
cursor = db.restaurants.find({"field": "value"})
for document in cursor:
print(document)
I'm surprised no one advised to use fakemongo so far. It emulates mongo client pretty well, and it all runs on same JVM with tests - so integration tests become robust, and technically much more close to true "unit tests", since no foreign system interaction takes place. It's like using embedded H2 to unit test your SQL code.
I was very happy using fakemongo in unit tests that test database integration code in end-to-end manner. Consider this configuration in test spring context:
#Configuration
#Slf4j
public class FongoConfig extends AbstractMongoConfiguration {
#Override
public String getDatabaseName() {
return "mongo-test";
}
#Override
#Bean
public Mongo mongo() throws Exception {
log.info("Creating Fake Mongo instance");
return new Fongo("mongo-test").getMongo();
}
#Bean
#Override
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(mongo(), getDatabaseName());
}
}
With this you can test your code that uses MongoTemplate from spring context, and in combination with nosql-unit, jsonunit, etc. you get robust unit tests that cover mongo querying code.
#Test
#UsingDataSet(locations = {"/TSDR1326-data/TSDR1326-subject.json"}, loadStrategy = LoadStrategyEnum.CLEAN_INSERT)
#DatabaseSetup({"/TSDR1326-data/dbunit-TSDR1326.xml"})
public void shouldCleanUploadSubjectCollection() throws Exception {
//given
JobParameters jobParameters = new JobParametersBuilder()
.addString("studyId", "TSDR1326")
.addString("execId", UUID.randomUUID().toString())
.toJobParameters();
//when
//next line runs a Spring Batch ETL process loading data from SQL DB(H2) into Mongo
final JobExecution res = jobLauncherTestUtils.launchJob(jobParameters);
//then
assertThat(res.getExitStatus()).isEqualTo(ExitStatus.COMPLETED);
final String resultJson = mongoTemplate.find(new Query().with(new Sort(Sort.Direction.ASC, "topLevel.subjectId.value")),
DBObject.class, "subject").toString();
assertThatJson(resultJson).isArray().ofLength(3);
assertThatDateNode(resultJson, "[0].topLevel.timestamp.value").isEqualTo(res.getStartTime());
assertThatNode(resultJson, "[0].topLevel.subjectECode.value").isStringEqualTo("E01");
assertThatDateNode(resultJson, "[0].topLevel.subjectECode.timestamp").isEqualTo(res.getStartTime());
... etc
}
I used fakemongo without problems with mongo 3.4 driver, and community is really close to release a version that supports 3.6 driver (https://github.com/fakemongo/fongo/issues/316).
Related
I have written several Unit Tests and now switched to write Integration Test in our Java (Spring Boot) app. We use JUnit and Mockito libraries for testing.
As far as I know, Integration Tests check the entire rings rather than a function. However, I am confused that if I should also check the if conditions in the methods while integration testing. Here is an example service method:
#Override
public CountryDTO create(CountryRequest request) {
if (countryRepository.existsByCodeIgnoreCase(countryCode)) {
throw new EntityAlreadyExistsException();
}
final Country country = new Country();
country.setCode("UK");
country.setName("United Kingdom");
final Country created = countryRepository.save(country);
return new CountryDTO(created);
}
My questions are:
1. Can I write integration test for a Service or a Repository class?
2. when I test create method in my service above, I think I just create the proper request values (CountryRequest) in my Test class, then pass them to this create method and then check the returned value. Is that true? Or do I also need to test the condition in the if clause (countryRepository.existsByCodeIgnoreCase(countryCode))?
3. When I test find methods, I think I should first create record by calling create method and the proper place for this is #BeforeEach setup() {} method. Is that true?
If you wrote Unit tests that made sure, your services and repositories are working correctly (for example by validation and parameterized tests) I believe, you don't have to write integration tests for them.
You should write integration tests to check the behavior of your app. By testing if your controller is working correctly you will also check if service and repo are ok.
I believe unit test should check it.
Do you ask if you should create record in db? If you want to test if repository is correctly communicating with service and it with controller, you have to do it with some data.
I have my service with jooq select in myMethod()
Result<MyRecord> records = dsl.selectFrom(MyTable).fetch();
And I try to write unit-test using mockito. I want to mock my dsl and get 2 different result when I call service.myMethod()
Mockito.when(dsl.selectFrom(MyTable)).thenReturn(result)
But it doesn't work. How can I mock my select request?
Mocking the jOOQ API
If you want to mock the jOOQ API, the interesting methods for you to mock are the various fetch() methods, or execute() methods, as they act like the "terminal" methods of the jOOQ DSL API. All the intermediate methods should produce new mocks, not actual results.
However, as jOOQ's DSL is vast, you might be overlooking a variety of edge cases, so I think that mocking it entirely might not work too well.
Mocking the JDBC API
If you really want to mock the jOOQ API, I would rather mock it on the JDBC level. jOOQ implements a mock JDBC Connection, allowing to mock the entirety of JDBC using a single lambda expression, where you can do something like this (from the manual):
MockDataProvider provider = new MyProvider();
MockConnection connection = new MockConnection(provider);
DSLContext ctx = DSL.using(connection, SQLDialect.ORACLE);
Result<BookRecord> result = ctx.selectFrom(BOOK).where(BOOK.ID.eq(5)).fetch();
And then (simplified):
public class MyProvider implements MockDataProvider {
#Override
public MockResult[] execute(MockExecuteContext ctx) throws SQLException {
DSLContext ctx = DSL.using(SQLDialect.ORACLE);
MockResult[] mock = new MockResult[1];
String sql = ctx.sql();
if (sql.toUpperCase().startsWith("DROP")) {
throw new SQLException("Statement not supported: " + sql);
}
else if (sql.toUpperCase().startsWith("SELECT")) {
Result<Record2<Integer, String>> result =
ctx.newResult(AUTHOR.ID, AUTHOR.LAST_NAME);
result.add(cctx
.newRecord(AUTHOR.ID, AUTHOR.LAST_NAME)
.values(1, "Orwell"));
mock[0] = new MockResult(1, result);
}
return mock;
}
}
There's also an out of the box, file-based version of this mock implementation, the MockFileDatabase, which uses regular expressions to match SQL strings and provides a text based syntax to construct result sets for your queries.
Mocking databases in general
Please beware that mocking is not a good way to test database interaction - it only gets you so far, in simple cases. A much better, thorough, approach are integration tests. If you must, using an in-memory database, but ideally using your production database product (e.g. Oracle, PostgreSQL, etc.) via something like testcontainers.
I need to write the junits for a oracle-wrapper (basically a microservice written on vertx which is interacting with oracle db).How to proceed?Mockito can't be used
How about using an in memory database eg: h2 database. Which can run in oracle compatibility mode:
To use the Oracle mode, use the database URL
jdbc:h2:~/test;MODE=Oracle or the SQL statement SET MODE Oracle.
First you write unit tests focusing on establishing that the Dao is working properly, that is every insert, delete, update and query is working as intended and so on. This approach would assume that the network access is working properly to the microservice from the clients.
Example:
public class MyFirstdao {
private static final MyFirstDao dao = new MyFirstDao(dbAddress, dbName, ...);
#Test
private void insert() {
SomeResult result = dao.insert(InsertSomeObject);
assertSomething(result);
}
...
}
After that you can create a fake client that you can use to access the microservice and perform predefined operations. Though if you only have one type of client accessing your microservice I would probably put these tests on the client rather than having to write the same code twice. I'm just speculating here but I hope it was of use.
I have a method inside my Dao class like this:
#Override
public List<Dog> loadAllDog(Date pDate) {
final MapSqlParameterSource lParameterSource = new MapSqlParameterSource();
lParameterSource.addValue("jdate", pDate);
final String lSql = readSqlQuery("LAD");
final NamedParameterJdbcTemplate lTemplate = createNamedParameterJdbcTemplate();
return lTemplate.query(lSql, lParameterSource, new DogExtractor());
}
I use the above method to load data for an integration test. Unfortunality the size of the result list is about 300000 data rows.
For my test it is ok to work only with 100 data rows. So I wrote a SQL Test file(Key LAD_TEST) that returns only 100 rows:
SELECT
*
FROM
DOG
WHERE
TO_CHAR(sell, 'dd.mm.yy') = TO_CHAR(:jdate,'dd.mm.yy')
and rownum <= 100
my question is, can I include anyhow that test sql(LAD_TEST) instead of the real production sql(LAD) without changing the production code here final String lSql = readSqlQuery("LAD"); ???
I am using jmockit in my testclass but that dao class(mDogDao) I am talking about is not mocked...
The call from my test:
List<Dog> lAllDog = mDogDao.loadAllDog(lNow.getTime());
Is there any way to manage this with jmockit without mocking mDogDao?
Some advice?
Thx
Stefan
You can mock the NamedParameterJdbcTemplate class, and record an expectation so the query(...) method returns your desired test data.
Why would you want to query your live database in a unit test?
What I always do is work against a seperate unit test database schema or an in-memory database. That way I am certain that a bug in my queries doesn't influence the data used by other people.
Even if you need a certain amount of test-data, you can always insert an extract of your data before your test and clean it up afterwards.
Moreover this way your unit tests are also run in isolation. If one tests modifies data, your other tests don't suffer from the possible consequences.
I have been having a query regarding writing unit tests for web methods which actually communicates with a database and returns some value.
Say for example I have a web service named "StudentInfoService".
That web serivces provides a API "getStudentInfo(studentid)"
Here is some sample snippet
public class StudentInfoService
{
public StudentInfo getStudentInfo(long studentId) {
//Communicates with DB and creates
// StudentInfo object with necessary information
// and returns it to the caller.
}
}
How do we actually write unit tests for this method getStudentInfo?
Generally how do we write unit tests for methods which involves a connection with a resource(Database, Files, JNDI, etc...)?
Firstly, the class StudentInfoService in your example is not testable, or atleast not easily. This is for a very simple reason - there is no way to pass in a database connection object to the class, at least not in the method that you've listed.
Making the class testable would require you to build your class in the following manner:
public class StudentInfoService
{
private Connection conn;
public StudentInfoService(Connection conn)
{
this.conn = conn;
}
public StudentInfo getStudentInfo(long studentId) {
//Uses the conn object to communicate with DB and creates
// StudentInfo object with necessary information
// and returns it to the caller.
}
}
The above code allows for dependency injection via the constructor. You may use setter injection instead of constructor injection, if that is more suitable, but it usually isn't for DAO/Repository classes, as the class cannot be considered fully formed, without a connection.
Dependency injection would allow your test cases to create a connection to a database (which is a collaborator to your class/system under test) instead of getting the class/system itself to create the collaborator objects. In simpler words, you are decoupling the mechanism of establishing database connections from your class. If your class was previously looking up a JNDI datasource and then creating a connection, then it would have been untestable, unless you deployed it to a container using Apache Cactus or a similar framework like Arquillian, or if you used an embedded container. By isolating the concern of creating the connection from the class, you are now free to create connections in your unit tests outside the class and provide them to the class on a as-needed basis, allowing you to run tests inside a Java SE environment.
This would enable you to use a database-oriented unit testing framework like DbUnit, which would allow you to setup the database in a known state before every test, then pass in the connection to the StudentInfoService class, and then assert the state of the class (and also the collaborator, i.e. the database) after the test.
It must be emphasized that when you unit test your classes, your classes alone must be the only systems under test. Items like connections and datasources are mere collaborators that could and ought to be mocked. Some unit tests would use in-memory databases like H2, HSQL, or Derby for unit-tests and use the production-equivalent installations of databases for integration and functional testing.
Try to use http://www.dbunit.org/intro.html.
Main idea - make a stub database with known dataset to run your tests and assert results.
You will need to reload the dataset before runs to restore initial state.
We are using the in-memory HSQL database. It is very fast and SQL-92 compliant. In order to make our PostgreSQL queries run on HSQL, we rewrite the queries using a self-written test SessionFactory (Hibernate). Advantages over a real database are:
much faster, which is important for unit tests
requires no configuration
runs everywhere, including our continuous integration server
When working with "legacy code", it can be difficult to write unit tests without some level of refactoring. When writing objects, I try to adhere to SOLID. As part of SOLID, the "D" stands for dependency inversion.
The problem with legacy code is that you may already have numerous clients that are using the no arg constructor of StudentInfoService, which can make adding a constructor that takes a Connection conn parameter difficult.
What I would suggest isn't generally best practice, because you're exposing test code in your production system, but it is sometimes optimal for working with legacy code.
public class StudentInfoService {
private final Connection conn;
/**
* This no arg constructor will automatically establish a connection for you. This
* will remain around to support legacy code that depends on a no arg constructor.
*/
public StudentInfoService() throws Exception {
conn = new ConcreteConnectionObject( ... );
}
/**
* This constructor may be used by your unit tests (or new code).
*/
public StudentInfoService( Connection conn ) {
this.conn = conn;
}
public StudentInfo getStudentInfo() {
// this method will need to be slightly refactored to use
// the class variable "conn" instead of establishing its own connection inline.
}
}