I have several DAO objects that are used to retrieve information from a database and I really want to write some automated tests for them but I'm having a hard time figuring out how to do it.
I'm using Spring's JdbcTemplate to run the actual query (via a prepared statement) and map the results to the model object (via the RowMapper class).
If I were to write unit tests, I'm not sure how I would/should mock the objects. For example, since there are only reads, I would use the actual database connection and not mock the jdbcTemplate, but I'm not sure that's right.
Here's the (simplified) code for the simplest DAO of the batch:
/**
* Implementation of the {#link BusinessSegmentDAO} interface using JDBC.
*/
public class GPLBusinessSegmentDAO implements BusinessSegmentDAO {
private JdbcTemplate jdbcTemplate;
private static class BusinessSegmentRowMapper implements RowMapper<BusinessSegment> {
public BusinessSegment mapRow(ResultSet rs, int arg1) throws SQLException {
try {
return new BusinessSegment(rs.getString(...));
} catch (SQLException e) {
return null;
}
}
}
private static class GetBusinessSegmentsPreparedStatementCreator
implements PreparedStatementCreator {
private String region, cc, ll;
private int regionId;
private GetBusinessSegmentsPreparedStatementCreator(String cc, String ll) {
this.cc = cc;
this.ll = ll;
}
public PreparedStatement createPreparedStatement(Connection connection)
throws SQLException {
String sql = "SELECT ...";
PreparedStatement ps = connection.prepareStatement(sql);
ps.setString(1, cc);
ps.setString(2, ll);
return ps;
}
}
public GPLBusinessSegmentDAO(DataSource dataSource) {
jdbcTemplate = new JdbcTemplate(dataSource);
}
public Collection<BusinessSegment> getBusinessSegments(String cc, String ll) {
return jdbcTemplate.query(
new GetBusinessSegmentsPreparedStatementCreator(cc, ll),
new BusinessSegmentRowMapper());
}
}
Any idea would be appreciated.
Thanks!
Please have a look at below links:
Testing SQL queries with Spring and DbUnit
MockObjects or DBUnit for testing Code using JdbcTemplate
Hope that helps.
EDIT:
Here is the GitHub version of RowMapperTests for easy reference.
I recommend breaking your dependency on JdbcTemplate class, and using the JdbcOperations interface instead, e.g.
public class GPLBusinessSegmentDAO implements BusinessSegmentDAO {
private final JdbcOperations jdbc;
public GPLBusinessSegmentDAO(DataSource dataSource) {
this(new JdbcTemplate(dataSource));
}
public GPLBusinessSegmentDAO(JdbcOperations jdbc) {
this.jdbc = jdbc;
}
// ... DAO methods here
}
Your unit test can invoke the second constructor, passing in a mock JdbcOperations object. Since all DB operations are performed via the jdbc object, you can mock that easily enough.
Your live code can call the first constructor as before.
To write a true unit test for this, you would not be touching a real database.
You may however find it more practical to pass in a real DataSource to your underlying db, and test the getBusinessSegments() method returns 0, 1 and many results depending on the cc and ll values you pass in.
Another option worth investigating would be to pass in a DataSource of an embedded Java DB that was initialised with your schema in a setUp/#Before method. I guess what you really want to test is that the SELECT... query maps correctly to the schema, so such a test would catch any errors that arise at runtime when the schema, say, changes.
Related
I have the following code in my Java class in a Spring Boot (v. 2.2.1.RELEASE) application:
#Inject
private JdbcTemplate jdbcTemplate;
#Inject
private MyRowCallbackHandler myRowCallbackHandler;
public void myMethod() {
jdbcTemplate.query(MY_QUERY, myRowCallbackHandler);
}
The JDBC template object is an implementation of org.springframework.jdbc.core.JdbcTemplate and the handler is an implementation of org.springframework.jdbc.core.RowCallbackHandler.
With JUnit version 4 and Mockito, can I mimic the retrieval of one or more rows from a database by the query method, thus calling the handler's processRow() method?
Thanks for any assistance.
I ran into this problem in my own code, thought I'd share the solution here, even though it's slightly different than the situation above as I mock the jdbcTemplate as well.
#InjectMocks
private JdbcOperationRepository jdbcOperationRepository;
#Mock
private NamedParameterJdbcTemplate mockJdbcTemplate;
#Test
public void testMyResults() {
final ResultSet mockResult1 = mock(ResultSet.class);
when(mockResult1.getString(MY_COLUMN)).thenReturn(value);
// ... other when statements to mock the returned data
doAnswer(invocation -> {
RowCallbackHandler callbackHandler = invocation.getArgument(2);
callbackHandler.processRow(mockResult1);
callbackHandler.processRow(mockResult2);
return null;
}).when(mockJdbcTemplate).query(any(), any(), any(RowCallbackHandler.class));
}
I was struggling with the same problem. First, you have to keep in mind that you can mock almost anything in spring boot using Junit tests.
I am writing the solution in a generalized pattern so that everyone can get an idea how it gets implemented and used.
Suppose, we have a function implemented in our BookService.java class:
#Service
public class BookService {
#Inject
private NamedParameterJdbcOperations jdbcTemplate;
#Inject
private FileHelper fileHelper;
public List<BookDTO> getBooks(Long libraryId){
String sql = fileHelper.getFileContents("SQL_FILE_PATH");
Map<String, Object> parameterMap = new HashMap<>();
if(libraryId!=null){
parameterMap.put("libraryId", libraryId);
}
List<BookDTO> books = new ArrayList<>();
jdbcTemplate.query(sql, parameterMap, rs -> {
BookDTO book = new BookDTO();
book.setId(rs.getLong("id"));
book.setName(rs.getString("name"));
if(rs.getObject("librarian") != null){
book.setLibrarian(rs.getString("librarian"));
}
books.add(book);
});
return books;
}
}
Now, we want to mock the functionality of the jdbcTemplate.query() method for the above service class.
Our test should be written in this pattern:
public class BookServiceMockTest {
#Mock
private NamedParameterJdbcOperations jdbcTemplate;
#Mock
private FileHelper fileHelper;
private BookService mockBookService;
#Before
public void setup(){
mockBookService = new BookService();
// using reflectionUtils setFields of the fileHelper and jdbcTemplate.
// ....
}
#Test
public void getBooksTest(){
when(fileHelper.getFileContents("SQL_FILE_PATH")).thenReturn("SOME SQL");
doAnswer(invocation -> {
// we are going to mock ResultSet class.
ResultSet rs = Mockito.mock(ResultSet.class);
when(rs.getLong("id")).thenReturn(1L);
when(rs.getString("name")).thenReturn("Game of Thrones");
// this will mock the if() statement
when(rs.getObject("librarian")).thenReturn("John Doe");
// this will mock the actual get statement call on getString() inside the if statement
when(rs.getString("librarian")).thenReturn("John Doe");
// the argument index is important here.
// we are doing getArgument(2).
// This means the third parameter passed in the jdbcTemplate.query(Param0, Param1, Param2) in the BookService.getBooks() function.
RowCallbackHandler rch = (RowCallbackHandler) invocation.getArgument(2);
// as we are mocking only one row..
rch.processRow(rs);
/* // if we wanter two or more results:
when(rs.getLong("id")).thenReturn(1L).thenReturn(2L);
when(rs.getString("name")).thenReturn("Game of Thrones").thenReturn("Dance of the Dragon");
int n = 2; // no of rows..
for(int i=0; i<n; i++){
rch.processRow(rs);
}
*/
return null;
})
// the parameters used here are important. Any mismatch will result in unsuccessful.
.when(jdbcTemplate).query(eq("SOME SQL"), anyMap(), any(RowCallbackHandler.class));
List<BookDTO> books = mockBookService.getBooks(anyLong());
verify(jdbcTemplate, times(1)).query(eq("SOME SQL"), anyMap(), any(RowCallbackHandler.class));
assertThat(books).hasSize(1);
}
}
I hope this answered what you were looking for!
I'm learning Mockito and Unit Testing in general. I want to learn how to unit test better by using Argument Captor. I'm using jdbc to handle my SQL statement. I have a method that inserts a user into my DB.
public void insert(User user) {
String sql = "INSERT INTO user (id) VALUES ?";
jdbcTemplate.update(new PreparedStatementCreator() {
#Override
public PreparedStatement createPreparedStatement(Connection connection) {
final PreparedStatement ps = connection.prepareStatement(sql);
ps.setString(1, user.getId().trim());
return ps;
}
});
}
Below is the test that I'm trying to write with ArgumentCaptor.
#Test
public void testInsert() {
User user = new User("testID");
ArgumentCaptor<PreparedStatementCreator> captor = ArgumentCaptor.forClass(PreparedStatementCreator.class);
insert(user);
verify(mockJdbc, times(1)).update(captor.capture());
PreparedStatementCreator actual = captor.getValue();
assertEquals(??, actual.createPreparedStatement(??));
}
Any advice or insight on what should be in the '??' for the assert statement or if this is the correct way to use Argument Captor?
Thank You
Edit:
#Test
public void testInsert() throws SQLException {
ArgumentCaptor<PreparedStatementCreator> captor = ArgumentCaptor.forClass(PreparedStatementCreator.class);
PreparedStatement ps = mockConnection.prepareStatement("INSERT INTO user (id) VALUES ?";);
ps.setString(1, user.getId().trim());
insert(user);
verify(mockJdbcTemplate, times(1)).update(captor.capture());
PreparedStatementCreator actual = captor.getValue();
assertEquals(ps, actual.createPreparedStatement(mockConnection));
}
I like your approach of using ArgumentCaptors.
You are using the ArgumentCaptor correctly to capture the argument of the method update on the mocked JDBC template; however, you cannot extract the argument used to call the PreparedStatementCreator, because this is object is not a mock.
Conceptually, the difficulty you have to test this part of your code comes from the fact you don't control the creation of the PreparedStatementCreator. One possible solution would be to take back control on how and when you create these objects; so as to allow you to mock them in your tests.
Following a standard creational pattern, you could introduce a factory which (single) responsibility is to create PreparedStatementCreator.
interface PreparedStatementCreatorFactory {
PreparedStatementCreator newPreparedStatementCreator(Connection connection, String sql, User user);
}
public final class DefaultPreparedStatementCreatorFactory {
#Override
public PreparedStatementCreator newPreparedStatementCreator(Connection connection, String sql, User user) {
final PreparedStatement ps = connection.prepareStatement(sql);
ps.setString(1, user.getId().trim());
return ps;
}
}
Then, in the class you are testing (which contains the JDBC mock), you can inject a mock of the PreparedStatementCreatorFactory. Then, instead of capturing the argument of the JDBC mock, you can capture the argument on the factory instead; and, of course, specify what the mocked factory returns.
PreparedStatementCreatorFactory factory = Mockito.mock(PreparedStatementCreatorFactory.class);
PreparedStatementCreator creator = Mockito.mock(PreparedStatementCreator.class);
// Mock the creator at your convenience.
when(factory.newPreparedStatementCreator(any(Connection.class), any(String.class), any(User.class)).thenReturn(creator);
...
User user = new User("testID");
ArgumentCaptor<Connection> connectionCaptor = ArgumentCaptor.forClass(Connector.class);
ArgumentCaptor<String> sqlCaptor = ArgumentCaptor.forClass(String.class);
ArgumentCaptor<User> userCaptor = ArgumentCaptor.forClass(User.class);
insert(user);
verify(factory, times(1)).newPreparedStatementCreator(connectionCaptor.capture(), sqlCaptor.capture(), userCaptor.capture());
assertEquals(user, userCaptor.getValue());
One drawback of this approach is that it adds one level of indirection and relative complexity; the main advantage is, as we see, to improve the separation of concerns in your design and in fine the testability of your code.
I have a StoredProcedure which is compiled on initialization.
Occasionally the logic requires that I use a different stored procedure.
I have tried and failed to reset the name of the stored procedure. The data is still retrieved using the original stored procedure.
Is there a way to achieve this?
Here is a reduced version of the class showing initialization and attempted recompiling with the name of a different stored procedure:
import org.springframework.jdbc.object.StoredProcedure;
public class MyDAOImpl extends StoredProcedure implements MyDAO {
#Autowired
public MyDAOImpl(DataSource dataSource, String originalSPName) {
super(dataSource, originalSPName); // invokes StoredProcedure constructor
compile();
}
public List<String> useOtherStoredProcedure(){
super.setSql("otherSPName");
compile();
// Error: Data is still retrieved with original StoredProcedure name
Map<String, Object> data = this.executeSP();
}
}
Rather than org.springframework.jdbc.object.StoredProcedure pls use org.springframework.jdbc.core.simple.SimpleJdbcCall. The beauty of SimpleJdbcCall is that you can dynamically specify the schema, package and stored procedure names. Pls find the respective code below :-
#Autowired
private JdbcTemplate jdbcTemplate;
public Date getSPData() {
SimpleJdbcCall spCall = new SimpleJdbcCall(jdbcTemplate).withSchemaName("schema")
.withCatalogName("catalog")
.withProcedureName("proc")
.withoutProcedureColumnMetaDataAccess()
.useInParameterNames("ref_id")
.declareParameters(
new SqlParameter("ref_id", Types.NUMERIC),
new SqlOutParameter("dt", Types.DATE));
SqlParameterSource in = new MapSqlParameterSource()
.addValue("ref_id", 12345);
Map<String, Object> out = spCall.execute(in);
Date date = (Date) out.get("dt");
return date;
}
I believe this might be what you want:
public class MyDAOAdapter implements MyDAO {
private volatile MyDAO currentDao;
private final MyDAO myDAOImpl1;
private final MyDAO myDAOImpl2;
#Autowired
public MyDAOImpl(#Qualifier("myDAOImpl1") MyDAO myDAOImpl1, #Qualifier("myDAOImpl2") MyDAO myDAOImpl2) {
this.myDAOImpl1 = myDAOImpl1;
this.myDAOImpl2 = myDAOImpl2;
currentDao = myDAOImpl1;
}
public void switchToFirstDao() {
currentDao = myDAOImpl2;
}
public void switchToSecondDao() {
currentDao = myDAOImpl2;
}
// do you really need this?
public List<String> useOtherStoredProcedure(){
return myDAOImpl2.executeSP();
}
// Delegate all the methods to current DAO selected
public List<String> executeSP(){
return currentDao.executeSP();
}
// implement all other methods by delegating calls to currentDao
}
This should be autowired by all the code and the switch between the two StoredProcedure implementations would be hidden inside this adapter. Still I didn't understand if you want this switch to be global for all code or only for some cases.
The solution I found for my own situation was to implement a third stored procedure on SQL Studio, which acts as a common accessor and routes the query to the right stored procedure based on a parameter.
CREATE PROCEDURE [dbo].[Router_SP]
#spNumber as INTEGER
AS
BEGIN
IF(#spNumber = 1) EXECUTE originalSPName
ELSE IF (#spNumber = 2) EXECUTE otherSPName
ELSE RETURN 'Unrecognised stored procedure'
END
Leaving this question open for now to see if a better solution comes up.
tl;dr;. I have a method for creating new database table on the fly, and I want to write a unit test for it. Unfortunately, test runner does not perform rollback after tests in a proper way, and the table still remains in the DB after tests finished. What should I do?
Long story:
I am not very familiar neither with Java Persistence nor with Java Spring, so, if you found current solution ugly (as for me, it is rather ugly), please, tell me how to improve it - I will very appreciate your opinion.
I have a SysDatastoreService with the following implementation of addStaticDatastore method.
#Service
public class SysDatastoreServiceImpl implements SysDatastoreService {
#Autowired
private SysDatastoreRepository datastoreRepository;
#Autowired
private DataSource dataSource;
#Override
#Transactional
public Optional<SysDatastore> addStaticDatastore(String name, String tableName, String ident, Long ord) {
String createTableSql = PostgresTableSqlBuilder.createTableInPublicSchemaWithBigintPkAndFkId(
tableName,
SysObject.TABLE_NAME,
Optional.of(SysObject.ID_COLUMN_NAME)).buildSql();
Optional<SysDatastore> sysDatastore = Optional.empty();
try(
Connection connection = dataSource.getConnection();
Statement statement = connection.createStatement()
) {
connection.setAutoCommit(false);
Savepoint beforeTableCreation = connection.setSavepoint();
try {
statement.execute(createTableSql);
sysDatastore = Optional.ofNullable(
datastoreRepository.save(new SysDatastore(name, tableName, DatastoreType.STATIC, ident, ord)));
} catch(SQLException e) {
e.printStackTrace();
}
if(!sysDatastore.isPresent()) {
connection.rollback(beforeTableCreation);
} else {
connection.commit();
}
} catch(SQLException e1) {
e1.printStackTrace();
}
return sysDatastore;
}
}
So, as you can see, I receive new connection from DataSource and try to create new table. In success, I will create a new entry in the SysDataStoreRepository, and, if this fails, I will perform a rollback for the table creation.
There are some disadvantages of current approach, one of them is that table creation and entry insertion operates on separate connections (am I right?).
But I have some problem while writing a unit test. This is what I tried:
#Transactional(propagation = Propagation.REQUIRED)
#RunWith(SpringJUnit4ClassRunner.class)
#TransactionConfiguration(transactionManager = "transactionManager", defaultRollback = true)
#ContextConfiguration(locations = "file:src/main/webapp/WEB-INF/rest-servlet.xml")
public class SysDatastoreServiceTest {
#Autowired
private SysDatastoreService sysDatastoreService;
#Autowired
private DataSource dataSource;
#Test
public void testAddStaticDatastore() throws Exception {
Optional<SysDatastore> sysDatastore =
sysDatastoreService.addStaticDatastore("New static datastore", "new_datastore_table",
"NEW_DATASTORE_TABLE", 42L);
assertTrue(sysDatastore.isPresent());
assertEquals("New static datastore", sysDatastore.get().getName());
assertEquals("NEW_DATASTORE_TABLE", sysDatastore.get().getIdent());
assertEquals("new_datastore_table", sysDatastore.get().getTableName());
assertEquals(DatastoreType.STATIC, sysDatastore.get().getDynType());
assertEquals(42L, sysDatastore.get().getOrd().longValue());
assertTrue(dataSource.getConnection()
.getMetaData()
.getTables(null, null, sysDatastore.get().getTableName(), null)
.next());
}
This test seems pretty easy: I just compare all the fields, and then checks database for a new table.
However, this test fails when I run it twice or more times. Looking at the database I noticed, that the table new_datastore_table still remained in the schema. I guess, it was not rollbacked properly because of hand-written transaction and raw sql execution, but I am not sure.
Question: How should I write a test case for this method in a proper way? And, in case if the current approach is fundamentally wrong, how it should be changed?
Side notes: I use PostgreSQL database, and it cannot be replaced with non-relational database.
First a CREATE TABLE is a DDL statement, not a DML one. That means that a rollback will not delete the table. If you want to clean your database, you must explicitely remove it in a #After or #AfterClass method.
But do you really need to do the tests on the PostgreSQL database? Spring has great support for in memory databases and the default embedded database is HSQL which has a pretty good support for postgresql syntax. Provided you have no complex statement, it could be enough and avoids cluttering the main database for (potentially destructive) unit tests.
You could create the database in a #BeforeClass method. Here is an oversimplified example:
private static DriverManagerDataSource dataSource;
#BeforeClass
public static void setupClass() throws Exception {
ResourceDatabasePopulator populator = new ResourceDatabasePopulator();
populator.addScript(new ClassPathResource("path/to/package/defaults.sql"));
dataSource = new DriverManagerDataSource();
dataSource.setUrl("jdbc:hsqldb:mem:pgtest;sql.syntax_pgs=true");
dataSource.setUsername("SA");
Connection con = dataSource.getConnection();
assertNotNull(con);
populator.populate(con);
con.close();
}
I have two persistence providers I like to use - my own JDBC approach (DBAccess), and jooq (DSLContext). A DSLContext and my DBAccess both can be created using a Connection and some configuration details. I'm trying to convert a project to use Guice, and would like to create a DAO that has the ability to use both in one transaction, e.g.
class ThingDAO {
final DBAccess dbAccess;
final DSLContext dslContext;
#Inject
ThingDAO(DBAccess dbAccess, DSLContext dslContext) {
this.dbAccess = dbAccess;
this.dslContext = dslContext;
}
Thing getThingForId(int id) {
return dslContext.select().from(OBJECT)....
}
void save(Thing t) {
dbAccess.save(t);
}
Stuff joinThingToStuffTableAndGetStuff(Thing t) {
// the Stuff I get may depend on what has been saved so far, so I need
// the dslContext and dbAccess operating on the same connection
dslContext....
}
}
which I could then use along the lines of
#Transactional
doTheThings(int id, int[] data) {
ThingDAO dao = thingDaoProvider.get();
Thing t = dao.getThingForId(id);
t.doTheThings(data);
dao.save(t);
Stuff s = dao.joinThingToStuffTableAndGetStuff(t);
....
}
I've been looking at this guice extension for jooq, which makes me think I want something along the lines of a UnitOfWork that grabs a connection from my datasource in order to give to a DBAccess and a DSLContext, but I'm unsure if that's right or how to proceed even if it is.