How to call datasource in Junit - java

I have a maven WAR package with JSF pages. Usually I use
#Resource(name = "jdbc/Oracle")
private DataSource ds;
to call database when I deploy the WAR package on Glassfish server. But in case of JUnit tests when I build the package on my laptop with netbeans I cannot use this datasource.
How I can solve this problem? I want to run the JUnit tests with database tables right after I build the package but I don't have a datasource.
What are the possible solutions?

Do you actually want to run your unit tests against the database? Personally I would try to avoid this, since it usually ties the test too closely to the state of the database and often prevents you from actually testing the "unit" and all the possible states you might like to handle. It will also probably make your unit tests take some time to run, which is not ideal.
An alternative would be to create a mock DataSource, using for example EasyMock or Mockito. Or you could create your own mock implementation of the DataSource interface, if you know you would like to define some common behaviour for DataSources across many tests.
If you really want to use the database, you would have to look at manually instantiating whatever implementation of DataSource you are using (e.g. OracleDataSource) and then using this in your class.
In either case, you will probably have to switch to using constructor or method injection to make it a bit easier to set the DataSource on the instance you are testing. (Otherwise you will have to use reflection to set the private variable.)
For example, your class might look like this:
public class DatabaseOperations {
private DataSource dataSource;
#Resource(name = "jdbc/Oracle")
public void setDataSource(DataSource dataSource) {
this.dataSource = dataSource;
}
}
And then your test might look like this:
public class DatabaseOperationsTest {
public void testSomeOperation() {
DatabaseOperations databaseOperations = new DatabaseOperations();
databaseOperations.setDataSource(new MockDataSource());
}
}

If you really do need to run tests using an injected DataSource, you could consider using Arquillian which will create a deployment unit for you, deploy it to either an embedeed or remote Glassfish container, together with a configured DataSource specifically for testing if you wish. They have a guide for this specific scenario.
The advantage is that you will have a full-blown container with CDI. You control what gets package so you can provide test stubs for CDI classes. You can also control the deployment configuration (test vs. production configuration). It is non-invasive.

Related

#BeforeAll JUnit/spring-boot-test alternative that runs when application context starts

I'm writing a #Repository/#Service integration test that leverages an embedded database. In my test class, I would like to preload my database with some data.
I'm currently using #BeforeEach to load in my sample data, however, this code is run upon each test in my class.
Is there any way that I can load in my test data after Spring application context has loaded, but before any test has been run?
My current approach:
#BeforeEach
public void before() {
repository.save(...); // -> prepopulates repository with sample data
}
#Test
public void testService() {
service.get(...); // -> gathers existing record
}
#Test
public void deleteById() {
service.delete(...); // -> deletes existing record
}
However... with this, I am required to flush out the records after every test. Otherwise any unique constraints can easily be violated.
Rather than using #BeforeEach which is required to run before every test... is it possible to load this in in a #BeforeAll kind of fashion that happens after the spring application context has been loaded?
Is there any way that I can load in my test data after Spring application context has loaded
Basically yes, I think you can do that:
The idea is to load the SQL data when the application context is started or in the process of being started.
For example, spring boot integration with Flyway works this way (the bean of Flyway is created and loaded). So, in theory, you could merely use Flyway with test migrations that will contain all the relevant SQL scripts of test data generation.
How can you do this technically?
Here is one way:
Create a special bean (just like the way it works with Flyway) that would depend on your repository and in post construct save the data:
#Component
public class SqlGenerationBean {
#Autowired
private MyRepository repo;
#PostConstruct
public void init() {
repo.save();
}
}
Another way of doing is to create a listener that will be called upon the application context started and again will call the same repo.save().
In both cases the bean/listener code should not be accessible from production (it's only for tests): so put it somewhere under src/test/java for example
Now once the application context is started you can use a neat trick:
Mark your tests with #Transactional annotation. Spring will wrap the code in an artificial transaction that will be rolled back automatically (even if the test succeeds) so that all the data that you'll modify during the test will be rolled back and basically before each test, you'll have the same state (that is identical to the state of the database when/after the application context starts). Of course, if you use DDL in the test, some databases can't make it a part of transaction but it depends on the database really.
Another interesting point here is that the application context can be cached even between the test cases (created only once), so keep this in mind.
In this case I would just create a constructor for the test class. It will be triggered before everything.
#BeforeEach runs before each tests but after all initialisations .
you can also just use Mockito and mock the result without need to clean and overcomplicate
Just add following snippet to your code. This is just like you can do to detect that Spring application is really started.
#Configuration
public class AppConfig implements ApplicationListener<ApplicationReadyEvent> {
/**
* This is to indicate in the logs when the application has actually started and everything is loaded.
*/
#Override
public void onApplicationEvent(ApplicationReadyEvent event) {
ApplicationContext context = event.getApplicationContext();
Environment env = context.getEnvironment();
// do what you want on application start
}
}
P.S. For database manipulation in test #Sql is the best candidate as was mentioned in comment.

How do I set up a second remote database for integration testing?

I need advice on what method will suit better for my case.
I have a Java application with Spring Boot, and right now, for the testing part, I use a localhost PostgreSQL database. The setup is quite simple; I just have the datasource.url/username/password and port setup in the application.properties file.
Now, I need to use a remote PostgreSQL database for a particular integration test class. So, as far as I know, I can do this in at least two ways:
I can set up in application.properties another configuration for the remote database, and use it in the test class with environment.getProperty().
Or, I can make a bean to use those properties:
#Bean
public DataSource dataSource() {
DataSourceBuilder dataSourceBuilder = DataSourceBuilder.create();
dataSourceBuilder.url(dbUrl);
dataSourceBuilder.username(username);
dataSourceBuilder.password(password);
return dataSourceBuilder.build();
}
And the things are not as simple as they may seem.
I have a class for tests configuration, ClassA.
I have a second class with actual tests, let's call it ClassB.
ClassB extends Class A, and ClassA starts a JAR file, which is another Java application which exposes some rest APIs, which will be tested with the tests from ClassB.
Now I use a local database, but I will want to use the remote one, in the future.
I managed to download the JAR file from Artifactory with a Maven dependency, and in the configuration class I search for the JAR file, I get the directory of it, and use an array of commands to run it: {"java", "-jar", "directory.getCanonicalPath()}, and ProcessStreamer to process it.
After I check the JAR file is up and running, the tests from ClassB are triggered, and they do CRUD operations against the recently opened application making REST calls to the APIs.
Any suggestions on how I can set up the second remote database?
Thank you!
Since you are saying you want to use both databases at once, I would create a separate data source for the remote database:
#Configuration
public class MultipleDBConfig {
#Bean(name = "dbRemote")
#ConfigurationProperties(prefix = "spring.dbRemote")
public DataSource dbRemoteDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "dbRemoteJdbcTemplate")
public JdbcTemplate jdbcTemplate(#Qualifier("dbRemote") DataSource dbRemote) {
return new JdbcTemplate(db1);
}
}
And then use them in your DAO/services:
#Autowired
#Qualifier("dbRemoteJdbcTemplate")
private JdbcTemplate dbRemoteTemplate;
If you can't use JdbcTemplates directly for that test case then I would suggest using RoutingDataSource from Spring.

Does Spring provide stub implementation of JpaRepositories?

I am trying to unit test my Service classes that looks similiar to this:
#Service
public class SomeQueryService {
private final SomeRepository repository;
public SomeQueryService(SomeRepository repository) {
this.repository = repository;
}
public void doSomething() {
// code doing some magic
}
}
SomeRepository is simple repository interface extending JpaRepository interface.
What I want to do is unit test this service to verify whether it is working properly.
I do not want to use mockito to mock repository behaviour instead, I want to have some in-memory implementation (on list or map) that will imitate database behaviour.
Does Spring provide such fake implementations?
I want to avoid making Stub Implementation of such repository by myself as I will be using such tests in many other places.
RealLifeDeveloper has created an MIT-licensed helper-class to do just what you want: implement the repository-interface with a plain-old-java-object that just wraps a Collection, and called it "InMemoryJpaRepository". You will still have to implement some logic yourself1, though it should be easy if your queries are not too complicated.
An article explaining how to do this with example: https://reallifedeveloper.com/creating-in-memory-versions-of-spring-data-jpa-repositories-for-testing/
The repository (which includes other stuff, too) is on github: https://github.com/reallifedeveloper/rld-build-tool
The specific helper-files for creating the inmemory-db are found at https://github.com/reallifedeveloper/rld-build-tools/tree/master/src/main/java/com/reallifedeveloper/tools/test/database/inmemory if you dont want the whole repo.
1 The rule "Don't put logic in tests" exists for a reason and is clearly violated here. However, the well-known and widely-used alternatives mentioned by the other answers, H2-testing and extensive mocking, have their drawbacks too.
The type of testing you are referring to is called "Integration Testing" or "End to end testing" since it tests the whole application or a big chunk of it compared to unit tests that test only one method.
https://www.guru99.com/unit-test-vs-integration-test.html
You should not unit test your repositories, since they are already well tested by the spring team.
Solution:
You can create a test that starts the whole spring container using Spring Boot:
Just create a class in your test folder and annotate it with:
#RunWith(SpringRunner.class)
#SpringBootTest
public class MyTestClass {
#Test
public void test() {
}
}
You can then configure an embedded database using H2 so that your test does not use the production database, just follow the Spring Boot Database Initialization doc.
https://docs.spring.io/spring-boot/docs/current/reference/html/howto-database-initialization.html
PS. You can also create a specific profile using the following annotation on your test class:
#ActiveProfiles("test")
No Spring Data does not provide a fake implementation of that interface.
You'll have to create it on your own or use a mocking framework like Mockito.

How to test Rest-APIs that rely on persistence with arquillian

I would like to test a class that provides a rest endpoint via JAX-RS. This class depends on a JPA EntityManager an thus on a database which needs to be populated prior to test execution. I saw solutions for database population like dbunit, but I want to populate the data directly from my test class (or delegated via object mother pattern). But when testing rest endpoints I need to use the annotation option #Deployment(testable = false) which refuses me to inject the EntityManager into my test class.
So how can I solve this situation?
Or are there any better best practices? (maybe mocking, but that's also not possible for black box tests)
You could create a bean to generate your test data:
#Startup
#Singleton
public class TestDataGenerator {
#PersistenceContext
private EntityManager em;
#PostConstruct
private void generateTestData() {
// Generate your test data
}
}
The TestDataGenerator class defined above is annotated with #Singleton (ensuring there will be only one instance of the class) and #Startup (for eager initialization during the application startup sequence).
Add the TestDataGenerator class to your Arquillian deployment:
#RunWith(Arquillian.class)
public class MyArquillianTest {
private Client client = ClientBuilder.newClient();
#Deployment
#RunAsClient
public static WebArchive createDeployment() {
return ShrinkWrap.create(WebArchive.class)
.addClasses(TestDataGenerator.class, ...)
.addAsResource("test-persistence.xml", "META-INF/persistence.xml")
.addAsWebInfResource(EmptyAsset.INSTANCE, "beans.xml");
}
#Test
public void testAPI(#ArquillianResource URL deploymentUrl) {
// Test your REST service
WebTarget target = client.target(deploymentUrl.toURI()).path("api");
}
}
Observe that #RunAsClient is equivalent to #Deployment(testable = false).
The #ArquillianResource annotation allows you to inject the URL to test your web application.
For tests, I usually try and separate black box and unit testing completely (I suppose it's a preference on how you do it).
For example, my REST Api could rely on whatever it wants, but usually it doesn't do much but call my database layer or some sort of facade accessing my DB layer. The objects are injected, yes, but usually I make the fields package private, which meant that you can set them from the same package (which works with Junit as well).
For example:
public class Facade1 {
#Inject
Facade facade;
public void doSomething() { ... }
}
This class represents my REST API. I could test doSomething by simply adding a mock object as facade. Mind you, this is a quite useless test, but you get the idea. Unit tests should happen in isolation with as much mocking as possible.
Now testing the actual Rest API I usually resort to a python integration tester. Python has nice http libraries that allow you to make request very easily.
For that, I set up a staging environment for my Rest Server. The environment is a live-like representation for testing. Everything there needs to works and needs to be on the same version as the production deployment.
I then use python to poke my REST Api and verify the responses. Since I've set up my staging environment I have complete control over database content. Therefore it is easy for me to test that all responses are correct.
My typical process then is:
Compile
Build
Deploy
Integration test
I hope that helps. If you want clearer examples, you might want to post a bit more code as it's a bit hard to imagine for me what it is exactly you'd like to do :)

How to test database calls with testng in spring boot

I have created small web application in spring boot.I am new for TestNG. I am trying to test my services with testng which calls dao for database operations. I am trying to do it using inmemory database HSQL.
Following is my UserService
#Service
class UserServiceImpl implements UserService
{
public void save(User user)
{
userDao.save(user);
}
public User update(user)
{
userDao.update(user);
}
}
Following is my UserTest class
#Test
class UserTest
{
?
}
What is good way to use HSQL to test save and update methods in UserService using TestNG with DataProvider? Please let me know if need some more information regarding question ;)
Your response will be greatly appreciated!!
If you just want to test if the dao is called correctly, mock it with a mocking framework (i'd choose Mockito) and just verify that the correct methods were called by your Service. This is more "unit" like, since you tests reflect the clear separation of dao and service.
If you are interested in real DB communication to create/load instances and so on, you can use an in memory database like h2 and let your dao run against that database. This becomes more of an integration test, but can be useful none the less.
Either way, you would set up a test application context that cares about datasources and mocks.
Acolyte provides a JDBC driver & tools, designed for such purposes (mock up, testing, ...): https://github.com/cchantep/acolyte
It's already used in several open source projects (Anorm, Youtube Vitess, ...), either in vanilla Java, or using its Scala DSL:
handler = handleStatement.withQueryDetection(...).
withQueryHandler(/* which result for which query */).
withUpdateHandler(/* which result for which update */).
// Register prepared handler with expected ID 'my-unique-id'
acolyte.Driver.register("my-unique-id", handler);
// then ...
Connection con = DriverManager.getConnection(jdbcUrl);
// ... Connection |con| is managed through |handler|
Note: I am the author of Acolyte.

Categories

Resources