Conext
I found this question here but my problem is different.
So we are using Katharsis Controller and Spring Data Rest.
We only have 1 controller for entire application and then the request will be sent to Spring Data Rest repositories classes.
We want to use Spring Restdoc to generate documentation which requires us to write unit tests with MockMvc.
But when using Mockmvc, it starts up the container and will require datasources to be set up.
If we use standaloneSetup() and pass the mocked repository class, then MockMvc won't load Katharsis Controller and therefore the request won't reach that repository.
I understand that we can create an in-memory database but our project is big and the database needs a huge number of tables to be created we want to avoid that since these tests are for documentation purposes.
Question
Is there any way to achieve this and only mock the target repository class?
Note
By repository I mean CrudRepository interface in Spring DataRest.
As Andy Wilkinson suggested, you may consider creating unit test where you wire beans together by yourself and use MokMvc standalone setup.
If you want to create integration test and create Spring Context anyway, there is way to fake Spring bean by using #Primary, #ActiveProfiles and #Profile annotations. I wrote a blog post with GitHub example how to do it. You just need to combine this approach with WebApplicationContext based MockMvc setup. It works without problems, I wrote such tests in the past.
Related
We have a larger Spring boot application and a number of integration tests which are annotated by #SpringBootTest and having number of services.
Within the integration tests we have a number of them which are using #Sql to setup in memory db (as in real world) and having services which are reading values from the database base at the start time of the service.
Based on analysing/reading we have found that the order of initialisation is:
ServletTestExecutionListener – configures Servlet API mocks for a WebApplicationContext
DirtiesContextBeforeModesTestExecutionListener – handles the #DirtiesContext annotation for “before” modes
DependencyInjectionTestExecutionListener – provides dependency injection for the test instance
DirtiesContextTestExecutionListener – handles the #DirtiesContext annotation for “after” modes
TransactionalTestExecutionListener – provides transactional test execution with default rollback semantics
SqlScriptsTestExecutionListener – runs SQL scripts configured using the #Sql annotation
(Hint: The above list is copied from https://www.baeldung.com/spring-testexecutionlistener).
This means in consequence that it is not possible to write an integration tests with #SpringBootTest in which a service will read values from a database during the initialisation. This also means it's not possible to use #PostConstruct as alternative.
The question is: Is there an easy way to change that order?
I already tried to register a different test execution listener like this:
#TestExecutionListeners(
{
SqlScriptsTestExecutionListener.class,
ServletTestExecutionListener.class,
DirtiesContextBeforeModesTestExecutionListener.class,
DependencyInjectionTestExecutionListener.class,
DirtiesContextTestExecutionListener.class,
TransactionalTestExecutionListener.class
}
)
but it does not execute my scripts first (Maybe I'm doing something wrong).
Apart from that If I start the spring boot application with a service as described (reading values from database) above it will work perfectly which puzzles me a bit. Why is there a big difference between an integration test and the real world application?
Maybe I misunderstand a thing here, but it looks like the order in spring test (#SpringBootTest) is different than in real world application?
This question is related to Using JPA repository access in constructor of Service class
I'm trying to write a integration test for my SpringBoot microservice that interacts with another service inside the product ecosystem.
Since this kind of testing is consider function/integration testing (depending on what nomenclature you use) it is usually done on some development environment.
However, I wanted to test a basic interaction between my service and a STUB/dummy app which are connected with RPC (so not exactly a typical TestRestTemplate test).
I know that there is a way to embed a service while booting up the Spring Context but have never done it by myself.
Does anyone have any experience with the upper or maybe a few helpful links where I can explore.
I have used WireMock in tests to mock services external to what I want to test that communicate over HTTP.
My test class annotated with #SpringBootTest is also annotated with #ContextConfiguration. In the classes attribute #ContextConfiguration I explicitly specify the configuration classes required to set up the Spring Context for the test in question. Here I can also include additional configuration classes in which I create beans only used in the test. In test configuration classes I can also override beans for the purpose of the test, creating mock beans etc.
Note that Spring Boot 2.1 and later disables bean overriding by default. It can be enabled by setting the following property to true:
spring.main.allow-bean-definition-overriding=true
To set the property for a single test, use the #TestPropertySource annotation like this:
#TestPropertySource(properties = {
"spring.main.allow-bean-definition-overriding=true"
})
I started using Ehcache in my Spring Boot project. How can I prove that the Ehcache is being used instead of the default ConcurrentHashMap, which is provided by Spring Boot by default? How can I prove it automatically in the integration tests?
If I understand the question correctly, you are trying to validate your spring configuration.
A one time thing regarding Ehcache is to check logs - you should see the information about it being started and configured.
For automated testing, the easiest way is going to be to have the test configured to be injected with the cacheManager bean and then making sure it is of the right type.
This should give you the confidence that your setup is correct.
Ok, I come from Rails am having a bit of a hard time trying to get this working.
Right know this is what I understand from the Spring framework (and please correct me if I'm wrong):
I am using Tomcat. So what it does, basically, is go after web.xml, and check any configuration files from there in order to initialize and get beans auto wired, etc.
In my example, I have on web.xml the context config location as application-config.xml
It turns out that application-config.xml has other config files that will take care of the following:
Hibernate:
<util:properties id="hibernatePropertiesConfigurer" location="classpath:hibernate.properties"/>
JSon Converter:
<import resource="classpath:json-converter.xml"/>
Managers (#Component) Scanner:
<import resource="classpath:scanner-config.xml"/>
Among others. In other words, these work just fine, web server gets up, managers (#Components) are #Autowired, #Controllers can call them, persist objects in the database, etc.
Now, I want to test, i.e. run JUnit tests. I did find a lot of examples on how to mock these layers (managers, controllers) but I want to try the real thing:
Test will instantiate #Controller
Test will post to #Controller object
#Controller will have a real #Autowired (not mocked version) of #Component or #Bean
#Component will persist an #Entity
And, most important of all, database should see the changes. (I am using MySQL)
Test will, then, use an #Autowired instance of #Component, and query the database to confirm the persistence occurred.
Is this possible, at all, with Spring, JUnit testing?
If you want to verify the behavior of the entire stack like that, it's no longer a unit test, but an integration test.
If you're using Maven, you can use something like the Maven Jetty plugin to deploy your project into an embedded container during the pre-integration-test phase. Then in your tests, make the HTTP calls to the server at localhost, and verify you get the expected responses. Then, in the post-integration-test phase, shut down the Jetty server.
I have a requirement where i need to configure a Spring based application to work with two databases. We have two databases, one that we use to keep the live data and the other database is used as a datawarehouse and contains archived data (which has the exact structure as the Live db).
To keep it simple, assume that there is a request to search for a product. What the application should do is to search for the product details in the Live database and if not found it will check the archive database.
If i need to configure such a setup, do i still need to configure to datasources and would the search code have to use the first datasource to check the live database and if not found it will run another query using the archive database?
The above is probably doable but i am wondering whether there is a better way of doing this. For example, is it possible for the application to work on a single datasource even though behind the scenes it actually works with two databases?
The application is based on Spring, JPA/Hibernate, SOAP and Mysql database and Jboss 7 as the application server.
Any examples showing how this is configured using Spring and Jboss would be very useful.
Thanks
Spring has exactly what you want - the AbstractRoutingDataSource. See this blog post on how to use it. In your case, you need to switch the datasource during one request, so you'll need to have 2 transactions, switching the datasource between them by changing the datasource indicator on the ThreadLocal:
For these DAOs, demarcate the wrapping Service-layer either with distinct packages, class names, or method names
Indicate to Spring that the Service-layer method calls should run in their own transactional contexts by annotating with #Transactional(propogation=Propogation.REQUIRES_NEW)
Create an Aspect (using AspectJ annotation #Aspect) to fire around the service-layer method calls (using #Around) to set the ThreadLocal value before the method call, and to unset it afterwards
In the #Controller, simply call the Service-layer methods. The Aspect will take care of setting the values to indicate which datasource to use, and the AbstractRoutingDataSource will use that datasource in the context of each transaction.