Ok, I come from Rails am having a bit of a hard time trying to get this working.
Right know this is what I understand from the Spring framework (and please correct me if I'm wrong):
I am using Tomcat. So what it does, basically, is go after web.xml, and check any configuration files from there in order to initialize and get beans auto wired, etc.
In my example, I have on web.xml the context config location as application-config.xml
It turns out that application-config.xml has other config files that will take care of the following:
Hibernate:
<util:properties id="hibernatePropertiesConfigurer" location="classpath:hibernate.properties"/>
JSon Converter:
<import resource="classpath:json-converter.xml"/>
Managers (#Component) Scanner:
<import resource="classpath:scanner-config.xml"/>
Among others. In other words, these work just fine, web server gets up, managers (#Components) are #Autowired, #Controllers can call them, persist objects in the database, etc.
Now, I want to test, i.e. run JUnit tests. I did find a lot of examples on how to mock these layers (managers, controllers) but I want to try the real thing:
Test will instantiate #Controller
Test will post to #Controller object
#Controller will have a real #Autowired (not mocked version) of #Component or #Bean
#Component will persist an #Entity
And, most important of all, database should see the changes. (I am using MySQL)
Test will, then, use an #Autowired instance of #Component, and query the database to confirm the persistence occurred.
Is this possible, at all, with Spring, JUnit testing?
If you want to verify the behavior of the entire stack like that, it's no longer a unit test, but an integration test.
If you're using Maven, you can use something like the Maven Jetty plugin to deploy your project into an embedded container during the pre-integration-test phase. Then in your tests, make the HTTP calls to the server at localhost, and verify you get the expected responses. Then, in the post-integration-test phase, shut down the Jetty server.
Related
We have a larger Spring boot application and a number of integration tests which are annotated by #SpringBootTest and having number of services.
Within the integration tests we have a number of them which are using #Sql to setup in memory db (as in real world) and having services which are reading values from the database base at the start time of the service.
Based on analysing/reading we have found that the order of initialisation is:
ServletTestExecutionListener – configures Servlet API mocks for a WebApplicationContext
DirtiesContextBeforeModesTestExecutionListener – handles the #DirtiesContext annotation for “before” modes
DependencyInjectionTestExecutionListener – provides dependency injection for the test instance
DirtiesContextTestExecutionListener – handles the #DirtiesContext annotation for “after” modes
TransactionalTestExecutionListener – provides transactional test execution with default rollback semantics
SqlScriptsTestExecutionListener – runs SQL scripts configured using the #Sql annotation
(Hint: The above list is copied from https://www.baeldung.com/spring-testexecutionlistener).
This means in consequence that it is not possible to write an integration tests with #SpringBootTest in which a service will read values from a database during the initialisation. This also means it's not possible to use #PostConstruct as alternative.
The question is: Is there an easy way to change that order?
I already tried to register a different test execution listener like this:
#TestExecutionListeners(
{
SqlScriptsTestExecutionListener.class,
ServletTestExecutionListener.class,
DirtiesContextBeforeModesTestExecutionListener.class,
DependencyInjectionTestExecutionListener.class,
DirtiesContextTestExecutionListener.class,
TransactionalTestExecutionListener.class
}
)
but it does not execute my scripts first (Maybe I'm doing something wrong).
Apart from that If I start the spring boot application with a service as described (reading values from database) above it will work perfectly which puzzles me a bit. Why is there a big difference between an integration test and the real world application?
Maybe I misunderstand a thing here, but it looks like the order in spring test (#SpringBootTest) is different than in real world application?
This question is related to Using JPA repository access in constructor of Service class
We have a Spring-Boot application exposing some REST endpoints. We allow for this application to be operated standalone (as executable jar) or as a war to be deployed in a wildfly-11 application-server.
The class defining the REST-endpoints is marked #RestController #Transactional(REQUIRES_NEW) (both on class level, obviously). When running standalone, everything works as expected but when deployed in wildfly, the rollback on exceptions does not work. We established this by sending the exact same REST-message while operating on the exact same database.
We have confirmed via debugging that the final frames of the stacktrace is identical in both cases and especially in both cases we see a transactional-proxy around our REST-controller bean.
One difference would be, that within wildfly the application will use a jndi-datasource, prepared by wildfly while standalone the spring-boot will manage the database-connections.
Any idea what is wrong here?
Edit
I just tried explicitly invoking setRollbackOnly on the JtaTransactionmanager from within my code. The transaction will still commit. This sort of looks like a bug in Spring Boot to me.
Edit 2
Debugging further reveals that the transaction seems to be set to autocommit - every statement is immediately written to the database. This seems to be in violation to the annotation #Transactional and also to the fact that Spring creates a transactional proxy around my bean.
It's not a full answer - just a reasoning. JNDI is usally used at the app server layer whereas JDBC - at the application layer. At the App server layer are used global transaction settins that are overriding app settings. Follow the spring doc to get more
For reasons beyond my understanding the default transactional behaviour when deploying a spring-boot webapp to an application-server is auto-commit.
The solution to this problem is to enrich your application-configuration with the property spring.datasource.tomcat.default-auto-commit=false
I'm trying to write a integration test for my SpringBoot microservice that interacts with another service inside the product ecosystem.
Since this kind of testing is consider function/integration testing (depending on what nomenclature you use) it is usually done on some development environment.
However, I wanted to test a basic interaction between my service and a STUB/dummy app which are connected with RPC (so not exactly a typical TestRestTemplate test).
I know that there is a way to embed a service while booting up the Spring Context but have never done it by myself.
Does anyone have any experience with the upper or maybe a few helpful links where I can explore.
I have used WireMock in tests to mock services external to what I want to test that communicate over HTTP.
My test class annotated with #SpringBootTest is also annotated with #ContextConfiguration. In the classes attribute #ContextConfiguration I explicitly specify the configuration classes required to set up the Spring Context for the test in question. Here I can also include additional configuration classes in which I create beans only used in the test. In test configuration classes I can also override beans for the purpose of the test, creating mock beans etc.
Note that Spring Boot 2.1 and later disables bean overriding by default. It can be enabled by setting the following property to true:
spring.main.allow-bean-definition-overriding=true
To set the property for a single test, use the #TestPropertySource annotation like this:
#TestPropertySource(properties = {
"spring.main.allow-bean-definition-overriding=true"
})
Conext
I found this question here but my problem is different.
So we are using Katharsis Controller and Spring Data Rest.
We only have 1 controller for entire application and then the request will be sent to Spring Data Rest repositories classes.
We want to use Spring Restdoc to generate documentation which requires us to write unit tests with MockMvc.
But when using Mockmvc, it starts up the container and will require datasources to be set up.
If we use standaloneSetup() and pass the mocked repository class, then MockMvc won't load Katharsis Controller and therefore the request won't reach that repository.
I understand that we can create an in-memory database but our project is big and the database needs a huge number of tables to be created we want to avoid that since these tests are for documentation purposes.
Question
Is there any way to achieve this and only mock the target repository class?
Note
By repository I mean CrudRepository interface in Spring DataRest.
As Andy Wilkinson suggested, you may consider creating unit test where you wire beans together by yourself and use MokMvc standalone setup.
If you want to create integration test and create Spring Context anyway, there is way to fake Spring bean by using #Primary, #ActiveProfiles and #Profile annotations. I wrote a blog post with GitHub example how to do it. You just need to combine this approach with WebApplicationContext based MockMvc setup. It works without problems, I wrote such tests in the past.
I am trying to figure out how can I dynamically update/reload externalized configuration in a Spring Boot application without restarting the whole application.
Most of the advice involves reloading ApplicationContext after changing the externalized configuration, but that is equivalent to restarting the entire application, so this is not really all that useful.
Reading through SpringBoot reference documentation, I found a chapter 23.7 Typesafe Configuration Properties.
If I understand it correctly, this allows to define simple POJO classes that will hold your application (externalized) configuration values as attributes.
In theory at least, this scheme could be used to bind beans only once to the required configuration POJO and upon configuration change just update the values in the POJO. Components could easily pick up the changes next time they access getters on the POJO...
However, I have yet not managed to figure out how to enable this type of behavior. Is there some glaringly obvious way to dynamically update components annotated with #ConfigurationProperties when relevant configuration has changed?
It sounds like you're looking for #RefreshScope which is provided by Spring Cloud. From the Spring Cloud documentation:
A Spring #Bean that is marked as #RefreshScope will get special treatment when there is a configuration change. This addresses the problem of stateful beans that only get their configuration injected when they are initialized. For instance if a DataSource has open connections when the database URL is changed via the Environment, we probably want the holders of those connections to be able to complete what they are doing. Then the next time someone borrows a connection from the pool he gets one with the new URL.