My application.properties file defines the default profile as spring.profiles.active=test and I have a method that I schedule like so:
#Scheduled(initialDelay = 2500, fixedRate = 60 * 1000 * minutesRecheckRate)
#Profile("loop")
public void processingLoop() {
System.out.println(Arrays.toString(env.getActiveProfiles()));
//.. the rest is omitted for brevity.
To my understanding, under these circumstances I should never see this get called while running my unit-tests because I do not change the default profile. This turns out not to be the case, as this is still getting scheduled and I see the output
[test]
in my console despite my best efforts to prevent it. What is happening? Why is this still running even with a different active profile?
UPDATE:
I can't give much more due to the fact this is a work-relevant application, but I'll give what I can.
The class is configured like so:
#Configuration
#EnableScheduling
public class BatchConfiguration {
The unit tests are all annotated like this:
#SpringApplicationConfiguration(classes = SpringBatchJsontestApplication.class)
public class SpringBatchJsontestApplicationTests extends AbstractTestNGSpringContextTests {
The main application class is this:
#SpringBootApplication
public class SpringBatchJsontestApplication {
None of them change anything else. There is no context.xml file, this is a SpringBoot application so everything is annotations only.
This is the end result that works very well for me
#Profile("test")
#Bean(name = TaskManagementConfigUtils.SCHEDULED_ANNOTATION_PROCESSOR_BEAN_NAME)
#Role(BeanDefinition.ROLE_INFRASTRUCTURE)
public ScheduledAnnotationBeanPostProcessor scheduleBeanProcessorOverride() {
logger.info("Test Profile is active, overriding ScheduledAnnotationBeanPostProcessor to prevent annotations from running during tests.");
return new ScheduledAnnotationBeanPostProcessor() {
#Override
protected void processScheduled(Scheduled scheduled, Method method, Object bean) {
logger.info(String.format("Preventing scheduling for %s, %s, %s", scheduled, method, bean.getClass().getCanonicalName()));
}
};
}
Here is the POM configuration to trigger the testing profile, so I no longer have to do so explicitly in my application.properties.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.19</version>
<configuration>
<systemPropertyVariables>
<spring.profiles.active>test</spring.profiles.active>
</systemPropertyVariables>
</configuration>
</plugin>
The #Profile annotation doesn't do anything for a regular method, nor a method annotated with #Scheduled. The javadoc states
The #Profile annotation may be used in any of the following ways:
as a type-level annotation on any class directly or indirectly annotated with #Component, including #Configuration classes
as a meta-annotation, for the purpose of composing custom stereotype annotations
as a method-level annotation on any #Bean method
The last case, in bold, is the only use of #Profile on a method.
If you want to enable the #Scheduled behavior under a specific profile, annotate the bean (definition) that contains it.
Related
Maybe I have some outdated knowledge but it is the same as described here
https://stackoverflow.com/a/2657465/2674303
But now I noticed that this example works without any exceptions:
#Service
#EnableScheduling
public final class MyService {
#PostConstruct
public void init(){
System.out.println("MyService started");
}
#Scheduled(fixedDelay= 1000)
public void scheduleCall() {
System.out.println("scheduleCall");
}
}
Could you pease provide how does it work ?
#Scheduled annotation does not require proxy creation. The mechanism is different. After bean initialization Spring called post-processor ScheduledAnnotationBeanPostProcessor. Post processor searches for all methods annotated with #Scheduled and registers them to TaskScheduller for execution. Method execution will be performed via reflection.
See ScheduledAnnotationBeanPostProcessor source code.
#Scheduled
Processing of #Scheduled annotations is performed by registering a
ScheduledAnnotationBeanPostProcessor. This can be done manually or,
more conveniently, through the task:annotation-driven/ XML element
or #EnableScheduling annotation.
ScheduledAnnotationBeanPostProcessor
Bean post-processor that registers methods annotated with #Scheduled
to be invoked by a TaskScheduler according to the "fixedRate",
"fixedDelay", or "cron" expression provided via the annotation. This
post-processor is automatically registered by Spring's
task:annotation-driven XML element, and also by the
#EnableScheduling annotation.
Autodetects any SchedulingConfigurer instances in the container,
allowing for customization of the scheduler to be used or for
fine-grained control over task registration (e.g. registration of
Trigger tasks). See the #EnableScheduling javadocs for complete usage
details.
#PostConstruct also implemented via post-processor InitDestroyAnnotationBeanPostProcessor when dependency injection performed for bean, method which marked #PostConstruct will be executed thru reflection without proxy.
See InitDestroyAnnotationBeanPostProcessor source code
Summary:
In your example, Spring will create bean without proxy.
In case you will add a proxy-specific annotation, for example, #Transactional you will get an exception that proxy can not be created due to final class java.lang.IllegalArgumentException: Cannot subclass final class com.test.services.MyService
#Service
#EnableScheduling
public final class MyService {
#PostConstruct
public void init(){
System.out.println("MyService started");
}
#Scheduled(fixedDelay= 1000)
#Transactional
public void scheduleCall() {
System.out.println("scheduleCall");
}
}
But the current problem you also can solve to force use JDK dynamic proxy. We need to create an interface for class and set property spring.aop.proxy-target-class = false according to Proxying mechanisms
I've written a Spring Boot Test, that writes into a JMS queue and is expecting some processing via an JMS listener. In the listener, I'm trying to read an object from S3. The AmazonS3 class should be replaced by a MockBean. In my test I set up the mock like this:
#SpringBootTest
public class MyTest {
#Autowired
MyJmsPublisher jmsPlublisher;
#MockBean
AmazonS3 amazonS3;
#Test
public void test() {
final S3Object s3Object = mock(S3Object.class);
when(s3Object.getObjectContent()).thenReturn(mock(S3ObjectInputStream.class));
when(amazonS3.getObject(anyString(), anyString())).thenReturn(s3Object);
jmsPlublisher.publishMessage("mymessage");
Awaitility.await().untilAsserted(() -> {
//wait for something here
});
}
}
#Component
#RequiredArgsConstructor
public class MyJmsPublisher {
private final JmsTemplate jmsTemplate;
public void publishMessage(String message) {
jmsTemplate.convertAndSend("destination", message);
}
}
#Component
#RequiredArgsConstructor
public class MyJmsListener {
private final AmazonS3 amazonS3;
#JmsListener(destination = "destination")
public void onMessageReceived(String message) {
final S3ObjectInputStream objectContent = amazonS3.getObject("a", "b").getObjectContent();
// some logic here
}
}
But the issue is that when running multiple Spring Boot tests , the MyJmsListener class contains a mock that is different from the one created in the Test. It's a mock, but for example the getObjectContent() returns null. But when I run the test alone, everything works fine.
I've tried to inject the AmazonS3 bean into the MyJmsPublisher and call the mocked method there and it worked. So I suspect, that it's because the JMS listener operates in a different thread.
I've found this thread and also set the reset to all available options, but does not make any difference. I also tried this OP's approach that worked for them, where I create a mock via the #Bean annotation like this:
#Configuration
public class MyConfig {
#Bean
#Primary
public AmazonS3 amazonS3() {
return Mockito.mock(AmazonS3.class);
}
}
But this just has the same behavior as mentioned above.
So can you actually use the #MockBean annotation when using different Threads like using a #JMSListener? Or am I missing something?
Spring Beans with methods annotated with #JmsListener are injecting beans leaked from previous test executions when activated by a secondary thread. A practical workaround is to configure the test executor to use an isolated VM for each class to avoid this issue.
For Maven executions you can configure the maven-failsafe-plugin or maven-surefire-plugin by setting the reuseForks option. e.g:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<configuration>
<reuseForks>false</reuseForks>
</configuration>
</plugin>
You can also easily change the Fork mode to Class in JUnit IDE tools for multiple tests execution. Example for IntelliJ:
Using #DirtiesContext does not work, and unfortunately, I still couldn't find the root cause for this - My hunch is that it could be something related to using an in-memory instance of the ActiveMQ broker, which lives in the VM instance shared by the executions.
We had a similar issue when using the #JmsListener annotation in combination with #MockBean/#SpyBean. In our case using a separate destination for each test class solved this problem:
#JmsListener(destination = "${my.mq.topic.name}")
void consume(TextMessage message){
...
}
#SpringBootTest
#TestPropertySource(properties = "my.mq.topic.name=UniqueTopicName"})
class MyConsumerIT{
...
}
As far as I understand Spring has to create a separate JMS Consumer for each topic/queue. This configuration forces Spring to create a separate JMS Consumer for this class and Inject it correctly into the TestContext.
In comparison without this configuration, Spring reuses the once created Consumer for all test classes.
I'm facing some problems when trying to inject a bean with the annotation #MockBean inside a Junit test.
As a result I get the real service injected instead of the mocked one, but the weird behavior is that this only happens when running tests with maven verify (together with other integration tests).
Basically, the bean I want to mock is injected inside a Listener (#Component) that is triggered by a message sent on the queue during the integration test. When the listener runs, the service inside it is the real one instead of the mocked one.
It seems to me that, when running other tests, the real bean is previously injected inside the context and #MockBean, although it should restart the spring context, does not replace the existing bean with the mocked one when it encounters a bean of the same type.
This is really a strange behavior, because documentation says "Any existing single bean of the same type defined in the context will be replaced by the mock". Well, this is not happening.
Below you find snippets showing how this is done.
Service to be mocked is:
#Slf4j
#Service
#Transactional
public class SomeServiceImpl implements SomeService {
#Override
#Async
public void doStuff(){
...
}
}
A listener that injects my service like this
#Slf4j
#Component
#Transactional
public class SagaListener {
#Autowired
private SomeService someService;
#JmsListener(destination = "${mydestinationtopic}", containerFactory = "myFactory",
subscription = "my-subscription", selector = "eventType =
'MY_EVENT'" )
public void receive(MyEventClass event) {
someService.doStuff();
}
}
And here is my test class
#Slf4j
#SpringBootTest
#RunWith(SpringRunner.class)
public class SagaListenerIT {
#MockBean
private SomeService someService;
#Autowired
private Sender sender;
#Test
public void createNamespaceSuccess() throws InterruptedException {
...
sender.send(event, event.getEventType(), myTopic);
BDDMockito.then(someService).should().doStuff();
}
}
As a result I get that mockito says that someService made 0 invokations, and this is because the real service is being called.
Why isn't #MockBean replacing the real bean? Should't the context be reinitialized?
I've tried to add #DirtiesContext annotation in other tests and in that case everything works, but this is not a clean solution.
here is a portion of my pom where failsafe plugin is defined. It's a really simple one by the way:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>integration-test</goal>
</goals>
</execution>
</executions>
</plugin>
Thank you
On a project I'm working on we have some old dependencies that define their own spring beans but need to be initialized from the main application. These beans are all constructed using spring profiles, i.e. "default" for production code and "test" for test code. We want to move away from using spring profiles, instead simply using #import to explicitly wire up our context.
The idea is to encapsulate all these old dependencies so that no other components need to care about spring profiles. Thus, from a test`s point of view, the application context setup can be described as follows:
#ContextConfiguration(classes = {TestContext.class})
#RunWith(SpringJUnit4ClassRunner.class)
public class MyTest {
//tests
}
TestContext further directs to two classes, one of which encapsulates the old dependencies:
#Configuration
#Import(value = {OldComponents.class, NewComponents.class})
public class TestContext {
//common spring context
}
To encapsulate the old components` need for profiles, the OldComponents.class looks as follows:
#Configuration
#Import(value = {OldContext1.class, OldContext2.class})
public class OldComponents {
static {
System.setProperty("spring.profiles.active", "test");
}
}
The problem here is that the static block does not appear to be executed in time. When running mvn clean install, the test gets an IllegalStateException because the ApplicationContext could not be loaded. I have verified that the static block gets executed, but it would appear that OldContext1 and OldContext2 (which are profile dependent) are already loaded at this time, which means it is too late.
The frustrating thing is that IntelliJ runs the tests just fine this way. Maven, however, does not. Is there a way to force these profiles while keeping it encapsulated? I've tried creating an intermediary context class, but it didn't solve the problem.
If we use the annotation #ActiveProfiles on the test class, it runs just fine but this kind of defeats the purpose. Naturally, we want to achieve the same in production and this means that if we cannot encapsulate the need for profiles, it needs to be configured in the web.xml.
If your configuration classes inherits of AbstractApplicationContext you can call:
getEnvironment().setActiveProfiles("your_profile");
For example:
public class TestContext extends AnnotationConfigWebApplicationContext {
public TestContext () {
getEnvironment().setActiveProfiles("test");
refresh();
}
}
Hope it helps.
It definietly seems that OldContext1 and OldContext2 are being class-loaded and initialized before the static block in OldComponents is executed.
Whilst I can't explain why there is a difference between your IDE and Maven (to do so would require some in-depth knowledge of some, if not all all, of : spring 3.x context initialization, maven surefire plugin, SpringJunit4ClassRunner and the internal IntelliJ test runner), can I recommend to try this?
#Configuration
#Import(value = {UseTestProfile.class, OldContext1.class, OldContext2.class})
public class OldComponents {
// moved the System.setProperty call to UseTestProfile.class
}
and
#Configuration
public class UseTestProfile {
static {
System.setProperty("spring.profiles.active", "test");
}
}
If I am understanding your problem correctly, class UseTestProfile should be loaded first (you might want to investigate a way to guarantee this?) and the other two classes in the import list should have the system setting they need to initialize properly.
Hope this helps...
You need make sure, environment takes effect at first.This is how I do:
#Component
public class ScheduledIni {
#Autowired
private Environment env;
#PostConstruct
public void inilizetion() {
String mechineName = env.getProperty("MACHINE_NAME");
if ("test".equals(mechineName) || "production".equals(mechineName) {
System.setProperty("spring.profiles.default", "Scheduled");
System.setProperty("spring.profiles.active", "Scheduled");
}
}
}
In scheduler add annotation Prodile and DependsOn to make it work.
#DependsOn("scheduledIni")
#Profile(value = { "Scheduled" })
#Component
Use #profile annotation in the class to load the configuration like below
#Configuration
#Profile("test")
public class UseTestProfile {
}
and set the value for the property spring.profiles.active either in property file or as a runtime argument
If I have a #Transactional -annotation on a private method in a Spring bean, does the annotation have any effect?
If the #Transactional annotation is on a public method, it works and open a transaction.
public class Bean {
public void doStuff() {
doPrivateStuff();
}
#Transactional
private void doPrivateStuff() {
}
}
...
Bean bean = (Bean)appContext.getBean("bean");
bean.doStuff();
The answer your question is no - #Transactional will have no effect if used to annotate private methods. The proxy generator will ignore them.
This is documented in Spring Manual chapter 10.5.6:
Method visibility and #Transactional
When using proxies, you should apply
the #Transactional annotation only
to methods with public visibility. If
you do annotate protected, private or
package-visible methods with the
#Transactional annotation, no error
is raised, but the annotated method
does not exhibit the configured
transactional settings. Consider the
use of AspectJ (see below) if you need
to annotate non-public methods.
The Question is not private or public, the question is: How is it invoked and which AOP implementation you use!
If you use (default) Spring Proxy AOP, then all AOP functionality provided by Spring (like #Transactional) will only be taken into account if the call goes through the proxy. -- This is normally the case if the annotated method is invoked from another bean.
This has two implications:
Because private methods must not be invoked from another bean (the exception is reflection), their #Transactional Annotation is not taken into account.
If the method is public, but it is invoked from the same bean, it will not be taken into account either (this statement is only correct if (default) Spring Proxy AOP is used).
#See Spring Reference: Chapter 9.6 9.6 Proxying mechanisms
IMHO you should use the aspectJ mode, instead of the Spring Proxies, that will overcome the problem. And the AspectJ Transactional Aspects are woven even into private methods (checked for Spring 3.0).
By default the #Transactional attribute works only when calling an annotated method on a reference obtained from applicationContext.
public class Bean {
public void doStuff() {
doTransactionStuff();
}
#Transactional
public void doTransactionStuff() {
}
}
This will open a transaction:
Bean bean = (Bean)appContext.getBean("bean");
bean.doTransactionStuff();
This will not:
Bean bean = (Bean)appContext.getBean("bean");
bean.doStuff();
Spring Reference: Using #Transactional
Note: In proxy mode (which is the default), only 'external' method calls coming in through the proxy will be intercepted. This means that 'self-invocation', i.e. a method within the target object calling some other method of the target object, won't lead to an actual transaction at runtime even if the invoked method is marked with #Transactional!
Consider the use of AspectJ mode (see below) if you expect self-invocations to be wrapped with transactions as well. In this case, there won't be a proxy in the first place; instead, the target class will be 'weaved' (i.e. its byte code will be modified) in order to turn #Transactional into runtime behavior on any kind of method.
If you need to wrap a private method inside a transaction and don't want to use AspectJ, you can use TransactionTemplate.
#Service
public class MyService {
#Autowired
private TransactionTemplate transactionTemplate;
private void process() {
transactionTemplate.executeWithoutResult(status -> processInTransaction());
}
private void processInTransaction(){
//...
}
}
Yes, it is possible to use #Transactional on private methods, but as others have mentioned this won't work out of the box. You need to use AspectJ. It took me some time to figure out how to get it working. I will share my results.
I chose to use compile-time weaving instead of load-time weaving because I think it's an overall better option. Also, I'm using Java 8 so you may need to adjust some parameters.
First, add the dependency for aspectjrt.
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>1.8.8</version>
</dependency>
Then add the AspectJ plugin to do the actual bytecode weaving in Maven (this may not be a minimal example).
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.8</version>
<configuration>
<complianceLevel>1.8</complianceLevel>
<source>1.8</source>
<target>1.8</target>
<aspectLibraries>
<aspectLibrary>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
</aspectLibrary>
</aspectLibraries>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
Finally add this to your config class
#EnableTransactionManagement(mode = AdviceMode.ASPECTJ)
Now you should be able to use #Transactional on private methods.
One caveat to this approach: You will need to configure your IDE to be aware of AspectJ otherwise if you run the app via Eclipse for example it may not work. Make sure you test against a direct Maven build as a sanity check.
Spring Docs explain that
In proxy mode (which is the default), only external method calls
coming in through the proxy are intercepted. This means that
self-invocation, in effect, a method within the target object calling
another method of the target object, will not lead to an actual
transaction at runtime even if the invoked method is marked with
#Transactional.
Consider the use of AspectJ mode (see mode attribute in table below)
if you expect self-invocations to be wrapped with transactions as
well. In this case, there will not be a proxy in the first place;
instead, the target class will be weaved (that is, its byte code will
be modified) in order to turn #Transactional into runtime behavior on
any kind of method.
Another way is user BeanSelfAware
The answer is no. Please see Spring Reference: Using #Transactional
:
The #Transactional annotation may be placed before an interface definition, a method on an interface, a class definition, or a public method on a class
Same way as #loonis suggested to use TransactionTemplate one may use this helper component (Kotlin):
#Component
class TransactionalUtils {
/**
* Execute any [block] of code (even private methods)
* as if it was effectively [Transactional]
*/
#Transactional
fun <R> executeAsTransactional(block: () -> R): R {
return block()
}
}
Usage:
#Service
class SomeService(private val transactionalUtils: TransactionalUtils) {
fun foo() {
transactionalUtils.executeAsTransactional { transactionalFoo() }
}
private fun transactionalFoo() {
println("This method is executed within transaction")
}
}
Don't know whether TransactionTemplate reuse existing transaction or not but this code definitely do.