Spring Framework Events - java

I was reading through Spring Framework documentation and found a section on raising events in Spring using ApplicationContext. After reading a few paragraphs I found that Spring events are raised synchronously. Is there any way to raise asynchronous events? your help is much appreciated. I am looking something similar which would help me to complete my module.

Simplest asynchronous ApplicationListener:
Publisher:
#Autowired
private SimpleApplicationEventMulticaster simpleApplicationEventMulticaster;
#Autowired
private AsyncTaskExecutor asyncTaskExecutor;
// ...
simpleApplicationEventMulticaster.setTaskExecutor(asyncTaskExecutor);
// ...
ApplicationEvent event = new ApplicationEvent("");
simpleApplicationEventMulticaster.multicastEvent(event);
Listener:
#Component
static class MyListener implements ApplicationListener<ApplicationEvent>
public void onApplicationEvent(ApplicationEvent event) {
// do stuff, typically check event subclass (instanceof) to know which action to perform
}
}
You should subclass ApplicationEvent with your specific events. You can configure SimpleApplicationEventMulticaster and its taskExecutor in an XML file.
You might want to implement ApplicationEventPublisherAware in your listener class and pass a source object (instead of empty string) in the event constructor.

Alternative notification strategies can be achieved by implementing ApplicationEventMulticaster (see Javadoc) and its underlying (helper) class hierarchy. Typically you use either a JMS based notification mechanism (as David already suggested) or attach to Spring's TaskExecuter abstraction (see Javadoc).

Spring itself (AFAIK) work synchronously, but what you can do is to create your own ApplicationListener proxy - a class that implements this interface but instead of handling the event it just delegates it by sending to another (or new) thread, sending JMS message, etc.

Try this override the ApplicationEventMulticaster bean in resources.groovy so that it uses a thread pool:
Some thing like this worked for me, i.e. I used
import java.util.concurrent.*
import org.springframework.context.event.*
beans = {
applicationEventMulticaster(SimpleApplicationEventMulticaster) {
taskExecutor = Executors.newCachedThreadPool()
}
}

Related

scope of #kafkaListener

I just want to understand that what is the scope of #kafkaListener, either prototype or singleton. In case of multiple consumers of a single topic, is it return the single instance or multiple instances. In my case, I have multiple customers are subscribed to single topic and get the reports. I just wanted to know, what would happen, if
multiple customers wants to query for the report on the same time. In
my case, I am closing the container after successful consumption of
messages but at the same time if some other person wants to fetch
reports, the container should be open.
how to change the scope to prototype (if it is not) associated with Id's of
container, so that each time a separate instance can be generated.
#KafkaListener(id = "id1", topics = "testTopic" )
public void listen() {
// code goes here
}
A Single Listener Instance is invoked for all consuming Threads.
The annotation #KafkaListener is not Prototype scoped, and it is not possible with this annotation either.
4.1.10. Thread Safety
When using a concurrent message listener container, a single listener instance is invoked on all consumer threads. Listeners, therefore, need to be thread-safe, and it is preferable to use stateless listeners. If it is not possible to make your listener thread-safe or adding synchronization would significantly reduce the benefit of adding concurrency, you can use one of a few techniques:
Use n containers with concurrency=1 with a prototype scoped MessageListener bean so that each container gets its own instance (this is not possible when using #KafkaListener).
Keep the state in ThreadLocal<?> instances.
Have the singleton listener delegate to a bean that is declared in SimpleThreadScope (or a similar scope).
To facilitate cleaning up thread state (for the second and third items in the preceding list), starting with version 2.2, the listener container publishes a ConsumerStoppedEvent when each thread exits. You can consume these events with an ApplicationListener or #EventListener method to remove ThreadLocal<?> instances or remove() thread-scoped beans from the scope. Note that SimpleThreadScope does not destroy beans that have a destruction interface (such as DisposableBean), so you should destroy() the instance yourself.
By default, the application context’s event multicaster invokes event listeners on the calling thread. If you change the multicaster to use an async executor, thread cleanup is not effective.
https://docs.spring.io/spring-kafka/reference/html/
=== Edited ===
Lets take their 3rd option (Delcaring a SimpleThreadScope and delegating to it)
Register SimpleThreadScope . It is not picked up automatically. You need to register it like below:
#Bean
public static BeanFactoryPostProcessor beanFactoryPostProcessor() {
return new BeanFactoryPostProcessor() {
#Override
public void postProcessBeanFactory(ConfigurableListableBeanFactory beanFactory) throws BeansException {
beanFactory.registerScope("thread", new SimpleThreadScope());
}
};
}
Create a component with scopeName = "thread"
#Component
#Scope(scopeName = "thread", proxyMode = ScopedProxyMode.TARGET_CLASS)
public class KafkaDelegate{
public void handleMessageFromKafkaListener(String message){
//Do some stuff here with Message
}
}
Create a #Service
public class KafkaListenerService{
#Autowired
private KafkaDelegate kafkaDelegate;
#KafkaListener(id = "id1", topics = "testTopic" )
public void listen(String message) {
kafkaDelete.handleMessageFromKafkaListener(message);
}
}
Another example: How to implement a stateful message listener using Spring Kafka?
See this answer for an example of how to use a prototype scoped #KafkaListener bean.

Spring - Listen for dependent bean changes

In a spring application, i have BeanA that is used by BeanX, BeanY and BeanZ. When BeanA changes during its lifecycle, i want BeanX, BeanY and BeanZ get notified. Is there any out of the box way to achieve this?
Spring provides Event handling in the ApplicationContext through the ApplicationEvent class and ApplicationListener interface.
Following event types are supported, one can use these for event handling.
ContextRefreshedEvent
ContextStartedEvent
ContextStoppedEvent
ContextClosedEvent
RequestHandledEvent
To listen to a context event, a bean should implement the ApplicationListener interface which has just one method onApplicationEvent()
You will have to implement Observer Pattern for this.
In Spring, the Observer Pattern is used in the ApplicationContext’s event mechanism. When you deploy a bean that implements the ApplicationListener interface, it will receive an ApplicationEvent every time the event is published by an event publisher.
If you are creating your own custom event, your event publisher must implement ApplicationEventPublisherAware interface. Publisher can publish an ApplicationEvent by calling the publishEvent() method of ApplicationEventPublisher instance.

Spring Async for batch insertion

I'm using spring boot. I was new to spring and started a spring project. So I didn't know about pre defined repositories (JPA, CRUD) which can be easily implemented. In case, I wanted to save a bulk data, so I use for loop and save one by one, Its taking more time. So I tried to use #Async. But it doesn't also work, is my concept wrong?
#Async has two limitation
it must be applied to public methods only
self-invocation – calling the async method from within the same class won’t work
1) Controller
for(i=0;i < array.length();i++){
// Other codes
gaugeCategoryService.saveOrUpdate(getEditCategory);
}
2) Dao implementation
#Repository
public class GaugeCategoryDaoImpl implements GaugeCategoryDao {
// Other codings
#Async
#Override
public void saveOrUpdate(GaugeCategory GaugeCategory) {
sessionFactory.getCurrentSession().saveOrUpdate(GaugeCategory);
}
}
After removing #Async , it working normally. But with that annotation it doesn't work. Is there any alternative method for time consuming? Thanks in advance.
the #Async annotation creates a thread for every time you call that method. but you need to enable it in your class using this annotation #EnableAsync
You also need to configure the asyncExecutor Bean.
You can find more details here : https://spring.io/guides/gs/async-method/
In my opinion, there are several issues with your code:
You overwrite the saveOrUpdate() method without any need to do so. A simple call to "super()" should have been enough to make #Async work.
I guess that you somewhere (within your controller class?) declare a transactional context. That one usually applies to the current thread. By using #Async, you might leave this transaction context as (because of the async DAO execution), the main thread may already be finished when saveOrUpdate() is called. And even though I currently don't know it exactly, there is a good change that the declared transaction is only valid for the current thread.
One possble fix: create an additional component like AsyncGaugeCategoryService or so like this:
#Component
public class AsyncGaugeCategoryService {
private final GaugeCategoryDao gaugeCategoryDao;
#Autowired
public AsyncGaugeCategoryService(GaugeCategoryDao gaugeCategoryDao) {
this.gaugeCategoryDao = gaugeCategoryDao;
}
#Async
#Transactional
public void saveOrUpdate(GaugeCategory gaugeCategory) {
gaugeCategoryDao.saveOrUpdate(gaugeCategory);
}
}
Then inject the service instead of the DAO into your controller class. This way, you don't need to overwrite any methods, and you should have a valid transactional context within your async thread.
But be warned that your execution flow won't give you any hint if something goes wrong while storing into the database. You'll have to check the log files to detect any problems.

How to manage shutdown of ExecutorService when we allow to inject it?

Suppose I am writing a service, which needs some executor service/separate thread. I give ability to use factory method to not worry about executor service, but still want to allow passing existing executor service (dependency injection).
How can I manage for executorService.shutdown()?
Example code:
public class ThingsScheduler {
private final ExecutorService executorService;
public ThingsScheduler(ExecutorService executorService) {
this.executorService = executorService;
}
public static ThingsScheduler createDefaultSingleThreaded() {
return new ThingsScheduler(Executors.newSingleThreadExecutor());
}
public scheduleThing() {
executorService.submit(new SomeTask());
}
// implement Closeable?
// #PreDestory?
// .shutdown() + JavaDoc?
}
There are several problems
We should have ability to shutdown internally created executor, or in best case handle it automatically (Spring #PreDestory, or in worst case finalize())
We shold rather not shutdown executor if it's externally managed (injected)
We could create some attribute stating if executor is created by our class or if it's injected, and then on finalize/#PreDestroy/shutdown hook we could shut down it, but it not feels elegant for me.
Maybe we should completely resign from factory method and always require injection pushing executor lifecycle management to the client?
You may crate an instance of anonymous sub-inner class from your default factory as shown below. The class will define the close/#PreDestroy method which shall be called by your DI container.
e.g.
public class ThingsScheduler {
final ExecutorService executorService;
public ThingsScheduler(ExecutorService executorService) {
this.executorService = executorService;
}
/**
* assuming you are using this method as factory method to make the returned
* bean as managed by your DI container
*/
public static ThingsScheduler createDefaultSingleThreaded() {
return new ThingsScheduler(Executors.newSingleThreadExecutor()) {
#PreDestroy
public void close() {
System.out.println("closing the bean");
executorService.shutdown();
}
};
}
}
I would say that solution in fully up to you. Third-party libraries like spring widely use a dedicated attribute to understand who should release a particular resource depending on its creator. mongoInstanceCreated in SimpleMongoDbFactory, localServer in SimpleHttpServerJaxWsServiceExporter, etc. But they do it because these classes are created only for external usage. If your class is only used in your application code than you can either inject executorService and don't care about its releasing or create and release it inside the class which uses it. This choice depends on your class/application design (does your class work with any executorService, whether executorService is shared and used by other classes, etc). Otherwise i don't see other option than the dedicated flag.
More "elegant" solution would be to extend your ExecutorService and in it override shutdown method (whichever you choose). In case of injection, you would return that extended type and it would have it's own shutdown logic. In case of factory - you still have original logic.
After some more thinking I came up with some conclusions:
do not think about shutting it down if it's injected - someone else created it, someone else will manage it's lifecycle
an executor factory could be injected instead of Executor, then we create instance using factory and manage closing it by ourself as we manage the lifecycle (and in such case responses from other users applies)

How to send an event to another bean in spring?

In spring it is possible to do this. Does anybody have code samples?
If you want to notify a bean about something, simply call a method:
#Service
public class Notifier {
#Autowired
private Notified notified;
public void something() {
notified.notify(..);
}
}
But event handling is usually asynchronous. In that case you will have to create a new Thread (or use the executors framework since Java 5), pass a reference to / inject the target bean, and let it notify it.
And if instead you want to notify multiple beans, without knowing which exactly, then use the event mechanism that spring provides as an implementation of the observer pattern.
You can use Spring Integration for messaging between beans in your context. Look at MessageChannel and ServiceActivator. You can route, filter, split messages to your beans how ever you need.

Categories

Resources