I am new to Spring Boot and just implemented a normal Spring Boot application with HTTP where endpoints receive data and put in a database. Now I want some data to put in both databases and a class with data structure. Since there will be continuous operations with this data I need to operate with it as a separate process.
#Service
public class RulesManager {
private HashMap<Integer, Rule> rules = new HashMap<Integer, Rule>();
public void addRule(Rule rule) {
// Add rule to the database
}
// should be running in the background
public void updateRules(){
// Continuous check of rules and update of this.rules HashMap
}
}
#SpringBootApplication
public class RulesApplication {
public static void main(String... args) {
SpringApplication.run(RulesApplication.class, args);
// How do I call RulesManager.updateRules() to run in the background and make changes to rules hashmap???
}
}
So while listening to HTTP requests I want my application to run background process which will never stop and repeat itself. I am not sure how to call that class from the main RulesApplication class so that both http requests and background process were able to make changes to this.rules HashMap. Will be grateful for any tip or advice.
If you are just looking to start a always on process when app starts ( even better when RuleManager gets initialized ), then you should simply create a new thread in the constructor of RuleManager :
methodCalledByConstructor()
{
new Thread(()->{
// loop start
// access and check the hashmap
// do what is necessary
// sleep for a sometime
// loop end
}).start();
}
But if the work is only required when some event occurs, then use observer pattern for more elegant solution.
Try to define a new Thread for example "LocalRulesHandling" and annotate it with #Component and inside this thread add your implementations regarding the rules hashmap.
In the RulesApplication class try to get the spring context and the get the execution thread bean and then start this thread.
ApplicationContext conttext = SpringApplication.run(RulesApplication.class, args);
LocalRulesHandling handling = context.getBean(LocalRulesHandling.class);
handling.start();
Related
Disclaimer : Noobie in RabbitMq and/or Spring Integration and/or Spring Cloud Stream.
I have the following class:
#Component
public class RabbitMQChannelBindingFactory {
...
private org.springframework.cloud.stream.binder.rabbitRabbitMessageChannelBinder binder;
private org.springframework.cloud.stream.config.BindingServiceProperties bindingServiceProperties;
private org.springframework.cloud.stream.binding.BindingService bindingService;
private org.springframework.beans.factory.config.ConfigurableListableBeanFactory beanFactory;
private org.springframework.cloud.stream.binding.SubscribableChannelBindingTargetFactory bindingTargetFactory;
private org.springframework.amqp.rabbit.connection.ConnectionFactory rabbitConnectionFactory;
...
}
What is needed?
I have a mechanism that creates an Exchange+Queue+Consumer and I have a mechanism that destroys these.
The exchange and the queue have the auto-delete set on true.
What is the problem?
The inherited mechanism that destroys all of those 3 elements does not work.
It deletes just the Exchange. The Queue doesn't get deleted because it still has a Consumer and I can see it in my application also.
What has been tried?
I tried using the JVisualVM to get to the String instance of customer tag, I then walked up the hierarchy do remove the consumers.
I have changed the org.springframework.amqp.rabbit.listener.BlockingQueueConsumer inside my application so that it would be loaded first by the class loader.
I added inside something like this, in order to keep track of all the Consumers created in my application:
public class BlockingQueueConsumer {
...
public static List<BlockingQueueConsumer> all = new ArrayList<>();
public BlockingQueueConsumer(...) {
...
all.add(this);
...
}
...
}
Once I have done the previous step, I have added another method inside the RabbitMQChannelBindingFactory
class to call the cancel method for all the consumers, something like this:
class RabbitMQChannelBindingFactory {
public void disconnect(...) {
BlockingQueueConsumer lastBlockingQueueConsumer =
BlockingQueueConsumer.all.get(BlockingQueueConsumer.all.size() - 1);
lastBlockingQueueConsumer.getConsumerTags()
.forEach(consumerTag -> basicCancel(lastBlockingQueueConsumer, consumerTag));
}
}
At this point on the Browser with RabbitMQ Console loaded, we can see that the Queue is delete (besides the Exchange and the Consumer).
What is the problem?
I can not find a way to connect the BlockingQueueConsumer to the autowired properties.
For example I have tried
public void deleteRabbitMQConsumer() {
RabbitTemplate rabbitTemplate = new RabbitTemplate(rabbitConnectionFactory);
rabbitTemplate.execute(channel -> {
if (channel instanceof ChannelN) {
ChannelN channelN = (ChannelN) channel;
return true;
}
return false;
});
}
but it seems that there are no consumers inside the ChannelN.
Can you please give me a direction what needs to be understood first?
Or what are some sources for helping me?
Or anybody that has tried this action of cancelling the Consumer using this autowired properties?
Or do I need to add other autowired properties?
I have tried the https://stackoverflow.com/a/27633771/13622666 solution.
The solution
#Component
public class RabbitMQChannelBindingFactory {
...
private org.springframework.cloud.stream.binder.rabbitRabbitMessageChannelBinder binder;
private void connectAndDisconnectConsumer(...) {
...
Binding<MessageChannel> messageChannelBinding =
binder.bindConsumer(exchangeName, "", channel, consumerProperties);
... // receive messages
messageChannelBinding.stop();
...
}
}
And the stacktrace:
messageChannelBinding.stop();
DefaultBinding#stop
AbstractEndpoint#stop()
AmqpInboundChannelAdapter#doStop
AbstractMessageListenerContainer#stop()
AbstractMessageListenerContainer#doStop
AbstractMessageListenerContainer#shutdown
SimpleMessageListenerContainer#doShutdown
BlockingQueueConsumer#basicCancel(boolean)
Voted to close. You must not abuse Java classes system like that, but more concentrate to learn the library you use. Probably some one has already asked about the solution you are looking for. As Gary said: there is just stop() on the Spring Cloud Stream binding, which is going to stop a MessageListenerContainer, which, in turn, will cancel all the consumers it has on the queue. And your auto-deleted queue is going to be removed from RabbitMQ. There is no reason in destroying exchanges. Although you can do that via AmqpAdmin.deleteExchange().
I have a question about how to conceptually create an Observer and link it to another class: I currently have a class called Simulation that is supposed to create TransactionCreated objects and publish them as events. Another class called TransactionReceiver is supposed to be an Observer of every event that is published by the Simulation class and work with them.
The main method is included in the Simulation class and starts by creating an event in a static context and publishing it which works. My question would be how I am supposed to connect the TransactionReceiver as an Observer and let it subscribe to those events by receiving them in a method and work with those received objects? Do I need to create another class that would include the main method and create a Simulation and TransactionReceiver object that are then linked together as Observable and Observer? How would that look like?
And if I would extend that system with several different classes would they all have to be linked together through one class that connects Observers and Observables?
Your app should only have one main method.
Conceptually, this should be where you do the initial setup of Simulation and TransactionReceiver, so perhaps you could move it to a separate class to help you visualise how things should work. You could try something like below:
class Application {
private Simulation simulation;
private TransactionReceiver transactionReceiver;
public Application() {
simulation = new Simulation(/* params here*/);
transactionReceiver = new TransactionReceiver(/*params here*/);
}
public void go() {
simulation.simulate().subscribe(transactionCreated -> transactionReceiver.doSomething(transactionCreated);
}
public static final main(String[] args) {
Application application = new Application();
application.go();
}
}
Eventually as you get more fluent you could think about adding a dependency-injection framework like Guice or Dagger.
This will help you with managing the dependencies of the classes that you need to use throughout your application.
So you would end up with a more simple Application - it would just set up the DI-framework and then you can use the classes how you want to.
UPDATE:
If you want to communicate between two different classes, you will need to use methods:
class Simulation {
public Observable<TransactionCreated> simulate() {
// use PublishSubject or whatever
}
}
I'm working in an Spring application that downloads data from different APIs. For that purpose I need a class Fetcher that interacts with an API to fetch the needed data. One of the requirements of this class is that it has to have a method to start the fetching and a method to stop it. Also, it must download all asynchronously because users must be able to interact with a dashboard while fetching data.
Which is the best way to accomplish this? I've been reading about task executors and the different annotations of Spring to schedule tasks and execute them asynchronously but this solutions don't seem to solve my problem.
Asynchronous task execution is what you're after and since Spring 3.0 you can achieve this using annotations too directly on the method you want to run asyncrhonously.
There are two ways of implementing this depending whether you are interested in getting a result from the async process:
#Async
public Future<ReturnPOJO> asyncTaskWithReturn(){
//..
return new AsyncResult<ReturnPOJO>(yourReturnPOJOInstance);
}
or not:
#Async
public void asyncTaskNoReturn() {
//..
}
In the former method the result of your computation conveyed by yourReturnPOJOInstance object instance, is stored in an instance of org.springframework.scheduling.annotation.AsyncResult<V> which in return implements the java.util.concurrent.Future<V> that the caller can use to retrieve the result of the computation later on.
To activate the above functionality in Spring you have to add in your XML config file:
<task: annotation-driven />
along with the needed task namespace.
The simplest way to do this is to use the Thread class. You supply a Runnable object that performs the fetching functionality in the run() method and when the Thread is started, it invokes the run method in a separate thread of execution.
So something like this:
public class Fetcher implements Runnable{
public void run(){
//do fetching stuff
}
}
//in your code
Thread fetchThread = new Thread(new Fetcher());
fetchThread.start();
Now, if you want to be able to cancel, you can do that a couple of ways. The easiest (albeit most violent and nonadvisable way to do it is to interrupt the thread:
fetchThread.interrupt();
The correct way to do it would be to implement logic in your Fetcher class that periodically checks a variable to see whether it should stop doing whatever it's doing or not.
Edit To your question about getting Spring to run it automatically, if you wanted it to run periodically, you'll need to use a scheduling framework like Quartz. However, if you just want it to run once what you could do is use the #PostConstruct annotation. The method annotated with #PostConstruct will be executed after the bean is created. So you could do something like this
#Service
public class Fetcher implements Runnable{
public void run(){
//do stuff
}
#PostConstruct
public void goDoIt(){
Thread trd = new Thread(this);
trd.start();
}
}
Edit 2 I actually didn't know about this, but check out the #Async discussion in the Spring documentation if you haven't already. Might also be what you want to do.
You might only need certain methods to run on a separate thread rather than the entire class. If so, the #Async annotation is so simple and easy to use.
Simply add it to any method you want to run asynchronously, you can also use it on methods with return types thanks to Java's Future library.
Check out this page: http://www.baeldung.com/spring-async
I have a singleton class in my play app. This singleton class is a long process which will generate reports from DB which consumes huge amount of memory. When i run my application in dev mode this singleton functionality is executing several times. I want this functionality to run only once. What should I do for that?
My code is:
public class DataGridManagerImpl extends ComponentContainer implements DataGridManager {
private static DataGridManager instance = null;
private DataGridManagerImpl(){
load();
}}
#Override
public void load() {
//Myreports function
}
public static DataGridManager getInstance(){
if (instance == null){
instance = new DataGridServiceManagerImpl();
}
return instance;
}
}
In my controller file inside a template function
DataGridManager dataGridMgr = DataGridManagerImpl.getInstance();
If i access the page it is executing the load reports function again.
Without code explaining how did you create your class it's hard to answer. From what I understand what you want is to run a process only once.
Problably the best approach is to use a Scheduled Job. This will trigger the process at a certain time, and Play ensures that only 1 instance of this process is running at the same time, even if the schedule would indicate another instance has to run. Let's say you have a process scheduled every hour and the process takes 3 hours. The initial process will be the only one running for 3 hours until it finishes.
Now, I would assume you want your process to be recurring as it generate reports. If not, if you only want to run it once, then you may want to use an asynchronous bootstrap job instead. This would run just once, at the beginning of the application.
EDIT on update: during development the #OnApplicationStart may execute several times, as Play may automatically reload the application when you do certain code changes. This is part of the dev process (the same that an #OnApplicationStart job won't start in Dev until the server gets a request).
As it's a job that you only want to run once, you may try to skip it in dev mode using the check:
if(Play.mode == Play.Mode.DEV)
If you need to run it at least once, add a dev-only url that you can access during dev to start the process.
Now, on your update you also mention that you are calling that code in a controller, and that every time the controller is acessed the method is called. That's expected. Singleton doesn't mean that it will run only once, but that there is only 1 object in the system. If in your controller you launch the calculation, that will happen everytime you access the controller.
SECOND EDIT (on comments): Arasu, the other issue is that you are calling the method load() when you construct the object. A singleton doesn't garantee that the object will only be constructed once. It garantees that, once constructed, only 1 object will exist. But it may happen that the object is removed by GC, in this case as per your code if you construct it again then you'll call load() and redo the processing.
The best solution is to not call "load" on constructor, but to force the user (you) to call it after retrieving the instance. An alternative is to set some flag at the beginning of load that detects if the code has been run. Be aware that Play is stateless, so that flag will need to be stored in the database.
the defition of a singleton is that it can run only once, it's practically the nature of the pattern. If you somehow manage to run it multiple times, you might have implementation errors in your singleton.
Recheck the singleton pattern in Wikipedia.
Edit:
This code makes it impossible to fetch more than one instance. How would you get more than one?
public class Singleton {
private static Singleton _instance;
private Singleton() { }
public static synchronized Singleton getInstance() {
if (null == _instance) {
_instance = new Singleton();
}
return _instance;
}
}
Or do you mean that you instanciate the Singleton class, instead of calling Singleton.getInstance()?
It is possible to have a Singleton doing a time consuming processing and be called the same time by two different threads. I think this is the situation here. The same Singleton object's method is called multiple times from the program.
I have run a little test... two thread calling the same Singleton object and here is the result
Thread[Thread 1,5,main] internal loop number = 0 Object = example.Singeton#164f1d0d
Thread[Thread 2,5,main] internal loop number = 0 Object = example.Singeton#164f1d0d
Thread[Thread 1,5,main] internal loop number = 1 Object = example.Singeton#164f1d0d
and here is the code.
package example;
public class Singeton {
private static final Singeton INSTANCE = new Singeton();
private Singeton() {}
public static Singeton getInstance(){
return INSTANCE;
}
public boolean doTimeConsumingThing(){
for (int i=0; i<10000000;i++){
System.out.println(Thread.currentThread() + " internal loop number = " + i + " Object = " + toString());
}
return true;
}
}
package example;
public class MulThread extends Thread{
public MulThread(String name) {
super(name);
}
#Override
public void run() {
while(true){
Singeton s = Singeton.getInstance();
System.out.println("Thread " + getId());
s.doTimeConsumingThing();
}
}
public static void main(String[] args) {
MulThread m1 = new MulThread("Thread 1");
MulThread m2 = new MulThread("Thread 2");
m1.start();
m2.start();
}
}
Please correct my notion above if i am wrong.
Hence what you need is a variable to keep track of the state of the time consuming procedure (i.e. a boolean isRunning) or the times the procedure has been called.
You can also make the pertinent time consuming method of the Singleton synchronized so only one thread can access the method while it is running (in my example if you make the doTimeConsumingThing() synchronized, the second thread will block until the singleton's method called from the first thread is finished.
Hope it helps
I had the same problem in DEV mode, and what I did is create a module for the tasks I don't want to be run at every #OnApplicationStart.
The trick is to launch those tasks in a overriden "onLoad()" method, in the module:
public void onLoad()
{
// tasks to run one time only
}
The onLoad() method is called one time only, not each time the application is restarted.
I don't know if this will help, but here are some things to check:
The code in your question is not thread-safe. You're missing the synchronized keyword in getInstance. That could cause the constructor to be called more than once by different threads.
Could DataGridManagerImpl be getting loaded by different classloaders? That static instance variable isn't static for the whole JVM, just static for that class' classloader.
load is public. Could some other code being calling that method?
I have a J2EE application that receives messages (events) via a web service. The messages are of varying types (requiring different processing depending on type) and sent in a specific sequence. It have identified a problem where some message types take longer to process than others. The result is that a message received second in a sequence may be processed before the first in the sequence. I have tried to address this problem by placing a synchronized block around the method that processes the messages. This seems to work, but I am not confident that this is the "correct" approach? Is there perhaps an alternative that may be more appropriate or is this "acceptable"? I have included a small snippit of code to try to explain more clearly. .... Any advice / guidance appreciated.
public class EventServiceImpl implements EventService {
public String submit (String msg) {
if (msg == null)
return ("NAK");
EventQueue.getInstance().submit(msg);
return "ACK";
}
}
public class EventQueue {
private static EventQueue instance = null;
private static int QUEUE_LENGTH = 10000;
protected boolean done = false;
BlockingQueue<String> myQueue = new LinkedBlockingQueue<String>(QUEUE_LENGTH);
protected EventQueue() {
new Thread(new Consumer(myQueue)).start();
}
public static EventQueue getInstance() {
if(instance == null) {
instance = new EventQueue();
}
return instance;
}
public void submit(String event) {
try {
myQueue.put(event);
} catch (InterruptedException ex) {
}
}
class Consumer implements Runnable {
protected BlockingQueue<String> queue;
Consumer(BlockingQueue<String> theQueue) { this.queue = theQueue; }
public void run() {
try {
while (true) {
Object obj = queue.take();
process(obj);
if (done) {
return;
}
}
} catch (InterruptedException ex) {
}
}
void process(Object obj) {
Event event = new Event( (String) obj);
EventHandler handler = EventHandlerFactory.getInstance(event);
handler.execute();
}
}
// Close queue gracefully
public void close() {
this.done = true;
}
I am not sure what is the framework (EJB(MDB)/JMS) you are working with. Generally using synchronization inside a Managed Environment like that of EJB/JMS should be avoided(its not a good practice). One way to get around is
the client should wait for the acknowledgement from the server before it sends the next message.
this way you client itself will control the sequence of events.
Please note this won't work if there are multiple client submitting the messages.
EDIT:
You have a situation wherein the client of the web service sends message in sequence without taking into account the message processing time. It simply dumps the message one after another. This is a good case for Queue ( First In First Out ) based solution. I suggest following two ways to accomplish this
Use JMS . This will have an additional overhead of adding a JMS providers and writing some plumbing code.
Use some multitheading pattern like Producer-Consumer wherein your web service handler will be dumping the incoming message in a Queue and a single threaded consumer will consume one message at a time. See this example using java.util.concurrent package.
Use database. Dump the incoming messages into a database. Use a different scheduler based program to scan the datbase (based on sequence number) and process the messages accordingly.
First and third solution is very standard for these type of problems. The second approach would be quick and won't need any additional libraries in your code.
If the events are to be processed in a specific sequence, then why not try adding "eventID" and 'orderID' fields to the messages? This way your EventServiceImpl class can sort, order and then execute in the proper order (regardless of the order they are created and/or delivered to the handler).
Synchronizing the handler.execute() block will not get the desired results, I expect. All the synchronized keyword does is prevent multiple threads from executing that block at the same time. It does nothing in the realm of properly ordering which thread goes next.
If the synchronized block does seem to make things work, then I assert you are getting very lucky in that the messages are being created, delivered and then acted upon in the proper order. In a multithread environment, this is not assured! I'd take steps to assure you are controlling this, rather than relying on good fortune.
Example:
Messages are created in the order 'client01-A', 'client01-C',
'client01-B', 'client01-D'
Messages arrive at the handler in the order 'client01-D',
'client01-B', 'client01-A', 'client01-C'
EventHandler can distinquish messages from one client to another and starts to cache 'client01' 's messages.
EventHandler recv's 'client01-A' message and knows it can process this and does so.
EventHandler looks in cache for message 'client01-B', finds it and processes it.
EventHandler cannot find 'client01-C' because it hasn't arrived yet.
EventHandler recv's 'client01-C' and processes it.
EventHandler looks in cache for 'client01-D' finds it, processes it, and considers the 'client01' interaction complete.
Something along these lines would assure proper processing and would promote good use of multiple threads.