I want to create a server to handle socket connections from users, and inside my server I want to have a connection to a RabbitMQ, one per connection, but in the examples provided in their webpage I see only "while" loops to wait for the message, in this case I will need to create a thread per connection only to process the message from RabbitMQ.
Is there a way to do this in Java using Spring or any framework that I just create the call back for the RabbitMQ instead of using while loops?
I was using node.js and there it is pretty straightforward to do this,
and I want to know some proposals for Java
You should take a look at the Channel.basicConsume and the DefaultConsumer abstract class: https://www.rabbitmq.com/api-guide.html#consuming
Java concurrency will require a thread for the callback to handle each message, but you can use a thread pool to reuse threads.
static final ExecutorService threadPool;
static {
threadPool = Executors.newCachedThreadPool();
}
Now you need to create a consumer that will handle each delivery by creating a Runnable instance that will be passed to the thread pool to execute.
channel.basicConsume(queueName, false, new DefaultConsumer(channel) {
#Override
public void handleDelivery(String consumerTag, Envelope envelope, AMQP.BasicProperties properties, byte[] body) throws IOException {
final byte msgBody = body; // a 'final' copy of the body that you can pass to the runnable
final long msgTag = envelope.getDeliveryTag();
Runnable runnable = new Runnable() {
#Override
public void run() {
// handle the message here
doStuff(msgBody);
channel.basicAck(msgTag, false);
}
};
threadPool.submit(runnable);
}
});
This shows how you can handle concurrent deliveries on a single connection and channel without a while loop in a single thread that would be blocked on each delivery. For your sanity, you probably will want to factor your Runnable implementation into its own class that could accept the channel, msgBody, msgTag and any other data as parameters that will be accessible when the run() method is called.
Related
I'm using hazel cast IMGD for my app. I have used queues for internal communication. I added an item listener to queue and it works great. Whenever a queue gets a message, listener wakes up and needed processing is done.
Problem is its single threaded. Sometimes, a message takes 30 seconds to process and messages in queue just have to wait until previous message is done processing. I'm told to use Java executor service to have a pool of threads and add an item listener to every thread so that multiple messages can be processed at same time.
Is there any better way to do it ? may be configure some kind of MDB or make the processing asynchronous so that my listener can process the messages faster
#PostConstruct
public void init() {
logger.info(LogFormatter.format(BG_GUID, "Starting up GridMapper Queue reader"));
HazelcastInstance hazelcastInstance = dc.getInstance();
queue = hazelcastInstance.getQueue(FactoryConstants.QUEUE_GRIDMAPPER);
queue.addItemListener(new Listener(), true);
}
class Listener implements ItemListener<QueueMessage> {
#Override
public void itemAdded(ItemEvent<QueueMessage> item) {
try {
QueueMessage message = queue.take();
processor.process(message.getJobId());
} catch (Exception ex) {
logger.error(LogFormatter.format(BG_GUID, ex));
}
}
#Override
public void itemRemoved(ItemEvent<QueueMessage> item) {
logger.info("Item removed: " + item.getItem().getJobId());
}
}
Hazelcast IQueue does not support asynchronous interface. Anyway, asynchronous access would not be faster. MDB requires JMS, which is pure overhead.
What you really need is multithreaded executor. You can use default executor:
private final ExecutorService execService = ForkJoinPool.commonPool();
I have Kafka Produce which sends the message to kafka .And i log the message in database in the both onsucess and onFailure with the help stored procedure . As shown in the code i am using asynchronous
should i mark my callStoredProcedure method in the repository as synchronised to avoid deadlocks? i believe synchronised is not needed as callback will be executed sequentially in a single thread.
from the below link
https://kafka.apache.org/10/javadoc/org/apache/kafka/clients/producer/KafkaProducer.html
Note that callbacks will generally execute in the I/O thread of the
producer and so should be reasonably fast or they will delay the
sending of messages from other threads. If you want to execute
blocking or computationally expensive callbacks it is recommended to
use your own Executor in the callback body to parallelize processing.
Should i execute callbacks in other thread ?
And can u share the code snippet how to excute callback in other thread. like parallelise callback in 3 threads
My code snippet
#Autowired
private Myrepository myrepository;
public void sendMessageToKafka(List<String> message) {
for (String s : message) {
future = kafkaTemplate.send(topicName, message);
future.addCallback(new ListenableFutureCallback<SendResult<String, String>>() {
#Override
public void onSuccess(SendResult<String, String> result) {
System.out.println("Message Sent " + result.getRecordMetadata().timestamp());
myrepository.callStoredProcedure(result,"SUCCESS");
}
#Override
public void onFailure(Throwable ex) {
System.out.println(" sending failed ");
myrepository.callStoredProcedure(result,"FAILED");
}
});
}
private final ExecutorService exec = Executors.newSingleThreadExecutor();
...
this.exec.submit(() -> myrepository.callStoredProcedure(result,"SUCCESS"));
The tasks will still be run on a single thread (but not the Kafka IO thread).
If it can't keep up with your publishing rate, you might need to use a different executor such as a cached thread pool executor or Spring's ThreadPoolTaskExecutor.
i have a spring boot aplication and i want send email with javamail using ses on aws. but if I send an email, while it and sent no other process is executed.
I want to send the email through a thread, but I've implemented a thread in this way and even then the email sending process is not asynchronous.
when I make this request to send email and then list all to see how the processing is, as long as the sending of the email does not finish the list request is not executed
#GetMapping
public ResponseEntity<?> listarUsuarios(){
System.out.println("--------begin send mail------------");
new SendMail(emailService).run();
System.out.println("--------finish send mail------------");
List<Usuario> usuariosList = usuarioRepository.findAll(); // <- this process not is processed when send email not finish
return new ResponseEntity<>(usuariosList,HttpStatus.OK);
}
.
public class SendMail extends Thread {
public EmailService emailService;
public SendMail(EmailService emailService) {
this.emailService = emailService;
}
public void run(){
try {
emailService.EnviarEmailDeConfirmacao("daviresio#gmail.com", 1, "Subject test mail","body test mail");
} catch (Exception e) {
e.printStackTrace();
}
}
}
You are not starting a new thread. Instead, you are calling the run() method directly:
new SendMail(emailService).run();
Call start() instead to start a new thread:
new SendMail(emailService).start();
By the way, starting new threads like this from a web application is bad practice. It's better to use for example an ExecutorService to manage the threads that send e-mails, so that you don't get a potentially unlimited number of threads when many users are calling this functionality at the same time.
You should use the start() method to spawn as a new thread. If you call run() directly it is run in the same thread. See https://docs.oracle.com/javase/tutorial/essential/concurrency/runthread.html
Use start() instead of run().
Run will execute it on the existing thread.
Start will execute it on a new thread.
So change your code to the following if you want it to execute asynchronous:
new SendMail(emailService).start();
new SendMail(emailService).start(); - will start a new Thread and will execute SendMail.run(); in the new Thread.
new SendMail(emailService).run(); - is just a method call which executed in the same thread.
I am familiar with Netty basics and have used it to build a typical application server running on TCP designed to serve many clients/connections. However, I recently have a requirement to build a server which is designed to handle handful of clients or only one client most of the times. But the client is the gateway to many devices and therefore generate substantial traffic to the server I am trying to design.
My questions are:
Is it possible / recommended at all to use Netty for this use case? I have seen the discussion here.
Is it possible to use multithreaded EventExecutor to the channel handlers in the pipeline so that instead of channel EventLoop, the concurrency is achieved by the EventExecutor thread pool? Will it ensure that one message from the client will be handled by one thread through all handlers, while the next message by another thread?
Is there any example implementation available?
According to the documentation of io.netty.channel.oio you can use it if you don't have lot's of client. In this case, every connection will be handled in a separate thread and use Java old blocking IO under the hood. Take a look at OioByteStreamChannel::activate:
/**
* Activate this instance. After this call {#link #isActive()} will return {#code true}.
*/
protected final void activate(InputStream is, OutputStream os) {
if (this.is != null) {
throw new IllegalStateException("input was set already");
}
if (this.os != null) {
throw new IllegalStateException("output was set already");
}
if (is == null) {
throw new NullPointerException("is");
}
if (os == null) {
throw new NullPointerException("os");
}
this.is = is;
this.os = os;
}
As you can see, the oio Streams will be used there.
According to your comment. You can Specify EventExecutorGroup while adding handler to a pipeline as this:
new ChannelInitializer<Channel> {
public void initChannel(Channel ch) {
ch.pipeline().addLast(new YourHandler());
}
}
Let's take a look at the AbstractChannelHandlerContext:
#Override
public EventExecutor executor() {
if (executor == null) {
return channel().eventLoop();
} else {
return executor;
}
}
Here we see that if you don't register your EventExecutor it will use the child event group you specified while creating the ServerBootstrap.
new ServerBootstrap()
.group(new OioEventLoopGroup(), new OioEventLoopGroup())
//acceptor group //child group
Here is how reading from channel is invoked AbstractChannelHandlerContext::invokeChannelRead:
static void invokeChannelRead(final AbstractChannelHandlerContext next, Object msg) {
final Object m = next.pipeline.touch(ObjectUtil.checkNotNull(msg, "msg"), next);
EventExecutor executor = next.executor();
if (executor.inEventLoop()) {
next.invokeChannelRead(m);
} else {
executor.execute(new Runnable() { //Invoked by the EventExecutor you specified
#Override
public void run() {
next.invokeChannelRead(m);
}
});
}
}
Even for a few connections I would go with NioEventLoopGroup.
Regarding your question:
Is it possible to use multithreaded EventExecutor to the channel
handlers in the pipeline so that instead of channel EventLoop, the
concurrency is achieved by the EventExecutor thread pool? Will it
ensure that one message from the client will be handled by one thread
through all handlers, while the next message by another thread?
Netty's Channel guarantees that every processing for an inbound or an outbound message will occur in the same thread. You don't have to hack an EventExecutor of your own to handle this. If serving inbound messages doesn't require long-lasting processings your code will look like basic usage of ServerBootstrap. You might find useful to tune the number of threads in the pool.
I would like to have an application which either loads or saves data through a HTTP request, however the data must interact with the UI thread. Ideally, I would like a single thread to use an IF statement on a message to determine if the request is to "load" or "save".
What would be the simplest way of doing this with the smallest amount of code?
Also, do instances of Handlers run on individual threads?
EDIT: This is the code I am using now:
Handler doStuff = new Handler(){
#Override
public void handleMessage(Message msg){
if(msg.what == 1){
// Load all the information.
// Get the ID from sharedPrefs
SharedPreferences details= getSharedPreferences("details", 0);
String ID = patDetails.getString("id", "error");
// Load up the ID from HTTP
String patInfo = httpInc.getURLContent("info.php?no="+AES.encrypt("387gk3hjbo8sgslksjho87s", ID));
// Separate all the details
patientInfo = patInfo.split("~");
}
if(msg.what == 2){
// Save the data
}
}
};
Eclipse halts the debugging and displays, "Source not found" for StrictMode.class
I suppose it's because it's using the Main thread to access the internet although it's running in individual threads.
Any idea.
Handlers do run on individual threads. Check that link. You should also check out AsyncTask.
I would propose submitting the jobs as Runnable to a single-threaded ExecutorService:
public class SomeClass {
private ExecutorService execService = Executors.newSingleThreadExecutor();
public void doSomething() {
final String someUiData = // retrieve data from UI
execService.submit(new Runnable() {
#Override
public void run() {
// so something time-consuming, which will be executed asynchronously from the UI thread
// you can also access someUiData here...
}
});
}
}
This way, the UI thread will not block whereas you can easily submit a different Runnable for different operations and the ExecutorService will completely take care of keeping it async.
Edit: If you need to interact with the UI, do so before becoming asynchronous and keep the result in final variables.