REST based message queue for microservices [closed] - java

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I'm given a task to implement Message queue to publish and consume message over queue but my requirement is, i'm gonna need to interact with queue using REST API (eg: ActiveMQ having REST API but problem with ActiveMq is when implementing consumer we don't have way to keep waiting for message queue to fetch,we cant listen to the queue using REST client ).
So i'm leaving my problem to you guys to give me better alternative for this
NOTE - solution should use only open source product only

The issue you are describing is the fundamental difference between messaging (stateful connections) and http-based services (stateless). A stateful consumer can process messages, b/c the broker knows the connection is active. This is also known as a "push" semantic. HTTP-based services are "pull". WebSockets provide a level of "push" available to web-browsers, but in the end you are really just doing STOMP or MQTT over WebSockets.
If you are doing a web application, look to web sockets. If it is a backend application go JMS+Openwire.

Related

How to make proper server connection without storing client URLs on the server [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 months ago.
Improve this question
I am implementing a rather simple communicator - one broadcast chat and multiple others for private messaging between users/clients. The purpose of this app is educational and I want to use solutions that are (preferably:)) simple yet modern, flexible, and used in real life, I started implementation using RestController and I stored clients' URLs in the database but I quickly understood that this is not good practice. I want to ask about links and resources that may address the following questions:
How to make a flexible and secure connection between server and client without storing the latter URL?
What is the correct protocol for information exchange in this kind of application
One way to make a flexible and secure connection between a server and its clients without storing their URLs is to use WebSockets. WebSockets is a protocol that allows for full-duplex communication between a server and its clients. This means that the server and the clients can both send and receive messages at the same time, and the connection remains open until it is closed by one of the parties.
Using WebSockets, the server can send messages to specific clients or broadcast messages to all connected clients. This allows for both private messaging and broadcast chat functionality in your application.
You can use a WebSockets library such as Socket.io for Java (https://github.com/socketio/socket.io-client-java). This library makes it easy to set up WebSockets on the server and handle incoming and outgoing messages from clients.
Some resources that may be helpful for learning more about WebSockets and implementing:
The WebSockets API: https://developer.mozilla.org/en-US/docs/Web/API/WebSockets_API
Socket.io: https://socket.io/
A tutorial on using Socket.io for real-time communication: https://socket.io/get-started/chat/

Spring Cloud Stream: Republish to other amqp connection if current connection throws exception [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I have 2 identical RabbitMQ servers, and I want to publish event to one of it (process exactly once), failing over to another in case of publishing failure.
Spring Cloud Stream has 1 binder for each server.
MQ servers have same exchanges and durable queues configured, but queues don't have HA policy.
The questions are:
What is the best design to do this? Preferably by making changes in high level Cloud Stream configuration, not diving into Spring AMQP.
How can I hook to asynchronous publishing and it's result? Preferably not by making it synchronous. Override some bean?
Can RabbitMQ HA help in any way? As I understood, the whole durable Queue is present only on single node to preserve order of messsages (actually I don't need the order). So if I configure HA and the node with durable HA queue fails/stops, processing and publishing will crash?
See the boot documentation:
spring.rabbitmq.addresses= # Comma-separated list of addresses to which the client should connect. The connection factory will automatically fail over.
If you set the producer errorChannelEnabled you will get returned messages in the error channel; this needs a connection factory configured for returns. There is no current mechanism to get async positive acks, unless you use Spring AMQP directly to publish messages.
With HA, a new master node is selected for the queue(s) hosted by the failed node. While the queue is hosted on one node, it is copied to the others.

Integrate existing application interface with Kafka [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I would like to integrate an existing application with Kafka.
The application is not under my control so I am not able to change the way it communicates. Application sends JSON request to REST API backend.
How can I put Kafka between application and its backend without changing the code?
Simply inserting Kafka between 2 existing applications is not necessarily a good idea especially if they won't be taking advantage of it, like scale for example. As you didn't describe your use case, I can't tell.
That said if it's what you want to do, you can use Kafka Connect to integrate existing applications with Kafka. You should be able to build:
a Source connector: to receive JSON requests from the app and insert them as records into Kafka
a Sink connector: to extract records from Kafka and send them to the backend
See the Connect docs: http://kafka.apache.org/documentation/#connect

best design pattern for handling multiple incoming and outgoing connections [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have a scenario where I am dealing with multiple incoming and outgoing connections. Which design pattern in java will be suitable for me to deal with such scenario.
I have multiple incoming connections like FTP, SFTP , HTTP , Database and multiple outgoing connections also FTP , SFTP , HTTP , Database. I am new to design patterns , I just want to know which design pattern best fit in my case.
I strongly recommend the Half-Sync Half-Async (http://www.cs.wustl.edu/~schmidt/PDF/PLoP-95.pdf) as a general way to deal with the complexity of having (possibly) blocking communication creating asynchronous tasks that need to be executed in order to give a result back to the caller.
It is a very general design pattern so it certainly fits several client-servers protocols you cited.
ESB, suggested in another answer is not adequate to what you are looking for, since it is based on a model in which you have several processes all connected to a message bus. All those processes exchange messages and they are all typically connected to one or more message queues or message topics. Think of it as the postal service. All houses (processes) have the same role and all of them talk with the postal service in order to exchange messages.
In your problem, you have two distinct roles: a client role and a server role. Your problem seems to be how to organize the server internally, not how to coordinate servers or equal peers.

Restful services and messaging [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
we are planning to design a system where data comes through web services and data will be processed asynchronously, i have been assigned to pick java open source technologies to do this, for web services we have decided to go with Restful services, I never worked with messaging technologies , can anyone please suggest which is the best open source technology that is available in the market that does data process asynchronously
Try Apache CXF - see the DOCS
It has everything you want i guess
Your use case is of processing of data asynchronously. This typically happens in following steps:
Receive the data and store it somewhere (in-memory or persistent location)
Return the Acknowledged/Received response immediately.
Either immediately start a thread to process the data or let some scheduled thread scan the received data and process it.
Optionally sent acknowledge to the sending application if such an interface is available.
There is no standard library or framework in java to do this. There are individual pieces which are know to solve standard problems and combining them will be one options.
Producer consumer Pattern is a typical patter which satisfies your need over there.
You can build a producer-consumer pattern using Java's concurrent APIs (Here is an example)
This producer consumer piece can be wrapped behind a Servlet (Or some other server side class which handles requests).
All in incoming request will put by the producer on the shared queue and return.
Consumer will pick it from the queue and process it asynchronously.
Another option would be to use Asynchronous processing in in Servlet3.0.

Categories

Resources