I am trying to do event scouring with Apache Camel.
For messaging bus I am using ActiveMQ.
Use cases
I want to audit each messages that are pushed to ActiveMQ using MongoDB as persistent storage. I have tried with mirrored queues in ActiveMQ. This pushes the message to a topic with the same name as queue.
But I have to implement worker based (load balancing) approach. This is not possible with topic (message duplication not allowed).
So I planned to use ActiveMQ with Camel by using the wiretap pattern.
Desired output:
Can I pull the message from wiretap destination and insert it into MongoDB or is there a way that can Camel directly insert it into MongoDB?
One possible way to tackle this on the broker side is with Composite Destinations. You can instruct the broker to forward messages sent to a Queue on to another Queue. Some care needs to be taken when doing this as by default this only happens when the Queue exists (static configuration of destinations can get round this). There is an option to always forward and you also have the option of applying selectors to reduce what gets sent. The thing to keep in mind is that unless you have something periodically purging the audit queue you will eventually run out of space.
You can configure the forwarding as follows:
<compositeQueue name="myQueue" forwardOnly="false">
<forwardTo>
<queue physicalName="myAuditQueue" />
</forwardTo>
</compositeQueue>
Related
I have an application that uses Java on the backend end, Angular on the frontend, and I'm trying to use STOMP messaging between the two to exchange state data.
What I would like to do is have my services, on startup, publish their states and have that data stay in the queue for any client that later connects to the server.
(edit)
For clarification, I don't mean I want to messages to survive a server reboot. What I want is for certain message queues to retain all messages until the server reboots.
How do I tell Spring Boot's STOMP implementation to not delete the contents of a /queue?
You can configure ActiveMQ Artemis as an "external broker" and use a "non-destructive" queue. When a STOMP client receives and acknowledges a message from a non-destructive queue the broker will not remove it. You can define a special "initialization" queue which all clients connect to initially to receive the state data which you care about and then they can connect to whatever other queues they need to complete their normal work.
In this kind of use-case the queue is typically configured as non-destructive and as a "last value" queue. This way each client can use its own "last value" and can keep their state data up-to-date without the complication of stale state data on the queue.
I realize your question was asking about how to do this with Spring's built-in broker, but all my research indicates that Spring's simple in-memory broker neither supports last-value queue semantics nor non-destructive queue semantics nor even persistent messages. From what I understand Spring's broker is only meant for the most basic use-cases which is why then enable integration with 3rd party brokers which can support more advanced use-cases (e.g. like yours).
I found the following link to read messages from JMS Queue and its working.
https://blogs.oracle.com/soaproactive/entry/jms_step_3_using_the
Now I want to read JMS queue statistics programmatically like number of messages, number of pending messages and message in/out time etc. Is it possible in weblogic or weblogic provide any API for this purpose?
Please help.
Statistics are part of a message broker implementation and thus vendor-specific. One popular implementations is ActiveMQ. It can be run in WebLogic Server or WebLogic Express.
Note: There are obviously many other JMS implementations around, and you should carefully evaluate for yourself which implementation suits your needs. Nevertheless, I shall use it as an example to point out the relevant features for your case:
Beginning with version 5.3, ActiveMQ ships with a statistics plugin
that can be used to retrieve statistics from the broker or its destinations.
You should be able to actively poll statistics from within your code by sending messages to specific destinations within the broker, see linked documentation for details.
Another feature of ActiveMQ is Advisory messages. Enable it in your broker's configuration and it
allows you to watch the system using regular JMS messages.
In this way, you can passively react to certain events in the messaging system , e.g. when a queue exceeds some threshold.
There is no API for statistics in JMS spec. However you can use JMX to monitor the statistics.
From docs,
Monitoring JMS Servers
You can monitor statistics on active JMS servers defined in your
domain via the Administration Console or through the
JMSServerRuntimeMBean. JMS servers act as management containers for
JMS queue and topic resources within JMS modules that are specifically
targeted to JMS servers.
This post (new way) may be helpful.
JMS API doesn't provide such information. It serves to receive and send messages, but isn't to grab statistics from underlying middleware.
Check direct API of the underlying MQ which you use. For instance, IBM WebSphere MQ has such API.
Is there any possible to develop a bi-directional messaging system using apache kafka ?
I need to subscribe for a topic from my consumer as well as I need to send message from my consumer.
You could do it one of two ways. Either set up a prefix system for the message keys or put content inside of the message that allows the consumer to avoid messages it has produced.
Now as to whether you should design it like this, that depends on your message traffic. If you're not slamming it with events, it might be better to consider something like Thrift as a way to have your message components do bidirectional communication. Where Kafka really excels relative to its complexity is when you need to produce and consume massive volumes of data. That might not be the case for you.
For example, one common use case with Kafka is to hook it up to a service like Storm, Apex or Samza for doing distributed processing of hundreds of GB or even TB of data. If your system has a high throughput requirement, that architecture would be a good one to consider as a starting point with Kafka for handling messages. With Storm, if you need to send messages back for reprocessing, you can always use the Kafka bolt to republish a message into Kafka to ensure it gets completely reprocessed.
I have Apache ActiveMQ embedded into my java 8 server side project. Its working fine, and I am able to send and consume messages from pre-configured queues. I now need to be able programatically remove messages from the queue upon request. After reading some docs I found that Apache ActiveMQ has a sub-project called Artemis that seems to provide the required functionality. But I am a bit confused on how to do it. Is Artemis sort of plugin on top of ActiveMQ and I just need to add required dependencies and use the tools or is it a separate product and it doesn't work with Active MQ but as an independent product. If so how do I manage individual messages (in particular delete requested message) in Active MQ?
First off, 'ActiveMQ Artemis' is a sub-project within the ActiveMQ project that represents an entirely new broker with a radically different underlying architecture than the main ActiveMQ broker. You would run one or the other.
To manage messages in the ActiveMQ broker you would use the JMX Mamagement API and the Queue#remove methods it exposes to remove specific messages. This can be done using the Message ID or more broadly using a message selector to capture more than one message if need be. The JMX API is also exposed via Jolokia so that you can manage the broker via simple REST calls instead of the JMX way if you prefer.
In any case this sort of message level management on the broker is a bit of an anti-pattern in the messaging world. If you find yourself needing to treat the broker as a database then you should ask yourself why you aren't using a database since a broker is not a database. Often you will run into many more issues trying to manage your messages this way as opposed to just putting them into a database.
I have a two instance of tomcat server running same web application due to durability.
These web applications consume some queue/topic form ActiveMQ using Apache-camel lib.
My issue is how to sync these two consumer so that only one consumer can get a particular message.I mean ActiveMQ send different message on each node.
If your have two consumers subscribe to the same queue/topic, you can use selector to make sure only one consumer can get a particular message. You can find some explanations here
Camel JMS component has the selector option could be use.