How to group/batch events with aggregator in spring integration - java

I have a channel whose payload is different POJOs that implement an interface called Event.
public interface Event {
String getEventType();
}
Events of many event types are added to channel one by one using a gateway. I want to group the events based on event type and call a service activator. The service have the following signature.
void processEventsInBatch(String eventType, List<Event> events);
It is important to get multiple events which belong to the same event type in the list to process them in batch and reduce multiple calls to external services.
How to achieve this with spring integration?

Aggregator in spring-integration uses correlation-id header (by default) to identify different messages in the same group. So the first step is to get eventType as correlation-id header. Later we can get this header as eventType param in the service activator as the correlation-id header will be present for the groups created by aggregator. This can be done by the following xml config
<int:header-enricher>
<int:correlation-id expression="payload.getEventType()"/>
<int:header-enricher>
Now the aggregator can be used as shown below.
<int:aggregator release-strategy-expression="size() >= 25"
group-timeout="5000"
expire-group-upon-completion="true"
send-partial-results-on-expiry="true" />
The above aggregator will send a group when it have at least 25 events in one group or it waited for 5 seconds. We can adjust the first two parameters to control how big we want the list to be and how much delay we want to introduce. The expire-group-upon-completion attribute is required to make sure that the aggregator continue to create new groups with same correlation-id. And the send-partial-results-on-expiry is required to make sure that if we get less than 25 events in 5 seconds, then the aggregator will send a group with what it have.

Related

Request level monitoring for hystrix stream

Does hystrix support request level monitoring.
For example, I have certain types of requests which I send to a single external API.
Request A : CITY REQUEST | Request B : COUNTRY REQUEST | Request C : GLOBAL REQUEST
I want to monitor these types of requests separately on my dashboard and I don't want to create separate services/methods and annotate them separately with different command keys.
Solved this. Realized that it can't be solved by using annotations currently. Created a HystrixCommand for the same and added conditions on CommandKey string.

CQRS events between multiple services

We have a microservice that listens for events, lets call this AuditService for now. It listens for Audit Events (AuditEvent) on rabbitmq. Any one who wants to call the AuditService needs to create and fire a AuditEvent. We have shared the AuditEvent pojo in a common module so it can be shared.
There is a event listener in the AuditService that listens for the AuditEvent from the rabbitmq queue. When we get a message we then do some processing/validation on the AuditEvent and then save it to the AuditEntry database table.
We then want to publish another event. Lets call this AuditPublishEvent. So in order to do this we create another command (AuditPublishCommand) which in turn fires theAuditPublishEvent. This event is again for the queue and any service that publishes the AuditEvent will listen for it. There will be a service to send it as an email, and another to send it as a push, etc.
At the moment what we are doing on the AuditService is
Listen for AuditEvent
|
v
Trigger AuditEvent event handler
|
v
Validate audit event and process it
|
v
Save it to the database
|
v
If save is successful then send AuditPublishEvent to queue via AuditPublishCommand
Note that the last part needs to be synchronous, meaning if the db save failed we don't want to send an email or such. This is currently done by calling the commandGateway from within the event handler in the AuditService, is it correct to call the commandGateway from the EventListener, if not what is the alternative?
The question is, is this the correct way/best practice of doing things using the Axon framework and spring?
Whether this is the best way to address the problem is hard to say, as it would require much more information about your domain.
What I can say, is that what your doing is technically allright. You mentioned you are unsure if the event published after the AuditEvent is stored is only published when the database changes are committed. That depends on how the event is published. If you use the EventBus to publish it and use the SpringAMQPPublisher, you’re safe.
If you publish it directly, this may not be the case.
Axon uses a unitOfWork to coordinate activities in different phases if processing. Handlers are called in the ‘started’ phase. A database commit is done in the phase after: ‘commit’.
If you want to be sure the message to AMQP after the commit, register a handler to the afterCommit phase. Handlers for this phase are not invoked on a rollback.
You can add the UnitOfWork as a parameter to you #EventHandler annoted method. Axon will automatically inject it for you.

Spring Cloud Stream with RabbitMQ binder, how to apply #Transactional?

I have a Spring Cloud Stream application that receives events from RabbitMQ using the Rabbit Binder. My application can be summarized as this:
#Transactional
#StreamListener(MySink.SINK_NAME)
public void processEvents(Flux<Event> events) {
// Transform events and store them in MongoDB using
// spring-boot-data-mongodb-reactive
...
}
The problem is that it doesn't seem that #Transactional works with Spring Cloud Stream (or at least that's my impression) since if there's an exception when writing to MongoDB the event seems to have already been ack:ed to RabbitMQ and the operation is not retried.
Given that I want to achieve basically the same functionality as when using the #Transactional around a function with spring-amqp:
Do I have to manually ACK the messages to RabbitMQ when using Spring
Cloud Stream with the Rabbit Binder?
If so, how can I achieve this?
There are several issues here.
Transactions are not required for acknowledging messages
Reactor-based #StreamListener methods are invoked exactly once, just to set up the Flux so #Transactional on that method is meaningless - messages then flow through the flux so anything pertaining to individual messages has to be done within the context of the flux.
Spring Transactions are bound to the thread - Reactor is non-blocking; the message will be acked at the first handoff.
Yes, you would need to use manual acks; presumably on the result of the mongodb store operation. You would probably need to use Flux<Message<Event>> so you would have access to the channel and delivery tag headers.

Spring integration dynamic message selector

I have an application with two queues, the first queue has control messages and the other has data messages. Based on the JMSCorrelationID of the message from the control queue i need to read only messages with that JMSCorrelationID from the data queue.
I am able to selectively read messages from the data queue using a selector defined as below.
<int-jms:message-driven-channel-adapter id="messageDrivenInboundAdapter"
channel="inboundChannel" destination-name="inboundMQ"
selector="JMSCorrelationID = 'JMSCORELIS1234'"
connection-factory="connectionFactory" extract-payload="false"/>
I need to dynamically update value for the JMSCorrelationID for the selector based on messages received on a different channel.
Is it possible to do that? Is there a different way to implement this solution in spring integration?
It's not possible with the message-driven adapter; the selector is baked into the message listener container constructed during initialization.
You can change the message selector of a polled <inbound-channel-adapter/>; the change will take effect on the next poll.
You can get a handle to the JmsDestinationPollingSource with auto wiring, or via the bean name (adapterId.source).

How should I build my Messages in Spring Integration?

I have an application I coded which I am refactoring to make better use of Spring Integration. The application processes the contents of files.
The problem (as I see it) is that my current implementation passes Files instead of Messages, i.e. Spring Integration Messages.
In order to avoid further rolling my own code, which I then have to maintain later, I'm wondering if there is a recommended structure for constructing Messages in Spring Integration. What I wonder is if there is some recommended combination of channel with something like MessageBuilder that I should use.
Process/Code (eventually)
I don't yet have the code to configure it but I would like to end up with the following components/processes:
Receive a file, remove header and footer of the file, take each line and convert it into a Message<String> (This it seems will actually be a Splitter) which I send on to...
Channel/Endpoint sends message to Router
Router detects format String in Payload and routes to the appropriate channel similar to Order Router here...
Selected channel then builds appropriate type of Message, specifically typed messages. For example I have the following builder to build a Message...
public class ShippedBoxMessageBuilder implements CustomMessageBuilder {
#Override
public Message buildMessage(String input) {
ShippedBox shippedBox = (ShippedBox) ShippedBoxFactory.manufactureShippedFile(input);
return MessageBuilder.withPayload(shippedBox).build();
}
...
Message is routed by type to the appropriate processing channel
My intended solution does seem like I've complicated it. However, I've purposefully separated two tasks 1) Breaking a file into many lines of Messages<String> and 2) Converting Messages<String> into Messages<someType>. Because of that I think I need an additional router/Message builder for the second task.
Actually, there is MessageBuilder support in the Spring Integration.
The general purpose of such Frameworks is to help back-end developers to decouple their domain code from messaging infrastructure. Finally, to work with Spring Integration you need to follow the POJO and Method Invocation principles.
You write your own services, transformers and domain models. Then you just use some out of the box compoenents (e.g. <int-file:inbound-channel-adapter>) and just refer from there to your POJOs, but not vise versa.
I recommend you to read Spring Integration in Action book to have more pictures on the matter.
Can you explain the reason to get deal with Spring Integration components directly?
UPDATE
1) Breaking a file into many lines of Messages
The <splitter> is for you. You should write some POJO which returns List<String> - the lines from your file without header and footer. How to read lines from File isn't a task of Spring Integration. Especially, if the "line" is something logical, not the real file line.
2) Converting Messages into Messages
One more time: there is no reason to build Message object. It's just enough to build new payload in some transformer (again POJO) and framework wrap to its Message to send.
Payload Type Router speaks for itself: it checks a payload type, but not Message type.
Of course, payload can be Message too, and even any header can be as well.
Anyway your Builder snapshot shows exactly a creation of plain Spring Integration Message in the end. And as I said: it will be enough just to transform one payload to another and return it from some POJO, which you will use as a transformer reference.

Categories

Resources