Triggering Spring Integration - java

I'm still new to Spring Integration and Spring framework as a whole so please bear with me. I've been looking at this example.
https://github.com/benjaminwootton/spring-integration-examples/blob/master/target/classes/direct-channel-example.xml
https://github.com/benjaminwootton/spring-integration-examples/tree/master/src/main/java/examples/components
I'm wondering how do I or Spring trigger the method exactly?
I'm trying to do a round-robin using direct channel for my REST services. My REST services consumes messages and processes it.
I understand that with direct channel, Spring will round-robin through the subscribers but I'm not sure how Spring actually triggers that method.
Thank you for any help or advice.

The first class citizen in the Spring Integration is MessageChannel, so to allow for message (HTTP request) travel in the Integration flow, we should place message to some <channel>.
Since you say that you are in the REST service, I assume you use:
<int-http:inbound-gateway path="/path1,/path2"
request-channel="myChannel"/>
Here the myChannel is a component to where the HTTP request will be sent after conversion to the Spring Integration Message.
Of course, the MessageChannel is a pipe, when we push a thing into one side and there really should be something on the other side to poll that thing. In case of DirectChannel it is some subscriber. And we involve here the second class citizen - MessageHandler.
And if you use there something like <service-activator input-channel="myChannel" ref="foo" method="service">, the call stack may look like:
DirectChannel#send -> UnicastingDispatcher#dispatch ->
ServiceActivatingHandler#handleMessage -> MethodInvokingMessageProcessor#processMessage ->
MessagingMethodInvokerHelper#process -> foo#service
HTH

Related

Auto create KafkaListeners

I work with apache-kafka and web flux (spring boot) and I want to know if there is a method to auto create a KafkaListener for each topic I add in application.yml(or properties)
This is not what consumer is for. The Kafka topic is a stream of data constantly changing . What is business purpose of that http request? Maybe you want to stream such a topic request to the Flux? Then consider to use Spring Integration dynamic flows and its toReactivePublisher() feature:
https://docs.spring.io/spring-integration/docs/current/reference/html/dsl.html#java-dsl-runtime-flows
https://docs.spring.io/spring-integration/docs/current/reference/html/reactive-streams.html#java-dsl
This sample shows something about Kafka and dynamic flows: https://github.com/spring-projects/spring-integration-samples/tree/main/dsl/kafka-dsl.
Also this one demonstrates some “to WebFlux” technique : https://github.com/artembilan/sandbox/tree/master/amqp-to-webflux.
Or you can look into Reactor Kafka: https://projectreactor.io/docs/kafka/release/reference/.

Queuing multiple request before making http call to external service - spring webflux and projectreactor

I am using spring webflux and I would like to queue multiple requests before sending an http call to external system using WebClient and would like to know what is the best practice to do such requirement.
#GetMapping
Mono<Result> get(#RequestParam Optional<String> input) {
//Here I have to somehow keep the input and make a call when I have X items, then I can write to response, how this need to be done? Also what sort of Mono has to be return here?
}
I have found this which can be a solution but still a bit unclear https://ducmanhphan.github.io/2019-08-25-How-to-use-Processor-in-Reactor-Java/
Would be appreciated if anybody explains.

Best way to use Websocket with Spring Boot and Vuejs

I try to use Websocket with spring boot backend (as an API) and Vuejs frontend.
I take a simple use case to expose my question. Some users are logged on my website, and there is a messaging feature. User A send a message to User B. User B is actually logged, and I want to notify User B that a new message is arrived.
I see 3 ways to do it with websockets :
1 - When User A send message, an Axios post is call to the API for saving message, and, if the Axios response is success, I call something like
this.stompClient.send("/app/foo", JSON.stringify(bar), {})
2 - When User A send message, I only call something like
this.stompClient.send("/app/foo", JSON.stringify(bar), {})
and it's my controller's method (annotated with #MessageMapping("/xxxx") #SendTo("/topic/yyyy")) that call facade, service, dao to first, save message, then return message to subscribers
3 - I keep my actuals controllers, facade, services and DAO, and juste add when save is successfull something like :
#Autowired SimpMessagingTemplate webSocket;
...
#GetMapping("/send-message")
public ResponseEntity sendMessage(#AuthenticationPrincipal User user, ....) {
service.saveMessage(....);
webSocket.convertAndSend("/ws/message-from", message);
without a new controller contains #MessageMapping("/xxxx") #SendTo("/topic/yyyy"). User B is just subscibed to "/ws/message-from"
Could you help me.
In the 3 way there is a good method ?
Thanks you.
The one and two method has no much difference as you use axios from npm for sending request and the other one you can directly,while the third one you use controller,and facade dao at single place.it is about architecture and how you wanna send your requests for your framework,as a requirement.
They serve best at their level,till you come with specific requirement.
The suggestion would be to use axios.
It has advantages:
supports older browsers (Fetch needs a polyfill)
has a way to abort a request
has a way to set a response timeout
has built-in CSRF protection
supports upload progress
performs automatic JSON data transformation
works in Node.js

Spring boot - Threads / Feign-Client / Messaging / Streamlistener

We struggle to find a solution for the following scenario:
Situation
Receive a message via Spring Cloud Streamlistener
Invoke a REST-Service via Feign-Client
We have configured several Feign-RequestInterceptor to enrich
request header data.
We want to avoid passing every request header on the method call and like the central configuration approach of the request interceptors.
Problem:
How to access data from a specific message, which contains informations, that need to be added to every request call via the Feign-RequestInterceptor.
We don't have a Request-Context, as we come from a message.
Can we be sure , that the message consumption and the REST call is happening on the same thread? If yes, we could use the NamedThreadLocal to store the information.
Yes, unless you hand off to another thread in your StreamListener, the rest call will be made on the same thread (assuming you are using RestTemplate and not the reactive web client).

Spring Cloud Stream with RabbitMQ binder, how to apply #Transactional?

I have a Spring Cloud Stream application that receives events from RabbitMQ using the Rabbit Binder. My application can be summarized as this:
#Transactional
#StreamListener(MySink.SINK_NAME)
public void processEvents(Flux<Event> events) {
// Transform events and store them in MongoDB using
// spring-boot-data-mongodb-reactive
...
}
The problem is that it doesn't seem that #Transactional works with Spring Cloud Stream (or at least that's my impression) since if there's an exception when writing to MongoDB the event seems to have already been ack:ed to RabbitMQ and the operation is not retried.
Given that I want to achieve basically the same functionality as when using the #Transactional around a function with spring-amqp:
Do I have to manually ACK the messages to RabbitMQ when using Spring
Cloud Stream with the Rabbit Binder?
If so, how can I achieve this?
There are several issues here.
Transactions are not required for acknowledging messages
Reactor-based #StreamListener methods are invoked exactly once, just to set up the Flux so #Transactional on that method is meaningless - messages then flow through the flux so anything pertaining to individual messages has to be done within the context of the flux.
Spring Transactions are bound to the thread - Reactor is non-blocking; the message will be acked at the first handoff.
Yes, you would need to use manual acks; presumably on the result of the mongodb store operation. You would probably need to use Flux<Message<Event>> so you would have access to the channel and delivery tag headers.

Categories

Resources