I have this scenario. I have many queues in Azure ServiceBus, the implementation below works, however it is not flexible since would need to replicate for each queue. I would like to change from dynamically forming, maybe as a parameter in the method send() the queue name and the OUTPUT_CHANNEL for #ServiceActivator and #MessagingGateway, It is possible?
import com.azure.spring.cloud.service.servicebus.properties.ServiceBusEntityType;
import com.azure.spring.integration.core.handler.DefaultMessageHandler;
import com.azure.spring.messaging.servicebus.core.ServiceBusTemplate;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.integration.annotation.MessagingGateway;
import org.springframework.integration.annotation.ServiceActivator;
import org.springframework.messaging.MessageHandler;
import org.springframework.stereotype.Component;
#Component
public class TestIntegration {
private static final String OUTPUT_CHANNEL = "output";
private static final String QUEUE_NAME = "myQueue";
#Autowired
private QueueOutboundGateway messagingGateway;
public void send(String message) {
this.messagingGateway.send(message);
}
#Bean
#ServiceActivator(inputChannel = OUTPUT_CHANNEL)
public MessageHandler queueMessageSender(ServiceBusTemplate serviceBusTemplate) {
serviceBusTemplate.setDefaultEntityType(ServiceBusEntityType.QUEUE);
return new DefaultMessageHandler(QUEUE_NAME, serviceBusTemplate);
}
#MessagingGateway(defaultRequestChannel = OUTPUT_CHANNEL)
public interface QueueOutboundGateway {
void send(String text);
}
}
The com.azure.spring.integration.core.handler.DefaultMessageHandler supports a dynamic destination resolution from message headers:
private String toDestination(Message<?> message) {
if (message.getHeaders().containsKey(AzureHeaders.NAME)) {
return message.getHeaders().get(AzureHeaders.NAME, String.class);
}
return this.destination;
}
So, what you need is a #Header(name = AzureHeaders.NAME) String destination argument on your gateway's send() method. There is no reason in the dynamic nature for the OUTPUT_CHANNEL: only one gateway and one service activator for that DefaultMessageHandler is enough. You call send() with payload and target destination as params.
Related
I'm just trying use Camel Reactive Stream together with Spring Boot Reactor using the following code
package com.manning.camel.reactive;
import org.apache.camel.ProducerTemplate;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.component.reactive.streams.api.CamelReactiveStreamsService;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RestController;
import reactor.core.publisher.Mono;
/**
* A simple Camel route that triggers from a timer and calls a bean and prints to system out.
* <p/>
* Use <tt>#Component</tt> to make Camel auto-detect this route when starting.
*/
#RestController
public class MySpringBootRouter extends RouteBuilder {
#Autowired
private ProducerTemplate template;
#Autowired
private CamelReactiveStreamsService crss;
#GetMapping
public Mono<String> sayHi() {
template.asyncSendBody("direct:works", "Hi");
//return Mono.from(crss.fromStream("greet", String.class));
return Mono.from(crss.fromStream("greet", String.class));
}
#Override
public void configure() {
from("direct:works")
.log("Fired")
.to("reactive-streams:greet");
}
}
After run the code
java.lang.IllegalStateException: The stream has no active subscriptions
After a long time, solved the error, as can be noticed the Router Class logic was changed a little
#Slf4j
#Service
#AllArgsConstructor
public class MyService {
final CamelContext context;
#PostConstruct
public void consumerData() {
var rCamel = CamelReactiveStreams.get(context);
var numbers = rCamel.fromStream("numbers", Integer.class);
Flux.from(numbers).subscribe(e -> log.info("{}", e));
}
}
#Component
#NoArgsConstructor
public class MyRouter extends RouteBuilder {
// Injects the Subscriber
#Autowired MyService service;
#Override
public void configure() {
//onException(ReactiveStreamsNoActiveSubscriptionsException.class)
// .continued(true);
from("timer://reactiveApp?fixedRate=true&period=2s")
.transform(method(Random.class, "nextInt(100)"))
//.log("${body}");
.to("direct:message");
from("direct:message")
//.log("${body}")
.to("reactive-streams:numbers");
}
}
I am trying to build a Java Spring Boot application that would post & get the messages from Confluent Cloud Kafka.
I followed the article for publishing a Kafka message into Confluent Cloud and it works.
Below is the implementation
KafkaController.java
package com.seroter.confluentboot.controller;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
import com.seroter.confluentboot.dto.Product;
import com.seroter.confluentboot.engine.Producer;
#RestController
#RequestMapping(value = "/kafka")
public class KafkaController {
private final Producer producer;
private final com.seroter.confluentboot.engine.Consumer consumer;
#Autowired
KafkaController(Producer producer,com.seroter.confluentboot.engine.Consumer consumer) {
this.producer = producer;
this.consumer=consumer;
}
#PostMapping(value = "/publish")
public void sendMessageToKafkaTopic(#RequestParam("message") String message) {
this.producer.sendMessage(message);
}
#PostMapping(value="/publishJson")
public ResponseEntity<Product> publishJsonMessage(#RequestBody Product product) {
producer.sendJsonMessage(product);
ResponseEntity<Product> responseEntity=new ResponseEntity<>(product,HttpStatus.CREATED);
return responseEntity;
}
}
Product.java
package com.seroter.confluentboot.dto;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.annotation.JsonPropertyOrder;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
#Data
#NoArgsConstructor
#AllArgsConstructor
#JsonPropertyOrder(value = {"product_id","product_name","quantity","price"})
public class Product {
#JsonProperty(value = "product_id")
private int productId;
#JsonProperty(value="product_name")
private String productName;
private int quantity;
private double price;
}
Producer.java
package com.seroter.confluentboot.engine;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.cloud.stream.annotation.EnableBinding;
import org.springframework.cloud.stream.messaging.Source;
import org.springframework.messaging.support.GenericMessage;
import org.springframework.stereotype.Service;
import com.seroter.confluentboot.dto.Product;
#Service
#EnableBinding(Source.class)
public class Producer {
private static final Logger logger = LoggerFactory.getLogger(Producer.class);
private static final String TOPIC = "users";
#Autowired
private Source source;
public void sendMessage(String message) {
logger.info(String.format("#### -> Producing message -> %s", message));
this.source.output().send(new GenericMessage<>(message));
}
public void sendJsonMessage(Product product)
{
logger.info(String.format("#### -> Producing message -> %s",product.toString()));
this.source.output().send(new GenericMessage<>(product));
}
}
ConfluentBootApplication.java
package com.seroter.confluentboot;
import org.apache.tomcat.util.net.WriteBuffer.Sink;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.cloud.stream.annotation.EnableBinding;
import org.springframework.cloud.stream.messaging.Source;
import org.springframework.http.ResponseEntity;
import org.springframework.messaging.support.GenericMessage;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import com.seroter.confluentboot.dto.Product;
#SpringBootApplication
#EnableBinding(Source.class)
#RestController
#RequestMapping(value = "/confluent")
public class ConfluentBootApplication {
#Autowired
private com.seroter.confluentboot.engine.Consumer consumer;
public static void main(String[] args) {
SpringApplication.run(ConfluentBootApplication.class, args);
}
}
application.properties
spring.cloud.stream.kafka.binder.brokers=pkc-epwny.eastus.azure.confluent.cloud:9092
spring.cloud.stream.bindings.output.destination=test
spring.cloud.stream.kafka.binder.configuration.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="user" password="password";
spring.cloud.stream.kafka.binder.configuration.sasl.mechanism=PLAIN
spring.cloud.stream.kafka.binder.configuration.security.protocol=SASL_SSL
server.port=9000
It works
and I could verify
I want to build a Spring Boot Consumer REST Endpoint? How do I do it?
Update:
ConfluentConsumer.java
package com.seroter.confluentboot.controller;
import org.springframework.cloud.stream.annotation.EnableBinding;
import org.springframework.cloud.stream.annotation.StreamListener;
import org.springframework.cloud.stream.messaging.Sink;
import com.seroter.confluentboot.dto.Product;
//#RestController
#EnableBinding(Sink.class)
public class ConfluentConsumer {
#StreamListener(Sink.INPUT)
public void consumeMessage(Product product)
{
System.out.println("******************************");
System.out.println("============= "+product.getProductId()+" ================");
System.out.println("******************************");
}
}
Consumer.java
package com.seroter.confluentboot.engine;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.context.annotation.PropertySource;
import org.springframework.stereotype.Service;
#Service
#PropertySource("classpath:application.properties")
public class Consumer {
private final Logger logger = LoggerFactory.getLogger(Producer.class);
}
I believe what you are trying to do here is, pick the latest message from Kafka consumer via a REST endpoint i.e. you want manually poll the Kafka topic. Publishing a message via a REST endpoint is logical, but consuming messages through an endpoint doesn't sound like a good idea. If you want a queue behavior, you should use RabbitMQ instead of Kafka.
But still, if you want to use Kafka and poll the message manually. You can use one of the below 2 approaches.
Approach 1: Create a ConsumerFactory and get a Consumer from the factory, and then poll Kafka using a Consumer
#Configuration
class KafkaConsumerConfig {
private static final String TOPIC_NAME = "test";
private final String userName = "username";
private final String password = "password";
#Bean
public ConsumerFactory<String, String> consumerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,"pkc-epwny.eastus.azure.confluent.cloud:9092");
props.put(ConsumerConfig.GROUP_ID_CONFIG,"conumer-gp-1");
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG,StringDeserializer.class);
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG,StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,StringDeserializer.class);
props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
props.put(CommonClientConfigs.SECURITY_PROTOCOL_CONFIG, "SASL_SSL");
props.put(SaslConfigs.SASL_MECHANISM, "PLAIN");
props.put(SaslConfigs.SASL_JAAS_CONFIG, "org.apache.kafka.common.security.plain.PlainLoginModule required username=" + userName + " password=" + password);
return new DefaultKafkaConsumerFactory<>(props);
}
#Bean
public Consumer<String, String> createConsumer(ConsumerFactory consumerFactory) {
Consumer consumer = consumerFactory.createConsumer("consumer-group-1", "client-1");
consumer.subscribe(List.of(TOPIC_NAME));
return consumer;
}
}
You can read the topic name, group-id, bootstrap servers, SSL configs, etc. from the application.properties
Now you can consume messages by injecting the consumer in the RestController.
private final Consumer<String, String> consumer;
#Autowired
ConsumerController(Consumer<String, String> consumer) {
this.consumer = consumer;
}
#GetMapping("retrieveMessage")
public String getMessage() {
// Kafka might return more than 1 events so be careful
ConsumerRecords<String, String> consumerRecords = consumer.poll(Duration.ofMillis(1000));
if (!consumerRecords.isEmpty()) {
Iterator<ConsumerRecord<String, String>> iterator = consumerRecords.iterator();
String value = iterator.next().value();
consumer.commitSync();
return value;
} else {
return "no message";
}
}
Approach 2: store the messages in an in-memory queue and then poll the in-memory queue
spring.cloud.stream.bindings.input.destination=test
Then store the messages in a Queue and retrieve it via REST endpoint
#RestController
#EnableBinding(Sink.class)
class ConsumerController {
private final Queue<String> queue;
ConsumerController() {
this.queue = new ConcurrentLinkedQueue<>();
}
#StreamListener(target = Sink.INPUT)
public void consume(String message) {
this.queue.add(message);
}
#GetMapping("getMessage")
public String retrieveMessage() {
return this.queue.poll();
}
}
Cons: you'll lose all the in-memory messages if your application restarts. Thus, storing the messages in a distributed cache such as Redis would be a better solution.
I'm trying to access the request body from WebFlux's HandlerFunctionFunction but I am getting java.lang.IllegalStateException: Only one connection receive subscriber allowed..
I want to do something similar to below code block
public class ExampleHandlerFilterFunction
implements HandlerFilterFunction<ServerResponse, ServerResponse> {
#Override
public Mono<ServerResponse> filter(ServerRequest serverRequest,
HandlerFunction<ServerResponse> handlerFunction) {
if (serverRequest.pathVariable("name").equalsIgnoreCase("test")) {
return serverRequest.bodyToMono(Player.class)
.doOnNext(loggerService :: logAndDoSomethingElse)
.then(handlerFunction.handle(serverRequest);
}
return handlerFunction.handle(serverRequest);
}
}
I tried serverRequest.bodyToMono(Player.class).cache() too, but did not work.
Update: Adding handler and router functions
Handler Function
#Component
public class PlayerHandler {
#Autowired
private final playerRepository;
public PlayerHandler(PlayerRepository palyerRepository) {
this.palyerRepository = playerRepository;
}
public Mono<ServerResponse> savePlayer(ServerRequest request) {
Mono<String> id = request.bodyToMono(Player.class)
.map(playerRepository::save)
.map(Player::getId);
return ok().body(id, String.class);
}
}
Router function
#Bean
public RouterFunction<ServerResponse> route(PlayerHandler playerHandler) {
return RouterFunctions
.route(POST("/players/"), playerHandler::save)
.filter(new ExampleHandlerFilterFunction());
}
Logger service
public Mono<Void> T logAndDoSomethingElse(T t){
---- auditing business logic----
return loggerRepository.save(asJsonb);
}
Can someone help me? Thanks
import org.json.JSONObject;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;
import org.springframework.web.reactive.function.server.HandlerFilterFunction;
import org.springframework.web.reactive.function.server.HandlerFunction;
import org.springframework.web.reactive.function.server.ServerRequest;
import org.springframework.web.reactive.function.server.ServerResponse;
import reactor.core.publisher.Mono;
#Component
public class FundsAuthorizationFilter implements HandlerFilterFunction<ServerResponse, ServerResponse> {
#Override
public Mono<ServerResponse> filter(ServerRequest request, HandlerFunction<ServerResponse> handlerFunction) {
String block = request.bodyToMono(String.class).block();
JSONObject jsonObj = new JSONObject(block);
ServerRequest.Builder newRequestBuilder = ServerRequest.from(request);
newRequestBuilder.body(block);
return handlerFunction.handle(newRequestBuilder.build());
}
}
I found a solution for this, that is clone the serverRequest and set the body to the new request
I realize that these are internal APIs, but if they're available internally why not make them usable by the less privileged masses, and they're also extremely useful. Even though these APIs were internal in Jersey 2.25 they could be used, and I'd like to upgrade my Jersey version without breaking my custom Jersey extensions.
It's certainly possible to extend ValueParamProvider in Jersey 2.27, but I no longer see a way to register that Provider along with it's triggering annotation. Looking at how Jersey does this for its own implementations, it now uses a BoostrapConfigurator, which seems to be internalized to such an extent that external implementations can't use the same methodology.
Maybe I'm wrong about that, and if someone has a clear description of how, that would be great. Otherwise, does anyone know of a method for doing the same thing?
This used to work...
ResourceConfig resourcceConfig = ...
resourceConfig.register(new AbstractBinder() {
#Override
protected void configure (){
bind(MyParamValueFactoryProvider.class).to(ValueFactoryProvider.class).in(Singleton.class);
bind(MyParamInjectionResolver.class).to(new TypeLiteral<InjectionResolver<EntityParam>>() {
}).in(Singleton.class);
}
}
});
With appropriate implementations of AbstractValueFactoryProvider and ParamInjectionResolver.
Now it looks like you need to implement ValueParamProvider, which is easy enough, but I'm not sure how to register that properly with the Jersey framework anymore. Any help appreciated.
You don't need to use any BootstrapConfigurator. All you need to is add the services to the injector and they will be added later to the list of value providers.
To configure it, you can still use the AbstractBinder, but instead of the HK2 one, use the Jersey one. The ValueParamProvider can still be bound the same way, but for the InjectionResolver, you should make sure to implement not the HK2 resolver, but the Jersey one. Then instead of binding to TypeLiteral, bind to GenericType.
I just want to add that a misconception that people have when trying to implement parameter injection is that we also need an InjectResolver to use a custom annotation for the method parameter. This is not the case. The method parameter annotation is just a marker annotation that we should check inside ValueParamProvider#getValueProvider() method. An InjectResolver is only needed for non-method-parameter injections, for instance field and constructor injection. If you don't need that, then you don't need the InjectionResolver.
Below is a complete example using Jersey Test Framework. I didn't use an InjectionResolver, just to show that it's not needed.
import org.glassfish.jersey.internal.inject.AbstractBinder;
import org.glassfish.jersey.server.ContainerRequest;
import org.glassfish.jersey.server.ResourceConfig;
import org.glassfish.jersey.server.model.Parameter;
import org.glassfish.jersey.server.spi.internal.ValueParamProvider;
import org.glassfish.jersey.test.JerseyTest;
import org.junit.Test;
import javax.inject.Singleton;
import javax.ws.rs.Consumes;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import javax.ws.rs.Produces;
import javax.ws.rs.client.Entity;
import javax.ws.rs.core.Feature;
import javax.ws.rs.core.FeatureContext;
import javax.ws.rs.core.Response;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
import java.util.function.Function;
import static org.assertj.core.api.Assertions.assertThat;
public class ParamInjectTest extends JerseyTest {
#Target({ElementType.PARAMETER, ElementType.FIELD})
#Retention(RetentionPolicy.RUNTIME)
public #interface Auth {
}
private static class User {
private String username;
public User(String username) {
this.username = username;
}
public String getUsername() {
return this.username;
}
}
public static class AuthValueParamProvider implements ValueParamProvider {
#Override
public Function<ContainerRequest, ?> getValueProvider(Parameter parameter) {
if (parameter.getRawType().equals(User.class)
&& parameter.isAnnotationPresent(Auth.class)) {
return new UserParamProvider();
}
return null;
}
private class UserParamProvider implements Function<ContainerRequest, User> {
#Override
public User apply(ContainerRequest containerRequest) {
return new User("Peeskillet");
}
}
#Override
public PriorityType getPriority() {
return Priority.HIGH;
}
}
public static class AuthFeature implements Feature {
#Override
public boolean configure(FeatureContext context) {
context.register(new AbstractBinder() {
#Override
protected void configure() {
bind(AuthValueParamProvider.class)
.to(ValueParamProvider.class)
.in(Singleton.class);
}
});
return true;
}
}
#Path("test")
#Consumes("text/plain")
public static class TestResource {
#POST
#Produces("text/plain")
public Response post(String text, #Auth User user) {
return Response.ok(user.getUsername() + ":" + text).build();
}
}
#Override
public ResourceConfig configure() {
return new ResourceConfig()
.register(TestResource.class)
.register(AuthFeature.class);
}
#Test
public void testIt() {
final Response response = target("test")
.request()
.post(Entity.text("Test"));
assertThat(response.getStatus()).isEqualTo(200);
assertThat(response.readEntity(String.class)).isEqualTo("Peeskillet:Test");
}
}
Another thing I'll mention is that in previous versions where you extended AbstractValueFactoryProvider and implemented a ParamInjectionResolver, most people did this to follow how Jersey implemented parameter injection while still allowing for other injection points (field and constructor). If you still want to use this pattern, you can.
Below is the AuthFeature from the above test refactored
public static class AuthFeature implements Feature {
#Override
public boolean configure(FeatureContext context) {
InjectionManager im = InjectionManagerProvider.getInjectionManager(context);
AuthValueParamProvider authProvider = new AuthValueParamProvider();
im.register(Bindings.service(authProvider).to(ValueParamProvider.class));
Provider<ContainerRequest> request = () -> {
RequestProcessingContextReference reference = im.getInstance(RequestProcessingContextReference.class);
return reference.get().request();
};
im.register(Bindings.injectionResolver(new ParamInjectionResolver<>(authProvider, Auth.class, request)));
return true;
}
}
I figured this stuff out just digging through the source. All this configuration I saw in the ValueParamProviderConfigurator. You don't need to implement your own ParamInjectionResolver. Jersey has a concrete class already that we can just use, as done in the feature above.
If you change the TestResource to inject by field, it should work now
#Path("test")
#Consumes("text/plain")
public static class TestResource {
#Auth User user;
#POST
#Produces("text/plain")
public Response post(String text) {
return Response.ok(user.getUsername() + ":" + text).build();
}
}
I'm trying to create a simple spring boot app with spring boot that "produce" messages to a rabbitmq exchange/queue and another sample spring boot app that "consume" these messages.
So I have two apps (or microservices if you wish).
1) "producer" microservice
2) "consumer" microservice
The "producer" has 2 domain objects. Foo and Bar which should be converted to json and send to rabbitmq.
The "consumer" should receive and convert the json message into a domain Foo and Bar respectively.
For some reason I can not make this simple task. There are not much examples about this.
For the message converter I want to use org.springframework.messaging.converter.MappingJackson2MessageConverter
Here is what I have so far:
PRODUCER MICROSERVICE
package demo.producer;
import org.springframework.amqp.core.Binding;
import org.springframework.amqp.core.BindingBuilder;
import org.springframework.amqp.core.Queue;
import org.springframework.amqp.core.TopicExchange;
import org.springframework.amqp.rabbit.core.RabbitMessagingTemplate;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.CommandLineRunner;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.Bean;
import org.springframework.messaging.converter.MappingJackson2MessageConverter;
import org.springframework.stereotype.Service;
#SpringBootApplication
public class ProducerApplication implements CommandLineRunner {
public static void main(String[] args) {
SpringApplication.run(ProducerApplication.class, args);
}
#Bean
Queue queue() {
return new Queue("queue", false);
}
#Bean
TopicExchange exchange() {
return new TopicExchange("exchange");
}
#Bean
Binding binding(Queue queue, TopicExchange exchange) {
return BindingBuilder.bind(queue).to(exchange).with("queue");
}
#Bean
public MappingJackson2MessageConverter jackson2Converter() {
MappingJackson2MessageConverter converter = new MappingJackson2MessageConverter();
return converter;
}
#Autowired
private Sender sender;
#Override
public void run(String... args) throws Exception {
sender.sendToRabbitmq(new Foo(), new Bar());
}
}
#Service
class Sender {
#Autowired
private RabbitMessagingTemplate rabbitMessagingTemplate;
#Autowired
private MappingJackson2MessageConverter mappingJackson2MessageConverter;
public void sendToRabbitmq(final Foo foo, final Bar bar) {
this.rabbitMessagingTemplate.setMessageConverter(this.mappingJackson2MessageConverter);
this.rabbitMessagingTemplate.convertAndSend("exchange", "queue", foo);
this.rabbitMessagingTemplate.convertAndSend("exchange", "queue", bar);
}
}
class Bar {
public int age = 33;
}
class Foo {
public String name = "gustavo";
}
CONSUMER MICROSERVICE
package demo.consumer;
import org.springframework.amqp.rabbit.annotation.EnableRabbit;
import org.springframework.amqp.rabbit.annotation.RabbitListener;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.CommandLineRunner;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.stereotype.Service;
#SpringBootApplication
#EnableRabbit
public class ConsumerApplication implements CommandLineRunner {
public static void main(String[] args) {
SpringApplication.run(ConsumerApplication.class, args);
}
#Autowired
private Receiver receiver;
#Override
public void run(String... args) throws Exception {
}
}
#Service
class Receiver {
#RabbitListener(queues = "queue")
public void receiveMessage(Foo foo) {
System.out.println("Received <" + foo.name + ">");
}
#RabbitListener(queues = "queue")
public void receiveMessage(Bar bar) {
System.out.println("Received <" + bar.age + ">");
}
}
class Foo {
public String name;
}
class Bar {
public int age;
}
And here is the exception I'm getting:
org.springframework.amqp.rabbit.listener.exception.ListenerExecutionFailedException: Listener method could not be invoked with the incoming message
Endpoint handler details:
Method [public void demo.consumer.Receiver.receiveMessage(demo.consumer.Bar)]
Bean [demo.consumer.Receiver#1672fe87]
at org.springframework.amqp.rabbit.listener.adapter.MessagingMessageListenerAdapter.invokeHandler(MessagingMessageListenerAdapter.java:116)
at org.springframework.amqp.rabbit.listener.adapter.MessagingMessageListenerAdapter.onMessage(MessagingMessageListenerAdapter.java:93)
at org.springframework.amqp.rabbit.listener.AbstractMessageListenerContainer.doInvokeListener(AbstractMessageListenerContainer.java:756)
at org.springframework.amqp.rabbit.listener.AbstractMessageListenerContainer.invokeListener(AbstractMessageListenerContainer.java:679)
at org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer.access$001(SimpleMessageListenerContainer.java:83)
at org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer$1.invokeListener(SimpleMessageListenerContainer.java:170)
at org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer.invokeListener(SimpleMessageListenerContainer.java:1257)
at org.springframework.amqp.rabbit.listener.AbstractMessageListenerContainer.executeListener(AbstractMessageListenerContainer.java:660)
at org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer.doReceiveAndExecute(SimpleMessageListenerContainer.java:1021)
at org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer.receiveAndExecute(SimpleMessageListenerContainer.java:1005)
at org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer.access$700(SimpleMessageListenerContainer.java:83)
at org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer$AsyncMessageProcessingConsumer.run(SimpleMessageListenerContainer.java:1119)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.springframework.amqp.support.converter.MessageConversionException: Cannot handle message
... 13 common frames omitted
Caused by: org.springframework.messaging.converter.MessageConversionException: No converter found to convert to class demo.consumer.Bar, message=GenericMessage [payload=byte[10], headers={amqp_receivedRoutingKey=queue, amqp_receivedExchange=exchange, amqp_deliveryTag=1, amqp_deliveryMode=PERSISTENT, amqp_consumerQueue=queue, amqp_redelivered=false, id=87cf7e06-a78a-ddc1-71f5-c55066b46b11, amqp_consumerTag=amq.ctag-msWSwB4bYGWVO2diWSAHlw, contentType=application/json;charset=UTF-8, timestamp=1433989934574}]
at org.springframework.messaging.handler.annotation.support.PayloadArgumentResolver.resolveArgument(PayloadArgumentResolver.java:115)
at org.springframework.messaging.handler.invocation.HandlerMethodArgumentResolverComposite.resolveArgument(HandlerMethodArgumentResolverComposite.java:77)
at org.springframework.messaging.handler.invocation.InvocableHandlerMethod.getMethodArgumentValues(InvocableHandlerMethod.java:127)
at org.springframework.messaging.handler.invocation.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:100)
at org.springframework.amqp.rabbit.listener.adapter.MessagingMessageListenerAdapter.invokeHandler(MessagingMessageListenerAdapter.java:113)
... 12 common frames omitted
The exception says there is no converter, and that is true, my problem is that I have no idea how to set the MappingJackson2MessageConverter converter in the consumer side (please note that I want to use org.springframework.messaging.converter.MappingJackson2MessageConverter and not org.springframework.amqp.support.converter.JsonMessageConverter)
Any thoughts ?
Just in case, you can fork this sample project at:
https://github.com/gustavoorsi/rabbitmq-consumer-receiver
Ok, I finally got this working.
Spring uses a PayloadArgumentResolver to extract, convert and set the converted message to the method parameter annotated with #RabbitListener. Somehow we need to set the mappingJackson2MessageConverter into this object.
So, in the CONSUMER app, we need to implement RabbitListenerConfigurer. By overriding configureRabbitListeners(RabbitListenerEndpointRegistrar registrar) we can set a custom DefaultMessageHandlerMethodFactory, to this factory we set the message converter, and the factory will create our PayloadArgumentResolver with the the correct convert.
Here is a snippet of the code, I've also updated the git project.
ConsumerApplication.java
package demo.consumer;
import org.springframework.amqp.rabbit.annotation.EnableRabbit;
import org.springframework.amqp.rabbit.annotation.RabbitListener;
import org.springframework.amqp.rabbit.annotation.RabbitListenerConfigurer;
import org.springframework.amqp.rabbit.listener.RabbitListenerEndpointRegistrar;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.Bean;
import org.springframework.messaging.converter.MappingJackson2MessageConverter;
import org.springframework.messaging.handler.annotation.support.DefaultMessageHandlerMethodFactory;
import org.springframework.stereotype.Service;
#SpringBootApplication
#EnableRabbit
public class ConsumerApplication implements RabbitListenerConfigurer {
public static void main(String[] args) {
SpringApplication.run(ConsumerApplication.class, args);
}
#Bean
public MappingJackson2MessageConverter jackson2Converter() {
MappingJackson2MessageConverter converter = new MappingJackson2MessageConverter();
return converter;
}
#Bean
public DefaultMessageHandlerMethodFactory myHandlerMethodFactory() {
DefaultMessageHandlerMethodFactory factory = new DefaultMessageHandlerMethodFactory();
factory.setMessageConverter(jackson2Converter());
return factory;
}
#Override
public void configureRabbitListeners(RabbitListenerEndpointRegistrar registrar) {
registrar.setMessageHandlerMethodFactory(myHandlerMethodFactory());
}
#Autowired
private Receiver receiver;
}
#Service
class Receiver {
#RabbitListener(queues = "queue")
public void receiveMessage(Foo foo) {
System.out.println("Received <" + foo.name + ">");
}
#RabbitListener(queues = "queue")
public void receiveMessage(Bar bar) {
System.out.println("Received <" + bar.age + ">");
}
}
class Foo {
public String name;
}
class Bar {
public int age;
}
So, if you run the Producer microservice it will add 2 messages in the queue. One that represent a Foo object and another that represent a Bar object.
By running the consumer microservice you will see that both are consumed by the respective method in the Receiver class.
Updated issue:
There is a conceptual problem about queuing from my side I think. What I wanted to achieved can not be possible by declaring 2 methods annotated with #RabbitListener that points to the same queue. The solution above was not working properly. If you send to rabbitmq, let say, 6 Foo messages and 3 Bar messages, they wont be received 6 times by the listener with Foo parameter. It seems that the listener are invoked in parallel so there is no way to discriminate which listener to invoke based on the method argument type.
My solution (and I'm not sure if this is the best way, I'm open to suggestions here) is to create a queue for each entity.
So now, I have queue.bar and queue.foo, and update #RabbitListener(queues = "queue.foo")
Once again, I've updated the code and you can check it out in my git repository.
Have not done this myself but it seems like you need to register the appropriate conversions by setting up a RabbitTemplate. Take a look at section 3.1.8 in this Spring documentation. I know it is configured using the AMQP classes but if the messaging class you are mentioning is compatible there is no reason you can't substitute it. Looks like this reference explains how you might do it using Java configuration rather than XML. I have not really used Rabbit so I don't have any personal experience but I would love to hear what you find out.