Consume 2 topics but process only one topic which is enabled - java

For our application we are writing the smae messages to 2 different Kafka Brokers with 2 different topics, now at the consumer part I have to consume both the topics and process only one topic that is enabled in spring boot property file.
How I can do it using spring boot and KafkaListner??

You can configure this simply by :
#KafkaListener(topics = { "${spring.kafka.topic1}", "${spring.kafka.topic2}" })
Now if you want to consume both but process one and discard the other you can use keys in the messages and create the key accordingly on both topics
Ref:
Java consumer API : ConsumerRecord#key()

Related

Exactly once Producer and Consumption - Apache Kafka and SpringBoot

I am working on a microservices project. In this project, there is a Microservice A is doing a process in various steps. At the completion of each step, Microservice sends a message into a kafka topic. Then another Microservice B consumes the message from the kafka topic and sends an email notifying the successful completion of the step. I need Exactly once semantics for this. I am using KafkaTemplate.send in Microservice A and #KafkaListener to read the message in Microservice B. My question is whether KafkaTemplate producer and #KafkaListener consumer are idempotent and if not, how can I make them idempotent.
Regards,
I am creating autowiring the KafkaTemplate using the following code:-
#Autowired
public EventProducer(NewTopic topic, KafkaTemplate<String, Event> kafkaTemplate) {
this.kafkaTemplate = kafkaTemplate;
}
Exactly once semantics in Kafka apply to consume->process->produce operations within the same application - even then, only the entire cpp is "exactly once"; the consume->process part is at least once; consumption is always at least once (or at most once), including in your scenario (for B).

Auto create KafkaListeners

I work with apache-kafka and web flux (spring boot) and I want to know if there is a method to auto create a KafkaListener for each topic I add in application.yml(or properties)
This is not what consumer is for. The Kafka topic is a stream of data constantly changing . What is business purpose of that http request? Maybe you want to stream such a topic request to the Flux? Then consider to use Spring Integration dynamic flows and its toReactivePublisher() feature:
https://docs.spring.io/spring-integration/docs/current/reference/html/dsl.html#java-dsl-runtime-flows
https://docs.spring.io/spring-integration/docs/current/reference/html/reactive-streams.html#java-dsl
This sample shows something about Kafka and dynamic flows: https://github.com/spring-projects/spring-integration-samples/tree/main/dsl/kafka-dsl.
Also this one demonstrates some “to WebFlux” technique : https://github.com/artembilan/sandbox/tree/master/amqp-to-webflux.
Or you can look into Reactor Kafka: https://projectreactor.io/docs/kafka/release/reference/.

Spring cloud kafka binder query

We have a requirement where we are consuming messages from one topic then there is some enrichment happening and then we are publishing the message to another topic. below are the events
Consumer - Consume the message
Enrichment - Enriched the consumed message
Producer - Published Enriched message to other topic
I am using Spring cloud kafka binder and things are working fine. suddenly we observed that producer is sending duplicate message to the topic and then we made Producer is idempotent. We have autocommitOffSet to false for better control. Below is what we are doing in the method
#StreamListener("INPUT")
#SendTo("OUTPUT")
public void consumer(Message message){
String inputMessage = message.getPayload.toString();
String enrichMessage = // Enrichment on inputMessage
return enrichMessage;
}
We observed if ack.acknowledge() failed due to some issue, Message still sent to the outbound channel. How can we handle entire consumer/producer as part of one transaction so that if acknowledge fail message will not sent to the topic.
I have set below transaction properties as well
spring.cloud.stream.kafka.binder.transaction.transactionIdPrefix=TX-
spring.cloud.stream.kafka.binder.transaction.producer.configuration.ack=all
spring.cloud.stream.kafka.binder.transaction.producer.configuration.retries=1
spring.cloud.stream.kafka.bindings.input.consumer.autoCommitOffset=true
spring.cloud.stream.kafka.bindings.input.consumer.enableDlq=true
spring.cloud.stream.kafka.bindings.input.consumer.dlqName=error.topic
spring.cloud.stream.kafka.bindings.input.consumer.autoCommitOnError=true
If there is any example available that would be really helpful.
Cheers
You need to make the binder transactional. See the documentation
https://docs.spring.io/spring-cloud-stream-binder-kafka/docs/3.1.4/reference/html/spring-cloud-stream-binder-kafka.html#_kafka_binder_properties
spring.cloud.stream.kafka.binder.transaction.transactionIdPrefix
Enables transactions in the binder. See transaction.id in the Kafka documentation and Transactions in the spring-kafka documentation. When transactions are enabled, individual producer properties are ignored and all producers use the spring.cloud.stream.kafka.binder.transaction.producer.* properties.
Default null (no transactions)
Note that consumers on the output topic must be configured with isolation.level=read_committed to avoid receiving rolled-back records.

Connecting to multiple clusters spring kafka

I would like to consume messages from one kafka cluster and publish to another kafka cluster. Would like to know how to configure this using spring-kafka?
Simply configure the consumer and producer factories with different bootstrap.servers properties.
If you are using Spring Boot, see
https://docs.spring.io/spring-boot/docs/current/reference/html/appendix-application-properties.html#spring.kafka.consumer.bootstrap-servers
and
https://docs.spring.io/spring-boot/docs/current/reference/html/appendix-application-properties.html#spring.kafka.producer.bootstrap-servers
If you are creating your own factory #Beans, set the properties there.
https://docs.spring.io/spring-kafka/docs/current/reference/html/#connecting
you can use spring cloud stream kafka binders.
create two stream one for consume and one for producing.
for consumer
public interface InputStreamExample{
String INPUT = "consumer-in";
#Input(INPUT)
MessageChannel readFromKafka();
}
for producer
public interface ProducerStreamExample{
String OUTPUT = "produce-out";
#Output(OUTPUT)
MessageChannel produceToKafka();
}
for consuming message use:
#StreamListener(value = ItemStream.INPUT)
public void processMessage(){
/*
code goes here
*/
}
for producing
//here producerStreamExample is instance of ProducerStreamExample
producerStreamExample.produceToKafka().send(/*message goes here*/);
Now configure consumer and producer cluster using binder, you can use consumer-in for consumer cluster and produce-out for producing cluster.
properties file
spring.cloud.stream.binders.kafka-a.environment.spring.cloud.stream.kafka.binder.brokers:<consumer cluster>
#other properties for this binders
#bind kafka-a to consumer-in
spring.cloud.stream.bindings.consumer-in.binder=kafka-a #kafka-a binding to consumer-in
#similary other properties of consumer-in, like
spring.cloud.stream.bindings.consumer-in.destination=<topic>
spring.cloud.stream.bindings.consumer-in.group=<consumer group>
#now configure cluster to produce
spring.cloud.stream.binders.kafka-b.environment.spring.cloud.stream.kafka.binder.brokers:<cluster where to produce>
spring.cloud.stream.bindings.produce-out.binder=kafka-b #here kafka-b, binding to produce-out
#similary you can do other configuration like topic
spring.cloud.stream.bindings.produce-out.destination=<topic>
Refer to this for more configuration: https://cloud.spring.io/spring-cloud-stream-binder-kafka/spring-cloud-stream-binder-kafka.html

Consume messages from RabbitMQ queue using spring cloud stream 3.0+

I have a Producer producing messages in a RabbitMQ queue by using a direct exchange.
queue name: TEMP_QUEUE,
exchange name: TEMP_DIRECT_EXCHANGE
Producing to this queue is easy since on my producer application I use Spring AMQP which I am familiar with.
On my Consumer application, I need to use Spring cloud stream version 3.0+.
I want to avoid using legacy annotations like #EnableBinding, #StreamListener because they are about to be depracated.
Legacy code for my application would look like that :
#EnableBinding(Bindings.class)
public class TempConsumer {
#StreamListener(target = "TEMP_QUEUE")
public void consumeFromTempQueue(MyObject object) {
// do stuff with the object
}
}
public interface Bindings {
#Input("TEMP_QUEUE")
SubscribableChannel myInputBinding();
}
From their docs I have found out I can do something like that
#Bean
public Consumer<MyObject> consumeFromTempQueue() {
return obj -> {
// do stuff with the object
};
}
It is not clear to me how do I specify that this bean will consume from TEMP_QUEUE? Also what if I want to consume from multiple queues?
See Consuming from Existing Queues/Exchanges.
You can consume from multiple queues with
spring.cloud.stream.bindings.consumeFromTempQueue-in-0.destination=q1,q2,q3
spring.cloud.stream.bindings.consumer.multiplex=true
Without multiplex you'll get 3 bindings; with multiplex, you'll get 1 listener container listening to multiple queues.
You need to use the application.yml to bind your bean.
spring.cloud.stream:
function.definition: consumeFromTempQueue
You can use this configuration to configure source, process and sink as well. In your case you are just using a source.
You can read this post for more information.

Categories

Resources