Kafka spring integration authorization with sasl - java

I am trying to connect to kafka server via spring integration module with SASL config and get error
java.lang.IllegalArgumentException: Could not find a 'KafkaClient' entry in the JAAS configuration. System property 'java.security.auth.login.config' is not set
but when I construct simple consumer and poll messages everything works fine. Can someone tell me how to turn off JAAS authorization or connect via it properly.
Here is my KafkaConfig.java
and SaslConfiguration.java. Thanks for answer!

There is a KafkaConsumerFactory which simply can accept the same set of properties you are mentioning in you gist: https://docs.spring.io/spring-kafka/docs/2.1.10.RELEASE/reference/html/_reference.html#_receiving_messages

Related

Configure spring kafka schema registry security

Iam using spring kafka and avro,
The schema registry is secured with plain sasl protocol.
I must develop a kafka consumer and producer, i don't find a way with spring kafka to configure the security properties for schema registry.
Does spring kafka library already integrate this configuration ?
I need to configure kafkastore.. In a schema registry properties file?
I didn't find any examples with spring kafka
there is docs like this related to your question and demo examples in github.
note that with spring you can have common properties for kafka, just make sure to use spring kafka dependency, maybe first try to establish connection between both consumer and producer and then put sasl plain security.

Elasticsearch Client is defaulting to localhost

I am using spring.data.elasticsearch with Elasticsearch 7.14.1 in my Spring Boot (2.5.4) application.
My application.properties is something like this
spring.elasticsearch.rest.uris=elasticsearch:9200
spring.data.elasticsearch.cluster=elasticsearch:9200
spring.data.elasticsearch.repositories.enabled=true
This works fine as long as the invocation is from localhost, no issues. However, when I try to bring up my Spring Boot container, I see a failure with NoReachableHostException
reactor.core.Exceptions$ErrorCallbackNotImplemented: org.springframework.data.elasticsearch.client.NoReachableHostException: Host 'localhost:9200' not reachable. Cluster state is offline.
blah-blah-service
Caused by: org.springframework.data.elasticsearch.client.NoReachableHostException: Host **'localhost:9200'** not reachable. Cluster state is offline.
blah-blah-service
at org.springframework.data.elasticsearch.client.reactive.SingleNodeHostProvider.lambda$lookupActiveHost$3(SingleNodeHostProvider.java:101)
Clearly, my suggestion to use "elasticsearch" host (defined network, that is tested and accessible from within and outside docker containers), hasn't gone well with Spring Data for whatever reason. I have even used
#PropertySource("classpath:mysearch.properties")
in my application to try and coax these properties into the app, but doesn't look like anything works. Is there something I am missing in my Elasticsearch configuration or otherwise?
PS: I have exercised curl http://elasticsearch:9200 from within the container and find no issues
These configurations are Spring Boot specific, Spring Data Elasticsearch does not use them. But as far as I can see, you are configuring the transport client (cluster entry, should not be used anyway) and the imperative REST client, but according to the error message you are using the reactive REST client.
According to the Spring Boot documentation you would need to set spring.data.elasticsearch.client.reactive.endpoints
In the .properties file, I used the below and solved the issue for me;
spring.elasticsearch.rest.uris=http://localhost:<port_number>
spring.data.elasticsearch.client.reactive.endpoints=localhost:<port_number>

Apache camel route with rabbit MQ returns "because of No endpoint could be found for"

I have tried to implement a simple application with Apache Camel and RabbitMQ. Below is the route that I have:
from("direct:startQueuePoint")
.id("idOfQueueHere")
.marshal(jsonDataFormat)
.to("rabbitmq:tasks?hostname=localhost&port=5672&autoDelete=false&routingKey=camel")
.end();
When I run the spring boot application which runs this route, it throws an error:
because of No endpoint could be found for: rabbitmq://tasks?autoDelete=false&hostname=localhost&port=5672&routingKey=camel, please check your classpath contains the needed Camel component jar
I created an exchange by name 'tasks' in RabbitMQ management console and binded it to the queue 'task_queue' with the routing key 'camel'. I could see in the netstat that the port 5672 is running the erlang exe.
I am not sure what mistake I am making here. Could someone please help me out here?

How to request https endpoints through quarkus

I am creating endpoint that is dependent on another endpoint,i have created interface and did all of that stuff but when i requested https://example.com to give me info it did not responded and request timeout exception came up.the quarkus is not supporting https request i also have added certificates kindly let me know what i am doing wrong or what i need to do.
quarkus.http.ssl.certificate.file=META-INF/dev.crt
quarkus.http.ssl.certificate.key-file=META-INF/dev.com.key
com.package.xyz/mp-rest/url=https://example.com
You need to specify some properties in your application.properties file:
quarkus.http.ssl-port=8443
quarkus.http.insecure-requests=enabled
quarkus.http.ssl.certificate.key-store-file=keystore.jks
quarkus.http.ssl.certificate.key-store-password=password
Documentation source I used for this was the Quarkus cookbook available from RedHat: https://developers.redhat.com/books/quarkus-cookbook see section 3.8

How can you use TLS for Kafka in Quarkus?

The Kafka guide from Quarkus works nicely when running Kafka locally in Docker. I'm trying to change this sample by replacing the local Kafka service with a hosted Kafka service in the cloud which requires TLS.
Does anyone know how I can configure this? In the Quarkus documentation and the Smallrye documentation I don't see any properties for this.
I'd like to use the Kafka service in the IBM Cloud. Based on the documentation I've tried the following configuration in application.properties:
kafka.bootstrap.servers=broker-0-8c8cph49mx2p2wqy.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093,broker-4-8c8cph49mx2p2wqy.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093,broker-3-8c8cph49mx2p2wqy.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093,broker-5-8c8cph49mx2p2wqy.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093,broker-2-8c8cph49mx2p2wqy.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093,broker-1-8c8cph49mx2p2wqy.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093
kafka.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="token" password="...";
kafka.sasl.mechanism=PLAIN
kafka.security.protocol=SASL_SSL
kafka.ssl.protocol=TLSv1.2
Update:
I've also tried Gunnar's suggestion below, but it doesn't work. When I use the following application.properties ...
mp.messaging.outgoing.generated-price.connector=smallrye-kafka
mp.messaging.outgoing.generated-price.topic=prices
mp.messaging.outgoing.generated-price.value.serializer=org.apache.kafka.common.serialization.IntegerSerializer
mp.messaging.outgoing.generated-price.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="token" password="...";
mp.messaging.outgoing.generated-price.sasl.mechanism=PLAIN
mp.messaging.outgoing.generated-price.security.protocol=SASL_SSL
mp.messaging.outgoing.generated-price.ssl.protocol=TLSv1.2
mp.messaging.incoming.prices.connector=smallrye-kafka
mp.messaging.incoming.prices.topic=prices
mp.messaging.incoming.prices.value.deserializer=org.apache.kafka.common.serialization.IntegerDeserializer
mp.messaging.outgoing.prices.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="token" password="...";
mp.messaging.outgoing.prices.sasl.mechanism=PLAIN
mp.messaging.outgoing.prices.security.protocol=SASL_SSL
mp.messaging.outgoing.prices.ssl.protocol=TLSv1.2
kafka.bootstrap.servers=broker-0-8c8cph49mx2p2wqy.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093,broker-4-8c8cph49mx2p2wqy.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093,broker-3-8c8cph49mx2p2wqy.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093,broker-5-8c8cph49mx2p2wqy.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093,broker-2-8c8cph49mx2p2wqy.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093,broker-1-8c8cph49mx2p2wqy.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093
... I get an error:
javax.enterprise.inject.spi.DeploymentException: java.lang.IllegalArgumentException: Invalid channel configuration - the connector attribute must be set for channel prices
at io.quarkus.smallrye.reactivemessaging.runtime.SmallRyeReactiveMessagingLifecycle.onApplicationStart(SmallRyeReactiveMessagingLifecycle.java:22)
Is TLS currently possible for Kafka in Quarkus?
Thanks
Have you tried specifying the relevant properties at the channel level? E.g.
mp.messaging.outgoing.generated-price.connector=smallrye-kafka
mp.messaging.outgoing.generated-price.topic=mytopic
mp.messaging.outgoing.generated-price.ssl.protocol=...
mp.messaging.outgoing.generated-price.ssl.keystore.location=...
mp.messaging.outgoing.generated-price.ssl.keystore.password=...
You also could refer to variables when requiring the same values for multiple topics.
One property is incorrect in the accepted answer by #Gunnar. It should be "security" instead of "ssl" in the property name.
mp.messaging.outgoing.generated-price.security.protocol=SSL

Categories

Resources