java.lang.InterruptedException in Spark's rdd. collect method - java

I am getting an exception an InterruptedException in Spark's rdd.collect() method. The stack trace are:
java.lang.Error: java.lang.InterruptedException
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:998)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304)
at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:202)
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:218)
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:153)
at org.apache.spark.util.ThreadUtils$.awaitReady(ThreadUtils.scala:222)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:633)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2027)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2048)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2067)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2092)
at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:939)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
at org.apache.spark.rdd.RDD.collect(RDD.scala:938)
at org.apache.spark.api.java.JavaRDDLike$class.collect(JavaRDDLike.scala:361)
at org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
at com.cybernetix.kaafka.EnrichEventSparkConsumer.lambda$startEnrichEventConsumer$ef798c17$1(EnrichEventSparkConsumer.java:76)
at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$1.apply(JavaDStreamLike.scala:272)
at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$1.apply(JavaDStreamLike.scala:272)
at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:628)
at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:628)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
at scala.util.Try$.apply(Try.scala:192)
at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
... 2 more
And my code:
mapper = new ObjectMapper();
Collection<String> topics = Arrays.asList(sparkConfiguration.getEnrichEventTopic());
Map<String, Object> kafkaParams = new HashedMap();
kafkaParams.put("bootstrap.servers", sparkConfiguration.getBootStrapServers());
kafkaParams.put("key.deserializer", StringDeserializer.class);
kafkaParams.put("value.deserializer", StringDeserializer.class);
kafkaParams.put("group.id", "group1");
kafkaParams.put("auto.offset.reset", "latest");
kafkaParams.put("enable.auto.commit", true);
JavaInputDStream<ConsumerRecord<String, String>> enrichEventRDD =
KafkaUtils.createDirectStream(javaStreamingContext, LocationStrategies.PreferConsistent(),
ConsumerStrategies.<String, String>Subscribe(topics, kafkaParams));
JavaDStream<String> enrichEventDStream = enrichEventRDD.map((x) -> x.value());
JavaDStream<List<Map<String, Object>>> enrichDataModelDStream = enrichEventDStream.map(convertIntoMapList);
// JavaDStream<EnrichEventDataModel> enrichDataModelDStream = enrichEventDStream.map(convertIntoEnrichModel);
enrichDataModelDStream.foreachRDD(enrichDataModelRdd -> {
if (enrichDataModelRdd.count() > 0) {
List<List<Map<String, Object>>> enrichEventDataModelList = enrichDataModelRdd.collect();
saveDataToElasticSearch(enrichEventDataModelList);
writeToLogsFile("EnrichEvent consumer SparkStreaming job started.", Constants.INFO);

Related

JEE MDB with Kafka consumer throws .KafkaException: Failed to construct kafka consumer when deploying

I'm playing with Kafka and how to implement a consumer in a JEE Message Bean.
But when I try to deploy my consumer I receive the following error (in my payara log)
Exception while loading the app : EJB Container initialization error
java.lang.Exception
at com.sun.enterprise.connectors.inbound.ConnectorMessageBeanClient.setup(ConnectorMessageBeanClient.java:215)
at org.glassfish.ejb.mdb.MessageBeanContainer.<init>(MessageBeanContainer.java:221)
at org.glassfish.ejb.mdb.MessageBeanContainerFactory.createContainer(MessageBeanContainerFactory.java:63)
at org.glassfish.ejb.startup.EjbApplication.loadContainers(EjbApplication.java:225)
at org.glassfish.ejb.startup.EjbDeployer.load(EjbDeployer.java:286)
at org.glassfish.ejb.startup.EjbDeployer.load(EjbDeployer.java:104)
at org.glassfish.internal.data.ModuleInfo.load(ModuleInfo.java:218)
at org.glassfish.internal.data.ApplicationInfo.load(ApplicationInfo.java:334)
at com.sun.enterprise.v3.server.ApplicationLifecycle.prepare(ApplicationLifecycle.java:570)
at org.glassfish.deployment.admin.DeployCommand.execute(DeployCommand.java:588)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$2$1.run(CommandRunnerImpl.java:556)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$2$1.run(CommandRunnerImpl.java:552)
at java.base/java.security.AccessController.doPrivileged(AccessController.java:399)
at java.base/javax.security.auth.Subject.doAs(Subject.java:376)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$2.execute(CommandRunnerImpl.java:551)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$3.run(CommandRunnerImpl.java:582)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$3.run(CommandRunnerImpl.java:574)
at java.base/java.security.AccessController.doPrivileged(AccessController.java:399)
at java.base/javax.security.auth.Subject.doAs(Subject.java:376)
at com.sun.enterprise.v3.admin.CommandRunnerImpl.doCommand(CommandRunnerImpl.java:573)
at com.sun.enterprise.v3.admin.CommandRunnerImpl.doCommand(CommandRunnerImpl.java:1497)
at com.sun.enterprise.v3.admin.CommandRunnerImpl.access$1300(CommandRunnerImpl.java:120)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$ExecutionContext.execute(CommandRunnerImpl.java:1879)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$ExecutionContext.execute(CommandRunnerImpl.java:1755)
at org.glassfish.admin.rest.utils.ResourceUtil.runCommand(ResourceUtil.java:272)
at org.glassfish.admin.rest.utils.ResourceUtil.runCommand(ResourceUtil.java:240)
at org.glassfish.admin.rest.utils.ResourceUtil.runCommand(ResourceUtil.java:294)
at org.glassfish.admin.rest.resources.TemplateListOfResource.createResource(TemplateListOfResource.java:136)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory.lambda$static$0(ResourceMethodInvocationHandlerFactory.java:52)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:124)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:167)
at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$ResponseOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:176)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:79)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:475)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:397)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:81)
at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:255)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:248)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:244)
at org.glassfish.jersey.internal.Errors.process(Errors.java:292)
at org.glassfish.jersey.internal.Errors.process(Errors.java:274)
at org.glassfish.jersey.internal.Errors.process(Errors.java:244)
at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:265)
at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:234)
at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:680)
at org.glassfish.jersey.grizzly2.httpserver.GrizzlyHttpContainer.service(GrizzlyHttpContainer.java:356)
at org.glassfish.admin.rest.adapter.RestAdapter$2.service(RestAdapter.java:335)
at org.glassfish.admin.rest.adapter.RestAdapter.service(RestAdapter.java:189)
at com.sun.enterprise.v3.services.impl.ContainerMapper$HttpHandlerCallable.call(ContainerMapper.java:520)
at com.sun.enterprise.v3.services.impl.ContainerMapper.service(ContainerMapper.java:217)
at org.glassfish.grizzly.http.server.HttpHandler.runService(HttpHandler.java:182)
at org.glassfish.grizzly.http.server.HttpHandler.doHandle(HttpHandler.java:156)
at org.glassfish.grizzly.http.server.HttpServerFilter.handleRead(HttpServerFilter.java:218)
at org.glassfish.grizzly.filterchain.ExecutorResolver$9.execute(ExecutorResolver.java:95)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.executeFilter(DefaultFilterChain.java:260)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.executeChainPart(DefaultFilterChain.java:177)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.execute(DefaultFilterChain.java:109)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.process(DefaultFilterChain.java:88)
at org.glassfish.grizzly.ProcessorExecutor.execute(ProcessorExecutor.java:53)
at org.glassfish.grizzly.nio.transport.TCPNIOTransport.fireIOEvent(TCPNIOTransport.java:524)
at org.glassfish.grizzly.strategies.AbstractIOStrategy.fireIOEvent(AbstractIOStrategy.java:89)
at org.glassfish.grizzly.strategies.WorkerThreadIOStrategy.run0(WorkerThreadIOStrategy.java:94)
at org.glassfish.grizzly.strategies.WorkerThreadIOStrategy.access$100(WorkerThreadIOStrategy.java:33)
at org.glassfish.grizzly.strategies.WorkerThreadIOStrategy$WorkerThreadRunnable.run(WorkerThreadIOStrategy.java:114)
at org.glassfish.grizzly.threadpool.AbstractThreadPool$Worker.doWork(AbstractThreadPool.java:569)
at org.glassfish.grizzly.threadpool.AbstractThreadPool$Worker.run(AbstractThreadPool.java:549)
at java.base/java.lang.Thread.run(Thread.java:833)
Caused by: org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:823)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:664)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:645)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:625)
at fish.payara.cloud.connectors.kafka.inbound.KafkaAsynchWorker.<init>(KafkaAsynchWorker.java:112)
at fish.payara.cloud.connectors.kafka.inbound.KafkaResourceAdapter.endpointActivation(KafkaResourceAdapter.java:107)
at com.sun.enterprise.connectors.inbound.ConnectorMessageBeanClient.setup(ConnectorMessageBeanClient.java:207)
... 70 more
Caused by: org.apache.kafka.common.KafkaException: class org.apache.kafka.common.serialization.StringDeserializer is not an instance of org.apache.kafka.common.serialization.Deserializer
at org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:403)
at org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:434)
at org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:419)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:708)
... 76 more
]]
My platform is:
Payara Server 5.2021.10 #badassfish (build 879)
Java 17
I have followed some tutorials
My Producer:
import org.apache.kafka.clients.producer.Callback;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.kafka.clients.producer.RecordMetadata;
public class Producer implements Runnable {
KafkaProducer<String, String> producer;
String topic = null;
public Producer() {
Properties consumerProps = new Properties();
consumerProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
consumerProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");
consumerProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");
producer = new KafkaProducer<>(consumerProps);
topic = System.getenv().getOrDefault("TOPIC_NAME", "emails");
}
static Random rnd = new Random();
#Override
public void run() {
System.out.println("Producing to topic "+ topic);
while (true) {
String key = "";
producer.send(new ProducerRecord(topic, "key-" + rnd.nextInt(10), "val-" + rnd.nextInt(10)), new Callback() {
#Override
public void onCompletion(RecordMetadata rm, Exception excptn) {
System.out.println("Sent data....");
}
});
try {
Thread.sleep(10000); //take it easy!
} catch (InterruptedException ex) {
Logger.getLogger(Producer.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
}
And my consumer:
import fish.payara.cloud.connectors.kafka.api.KafkaListener;
import fish.payara.cloud.connectors.kafka.api.OnRecord;
import javax.ejb.ActivationConfigProperty;
import javax.ejb.MessageDriven;
import org.apache.kafka.clients.consumer.ConsumerRecord;
#MessageDriven(activationConfig = {
#ActivationConfigProperty(propertyName = "clientId", propertyValue = "mailClient"),
#ActivationConfigProperty(propertyName = "groupIdConfig", propertyValue = "mailGroup"),
#ActivationConfigProperty(propertyName = "topics", propertyValue = "emails"),
#ActivationConfigProperty(propertyName = "bootstrapServersConfig", propertyValue = "localhost:9092"),
#ActivationConfigProperty(propertyName = "autoCommitInterval", propertyValue = "100"),
#ActivationConfigProperty(propertyName = "retryBackoff", propertyValue = "1000"),
#ActivationConfigProperty(propertyName = "keyDeserializer", propertyValue = "org.apache.kafka.common.serialization.StringDeserializer"),
#ActivationConfigProperty(propertyName = "valueDeserializer", propertyValue = "org.apache.kafka.common.serialization.StringDeserializer"),
#ActivationConfigProperty(propertyName = "pollInterval", propertyValue = "30000"),
})
public class KafkaMDB implements KafkaListener {
public KafkaMDB() {
}
#OnRecord( topics={"emails"})
public void test(ConsumerRecord<Object,Object> record) {
System.out.println("Payara Kafka MDB record " + record );
}
}
I can't see what the problem is.
I have installed the kafka-rar-0.7.0 before deploying my consumer and it goes without problems.
Kim

Spring Integration with Kafka throwing ClassCastException

I have a case where i want to publish message from Kafka Producer, My message is just a POJO object e.g CreateRequest. So for consuming I have added below code
#Bean
#InboundChannelAdapter(channel = "inputchannel", poller=#Poller(fixedDelay = "5000",errorChannel = "errorChannel"))
public KafkaMessageSource consumeMsg() {
ConsumerFactory<String, String> cf = consumerFactory();
KafkaMessageSource kafkaMessageSource = new KafkaMessageSource (cf, new ConsumerProperties("Kafka_Topic"));
kafkaMessageSource.getConsumerProperties().setGroupId("group_id");
kafkaMessageSource.getConsumerProperties().setClientId("clientid");
kafkaMessageSource.setMessageConverter(messageConverter());
kafkaMessageSource.setPayloadType(CreateResponse.class);
return kafkaMessageSource;
}
#Bean
public ConsumerFactory consumerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, this.bootstrapServers);
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
props.put(ConsumerConfig.MAX_POLL_INTERVAL_MS_CONFIG,60000);
props.put(ConsumerConfig.MAX_POLL_RECORDS_CONFIG,10);
props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG,false);
props.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG,"30000");
props.put(ConsumerConfig.GROUP_ID_CONFIG, GROUP_ID);
props.put(JsonDeserializer.VALUE_DEFAULT_TYPE, CreateResponse.class);
return new DefaultKafkaConsumerFactory(props);
}
#Bean
RecordMessageConverter messageConverter(){
return new StringJsonMessageConverter();
}
Also,I have added setMessageConverter and setPayloadType to get response of type CreateResponse but still i am getting response of type KafkaMessageSource which is throwing java.lang.ClassCastException cannot cast KafkaMessageSource to type CreateResponse
kafkaMessageSource.setMessageConverter(messageConverter());
kafkaMessageSource.setPayloadType(CreateResponse.class);
Stacktrace:-
org.springframework.messaging.MessageHandlingException: error occurred during processing message in 'MethodInvokingMessageProcessor' [org.springframework.integration.handler.MethodInvokingMessageProcessor#6e6d85cd]; nested exception is java.lang.ClassCastException: org.springframework.integration.kafka.inbound.KafkaMessageSource cannot be cast to com.kafka.response.domain.CreateResponse
at org.springframework.integration.support.utils.IntegrationUtils.wrapInHandlingExceptionIfNecessary(IntegrationUtils.java:192) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.handler.MethodInvokingMessageProcessor.processMessage(MethodInvokingMessageProcessor.java:111) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.handler.ServiceActivatingHandler.handleRequestMessage(ServiceActivatingHandler.java:104) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.handler.AbstractReplyProducingMessageHandler.handleMessageInternal(AbstractReplyProducingMessageHandler.java:134) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:62) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.dispatcher.AbstractDispatcher.tryOptimizedDispatch(AbstractDispatcher.java:115) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.dispatcher.UnicastingDispatcher.doDispatch(UnicastingDispatcher.java:133) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.dispatcher.UnicastingDispatcher.dispatch(UnicastingDispatcher.java:106) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.channel.AbstractSubscribableChannel.doSend(AbstractSubscribableChannel.java:72) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:570) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:520) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:187) ~[spring-messaging-5.2.9.RELEASE.jar!/:5.2.9.RELEASE]
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:166) ~[spring-messaging-5.2.9.RELEASE.jar!/:5.2.9.RELEASE]
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:47) ~[spring-messaging-5.2.9.RELEASE.jar!/:5.2.9.RELEASE]
at org.springframework.messaging.core.AbstractMessageSendingTemplate.send(AbstractMessageSendingTemplate.java:109) ~[spring-messaging-5.2.9.RELEASE.jar!/:5.2.9.RELEASE]
at org.springframework.integration.endpoint.SourcePollingChannelAdapter.handleMessage(SourcePollingChannelAdapter.java:196) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.endpoint.AbstractPollingEndpoint.messageReceived(AbstractPollingEndpoint.java:444) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.endpoint.AbstractPollingEndpoint.doPoll(AbstractPollingEndpoint.java:428) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.endpoint.AbstractPollingEndpoint.pollForMessage(AbstractPollingEndpoint.java:376) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.endpoint.AbstractPollingEndpoint.lambda$null$3(AbstractPollingEndpoint.java:323) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.util.ErrorHandlingTaskExecutor.lambda$execute$0(ErrorHandlingTaskExecutor.java:57) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:50) ~[spring-core-5.2.19.RELEASE.jar!/:5.2.19.RELEASE]
at org.springframework.integration.util.ErrorHandlingTaskExecutor.execute(ErrorHandlingTaskExecutor.java:55) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.endpoint.AbstractPollingEndpoint.lambda$createPoller$4(AbstractPollingEndpoint.java:320) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54) [spring-context-5.2.9.RELEASE.jar!/:5.2.9.RELEASE]
at org.springframework.scheduling.concurrent.ReschedulingRunnable.run(ReschedulingRunnable.java:93) [spring-context-5.2.9.RELEASE.jar!/:5.2.9.RELEASE]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_311]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_311]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_311]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [?:1.8.0_311]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_311]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_311]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_311]
Caused by: java.lang.ClassCastException: org.springframework.integration.kafka.inbound.KafkaMessageSource cannot be cast to com.kafka.response.domain.CreateResponse
at com.kafka.response.service.MessageConsumerQueue.consume(MessageConsumerQueue.java:55) ~[classes!/:1.0-SNAPSHOT]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_311]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_311]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_311]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_311]
at org.springframework.expression.spel.support.ReflectiveMethodExecutor.execute(ReflectiveMethodExecutor.java:129) ~[spring-expression-5.2.9.RELEASE.jar!/:5.2.9.RELEASE]
at org.springframework.expression.spel.ast.MethodReference.getValueInternal(MethodReference.java:112) ~[spring-expression-5.2.9.RELEASE.jar!/:5.2.9.RELEASE]
at org.springframework.expression.spel.ast.MethodReference.access$000(MethodReference.java:55) ~[spring-expression-5.2.9.RELEASE.jar!/:5.2.9.RELEASE]
at org.springframework.expression.spel.ast.MethodReference$MethodValueRef.getValue(MethodReference.java:387) ~[spring-expression-5.2.9.RELEASE.jar!/:5.2.9.RELEASE]
at org.springframework.expression.spel.ast.CompoundExpression.getValueInternal(CompoundExpression.java:92) ~[spring-expression-5.2.9.RELEASE.jar!/:5.2.9.RELEASE]
at org.springframework.expression.spel.ast.SpelNodeImpl.getTypedValue(SpelNodeImpl.java:117) ~[spring-expression-5.2.9.RELEASE.jar!/:5.2.9.RELEASE]
at org.springframework.expression.spel.standard.SpelExpression.getValue(SpelExpression.java:375) ~[spring-expression-5.2.9.RELEASE.jar!/:5.2.9.RELEASE]
at org.springframework.integration.util.AbstractExpressionEvaluator.evaluateExpression(AbstractExpressionEvaluator.java:171) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.util.AbstractExpressionEvaluator.evaluateExpression(AbstractExpressionEvaluator.java:156) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.handler.support.MessagingMethodInvokerHelper.invokeExpression(MessagingMethodInvokerHelper.java:637) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.handler.support.MessagingMethodInvokerHelper.fallbackToInvokeExpression(MessagingMethodInvokerHelper.java:630) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.handler.support.MessagingMethodInvokerHelper.processInvokeExceptionAndFallbackToExpressionIfAny(MessagingMethodInvokerHelper.java:614) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.handler.support.MessagingMethodInvokerHelper.invokeHandlerMethod(MessagingMethodInvokerHelper.java:585) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.handler.support.MessagingMethodInvokerHelper.processInternal(MessagingMethodInvokerHelper.java:477) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.handler.support.MessagingMethodInvokerHelper.process(MessagingMethodInvokerHelper.java:355) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.handler.MethodInvokingMessageProcessor.processMessage(MethodInvokingMessageProcessor.java:108) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
... 31 more
Can somebody tell where i am going wrong?
Here is a simple Spring Boot app which demonstrates that intention of the mentioned configuration is correct:
#SpringBootApplication
public class So71806313Application {
public static void main(String[] args) {
SpringApplication.run(So71806313Application.class, args);
}
#Bean
RecordMessageConverter messageConverter(){
return new StringJsonMessageConverter();
}
#Bean
#InboundChannelAdapter(channel = "inputChannel", poller=#Poller(fixedDelay = "5000"))
public KafkaMessageSource<String, String> consumeMsg(ConsumerFactory<String, String> consumerFactory,
RecordMessageConverter messageConverter) {
KafkaMessageSource<String, String> kafkaMessageSource = new KafkaMessageSource<>(consumerFactory,
new ConsumerProperties("Kafka_Topic"));
kafkaMessageSource.getConsumerProperties().setGroupId("group_id");
kafkaMessageSource.getConsumerProperties().setClientId("clientid");
kafkaMessageSource.setMessageConverter(messageConverter);
kafkaMessageSource.setPayloadType(CreateResponse.class);
return kafkaMessageSource;
}
#Bean
QueueChannel inputChannel() {
return new QueueChannel();
}
}
I'm not sure what is your CreateResponse, so I have it as simple as this:
public class CreateResponse {
private String name;
public void setName(String name) {
this.name = name;
}
public String getName() {
return name;
}
}
Then I have a integration test to verify it against an embedded Kafka:
#SpringBootTest
#DirtiesContext
#EmbeddedKafka(bootstrapServersProperty = "spring.kafka.bootstrap-servers", topics = "Kafka_Topic")
class So71806313ApplicationTests {
#Autowired
QueueChannel inputChannel;
#Autowired
KafkaTemplate<String, String> kafkaTemplate;
#Test
void contextLoads() {
this.kafkaTemplate.send("Kafka_Topic", "{\"name\" : \"foo\"}");
Message<?> receive = this.inputChannel.receive(10_000);
assertThat(receive).isNotNull();
assertThat(receive.getPayload())
.isInstanceOf(CreateResponse.class)
.extracting("name")
.isEqualTo("foo");
}
}
If I change that consumeMsg() to this (pay attention to the commented #Bean):
#Autowired
ConsumerFactory<String, String> consumerFactory;
// #Bean
#InboundChannelAdapter(channel = "inputChannel", poller=#Poller(fixedDelay = "5000"))
public KafkaMessageSource consumeMsg() {
KafkaMessageSource kafkaMessageSource = new KafkaMessageSource(consumerFactory,
new ConsumerProperties("Kafka_Topic"));
kafkaMessageSource.getConsumerProperties().setGroupId("group_id");
kafkaMessageSource.getConsumerProperties().setClientId("clientid");
kafkaMessageSource.setMessageConverter(new StringJsonMessageConverter());
kafkaMessageSource.setPayloadType(CreateResponse.class);
return kafkaMessageSource;
}
Then it indeed fails with the:
java.lang.AssertionError:
Expecting actual:
org.springframework.integration.kafka.inbound.KafkaMessageSource#305289b3
to be an instance of:
org.springframework.integration.stackoverflow.so71806313.CreateResponse
but was instance of:
org.springframework.integration.kafka.inbound.KafkaMessageSource
So, please, revise the configuration on your side.

Kafka with PublishSubscribeChannel throwing java.lang.OutOfMemoryError: Direct buffer memory

Let say i have below configuration for KafkaConsumer my Kafka is using PublishSubscribeChannel with taskexecutor.
#Bean
#InboundChannelAdapter(channel = "someInputChannel",poller = #Poller(fixedDelay = "5000", taskExecutor = "taskexecutor"))
public KafkaMessageSource getKafkaMessageSource() {
KafkaMessageSource kafkaMessageSource = new KafkaMessageSource (consumerFactory, new ConsumerProperties("topic"));
kafkaMessageSource.getConsumerProperties().setClientId("listner");
kafkaMessageSource.setMessageConverter(messageConverter());
kafkaMessageSource.setPayloadType(CutsomRequest.class);
return kafkaMessageSource;
}
ThreadPoolTaskExecutor
#Bean(name = "taskexecutor")
public ThreadPoolTaskExecutor queryRequestTaskExecutor() {
ThreadPoolTaskExecutor threadPoolTaskExecutor = new ThreadPoolTaskExecutor();
threadPoolTaskExecutor.setCorePoolSize(poolSize);
threadPoolTaskExecutor.setMaxPoolSize(maxPoolSize);
threadPoolTaskExecutor.setThreadNamePrefix("Request-");
threadPoolTaskExecutor.setWaitForTasksToCompleteOnShutdown(true);
return threadPoolTaskExecutor;
}
Dorequest
#Bean
public MessageChannel doRequest() {
return new PublishSubscribeChannel(taskexecutor);
}
Issue I am facing is java.lang.OutOfMemoryError: Direct buffer memory
Below is log Stacktrace for my issue:-
2022-04-08 09:25:06.859 ERROR 9 --- [scheduling-1] o.s.i.c.MessagePublishingErrorHandler : failure occurred in messaging task
java.lang.OutOfMemoryError: Direct buffer memory
at java.nio.Bits.reserveMemory(Bits.java:695) ~[?:1.8.0_311]
at java.nio.DirectByteBuffer.<init>(DirectByteBuffer.java:123) ~[?:1.8.0_311]
at java.nio.ByteBuffer.allocateDirect(ByteBuffer.java:311) ~[?:1.8.0_311]
at sun.nio.ch.Util.getTemporaryDirectBuffer(Util.java:241) ~[?:1.8.0_311]
at sun.nio.ch.IOUtil.read(IOUtil.java:195) ~[?:1.8.0_311]
at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:378) ~[?:1.8.0_311]
at org.apache.kafka.common.network.PlaintextTransportLayer.read(PlaintextTransportLayer.java:103) ~[kafka-clients-2.8.1.jar!/:?]
at org.apache.kafka.common.network.NetworkReceive.readFrom(NetworkReceive.java:118) ~[kafka-clients-2.8.1.jar!/:?]
at org.apache.kafka.common.network.KafkaChannel.receive(KafkaChannel.java:452) ~[kafka-clients-2.8.1.jar!/:?]
at org.apache.kafka.common.network.KafkaChannel.read(KafkaChannel.java:402) ~[kafka-clients-2.8.1.jar!/:?]
at org.apache.kafka.common.network.Selector.attemptRead(Selector.java:674) ~[kafka-clients-2.8.1.jar!/:?]
at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:576) ~[kafka-clients-2.8.1.jar!/:?]
at org.apache.kafka.common.network.Selector.poll(Selector.java:481) ~[kafka-clients-2.8.1.jar!/:?]
at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:561) ~[kafka-clients-2.8.1.jar!/:?]
at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.poll(ConsumerNetworkClient.java:265) ~[kafka-clients-2.8.1.jar!/:?]
at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.poll(ConsumerNetworkClient.java:236) ~[kafka-clients-2.8.1.jar!/:?]
at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.poll(ConsumerNetworkClient.java:227) ~[kafka-clients-2.8.1.jar!/:?]
at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.awaitMetadataUpdate(ConsumerNetworkClient.java:164) ~[kafka-clients-2.8.1.jar!/:?]
at org.apache.kafka.clients.consumer.internals.AbstractCoordinator.ensureCoordinatorReady(AbstractCoordinator.java:257) ~[kafka-clients-2.8.1.jar!/:?]
at org.apache.kafka.clients.consumer.internals.ConsumerCoordinator.poll(ConsumerCoordinator.java:480) ~[kafka-clients-2.8.1.jar!/:?]
at org.apache.kafka.clients.consumer.KafkaConsumer.updateAssignmentMetadataIfNeeded(KafkaConsumer.java:1261) ~[kafka-clients-2.8.1.jar!/:?]
at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1230) ~[kafka-clients-2.8.1.jar!/:?]
at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1210) ~[kafka-clients-2.8.1.jar!/:?]
at org.springframework.integration.kafka.inbound.KafkaMessageSource.doReceive(KafkaMessageSource.java:441) ~[spring-integration-kafka-3.3.1.RELEASE.jar!/:3.3.1.RELEASE]
at org.springframework.integration.endpoint.AbstractMessageSource.receive(AbstractMessageSource.java:184) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.endpoint.SourcePollingChannelAdapter.receiveMessage(SourcePollingChannelAdapter.java:212) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.endpoint.AbstractPollingEndpoint.doPoll(AbstractPollingEndpoint.java:407) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.endpoint.AbstractPollingEndpoint.pollForMessage(AbstractPollingEndpoint.java:376) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.endpoint.AbstractPollingEndpoint.lambda$null$3(AbstractPollingEndpoint.java:323) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.util.ErrorHandlingTaskExecutor.lambda$execute$0(ErrorHandlingTaskExecutor.java:57) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:50) ~[spring-core-5.2.19.RELEASE.jar!/:5.2.19.RELEASE]
at org.springframework.integration.util.ErrorHandlingTaskExecutor.execute(ErrorHandlingTaskExecutor.java:55) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.integration.endpoint.AbstractPollingEndpoint.lambda$createPoller$4(AbstractPollingEndpoint.java:320) ~[spring-integration-core-5.3.2.RELEASE.jar!/:5.3.2.RELEASE]
at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54) [spring-context-5.2.9.RELEASE.jar!/:5.2.9.RELEASE]
at org.springframework.scheduling.concurrent.ReschedulingRunnable.run(ReschedulingRunnable.java:93) [spring-context-5.2.9.RELEASE.jar!/:5.2.9.RELEASE]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_311]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_311]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_311]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [?:1.8.0_311]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_311]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_311]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_311]
Can someone help me out with this issue. Thanks

Kafka TimeoutException stops Application Startup - How do I properly handle this case?

We are using Kafka Consumers in our Spring Boot application. When creating the beans the KafkaListenerEndpointRegistry is created and try to fetch the data for the Containers. The problem is now, that we sometimes receive a TimeoutException which stops the application start up. See full stacktrace.
2019-08-28 07:49:46,053 [main] ERROR o.s.boot.SpringApplication - Application run failed
org.springframework.context.ApplicationContextException: Failed to start bean 'org.springframework.kafka.config.internalKafkaListenerEndpointRegistry'; nested exception is org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
at org.springframework.context.support.DefaultLifecycleProcessor.doStart(DefaultLifecycleProcessor.java:185)
at org.springframework.context.support.DefaultLifecycleProcessor.access$200(DefaultLifecycleProcessor.java:53)
at org.springframework.context.support.DefaultLifecycleProcessor$LifecycleGroup.start(DefaultLifecycleProcessor.java:360)
at org.springframework.context.support.DefaultLifecycleProcessor.startBeans(DefaultLifecycleProcessor.java:158)
at org.springframework.context.support.DefaultLifecycleProcessor.onRefresh(DefaultLifecycleProcessor.java:122)
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:893)
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.finishRefresh(ServletWebServerApplicationContext.java:161)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:552)
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:140)
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:742)
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:389)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:311)
at org.springframework.boot.web.servlet.support.SpringBootServletInitializer.run(SpringBootServletInitializer.java:151)
at org.springframework.boot.web.servlet.support.SpringBootServletInitializer.createRootApplicationContext(SpringBootServletInitializer.java:131)
at com.pack.MyApplication.createRootApplicationContext(MyApplication.java:45)
at org.springframework.boot.web.servlet.support.SpringBootServletInitializer.onStartup(SpringBootServletInitializer.java:91)
at org.springframework.web.SpringServletContainerInitializer.onStartup(SpringServletContainerInitializer.java:171)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5132)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:717)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:690)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:705)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:978)
at org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1849)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:112)
at org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:773)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:427)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1576)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:309)
at org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:123)
at org.apache.catalina.util.LifecycleBase.setStateInternal(LifecycleBase.java:423)
at org.apache.catalina.util.LifecycleBase.setState(LifecycleBase.java:366)
at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:936)
at org.apache.catalina.core.StandardHost.startInternal(StandardHost.java:841)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1384)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1374)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:134)
at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:909)
at org.apache.catalina.core.StandardEngine.startInternal(StandardEngine.java:262)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
at org.apache.catalina.core.StandardService.startInternal(StandardService.java:421)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
at org.apache.catalina.core.StandardServer.startInternal(StandardServer.java:932)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
at org.apache.catalina.startup.Catalina.start(Catalina.java:633)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:344)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:475)
Caused by: org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
Since I'm new to Kafka I'm not sure how to handle this correctly, so that my application will start also if the meta data cannot be fetched.
I thought about setting the ErrorHandler for the ContainerFactory but I'm not sure if this will resolve my issue.
#Bean
ConcurrentKafkaListenerContainerFactory<String, String>
kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, String> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
factory.setRetryTemplate(getRetryTemplate());
factory.setErrorHandler(new LoggingErrorHandler());
factory.setAutoStartup(false);
return factory;
}
#Bean
public RetryPolicy getRetryPolicy(){
SimpleRetryPolicy simpleRetryPolicy = new SimpleRetryPolicy();
simpleRetryPolicy.setMaxAttempts(getMaxRetryAttempts());
return simpleRetryPolicy;
}
#Bean
public FixedBackOffPolicy getBackOffPolicy() {
FixedBackOffPolicy backOffPolicy = new FixedBackOffPolicy();
backOffPolicy.setBackOffPeriod(getRetryInterval());
return backOffPolicy;
}
#Bean
public RetryTemplate getRetryTemplate(){
RetryTemplate retryTemplate = new RetryTemplate();
retryTemplate.setRetryPolicy(getRetryPolicy());
retryTemplate.setBackOffPolicy(getBackOffPolicy());
return retryTemplate;
}
#Bean
public ConsumerFactory<String, String> consumerFactory() {
return new DefaultKafkaConsumerFactory<>(consumerConfigs());
}
#Bean
public Map<String, Object> consumerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, env.getProperty("spring.kafka.consumer.bootstrap-servers"));
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, env.getProperty("spring.kafka.consumer.key-deserializer"));
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, env.getProperty("spring.kafka.consumer.value-deserializer"));
props.put(ConsumerConfig.GROUP_ID_CONFIG, env.getProperty("spring.kafka.consumer.group-id"));
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, env.getProperty("spring.kafka.consumer.auto-offset-reset"));
props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
props.put(ConsumerConfig.CLIENT_ID_CONFIG, env.getProperty("spring.kafka.consumer.client-id"));
props.put(SaslConfigs.SASL_MECHANISM, env.getProperty("kafka.sasl.mechanism"));
props.put(CommonClientConfigs.SECURITY_PROTOCOL_CONFIG, env.getProperty("kafka.security.protocol.config"));
props.put(SaslConfigs.SASL_JAAS_CONFIG, env.getProperty("kafka.sasl.jaas.configuration"));
try {
Resource res = new ClassPathResource(env.getProperty("spring.kafka.ssl.trust-store-location"));
props.put(SslConfigs.SSL_TRUSTSTORE_LOCATION_CONFIG, res.getFile().getAbsolutePath());
} catch (IOException e) {
logger.error("SSL Truststore for Kafka connection could not be found!", e);
}
props.put(KafkaAvroDeserializerConfig.SCHEMA_REGISTRY_URL_CONFIG, env.getProperty("avro.kafka.consumer.deserializer.schema-url"));
props.put(KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG,
env.getProperty("avro.kafka.consumer.deserializer.specific-avro-reader"));
return props;
}
private int getMaxRetryAttempts(){
return this.retryAttempts;
}
private int getRetryInterval(){
return this.retryInterval;
}
Anyone can explain how the ContainerStoppingErrorHandler works in this case and if it might resolve my issue?
I also thought about creating the KafkaListenerEndpointRegistry bean myself, but don't know how to configure so the Exception is handled properly and the application start up will continue. Maybe the autoStartup is something I should consider in this case and if so, whats might be the correct way to control and handle the start up for the containers?
The Versions used:
spring-kafka version 2.2.7.RELEASE
spring-boot-starter-parent 2.1.6.RELEASE
Thanks in advance.

java.lang.StackOverflowError when doing Spark Streaming

I'm doing Spark Streaming to parse some kafka messages in real time. Before I parse the message, I read some file from local and construct two variables GridMatrix GM and LinkMatcher LM which are useful to parsing. Here is the code that gives me java.lang.StackOverflowError when I submit it using spark-submit xxx.jar:
public class Stream implements Serializable {
GridMatrix GM = GridMatrixConstructor.init_Grid_Matrix(0.001);
LinkMatcher LM = new LinkMatcher();
public void parse_rdd_record(String[] fields) {
try {
System.out.println(InetAddress.getLocalHost().getHostName() + "---->" + Thread.currentThread());
}
catch (Exception e) {
e.printStackTrace();
}
System.out.println(LM.GF.toString());
System.out.println(GM.topleft_x);
}
public void Streaming_process() throws Exception {
SparkConf conf = new SparkConf()
.setAppName("SparkStreaming")
.setMaster("local[*]");
conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
conf.registerKryoClasses(new Class<?>[]{
Class.forName("Streaming.Stream")
});
JavaSparkContext sc = new JavaSparkContext(conf);
sc.setLogLevel("WARN");
JavaStreamingContext ssc = new JavaStreamingContext(sc, new Duration(2000));
Map<String, Object> kafkaParams = new HashMap<>();
kafkaParams.put("bootstrap.servers", "xxx.xx.xx.xx:20103,xxx.xx.xx.xx:20104,xxx.xx.xx.xx:20105");
kafkaParams.put("key.deserializer", StringDeserializer.class);
kafkaParams.put("value.deserializer", StringDeserializer.class);
kafkaParams.put("group.id", "use_a_separate_group_id_for_each_stream");
kafkaParams.put("auto.offset.reset", "latest");
kafkaParams.put("enable.auto.commit", false);
Collection<String> topics = Arrays.asList("nc_topic_gis_test");
JavaInputDStream<ConsumerRecord<String, String>> GPS_DStream =
KafkaUtils.createDirectStream(
ssc,
LocationStrategies.PreferConsistent(),
ConsumerStrategies.<String, String>Subscribe(topics, kafkaParams)
);
JavaPairDStream<String, String> GPS_DStream_Pair = GPS_DStream.mapToPair(
(PairFunction<ConsumerRecord<String, String>, String, String>) record ->
new Tuple2<>("GPSValue", record.value()));
GPS_DStream_Pair.foreachRDD(PairRDD -> PairRDD.foreach(rdd -> {
String[] fields = rdd._2.split(",");
this.parse_rdd_record(fields);
}));
ssc.start();
ssc.awaitTermination();
}
public static void main(String[] args) throws Exception {
new Stream().Streaming_process();
}
}
It gives me the following error:
Exception in thread "streaming-job-executor-0" java.lang.StackOverflowError
at java.io.Bits.putDouble(Bits.java:121)
at java.io.ObjectStreamClass$FieldReflector.getPrimFieldValues(ObjectStreamClass.java:2168)
at java.io.ObjectStreamClass.getPrimFieldValues(ObjectStreamClass.java:1389)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1533)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at java.util.HashMap.internalWriteEntries(HashMap.java:1790)
at java.util.HashMap.writeObject(HashMap.java:1363)
at sun.reflect.GeneratedMethodAccessor37.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:1140)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at java.util.HashMap.internalWriteEntries(HashMap.java:1790)
at java.util.HashMap.writeObject(HashMap.java:1363)
at sun.reflect.GeneratedMethodAccessor37.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
However, if I change the GM and LM to static variable, it runs well. Change line 2 and line 3 to:
private static final GridMatrix GM = GridMatrixConstructor.init_Grid_Matrix(0.001);
private static final LinkMatcher LM = new LinkMatcher();
Could someone tell me why it won't work with none static variables?
The difference between static and non static version is that when non static it sends them to all workers as the Stream closure, when static not by default, except it using by one of streaming lambda, which is not a case.
While sending to the workers it trying to serialize an object. And it fails according to the provided stack trace. The reason most probably because those structures has cyclic declaration on themselfs inside and can not be properly serializable.

Categories

Resources