Spring cloud kafka and avro serialization issue - java

I use spring-cloud-stream-schema to read avro messages from kafka. I configured input channel in MessagesChannels:
#Input("topicName1")
SubscribableChannel fromInput1();
I have configuration file like that:
#Configuration
#EnableBinding(MessagesChannels.class)
#EnableSchemaRegistryClient
public class MessageConfiguration {
#Bean
public MessageConverter topic1MessageConverter() throws IOException {
return new AvroSchemaMessageConverter(MimeType.valueOf("avro/bytes"));
}
}
And my consumer is called with
fromInput1().subscribe(this::onMessage);
void onMessage(Message message) {
}
When I actually sent message I got this error:
nested exception is java.lang.ClassCastException:
org.apache.avro.generic.GenericData$Record cannot be cast to [B
Actually raw bytes are parsed correctly into org.apache.avro.generic.GenericData$Record. But spring requires Message class. How to cast GenericData$Record to Message or how to cast GenericData$Record directly to generated by avro-tools class?
More details:
2017-03-06 11:23:10.695 ERROR 19690 --- [afka-listener-1] o.s.kafka.listener.LoggingErrorHandler : Error while processing: ConsumerRecord(topic = topic1, partition = 0, offset = 7979, CreateTime = 1488784987569, checksum = 623709057, serialized key size = -1, serialized value size = 36, key = null, value = {"foor": "bar"})
org.springframework.messaging.MessageHandlingException: error occurred in message handler [org.springframework.cloud.stream.binder.AbstractMessageChannelBinder$ReceivingHandler#4bf9d802]; nested exception is java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to [B
at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:139)
at org.springframework.integration.channel.FixedSubscriberChannel.send(FixedSubscriberChannel.java:70)
at org.springframework.integration.channel.FixedSubscriberChannel.send(FixedSubscriberChannel.java:64)

I think you need to set the contentType for the incoming message channel to use application/*+avro as specified here

Related

ClassCastException with Stacktrace Hazelcast version 4.2.5 using ReplicatedMap

Using Hazelcast version 4.2.5 in a webapp deployed on Tomcat on Kubernetes. We're frequently("every 5 seconds") seeing ClassCastException with a stacktrace in the application logs.
Here's the ClassCastException :
java.lang.ClassCastException: class java.lang.String cannot be cast to class com.hazelcast.internal.serialization.impl.HeapData (java.lang.String is in module java.base of loader 'bootstrap'; com.hazelcast.internal.serialization.impl.HeapData is in unnamed module of loader org.apache.catalina.loader.ParallelWebappClassLoader #2f04993d)
27-Oct-2022 22:57:56.357 WARNING [hz.rogueUsers.cached.thread-2] com.hazelcast.internal.metrics.impl.MetricsCollectionCycle.null Collecting metrics from source com.hazelcast.replicatedmap.impl.ReplicatedMapService failed
at com.hazelcast.internal.util.executor.HazelcastManagedThread.run(HazelcastManagedThread.java:102)
at com.hazelcast.internal.util.executor.HazelcastManagedThread.executeRun(HazelcastManagedThread.java:76)
at java.base/java.lang.Thread.run(Thread.java:834)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at com.hazelcast.internal.util.executor.CachedExecutorServiceDelegate$Worker.run(CachedExecutorServiceDelegate.java:217)
at com.hazelcast.spi.impl.executionservice.impl.DelegateAndSkipOnConcurrentExecutionDecorator$DelegateDecorator.run(DelegateAndSkipOnConcurrentExecutionDecorator.java:77)
at com.hazelcast.internal.metrics.impl.MetricsService.collectMetrics(MetricsService.java:154)
at com.hazelcast.internal.metrics.impl.MetricsService.collectMetrics(MetricsService.java:160)
at com.hazelcast.internal.metrics.impl.MetricsRegistryImpl.collect(MetricsRegistryImpl.java:316)
at com.hazelcast.internal.metrics.impl.MetricsCollectionCycle.collectDynamicMetrics(MetricsCollectionCycle.java:88)
at com.hazelcast.replicatedmap.impl.ReplicatedMapService.provideDynamicMetrics(ReplicatedMapService.java:387)
at com.hazelcast.replicatedmap.impl.ReplicatedMapService.getStats(ReplicatedMapService.java:357)
at com.hazelcast.replicatedmap.impl.ReplicatedMapService.getLocalReplicatedMapStats(ReplicatedMapService.java:197)
at com.hazelcast.replicatedmap.impl.LocalReplicatedMapStatsProvider.getLocalReplicatedMapStats(LocalReplicatedMapStatsProvider.java:85)
Here's how we're setting up Hazelcast.
private static HazelcastInstance setupHazelcastConfig() {
Config config = new Config();
config.setInstanceName("rogueUsers");
NetworkConfig network = config.getNetworkConfig();
network.setPort(5701).setPortCount(20);
network.setPortAutoIncrement(true);
JoinConfig join = network.getJoin();
join.getMulticastConfig().setEnabled(true);
// join.getTcpIpConfig()
// .setEnabled(true);
HazelcastInstance hz = Hazelcast.getOrCreateHazelcastInstance(config);
ReplicatedMapConfig replicatedMapConfig =
config.getReplicatedMapConfig("rogueUsers");
replicatedMapConfig.setInMemoryFormat(InMemoryFormat.BINARY);
replicatedMapConfig.setAsyncFillup(true);
replicatedMapConfig.setStatisticsEnabled(true);
replicatedMapConfig.setSplitBrainProtectionName("splitbrainprotection-name");
ReplicatedMap<String, String> map = hz.getReplicatedMap("rogueUsers");
map.addEntryListener(new RogueEntryListener());
return hz;
}
Is this a configuration issue ?
How do I fix this ?
Thanks very much,
The exception is being thrown from the following line:
if (isBinary) {
memoryUsage += ((HeapData) record.getValueInternal()).getHeapCost(); <-- exception
}
which is line 85 of com.hazelcast.replicatedmap.impl.LocalReplicatedMapStats class. The condition being checked is as the following:
boolean isBinary = (replicatedMapConfig.getInMemoryFormat() == InMemoryFormat.BINARY);
so basically, it is related to the format you are saving the data (from the config above you have chosen BINARY).
However, I don't think you are following it correctly since you do the following: ReplicatedMap<String, String> map = hz.getReplicatedMap("rogueUsers"); in your config.
From the Javadoc of com.hazelcast.internal.serialization.Data class:
Data is basic unit of serialization. It stores binary form of an object serialized by SerializationService.toData(Object).
Therefore, try editing your config to this:
ReplicatedMap<Data, Data> map = hz.getReplicatedMap("rogueUsers");

Kafka-streams application error using mapValue() method with Gson

I write a kafka-streams application that getting data from topic "topic_one" (data had received from MySQL). Then I want to get a part (section "after", see below) of this data with KStream interface to make other operations. But I have an error with serialization then I use mapValue(). I am a new in kafka-streams and have no idea how to make and use a proper serde. Can anybody help me?
Source data from topic_one:
[KSTREAM-SOURCE-0000000000]: null, {"before": null, "after": {"id": 1, "category": 1, "item": "abc"}, "source": {"version": "0.8.3.Final", "name": "example", "server_id": 1, "ts_sec": 1581491071, "gtid": null, "file": "mysql-bin.000013", "pos": 217827349, "row": 0, "snapshot": false, "thread": 95709, "db": "example", "table": "item", "query": null}, "op": "c", "ts_ms": 1581491071727}
I want to get:
{"id": 1, "category": 1, "item": "abc"}
My code:
Properties properties = getProperties();
try {
StreamsBuilder builder = new StreamsBuilder();
KStream<String, String> resourceStream = builder.stream("topic_one");
resourceStream.print(Printed.toSysOut());
KStream<String, String> resultStream = resourceStream.mapValues(value ->
new Gson().fromJson(value, JsonObject.class).get("after").getAsJsonObject().toString());
resultStream.print(Printed.toSysOut());
Topology topology = builder.build();
KafkaStreams streams = new KafkaStreams(topology, properties);
streams.cleanUp();
streams.start();
} catch (Exception e) {
System.out.println(e.getMessage());
}
}
private static Properties getProperties() {
Properties properties = new Properties(); // TODO настройки вынести в отдельный файл?
properties.put(StreamsConfig.APPLICATION_ID_CONFIG, "app_id");
properties.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
properties.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
properties.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, GenericAvroSerde.class);
properties.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
properties.put("schema.registry.url", "http://localhost:8081");
return properties;
}
Error:
Exception in thread "streams_id-db618fbf-c3e4-468b-a5a2-18e6b0b9c6be-StreamThread-1" org.apache.kafka.streams.errors.StreamsException: Exception caught in process. taskId=0_0, processor=KSTREAM-SOURCE-0000000000, topic=matomo.matomo.matomo_scenarios_directory, partition=0, offset=30, stacktrace=org.apache.kafka.streams.errors.StreamsException: ClassCastException invoking Processor. Do the Processor's input types match the deserialized types? Check the Serde setup and change the default Serdes in StreamConfig or provide correct Serdes via method parameters. **Make sure the Processor can accept the deserialized input of type key: unknown because key is null, and value: org.apache.avro.generic.GenericData$Record.
Note that although incorrect Serdes are a common cause of error, the cast exception might have another cause (in user code, for example). For example, if a processor wires in a store, but casts the generics incorrectly, a class cast exception could be raised during processing, but the cause would not be wrong Serdes.**
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:122)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:201)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:180)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:133)
at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:87)
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:429)
at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.process(AssignedStreamsTasks.java:474)
at org.apache.kafka.streams.processor.internals.TaskManager.process(TaskManager.java:536)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:792)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:698)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:671)
Caused by: java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to java.lang.String
at org.apache.kafka.streams.kstream.internals.AbstractStream.lambda$withKey$1(AbstractStream.java:103)
at org.apache.kafka.streams.kstream.internals.KStreamMapValues$KStreamMapProcessor.process(KStreamMapValues.java:40)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:118)
... 10 more
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:446)
at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.process(AssignedStreamsTasks.java:474)
at org.apache.kafka.streams.processor.internals.TaskManager.process(TaskManager.java:536)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:792)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:698)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:671)
**Caused by: org.apache.kafka.streams.errors.StreamsException: ClassCastException invoking Processor. Do the Processor's input types match the deserialized types? Check the Serde setup and change the default Serdes in StreamConfig or provide correct Serdes via method parameters. Make sure the Processor can accept the deserialized input of type key: unknown because key is null, and value: org.apache.avro.generic.GenericData$Record.**
Note that although incorrect Serdes are a common cause of error, the cast exception might have another cause (in user code, for example). For example, if a processor wires in a store, but casts the generics incorrectly, a class cast exception could be raised during processing, but the cause would not be wrong Serdes.
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:122)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:201)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:180)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:133)
at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:87)
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:429)
... 5 more
Caused by: java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to java.lang.String
at org.apache.kafka.streams.kstream.internals.AbstractStream.lambda$withKey$1(AbstractStream.java:103)
at org.apache.kafka.streams.kstream.internals.KStreamMapValues$KStreamMapProcessor.process(KStreamMapValues.java:40)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:118)
... 10 more
In your getProperties() method, you defined your value serde as GenericAvroSerde.class, but when you create the streams, you are using String as value type. That's why you get the exception at runtime.
KStream<String, String> resourceStream = ...
KStream<String, String> resultStream = ...
If you really use Avro as message format, then you have the use the correct types, when defining you KStream. But as it seems, you have just JSON strings as values, so you can probably just set the correct value serde by replacing
properties.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, GenericAvroSerde.class);
with
properties.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
Hope it helps.

Apache Ignite: How do I fix the "Unknown pair" error when using a StreamReceiver?

How do I fix the following serialization error when using a DataStreamer with a StreamReceiver? I am guessing that it is not able to find the class when deserializing the RowStreamReceiver.
Error: SEVERE: Failure in Java callback class org.apache.ignite.IgniteException: Platform error:Apache.Ignite.Core.Binary.BinaryObjectException: Unknown pair [platformId=1, typeId=113114]
I'm using Apache Ignite 2.0, and I am trying to employ the same kind of code demonstrated in this test:
https://github.com/apache/ignite/blob/master/modules/platforms/dotnet/Apache.Ignite.Core.Tests/Dataload/DataStreamerTest.cs#L436
I've tried adding the assembly to the configuration, but that didn't seem to help. config.Assemblies.Add(typeof(RowStreamReceiver).Assembly.FullName);
Here's the relevant code:
DataStreamer:
using (var ds = m_ignite.GetDataStreamer<string, IBinaryObject>(CacheName)) {
ds.AllowOverwrite = true;
ds.Receiver = new RowStreamReceiver(); // If I comment this out, the error goes away
Parallel.ForEach(rows.Select((r, i) => new KeyValuePair<long, string>(i, r)), r => {
var pair = BuildRow(r.Key, r.Value);
ds.AddData(pair);
});
}
StreamReceiver:
[Serializable]
public class RowStreamReceiver : IStreamReceiver<string, IBinaryObject> {
public void Receive(ICache<string, IBinaryObject> cache, ICollection<ICacheEntry<string, IBinaryObject>> entries) {
var bin = cache.Ignite.GetBinary();
cache.PutAll(entries.ToDictionary(x => x.Key, x => {
var builder = bin.GetBuilder(x.Value);
SetColumnFields(builder);
return builder.Build();
}));
}
private static void SetColumnFields(IBinaryObjectBuilder builder) {
/* logic to set fields */
}
}
Stack Trace:
Jul 07, 2017 11:25:22 AM java.util.logging.LogManager$RootLogger log
SEVERE: Failure in Java callback class org.apache.ignite.IgniteException: Platform error:Apache.Ignite.Core.Binary.BinaryObjectException: Unknown pair [platformId=1, typeId=113114] ---> Apache.Ignite.Core.Common.JavaException: class org.apache.ignite.binary.BinaryObjectException: Unknown pair [platformId=1, typeId=113114]
at org.apache.ignite.internal.processors.platform.binary.PlatformBinaryProcessor.processInStreamOutStream(PlatformBinaryProcessor.java:119)
at org.apache.ignite.internal.processors.platform.PlatformTargetProxyImpl.inStreamOutStream(PlatformTargetProxyImpl.java:155)
at org.apache.ignite.internal.processors.platform.callback.PlatformCallbackUtils.inLongLongLongObjectOutLong(Native Method)
at org.apache.ignite.internal.processors.platform.callback.PlatformCallbackGateway.dataStreamerStreamReceiverInvoke(PlatformCallbackGateway.java:464)
at org.apache.ignite.internal.processors.platform.datastreamer.PlatformStreamReceiverImpl.receive(PlatformStreamReceiverImpl.java:100)
at org.apache.ignite.internal.processors.datastreamer.DataStreamerUpdateJob.call(DataStreamerUpdateJob.java:137)
at org.apache.ignite.internal.processors.datastreamer.DataStreamProcessor.localUpdate(DataStreamProcessor.java:382)
at org.apache.ignite.internal.processors.datastreamer.DataStreamProcessor.processRequest(DataStreamProcessor.java:301)
at org.apache.ignite.internal.processors.datastreamer.DataStreamProcessor.access$000(DataStreamProcessor.java:58)
at org.apache.ignite.internal.processors.datastreamer.DataStreamProcessor$1.onMessage(DataStreamProcessor.java:88)
at org.apache.ignite.internal.managers.communication.GridIoManager.invokeListener(GridIoManager.java:1257)
at org.apache.ignite.internal.managers.communication.GridIoManager.processRegularMessage0(GridIoManager.java:885)
at org.apache.ignite.internal.managers.communication.GridIoManager.access$2100(GridIoManager.java:114)
at org.apache.ignite.internal.managers.communication.GridIoManager$7.run(GridIoManager.java:802)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.ClassNotFoundException: Unknown pair [platformId=1, typeId=113114]
at org.apache.ignite.internal.MarshallerContextImpl.getClassName(MarshallerContextImpl.java:385)
at org.apache.ignite.internal.processors.platform.binary.PlatformBinaryProcessor.processInStreamOutStream(PlatformBinaryProcessor.java:113)
... 16 more
--- End of inner exception stack trace --- at Apache.Ignite.Core.Impl.Unmanaged.UnmanagedCallbacks.Error(Void* target, Int32 errType, SByte* errClsChars, Int32 errClsCharsLen, SByte* errMsgChars, Int32 errMsgCharsLen, SByte* stackTraceChars, Int32 stackTraceCharsLen, Void* errData, Int32 errDataLen) at Apache.Ignite.Core.Impl.Unmanaged.IgniteJniNativeMethods.TargetInStreamOutStream(Void* ctx, Void* target, Int32 opType, Int64 inMemPtr, Int64 outMemPtr) at Apache.Ignite.Core.Impl.PlatformTarget.DoOutInOp[TR](Int32 type, Action`1 outAction, Func`2 inAction) at Apache.Ignite.Core.Impl.Binary.BinaryProcessor.GetType(Int32 id) at Apache.Ignite.Core.Impl.Binary.Marshaller.GetDescriptor(Boolean userType, Int32 typeId, Boolean requiresType) at Apache.Ignite.Core.Impl.Binary.BinaryReader.ReadFullObject[T](Int32 pos) at Apache.Ignite.Core.Impl.Binary.BinaryReader.TryDeserialize[T](T& res) at Apache.Ignite.Core.Impl.Binary.BinaryReader.Deserialize[T]() at Apache.Ignite.Core.Impl.Binary.BinaryReader.ReadBinaryObject[T](Boolean do Detach) at Apache.Ignite.Core.Impl.Binary.BinaryReader.TryDeserialize[T](T& res) at Apache.Ignite.Core.Impl.Binary.BinaryReader.Deserialize[T]() at Apache.Ignite.Core.Impl.Datastream.StreamReceiverHolder.InvokeReceiver[TK, TV](IStreamReceiver`2 receiver, Ignite grid, IUnmanagedTarget cache, IBinaryStream stream, Boolean keepBinary) at Apache.Ignite.Core.Impl.Datastream.StreamReceiverHolder.Receive(Ignite grid, IUnmanagedTarget cache, IBinaryStream stream, Boolean keepBinary) at Apache.Ignite.Core.Impl.Unmanaged.UnmanagedCallbacks.DataStreamerStreamReceiverInvoke(Int64 memPtr, Int64 unused, Int64 unused1, Void* cache) at Apache.Ignite.Core.Impl.Unmanaged.UnmanagedCallbacks.InLongLongLongObjectOutLong(Void* target, Int32 type, Int64 val1, Int64 val2, Int64 val3, Void* arg)
at org.apache.ignite.internal.processors.platform.PlatformProcessorImpl.loggerLog(PlatformProcessorImpl.java:497)
at org.apache.ignite.internal.processors.platform.callback.PlatformCallbackUtils.inLongLongLongObjectOutLong(Native Method)
at org.apache.ignite.internal.processors.platform.callback.PlatformCallbackGateway.dataStreamerStreamReceiverInvoke(PlatformCallbackGateway.java:464)
at org.apache.ignite.internal.processors.platform.datastreamer.PlatformStreamReceiverImpl.receive(PlatformStreamReceiverImpl.java:100)
at org.apache.ignite.internal.processors.datastreamer.DataStreamerUpdateJob.call(DataStreamerUpdateJob.java:137)
at org.apache.ignite.internal.processors.datastreamer.DataStreamProcessor.localUpdate(DataStreamProcessor.java:382)
at org.apache.ignite.internal.processors.datastreamer.DataStreamProcessor.processRequest(DataStreamProcessor.java:301)
at org.apache.ignite.internal.processors.datastreamer.DataStreamProcessor.access$000(DataStreamProcessor.java:58)
at org.apache.ignite.internal.processors.datastreamer.DataStreamProcessor$1.onMessage(DataStreamProcessor.java:88)
at org.apache.ignite.internal.managers.communication.GridIoManager.invokeListener(GridIoManager.java:1257)
at org.apache.ignite.internal.managers.communication.GridIoManager.processRegularMessage0(GridIoManager.java:885)
at org.apache.ignite.internal.managers.communication.GridIoManager.access$2100(GridIoManager.java:114)
at org.apache.ignite.internal.managers.communication.GridIoManager$7.run(GridIoManager.java:802)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
The "Unknown pair" issue is caused by not using GetDataStreamer correctly. My cache was originally created like this:
ignite.GetOrCreateCache<string, object>(cacheConfig)
so when getting the DataStreamer, I needed to use the same types, and then use KeepWithBinary
var ds = m_ignite.GetDataStreamer<string, object>(CacheName).KeepWithBinary<string, IBinaryObject>()
typeId=113114 is for Row class name. Looks like somewhere in RowStreamReceiver you try to deserialize such an object, but a class can't be found.
Can you attach the debugger to the server node and see where the exception is thrown?

Kafka Streamer: Issue with user defined 'Serdes'

I am using Confluent-3.2.1 as a Kafka streamer. I am trying to aggregate my KGroupedStream<String, MyClass1> into KTable<Windowed<String>,MsgAggr>. While using aggregation, I am also using TimeWindows.of(TimeUnit.SECONDS.toMillis(5)). I am using user defined "Serdes" as an argument to aggregation. The code for User define "Serdes" is,
Map<String, Object> serdeProps = new HashMap<>();
final Serializer<MsgAggr> pageViewSerializer = new JsonPOJOSerializer<>();
serdeProps.put("JsonPOJOClass", MsgAggr.class);
pageViewSerializer.configure(serdeProps, false);
final Deserializer<MsgAggr> pageViewDeserializer = new JsonPOJODeserializer<>();
serdeProps.put("JsonPOJOClass", MsgAggr.class);
pageViewDeserializer.configure(serdeProps, false);
final Serde<MsgAggr> pageViewSerde = Serdes.serdeFrom(pageViewSerializer, pageViewDeserializer);`
Code for Streaming is
KGroupedStream<String, MyClass1> msg_grp = message
.groupByKey();
KTable<Windowed<String>,MsgAggr> msg_win = msg_grp
//.reduce(new Reduced(), arg1, arg2);
.aggregate(new Init(),
new Aggr(),
TimeWindows.of(TimeUnit.SECONDS.toMillis(5)),
pageViewSerde,
"MySample_out");
When I run the code I got the errors:
[2017-05-23 18:16:45,648] ERROR stream-thread [StreamThread-1] Streams application error during processing: (org.apache.kafka.streams.processor.internals.StreamThread:249)
java.lang.ClassCastException: my.kafka.strm.MyClass1 cannot be cast to java.lang.String
at org.apache.kafka.common.serialization.StringSerializer.serialize(StringSerializer.java:24)
at org.apache.kafka.streams.processor.internals.RecordCollectorImpl.send(RecordCollectorImpl.java:64)
at org.apache.kafka.streams.processor.internals.SinkNode.process(SinkNode.java:82)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:202)
at org.apache.kafka.streams.kstream.internals.KStreamFilter$KStreamFilterProcessor.process(KStreamFilter.java:44)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:82)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:202)
at org.apache.kafka.streams.kstream.internals.KStreamMap$KStreamMapProcessor.process(KStreamMap.java:43)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:82)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:202)
at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:66)
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:180)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:436)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:242)
Exception in thread "StreamThread-1" java.lang.ClassCastException: my.kafka.strm.MyClass1 cannot be cast to java.lang.String
at org.apache.kafka.common.serialization.StringSerializer.serialize(StringSerializer.java:24)
at org.apache.kafka.streams.processor.internals.RecordCollectorImpl.send(RecordCollectorImpl.java:64)
at org.apache.kafka.streams.processor.internals.SinkNode.process(SinkNode.java:82)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:202)
at org.apache.kafka.streams.kstream.internals.KStreamFilter$KStreamFilterProcessor.process(KStreamFilter.java:44)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:82)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:202)
at org.apache.kafka.streams.kstream.internals.KStreamMap$KStreamMapProcessor.process(KStreamMap.java:43)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:82)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:202)
at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:66)
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:180)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:436)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:242)
The problem is with message.groupByKey();. Its using the String Serde for your custom class MyClass1. Please implement custom Serializer and deserializer for MyClass1 and use the same in the overloaded version of groupByKey - https://kafka.apache.org/0102/javadoc/org/apache/kafka/streams/kstream/KStream.html#groupByKey(org.apache.kafka.common.serialization.Serde,%20org.apache.kafka.common.serialization.Serde)

HTTP 400 Bad Request : javax.ws.rs.BadRequestException

I create a RESTful web service and write a client to use it . but when I run it i take HTTP 400 Bad Request : javax.ws.rs.BadRequestException exeption . this is my client code :
String webserviceURI = "http://localhost:8084/fsc-access";
ClientConfig clientConfig = new ClientConfig();
Client client = ClientBuilder.newClient(clientConfig);
URI serviceURI = UriBuilder.fromUri(webserviceURI).build();
WebTarget webTarget = client.target(serviceURI);
MultivaluedMap formData = new MultivaluedMapImpl();
formData.add("plate", plate);
formData.add("startTime", start.toString());
formData.add("endTime", end.toString());
Weightings weightings = new Weightings();
weightings.getWeightings().addAll((Collection<? extends Weighting>) webTarget.path("rest").path("report").path("loadWeightingByPlate").
request().accept(MediaType.APPLICATION_XML).post(javax.ws.rs.client.Entity.form(formData), Weightings.class));
and this is my web Service :
#Path("/report")
public class WeightingRESTfulService {
#POST
#Path("/loadWeightingByPlate")
#Produces(MediaType.APPLICATION_XML)
#Consumes(MediaType.APPLICATION_FORM_URLENCODED)
public Weightings LoadWeightingInSpecTimeInSpecPlate(
#FormParam("plate") String plate,
#FormParam("startTime") String _startTime,
#FormParam("endTime") String _endTime,
#Context HttpServletRequest req) {
Long startTime = new Long(_startTime);
Long endTime = new Long(_endTime);
try {
Weightings weightings = new Weightings();
weightings.getWeightings().addAll(Weighting.LoadWeightingInSpecTimeInSpecPlate(startTime, endTime, plate));
System.out.println("no error");
return weightings;
} catch (Exception ex) {
System.out.println("Exception = " + ex);
return null;
}
}
}
can any one help me to use this web Service ?
there is some warning :
21-Aug-2015 23:18:11.797 WARNING [http-nio-8084-exec-123] org.glassfish.jersey.servlet.WebComponent.filterFormParameters A servlet request to the URI http://localhost:8084/fsc-access/rest/report/loadWeightingByPlate contains form parameters in the request body but the request body has been consumed by the servlet or a servlet filter accessing the request parameters. Only resource methods using #FormParam will work as expected. Resource methods consuming the request body by other means will not work as expected.
and there is som exeprions :
Exception in thread "C3P0PooledConnectionPoolManager[identityToken->1hge1379bmmvkmpse6n4w|7936e088]-AdminTaskTimer" java.lang.IllegalStateException: Can't overwrite cause with java.lang.IllegalStateException: Illegal access: this web application instance has been stopped already. Could not load com.mchange.v2.resourcepool.BasicResourcePool$1DestroyResourceTask. The eventual following stack trace is caused by an error thrown for debugging purposes as well as to attempt to terminate the thread which caused the illegal access, and has no functional impact.
at java.lang.Throwable.initCause(Throwable.java:457)
at org.apache.catalina.loader.WebappClassLoader.checkStateForClassLoading(WebappClassLoader.java:1335)
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1216)
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1177)
at com.mchange.v2.resourcepool.BasicResourcePool.destroyResource(BasicResourcePool.java:1040)
at com.mchange.v2.resourcepool.BasicResourcePool.removeResource(BasicResourcePool.java:1507)
at com.mchange.v2.resourcepool.BasicResourcePool.removeResource(BasicResourcePool.java:1477)
at com.mchange.v2.resourcepool.BasicResourcePool.cullExpired(BasicResourcePool.java:1565)
at com.mchange.v2.resourcepool.BasicResourcePool.access$1900(BasicResourcePool.java:44)
at com.mchange.v2.resourcepool.BasicResourcePool$CullTask.run(BasicResourcePool.java:2089)
at java.util.TimerThread.mainLoop(Timer.java:555)
at java.util.TimerThread.run(Timer.java:505)
Caused by: java.lang.ClassNotFoundException
at org.apache.catalina.loader.WebappClassLoader.checkStateForClassLoading(WebappClassLoader.java:1334)
... 10 more
Exception in thread "C3P0PooledConnectionPoolManager[identityToken->1hge1379bmmw228sz1sso|53826b99]-AdminTaskTimer" java.lang.IllegalStateException: Can't overwrite cause with java.lang.IllegalStateException: Illegal access: this web application instance has been stopped already. Could not load com.mchange.v2.resourcepool.BasicResourcePool$1DestroyResourceTask. The eventual following stack trace is caused by an error thrown for debugging purposes as well as to attempt to terminate the thread which caused the illegal access, and has no functional impact.
at java.lang.Throwable.initCause(Throwable.java:457)
at org.apache.catalina.loader.WebappClassLoader.checkStateForClassLoading(WebappClassLoader.java:1335)
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1216)
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1177)
at com.mchange.v2.resourcepool.BasicResourcePool.destroyResource(BasicResourcePool.java:1040)
at com.mchange.v2.resourcepool.BasicResourcePool.removeResource(BasicResourcePool.java:1507)
at com.mchange.v2.resourcepool.BasicResourcePool.removeResource(BasicResourcePool.java:1477)
at com.mchange.v2.resourcepool.BasicResourcePool.cullExpired(BasicResourcePool.java:1565)
at com.mchange.v2.resourcepool.BasicResourcePool.access$1900(BasicResourcePool.java:44)
at com.mchange.v2.resourcepool.BasicResourcePool$CullTask.run(BasicResourcePool.java:2089)
at java.util.TimerThread.mainLoop(Timer.java:555)
at java.util.TimerThread.run(Timer.java:505)
Caused by: java.lang.ClassNotFoundException
at org.apache.catalina.loader.WebappClassLoader.checkStateForClassLoading(WebappClassLoader.java:1334)
... 10 more
loggingfilter :
22-Aug-2015 00:32:32.969 INFO [http-nio-8084-exec-37] org.glassfish.jersey.filter.LoggingFilter.log 1 * Sending client request on thread http-nio-8084-exec-37
1 > POST http://localhost:8084/fsc-access/rest/report/loadWeightingByPlate
1 > Accept: application/xml
1 > Content-Type: application/x-www-form-urlencoded
22-Aug-2015 00:32:33.015 INFO [http-nio-8084-exec-37] org.glassfish.jersey.filter.LoggingFilter.log 2 * Client response received on thread http-nio-8084-exec-37
2 < 200
2 < Content-Length: 1026
2 < Content-Type: application/xml
2 < Date: Fri, 21 Aug 2015 19:54:48 GMT
2 < Server: Apache-Coyote/1.1
Your resource is returning an instance of Weightings, so you just need to cast it, you don't need to do the addAll()
Weightings weightings = new Weightings();
weightings.getWeightings().addAll((Collection<? extends Weighting>) webTarget.path("rest").path("report").path("loadWeightingByPlate").
request().accept(MediaType.APPLICATION_XML).post(javax.ws.rs.client.Entity.form(formData), Weightings.class));
Should be:
Weightings weightings = (Weightings) webTarget.path("rest").path("report").path("loadWeightingByPlate").
request().accept(MediaType.APPLICATION_XML).post(javax.ws.rs.client.Entity.form(formData), Weightings.class));
It won't fix your 400 exception, but you'll get a ClassCastException without it.

Categories

Resources