Zookeeper property still a requirement in the Kafka consumer? - java

I was referring to a post here:
Connecting to Zookeeper in a Apache Kafka Multi Node cluster
Its mentioned here that from kafka V9 version, Producer and Consumer does not have to use the zookeeper.connect property and just the bootstrap.servers is enough to producer/consume data.
My POM.xml looks like this in the consumer side:
<properties>
<java.version>1.7</java.version>
<kafka.version>0.9.0.1-cp1</kafka.version>
<kafka.scala.version>2.11</kafka.scala.version>
<confluent.version>2.0.1</confluent.version>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.10</artifactId>
<version>0.10.0.0</version>
</dependency>
I run into the following issue in the consumer side, without zookeeper.connect property. Does anyone has the consumer part working without the zookeeper connect property ?
[WARNING]
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.codehaus.mojo.exec.ExecJavaMojo$1.run(ExecJavaMojo.java:297)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalArgumentException: requirement failed: Missing required property 'zookeeper.connect'
at scala.Predef$.require(Predef.scala:233)
at kafka.utils.VerifiableProperties.getString(VerifiableProperties.scala:177)
at kafka.utils.ZKConfig.<init>(ZkUtils.scala:902)
at kafka.consumer.ConsumerConfig.<init>(ConsumerConfig.scala:101)
at kafka.consumer.ConsumerConfig.<init>(ConsumerConfig.scala:105)
at io.confluent.examples.consumer.ConsumerGroup.<init>(ConsumerGroup.java:30)
at io.confluent.examples.consumer.ConsumerGroup.main(ConsumerGroup.java:113)
... 6 more

Only the new consumer works without connecting to Zookeeper and that is available in the kafka-clients artifact. You have to add the dependency:
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.10.0.0</version>
</dependency>
and use the implementation from the org.apache.kafka.clients.consumer. package.

Related

Connect SMB endpoint with Camel v3 and the camel-jcifs from camel-extra

I have the following maven versions in my pom.xml (among others):
<dependency>
<groupId>org.apache.camel.springboot</groupId>
<artifactId>camel-spring-boot-starter</artifactId>
<version>3.7.0</version>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-core</artifactId>
<version>3.7.0</version>
</dependency>
<dependency>
<groupId>org.apache-extras.camel-extra</groupId>
<artifactId>camel-jcifs</artifactId>
<version>2.25.2</version>
<exclusions>
<exclusion>
<groupId>com.sun.xml.bind</groupId>
<artifactId>jaxb-impl</artifactId>
</exclusion>
</exclusions>
</dependency>
Camel spring-boot version = 3.7.0 and I want to connect to a SMB endpoint like this:
smb://sharedriveuser#server-instance.sub.domain.net/folder?initialDelay=0&delay=9000&autoCreate=false&noop=true&idempotent=true&password=ThePassWorD&filter=#csvFileFilter
I read the Camel 3 Migration Guide and found nothing regarding this camel-extras.
When trying to connect, I get an error like the password option is not supported anymore:
Caused by: org.apache.camel.ResolveEndpointFailedException: Failed to resolve endpoint: smb://sharedriveuser#server-instance.sub.domain.net/folder?initialDelay=0&delay=9000&autoCreate=false&noop=true&idempotent=true&password=xxxxxx&filter=#csvFileFilter due to: There are 1 parameters that couldn't be set on the endpoint. Check the uri if the parameters are spelt correctly and that they are properties of the endpoint. Unknown parameters=[{password=ThePassWorD}]
The actual documentation link google found many times, seems dead.
From Maven central, there is no version 3.x of the lib camel-jcifs and I am wondering if the lib is still compatible with Camel 3.x.x, otherwise is there another alternative with Camel 3?
I also tried to downgrade the camel-jcifs to 2.24.3 with the same error.
Camel-extras is a separated project from the Apache Camel. There is some work in place in the camel-extra repository to support camel 3[1], but it is still to be completed and there is no release in sight.
[1] https://github.com/camel-extra/camel-extra/commit/f028dfdfaa467958c58abea0d604f8fe2f17be04
There is now a pull request to add camel-jcifs to the 3.x version:
https://github.com/camel-extra/camel-extra/pull/39
You might also get my fork and build it yourself:
https://github.com/bebbo/camel-extra *
It got merged and is in the official repository:
https://github.com/camel-extra/camel-extra
To use it with quarkus, you have to convert some List types to arrays.

How to fix NotSerialiazableException when using Beam's KafkaIO on FlinkRunner

I'm trying to run an Apache Beam application on a Flink cluster, but it fails with an error translating the Kafka UnboundedSource, saying that [partitions type:ARRAY pos:0] is not serializable. The application is a word count example reading from a Kafka topic and publishing to a Kafka topic, and it works fine using Beam's direct runner.
I created a pom.xml by following Beam's QuickStart Java and then added the KafkaIO sdk. I'm running a single-node local Flink 1.8.1 cluster and Kafka 2.3.0.
pom.xml snippets
<properties>
<beam.version>2.14.0</beam.version>
<flink.artifact.name>beam-runners-flink-1.8</flink.artifact.name>
<flink.version>1.8.1</flink.version>
</properties>
...
<profile>
<id>flink-runner</id>
<!-- Makes the FlinkRunner available when running a pipeline. -->
<dependencies>
<dependency>
<groupId>org.apache.beam</groupId>
<!-- Please see the Flink Runner page for an up-to-date list
of supported Flink versions and their artifact names:
https://beam.apache.org/documentation/runners/flink/ -->
<artifactId>${flink.artifact.name}</artifactId>
<version>${beam.version}</version>
<scope>runtime</scope>
</dependency>
<!-- Tried with and without this flink-avro dependency -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-avro</artifactId>
<version>${flink.version}</version>
</dependency>
</dependencies>
</profile>
...
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-java-io-kafka</artifactId>
<version>${beam.version}</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>2.3.0</version>
</dependency>
KafkaWordCount.java snippet
// Create the Pipeline object with the options we defined above.
Pipeline p = Pipeline.create(options);
PCollection<KV<String, Long>> counts = p.apply(KafkaIO.<String, String>read()
.withBootstrapServers(options.getBootstrapServer())
.withTopics(Collections.singletonList(options.getInputTopic()))
.withKeyDeserializer(StringDeserializer.class)
.withValueDeserializer(StringDeserializer.class)
.updateConsumerProperties(ImmutableMap.of("auto.offset.reset", (Object)"latest"))
.withoutMetadata() // PCollection<KV<Long, String>> instead of KafkaRecord type
)
The full error message, which is the result of submitting the Beam jar to Flink via /opt/flink/bin/flink run -c org.apache.beam.examples.KafkaWordCount target/word-count-beam-bundled-0.1.jar --runner=FlinkRunner --bootstrapServer=localhost:9092
The program finished with the following exception:
org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: Error while translating UnboundedSource: org.apache.beam.sdk.io.kafka.KafkaUnboundedSource#65be88ae
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:546)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:421)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:423)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:813)
at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:287)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:213)
at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1050)
at org.apache.flink.client.cli.CliFrontend.lambda$main$11(CliFrontend.java:1126)
at org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1126)
Caused by: java.lang.RuntimeException: Error while translating UnboundedSource: org.apache.beam.sdk.io.kafka.KafkaUnboundedSource#65be88ae
at org.apache.beam.runners.flink.FlinkStreamingTransformTranslators$UnboundedReadSourceTranslator.translateNode(FlinkStreamingTransformTranslators.java:233)
at org.apache.beam.runners.flink.FlinkStreamingTransformTranslators$ReadSourceTranslator.translateNode(FlinkStreamingTransformTranslators.java:281)
at org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.applyStreamingTransform(FlinkStreamingPipelineTranslator.java:157)
at org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.visitPrimitiveTransform(FlinkStreamingPipelineTranslator.java:136)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:665)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600(TransformHierarchy.java:317)
at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:251)
at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:458)
at org.apache.beam.runners.flink.FlinkPipelineTranslator.translate(FlinkPipelineTranslator.java:38)
at org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.translate(FlinkStreamingPipelineTranslator.java:88)
at org.apache.beam.runners.flink.FlinkPipelineExecutionEnvironment.translate(FlinkPipelineExecutionEnvironment.java:116)
at org.apache.beam.runners.flink.FlinkRunner.run(FlinkRunner.java:108)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:313)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:299)
at org.apache.beam.examples.KafkaWordCount.runWordCount(KafkaWordCount.java:99)
at org.apache.beam.examples.KafkaWordCount.main(KafkaWordCount.java:106)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:529)
... 9 more
Caused by: org.apache.flink.api.common.InvalidProgramException: [partitions type:ARRAY pos:0] is not serializable. The object probably contains or references non serializable fields.
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:140)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:115)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:115)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:115)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:115)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:115)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.clean(StreamExecutionEnvironment.java:1558)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.addSource(StreamExecutionEnvironment.java:1470)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.addSource(StreamExecutionEnvironment.java:1414)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.addSource(StreamExecutionEnvironment.java:1396)
at org.apache.beam.runners.flink.FlinkStreamingTransformTranslators$UnboundedReadSourceTranslator.translateNode(FlinkStreamingTransformTranslators.java:218)
... 32 more
Caused by: java.io.NotSerializableException: org.apache.avro.Schema$Field
at java.base/java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1185)
at java.base/java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:349)
at java.base/java.util.ArrayList.writeObject(ArrayList.java:896)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at java.base/java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:1130)
at java.base/java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1497)
at java.base/java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1433)
at java.base/java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1179)
at java.base/java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:349)
at org.apache.flink.util.InstantiationUtil.serializeObject(InstantiationUtil.java:576)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:122)
... 42 more
UPDATE
Turns out there is an issue in Beam related to running on Flink that seems to be related to this: https://issues.apache.org/jira/browse/BEAM-7478. One of the comments on it specifically mentions that using flink/run with KafkaIO doesn't work due to Avro's Schema.Field not being serializable: https://issues.apache.org/jira/browse/BEAM-7478?focusedCommentId=16902419&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-16902419
UPDATE 2
As mentioned in the comments, a workaround is to downgrade to Flink to 1.8.0.

Keycloak admin client - javax.ws.rs.ProcessingException: RESTEASY003215

Im using the Keycloak admin client (version 4.5.0.Final) and am trying to do some simple queries such as looking up a user. The client code is running in a plugin module in another java server, not standalone. The code looks like this:
...
try {
Keycloak kc = Keycloak.getInstance(URL, REALM, USER, PWD, CLIENT_ID);
UserRepresentation kcuser = kc.realm(REALM).users().get(USER).toRepresentation();
trace(String.format("Got user: %s", kcuser.toString()));
} catch (Exception e) {
trace("Error authenticating: " + e);
}
...
It creates the kc instance successfully, but barfs when trying to look up the user.
This is the error:
javax.ws.rs.ProcessingException: RESTEASY003215: could not find writer for content-type application/x-www-form-urlencoded type: javax.ws.rs.core.Form$1
at org.jboss.resteasy.core.interception.ClientWriterInterceptorContext.throwWriterNotFoundException(ClientWriterInterceptorContext.java:40)
at org.jboss.resteasy.core.interception.AbstractWriterInterceptorContext.getWriter(AbstractWriterInterceptorContext.java:146)
at org.jboss.resteasy.core.interception.AbstractWriterInterceptorContext.proceed(AbstractWriterInterceptorContext.java:121)
at org.jboss.resteasy.client.jaxrs.internal.ClientInvocation.writeRequestBody(ClientInvocation.java:394)
at org.jboss.resteasy.client.jaxrs.engines.ApacheHttpClient4Engine.writeRequestBodyToOutputStream(ApacheHttpClient4Engine.java:666)
at org.jboss.resteasy.client.jaxrs.engines.ApacheHttpClient4Engine.buildEntity(ApacheHttpClient4Engine.java:631)
at org.jboss.resteasy.client.jaxrs.engines.ApacheHttpClient4Engine.loadHttpMethod(ApacheHttpClient4Engine.java:509)
at org.jboss.resteasy.client.jaxrs.engines.ApacheHttpClient4Engine.invoke(ApacheHttpClient4Engine.java:310)
at org.jboss.resteasy.client.jaxrs.internal.ClientInvocation.invoke(ClientInvocation.java:439)
at org.jboss.resteasy.client.jaxrs.internal.proxy.ClientInvoker.invokeSync(ClientInvoker.java:148)
at org.jboss.resteasy.client.jaxrs.internal.proxy.ClientInvoker.invoke(ClientInvoker.java:112)
at org.jboss.resteasy.client.jaxrs.internal.proxy.ClientProxy.invoke(ClientProxy.java:76)
at com.sun.proxy.$Proxy362.grantToken(Unknown Source)
at org.keycloak.admin.client.token.TokenManager.grantToken(TokenManager.java:89)
at org.keycloak.admin.client.token.TokenManager.getAccessToken(TokenManager.java:69)
at org.keycloak.admin.client.token.TokenManager.getAccessTokenString(TokenManager.java:64)
at org.keycloak.admin.client.resource.BearerAuthFilter.filter(BearerAuthFilter.java:52)
at org.jboss.resteasy.client.jaxrs.internal.ClientInvocation.filterRequest(ClientInvocation.java:587)
at org.jboss.resteasy.client.jaxrs.internal.ClientInvocation.invoke(ClientInvocation.java:436)
at org.jboss.resteasy.client.jaxrs.internal.proxy.ClientInvoker.invokeSync(ClientInvoker.java:148)
at org.jboss.resteasy.client.jaxrs.internal.proxy.ClientInvoker.invoke(ClientInvoker.java:112)
at org.jboss.resteasy.client.jaxrs.internal.proxy.ClientProxy.invoke(ClientProxy.java:76)
at com.sun.proxy.$Proxy372.toRepresentation(Unknown Source)
...
My pom has the latest dependencies and classpath seems ok, any ideas why this is not working?
<properties>
<keycloak.version>4.5.0.Final</keycloak.version>
<resteasy.version>3.6.1.Final</resteasy.version>
</properties>
<dependencies>
<dependency>
<groupId>org.keycloak</groupId>
<artifactId>keycloak-admin-client</artifactId>
<version>${keycloak.version}</version>
</dependency>
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-client</artifactId>
<version>${resteasy.version}</version>
</dependency>
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-multipart-provider</artifactId>
<version>${resteasy.version}</version>
</dependency>
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-jackson2-provider</artifactId>
<version>${resteasy.version}</version>
</dependency>
</dependencies>
I noticed that during the instantiation of a new Keycloak instance, resteasy is checking here for the available providers with the help of the current thread. In version 3.9.1.Final which is currently used by the last keycloak-admin-client so far (version 11.0.0).
In my specific case we are using keycloak-admin-client in combination with graphql-java and CompletableFuture.supplyAsync for our data loaders. Which implies that in some cases, without further configuration, the current thread is not an instance of Thread but actually ForkJoinWorkerThread. Which apparently breaks the retrieval of the providers.
I am still a beginner to java so I would be glad if someone could explain why the registerProviders method does not work with a ForkJoinWorkerThread.
I learned on DZone is that JVM sizes the commonPool to two threads when you have more than 2 CPUs available. So I tried and noticed that my app works with 2 CPUs, but I have the same error (RESTEASY003215) with 3 CPUs.
My current "workaround" is to to use CompletableFuture.completedStage when loading data using the keycloak-admin-client.

NoSuchMethodError : Spark + Kafka : Java

Use Case
Simple message fetching and printing from Kafka topic using Spark with Java as programming language
Background
Experience in dealing with Kafka Storm Integration, developed and maintained kafka cluster and storm topologies more than a year.
No experience with Apache Spark and Scala
Simple word count application built and tested successfully using stand alone spark cluster.
Problem
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
at org.apache.spark.streaming.kafka.KafkaUtils$.createStream(KafkaUtils.scala:64)
at org.apache.spark.streaming.kafka.KafkaUtils$.createStream(KafkaUtils.scala:110)
at org.apache.spark.streaming.kafka.KafkaUtils.createStream(KafkaUtils.scala)
at com.random.spark.EventsToFileAggregator.main(EventsToFileAggregator.java:54)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
At EventsToFileAggregator.java:54
JavaPairReceiverInputDStream<String, String> messages =
KafkaUtils.createStream(jsc, args[0], args[1], topicMap,
StorageLevel.MEMORY_AND_DISK_SER());
pom.xml
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.6.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>1.6.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.11</artifactId>
<version>1.6.1</version>
</dependency>
</dependencies>
Build
Successful without any warnings
Command
./bin/spark-submit --class com.random.spark.EventsToFileAggregator --master spark://host:7077 /usr/local/spark/stats/target/stats-1.0-SNAPSHOT-jar-with-dependencies.jar localhost:2181 test topic 2
NoSuchMethodError is almost always an indication that two libraries are not at a compatible version. In this case Spark-Streaming Kafka is attempting to use a Scala language feature that doesn't exist. Check that the version of Spark-Streaming Kafka is compatible with the version of Scala you're using. Make sure you're not actually running with Scala and not Java.

Importing a new cassandra related POM dependency in project resulting in run time errors

I'm getting the following errors :
Caused by: javax.persistence.PersistenceException: Failed to load provider from META-INF/services
at javax.persistence.spi.PersistenceProviderResolverHolder$DefaultPersistenceProviderResolver.getPersistenceProviders(PersistenceProviderResolverHolder.java:115)
at javax.persistence.Persistence$PersistenceUtilImpl.isLoaded(Persistence.java:278)
at org.hibernate.validator.engine.resolver.JPATraversableResolver.isReachable(JPATraversableResolver.java:62)
at org.hibernate.validator.engine.resolver.DefaultTraversableResolver.isReachable(DefaultTraversableResolver.java:94)
at org.hibernate.validator.engine.resolver.SingleThreadCachedTraversableResolver.isReachable(SingleThreadCachedTraversableResolver.java:47)
at org.hibernate.validator.engine.ValidatorImpl.isValidationRequired(ValidatorImpl.java:757)
... 96 more
Caused by: java.lang.ClassNotFoundException: me.prettyprint.hom.CassandraPersistenceProvider
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1858)
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1709)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:274)
at org.apache.geronimo.osgi.locator.ProviderLocator.loadClass(ProviderLocator.java:195)
at org.apache.geronimo.osgi.locator.ProviderLocator.locateServiceClasses(ProviderLocator.java:524)
at org.apache.geronimo.osgi.locator.ProviderLocator.getServices(ProviderLocator.java:315)
at javax.persistence.spi.PersistenceProviderResolverHolder$DefaultPersistenceProviderResolver.getPersistenceProviders(PersistenceProviderResolverHolder.java:108)
... 101 more
I have imported a pom dependeny in my project, the new pom dependeny inturn has some cassandra related dependeny shown below :
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-mapping</artifactId>
<version>3.0.0</version>
</dependency>
The cassandra project works good in stand alone. Can someone help me with this
Your project is complaining ClassNotFoundException:me.prettyprint.hom.CassandraPersistenceProvider which belongs to Cassandra hector client.
I am guessing your project was using hector core which is no longer active hector client github page. You have to migrate all dependencies to datastax's cassandra drivers and remove all hector-client related dependencies. check it here

Categories

Resources