NoSuchMethodError: akka.actor.LocalActorRefProvider.log()Lakka/event/LoggingAdapter - java

When I start a simple akka-http-server, I get an exception. I searched all over the internet and found two solutions which are not working for me:
Mixing scala versions.(I had check my pom 2.11 scala without mix)
Change scala actor version.(I tried almost ten versions with 2.11 scala)
I have no idea how to resolve this problem, kindly tell me how to resolve it.
application.conf
akka {
actor {
provider = "akka.remote.RemoteActorRefProvider"
}
remote {
enabled-transports = ["akka.remote.netty.tcp"]
netty.tcp {
hostname = "127.0.0.1"
port = 2552
}
}
}
main class
package com.akkademo;
import akka.actor.ActorSystem;
import akka.actor.Props;
import com.typesafe.config.Config;
import com.typesafe.config.ConfigFactory;
public class Main {
public static void main(String[] args) {
ActorSystem actorSystem = ActorSystem.create("akkademo", ConfigFactory.load());
Config config = actorSystem.settings().config();
actorSystem.actorOf(Props.create(AkkademoDb.class), "akkademo-db");
}
}
pom.xml
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<jdk.version>1.8</jdk.version>
<scala>2.11</scala>
<scala.version>2.11.8</scala.version>
<akka.version>2.4.14</akka.version>
<akka.remote>2.3.6</akka.remote>
<akka.compat>0.7.0</akka.compat>
<akka.test>2.5.9</akka.test>
</properties>
<dependencies>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-api</artifactId>
<version>RELEASE</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-testkit_${scala}</artifactId>
<version>${akka.test}</version>
</dependency>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-remote_${scala}</artifactId>
<version>${akka.remote}</version>
</dependency>
<dependency>
<groupId>org.scala-lang.modules</groupId>
<artifactId>scala-java8-compat_${scala}</artifactId>
<version>${akka.compat}</version>
</dependency>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-actor_${scala}</artifactId>
<version>${akka.version}</version>
</dependency>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-slf4j_${scala}</artifactId>
<version>${akka.version}</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
</dependency>
</dependencies>
<repositories>
<repository>
<id>aliyun</id>
<name>aliyun</name>
<url>http://maven.aliyun.com/nexus/content/groups/public</url>
</repository>
</repositories>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>${jdk.version}</source>
<target>${jdk.version}</target>
</configuration>
</plugin>
</plugins>
</build>
</project>
Exception in thread "main" java.lang.NoSuchMethodError:
akka.actor.LocalActorRefProvider.log()Lakka/event/LoggingAdapter;
at akka.remote.RemoteActorRefProvider.(RemoteActorRefProvider.scala:128)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(ReflectiveDynamicAccess.scala:32)
at scala.util.Try$.apply(Try.scala:192)
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(ReflectiveDynamicAccess.scala:27)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(ReflectiveDynamicAccess.scala:38)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(ReflectiveDynamicAccess.scala:38)
at scala.util.Success.flatMap(Try.scala:231)
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(ReflectiveDynamicAccess.scala:38)
at akka.actor.ActorSystemImpl.liftedTree1$1(ActorSystem.scala:620)
at akka.actor.ActorSystemImpl.(ActorSystem.scala:613)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:119)
at akka.actor.ActorSystem$.create(ActorSystem.scala:67)
at akka.actor.ActorSystem.create(ActorSystem.scala)
at com.akkademo.Main.main(Main.java:15)

You issue is propably caused by using different minor versions of akka-remoting and the core akka-actor package.
Set the akka.version and akka.remote property to the same value. The current release is 2.5.19.

Related

Many errors while running Renjin with packages in Eclipse

I've been trying to run some simple code from the first lesson of R for Data Science using Renjin in Eclipse. The reason I'm using Renjin is because I'm eventually going to develop this into a Java data analysis program. Here is the link to the R for Data Science lesson I'm following: https://r4ds.had.co.nz/data-visualisation.html
I got all my dependencies in the pom.xml file of my Maven project, and when I test my pom.xml file, it builds successfully. Here's my pom.xml just in case:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>Tests</groupId>
<artifactId>Test1</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>Test1</name>
<!-- FIXME change it to the project's website -->
<url>http://www.example.com</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.7</maven.compiler.source>
<maven.compiler.target>1.7</maven.compiler.target>
</properties>
<dependencies>
<dependency>
<groupId>org.renjin</groupId>
<artifactId>renjin-script-engine</artifactId>
<version>3.5-beta76</version>
</dependency>
<dependency>
<groupId>org.renjin.cran</groupId>
<artifactId>ggplot2</artifactId>
<version>3.2.0-b8</version>
</dependency>
<dependency>
<groupId>org.renjin.cran</groupId>
<artifactId>dplyr</artifactId>
<version>0.8.2-b6</version>
</dependency>
<dependency>
<groupId>org.renjin.cran</groupId>
<artifactId>feather</artifactId>
<version>0.3.1-b9</version>
</dependency>
<dependency>
<groupId>org.renjin.cran</groupId>
<artifactId>hms</artifactId>
<version>0.4.2-b18</version>
</dependency>
<dependency>
<groupId>org.renjin.cran</groupId>
<artifactId>stringr</artifactId>
<version>1.4.0-b6</version>
</dependency>
<dependency>
<groupId>org.renjin.cran</groupId>
<artifactId>lubridate</artifactId>
<version>1.7.4-b4</version>
</dependency>
<dependency>
<groupId>org.renjin.cran</groupId>
<artifactId>forcats</artifactId>
<version>0.3.0-b9</version>
</dependency>
<dependency>
<groupId>org.renjin.cran</groupId>
<artifactId>haven</artifactId>
<version>2.0.0-b1</version>
</dependency>
<dependency>
<groupId>org.renjin.cran</groupId>
<artifactId>httr</artifactId>
<version>1.4.0-b1</version>
</dependency>
<dependency>
<groupId>org.renjin.cran</groupId>
<artifactId>jsonlite</artifactId>
<version>1.6-b1</version>
</dependency>
<dependency>
<groupId>org.renjin.cran</groupId>
<artifactId>readxl</artifactId>
<version>1.3.1-b11</version>
</dependency>
<dependency>
<groupId>org.renjin.cran</groupId>
<artifactId>rvest</artifactId>
<version>0.3.2-b115</version>
</dependency>
<dependency>
<groupId>org.renjin.cran</groupId>
<artifactId>xml2</artifactId>
<version>1.2-renjin-8</version>
</dependency>
<dependency>
<groupId>org.renjin.cran</groupId>
<artifactId>modelr</artifactId>
<version>0.1.1-b31</version>
</dependency>
<dependency>
<groupId>org.renjin.cran</groupId>
<artifactId>broom</artifactId>
<version>0.4.4-b1</version>
</dependency>
<dependency>
<groupId>org.renjin.cran</groupId>
<artifactId>tidyr</artifactId>
<version>0.8.0-b1</version>
</dependency>
<dependency>
<groupId>org.renjin.cran</groupId>
<artifactId>readr</artifactId>
<version>1.3.1-b3</version>
</dependency>
<dependency>
<groupId>org.renjin.cran</groupId>
<artifactId>purrr</artifactId>
<version>0.3.2-b11</version>
</dependency>
<dependency>
<groupId>org.renjin.cran</groupId>
<artifactId>tibble</artifactId>
<version>2.1.3-b9</version>
</dependency>
</dependencies>
<build>
<pluginManagement><!-- lock down plugins versions to avoid using Maven defaults (may be moved to parent pom) -->
<plugins>
<!-- clean lifecycle, see https://maven.apache.org/ref/current/maven-core/lifecycles.html#clean_Lifecycle -->
<plugin>
<artifactId>maven-clean-plugin</artifactId>
<version>3.1.0</version>
</plugin>
<!-- default lifecycle, jar packaging: see https://maven.apache.org/ref/current/maven-core/default-bindings.html#Plugin_bindings_for_jar_packaging -->
<plugin>
<artifactId>maven-resources-plugin</artifactId>
<version>3.0.2</version>
</plugin>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.0</version>
</plugin>
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.22.1</version>
</plugin>
<plugin>
<artifactId>maven-jar-plugin</artifactId>
<version>3.0.2</version>
</plugin>
<plugin>
<artifactId>maven-install-plugin</artifactId>
<version>2.5.2</version>
</plugin>
<plugin>
<artifactId>maven-deploy-plugin</artifactId>
<version>2.8.2</version>
</plugin>
<!-- site lifecycle, see https://maven.apache.org/ref/current/maven-core/lifecycles.html#site_Lifecycle -->
<plugin>
<artifactId>maven-site-plugin</artifactId>
<version>3.7.1</version>
</plugin>
<plugin>
<artifactId>maven-project-info-reports-plugin</artifactId>
<version>3.0.0</version>
</plugin>
</plugins>
</pluginManagement>
</build>
<repositories>
<repository>
<id>bedatadriven</id>
<name>bedatadriven public repo</name>
<url>https://nexus.bedatadriven.com/content/groups/public/</url>
</repository>
</repositories>
</project>
However, when I try to run my App.java program with the R for Data Science code, it gives me loads of errors. Here is my App.java:
package Tests.Test1;
import javax.script.*;
import org.renjin.script.*;
import java.awt.*;
import javax.swing.*;
// ... add additional imports here ...
public class App {
public static void main(String[] args) throws Exception {
// create a script engine manager:
RenjinScriptEngineFactory factory = new RenjinScriptEngineFactory();
// create a Renjin engine:
ScriptEngine engine = factory.getScriptEngine();
// ... put your Java code here ...
engine.eval("library(dplyr)");
engine.eval("library(ggplot2)");
engine.eval("library(tidyr)");
engine.eval("library(readr)");
engine.eval("library(purrr)");
engine.eval("library(tibble)");
engine.eval("?mpg");
engine.eval("ggplot(data = mpg) + geom_point(mapping = aes(x = displ, y = hwy))");
}
}
And here's the long list of errors I'm getting:
EEK! colSums.computeMeans() called through getElementAsDouble()
EEK! colSums.computeMeans() called through getElementAsDouble()
EEK! colSums.computeMeans() called through getElementAsDouble()
EEK! colSums.computeMeans() called through getElementAsDouble()
org.renjin.eval.EvalException: Exception initializing compiled GNU R library class org.renjin.cran.dplyr.dplyr
at org.renjin.primitives.packaging.DllInfo.initialize(DllInfo.java:141)
at org.renjin.primitives.packaging.Namespace.loadDynamicLibrary(Namespace.java:383)
at org.renjin.primitives.packaging.Namespace.importDynamicLibrary(Namespace.java:296)
at org.renjin.primitives.packaging.Namespace.initImports(Namespace.java:274)
at org.renjin.primitives.packaging.NamespaceRegistry.load(NamespaceRegistry.java:175)
at org.renjin.primitives.packaging.NamespaceRegistry.getNamespace(NamespaceRegistry.java:143)
at org.renjin.primitives.packaging.NamespaceRegistry.getNamespace(NamespaceRegistry.java:114)
at org.renjin.primitives.packaging.Packages.library(Packages.java:39)
at org.renjin.primitives.R$primitive$library.doApply(R$primitive$library.java:68)
at org.renjin.primitives.R$primitive$library.applyPromised(R$primitive$library.java:33)
at org.renjin.sexp.BuiltinFunction.apply(BuiltinFunction.java:100)
at org.renjin.primitives.special.InternalFunction.apply(InternalFunction.java:46)
at org.renjin.sexp.FunctionCall.eval(FunctionCall.java:80)
at org.renjin.primitives.special.BeginFunction.apply(BeginFunction.java:39)
at org.renjin.sexp.FunctionCall.eval(FunctionCall.java:80)
at org.renjin.sexp.Closure.applyPromised(Closure.java:200)
at org.renjin.sexp.Closure.apply(Closure.java:133)
at org.renjin.sexp.FunctionCall.eval(FunctionCall.java:80)
at org.renjin.sexp.ExpressionVector.eval(ExpressionVector.java:85)
at org.renjin.eval.Context.evaluate(Context.java:280)
at org.renjin.script.RenjinScriptEngine.eval(RenjinScriptEngine.java:174)
at org.renjin.script.RenjinScriptEngine.eval(RenjinScriptEngine.java:133)
at Tests.Test1.App.main(App.java:19)
Caused by: java.lang.ArithmeticException: / by zero
at java.base/java.lang.Integer.remainderUnsigned(Integer.java:1564)
at org.renjin.cran.dplyr.hybrid__._ZN5boost9unordered6detail12prime_policyIjE9to_bucketEjj(hybrid.cpp)
at org.renjin.cran.dplyr.hybrid__._ZNK5boost9unordered6detail5tableINS1_3mapISaISt4pairIKP7SEXPRECN5dplyr6hybrid15hybrid_functionEEES6_SA_NS_4hashIS6_EESt8equal_toIS6_EEEE14hash_to_bucketEj(hybrid.cpp)
at org.renjin.cran.dplyr.hybrid__._ZNK5boost9unordered6detail5tableINS1_3mapISaISt4pairIKP7SEXPRECN5dplyr6hybrid15hybrid_functionEEES6_SA_NS_4hashIS6_EESt8equal_toIS6_EEEE14find_node_implIS6_SG_EEPNS1_8ptr_nodeISB_EEjRKT_RKT0_(hybrid.cpp)
at org.renjin.cran.dplyr.hybrid__._ZNK5boost9unordered6detail5tableINS1_3mapISaISt4pairIKP7SEXPRECN5dplyr6hybrid15hybrid_functionEEES6_SA_NS_4hashIS6_EESt8equal_toIS6_EEEE9find_nodeEjRS7_(hybrid.cpp)
at org.renjin.cran.dplyr.hybrid__._ZN5boost9unordered6detail5tableINS1_3mapISaISt4pairIKP7SEXPRECN5dplyr6hybrid15hybrid_functionEEES6_SA_NS_4hashIS6_EESt8equal_toIS6_EEEE14emplace_uniqueINS1_13emplace_args1ISB_EEEES4_INS0_15iterator_detail8iteratorINS1_8ptr_nodeISB_EEEEbERS7_RKT_(hybrid.cpp)
at org.renjin.cran.dplyr.hybrid__._ZN5boost9unordered13unordered_mapIP7SEXPRECN5dplyr6hybrid15hybrid_functionENS_4hashIS3_EESt8equal_toIS3_ESaISt4pairIKS3_S6_EEE7emplaceISD_EESB_INS0_15iterator_detail8iteratorINS0_6detail8ptr_nodeISD_EEEEbERKT_(hybrid.cpp)
at org.renjin.cran.dplyr.hybrid__._ZN5boost9unordered13unordered_mapIP7SEXPRECN5dplyr6hybrid15hybrid_functionENS_4hashIS3_EESt8equal_toIS3_ESaISt4pairIKS3_S6_EEE6insertERKSD_(hybrid.cpp)
at org.renjin.cran.dplyr.hybrid__._ZN5dplyr6hybrid11hybrid_initEP7SEXPRECS2_S2_NS0_9hybrid_idE(hybrid.cpp)
at org.renjin.cran.dplyr.hybrid__._ZN5dplyr6hybrid4initEv(hybrid.cpp)
at org.renjin.cran.dplyr.hybrid__._Z22init_hybrid_inline_mapP8_DllInfo(hybrid.cpp)
at org.renjin.cran.dplyr.RcppExports__.R_init_dplyr(RcppExports.cpp)
at org.renjin.cran.dplyr.dplyr.R_init_dplyr(Unknown Source)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:78)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:567)
at org.renjin.primitives.packaging.DllInfo.initialize(DllInfo.java:137)
... 22 more
org.renjin.eval.EvalException: Exception initializing compiled GNU R library class org.renjin.cran.tidyr.tidyr
at org.renjin.primitives.packaging.DllInfo.initialize(DllInfo.java:141)
at org.renjin.primitives.packaging.Namespace.loadDynamicLibrary(Namespace.java:383)
at org.renjin.primitives.packaging.Namespace.importDynamicLibrary(Namespace.java:296)
at org.renjin.primitives.packaging.Namespace.initImports(Namespace.java:274)
at org.renjin.primitives.packaging.NamespaceRegistry.load(NamespaceRegistry.java:175)
at org.renjin.primitives.packaging.NamespaceRegistry.getNamespace(NamespaceRegistry.java:143)
at org.renjin.primitives.packaging.NamespaceRegistry.getNamespace(NamespaceRegistry.java:114)
at org.renjin.primitives.packaging.Packages.library(Packages.java:39)
at org.renjin.primitives.R$primitive$library.doApply(R$primitive$library.java:68)
at org.renjin.primitives.R$primitive$library.applyPromised(R$primitive$library.java:33)
at org.renjin.sexp.BuiltinFunction.apply(BuiltinFunction.java:100)
at org.renjin.primitives.special.InternalFunction.apply(InternalFunction.java:46)
at org.renjin.sexp.FunctionCall.eval(FunctionCall.java:80)
at org.renjin.primitives.special.BeginFunction.apply(BeginFunction.java:39)
at org.renjin.sexp.FunctionCall.eval(FunctionCall.java:80)
at org.renjin.sexp.Closure.applyPromised(Closure.java:200)
at org.renjin.sexp.Closure.apply(Closure.java:133)
at org.renjin.sexp.FunctionCall.eval(FunctionCall.java:80)
at org.renjin.sexp.ExpressionVector.eval(ExpressionVector.java:85)
at org.renjin.eval.Context.evaluate(Context.java:280)
at org.renjin.script.RenjinScriptEngine.eval(RenjinScriptEngine.java:174)
at org.renjin.script.RenjinScriptEngine.eval(RenjinScriptEngine.java:133)
at Tests.Test1.App.main(App.java:21)
Caused by: java.lang.NoSuchFieldError: _ZTVN10__cxxabiv120__si_class_type_infoE
at org.renjin.cran.tidyr.RcppExports__._ZTIN4Rcpp9exceptionE$$clinit(RcppExports.cpp)
at org.renjin.cran.tidyr.RcppExports__.<clinit>(RcppExports.cpp)
at org.renjin.cran.tidyr.tidyr.R_init_tidyr(Unknown Source)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:78)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:567)
at org.renjin.primitives.packaging.DllInfo.initialize(DllInfo.java:137)
... 22 more
I'm not quite sure how to get the code to work. If you have any ideas on how to fix this, I'd love to hear them. Please keep in mind that I am a beginner though, and I may need things in simpler/more thorough terms. If you need any more information, let me know, and I'll respond as soon as possible. Any and all help is appreciated. Thank you.

Twitter Spark Streaming using Apache Spark 2.2.1 - Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/Logging [duplicate]

I'm always getting the following error.Can somebody help me please?
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/Logging
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at com.datastax.spark.connector.japi.DStreamJavaFunctions.<init>(DStreamJavaFunctions.java:24)
at com.datastax.spark.connector.japi.CassandraStreamingJavaUtil.javaFunctions(CassandraStreamingJavaUtil.java:55)
at SparkStream.main(SparkStream.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 20 more
When I compile the following code. I've searched the web but didn't find a solution. I've got the error when I added the saveToCassandra.
import com.datastax.spark.connector.japi.CassandraStreamingJavaUtil;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.streaming.Duration;
import org.apache.spark.streaming.api.java.JavaDStream;
import org.apache.spark.streaming.api.java.JavaPairInputDStream;
import org.apache.spark.streaming.api.java.JavaStreamingContext;
import org.apache.spark.streaming.kafka.KafkaUtils;
import java.io.Serializable;
import java.util.Collections;
import java.util.HashMap;
import java.util.Map;
import java.util.Set;
import static com.datastax.spark.connector.japi.CassandraJavaUtil.mapToRow;
/**
* Created by jonas on 10/10/16.
*/
public class SparkStream implements Serializable{
public static void main(String[] args) throws Exception{
SparkConf conf = new SparkConf(true)
.setAppName("TwitterToCassandra")
.setMaster("local[*]")
.set("spark.cassandra.connection.host", "127.0.0.1")
.set("spark.cassandra.connection.port", "9042");
;
JavaSparkContext sc = new JavaSparkContext(conf);
JavaStreamingContext ssc = new JavaStreamingContext(sc, new Duration(5000));
Map<String, String> kafkaParams = new HashMap<>();
kafkaParams.put("bootstrap.servers", "localhost:9092");
Set<String> topics = Collections.singleton("Test");
JavaPairInputDStream<String, String> directKafkaStream = KafkaUtils.createDirectStream(
ssc,
String.class,
String.class,
kafka.serializer.StringDecoder.class,
kafka.serializer.StringDecoder.class,
kafkaParams,
topics
);
JavaDStream<Tweet> createTweet = directKafkaStream.map(s -> createTweet(s._2));
CassandraStreamingJavaUtil.javaFunctions(createTweet)
.writerBuilder("mykeyspace", "rawtweet", mapToRow(Tweet.class))
.saveToCassandra();
ssc.start();
ssc.awaitTermination();
}
public static Tweet createTweet(String rawKafka){
String[] splitted = rawKafka.split("\\|");
Tweet t = new Tweet(splitted[0], splitted[1], splitted[2], splitted[3]);
return t;
}
}
My pom is the following.
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.company</groupId>
<artifactId>Sentiment</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
<repositories>
<repository>
<id>twitter4j.org</id>
<name>twitter4j.org Repository</name>
<url>http://twitter4j.org/maven2</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.0.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
<version>2.0.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.scala-lang/scala-library -->
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.8</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector_2.10 -->
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>1.6.2</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.10</artifactId>
<version>0.9.0.0</version>
</dependency>
<dependency>
<groupId>org.twitter4j</groupId>
<artifactId>twitter4j-core</artifactId>
<version>[4.0,)</version>
</dependency>
<dependency>
<groupId>org.twitter4j</groupId>
<artifactId>twitter4j-stream</artifactId>
<version>4.0.4</version>
</dependency>
<dependency>
<groupId>org.twitter4j</groupId>
<artifactId>twitter4j-async</artifactId>
<version>4.0.4</version>
</dependency>
</dependencies>
</project>
org.apache.spark.Logging is available in Spark version 1.5.2 or lower version. It is not in the 2.0.0. Pls change versions as follows
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>1.5.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.5.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.5.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
<version>1.6.2</version>
</dependency>
The error is because you are using Spark 2.0 libraries with the connector from Spark 1.6 (which looks for the Spark 1.6 logging class. Use the 2.0.5 version of the connector.
It's because of the missing of org.apache.spark.Logging class since 1.5.2, just like everybody say. (Only org.apache.spark.internal.Logging exists in later version... )
But it seems none of the solution against maven can solve this dependency, so I just try to add this class to lib manually. Here is my way to fixed the problem:
Package the scala org.apache.spark.internal.Logging into a public jar. Or download it from https://raw.githubusercontent.com/swordsmanliu/SparkStreamingHbase/master/lib/spark-core_2.11-1.5.2.logging.jar (Thanks to this host.)
Move the jar into your spark cluster's jars directory.
Submit your project again, wish it will help u.
I got the solution by changing above mention jar.
Initially, I was having degraded jar for spark-kafka-streaming:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
<version>2.1.1</version></dependency>
I also removed multilple sl4j-log4j.jars and log4j.jars which I had added externally from spark and kafka jar library.
if you're using IntelliJ, just check the 'include dependencies with provided scope' box and that will fix the issue for you without mucking around with the pom or manually downloading the files.
Download spark-core_2.11-1.5.2.logging.jar and use as --jar option
spark-submit --class com.SentimentTwiteer --packages "org.apache.spark:spark-streaming-twitter_2.11:1.6.3" --jars /root/Desktop/spark-core_2.11-1.5.2.logging.jar /root/Desktop/SentimentTwiteer.jar consumerKey consumerSecret accessToken accessTokenSecret yoursearchTag
https://github.com/sinhavicky4/SentimentTwiteer
One reason that may cause this problem is lib and class conflict.
I faced this problem and solved it using some maven exclusions:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.0.0</version>
<scope>provided</scope>
<exclusions>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.0.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
<version>2.0.0</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
</exclusions>
</dependency>
this pom.xml is resolve my issue:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.6.1</version>
</dependency>
Downloaded the jar and use --jars in spark-submit ,worked for me
spark-submit --class com.SentimentTwiteer --packages "org.apache.spark:spark-streaming-twitter_2.11:1.6.3" --jars /root/Desktop/spark-core_2.11-1.5.2.logging.jar /root/Desktop/SentimentTwiteer.jar XX XX XX XX
Download the below Jar ,and it into you library ,and it will work as expected.
https://raw.githubusercontent.com/swordsmanliu/SparkStreamingHbase/master/lib/spark-core_2.11-1.5.2.logging.jar
its a version issue try with latest version
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.1.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
<version>2.1.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.0</version>
</dependency>
Missing logging Jar in your dependency jars list. Try to download "spark-core_2.11-1.5.2.logging" jar from mvn repository, then add it as external jar to your spark project, you don't get "java.lang.NoClassDefFoundError: org/apache/spark/Logging" error.
Based on scala version you can download the jar{2.10,2.11,etc}.

How to connect a Java project to an AWS mysql (mariadb) instance using Lambda Functions

Since my project is on AWS and I have to connect with other AWS related instances (such as API-Gateway, for example) I am trying to use a Lambda Function to connect to a RDS instance (and I know for a fact is up and running). Running in Eclipse with the AWS plugin, in java 8. So, the idea is I code on a local Eclipse, upload the function, run it, and then, on a serverless manner, it should execute the code.
No matter what combination of code I try, or what advice I follow, I do not get a succesful connection, so I can't query the database.
With the current iteration of code I have, I get
com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
I can't for the life of me figure out why I can not connect from eclipse-java-awsLambdaFunction to a mysql-aws instance. If I try to access from terminal to the database, with the same credentials, I am allowed. I have read and followed the official documentation 4-5 times already (originally I was following this, now it is slightly tweaked, and I am unsure if with this is enough for a simple connection (I have been told and read that it is!), but I wonder if there is some other way of doing it).
This is what I execute, since the plugin makes me give "A" json, even though I do not need it, I give it one like this: {"key1" : "value1"}
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;
import org.apache.log4j.Logger;
import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.RequestHandler;
public class Adapt_LambdaConnection implements RequestHandler<Object, String> {
private static Logger log = Logger.getLogger(Adapt_LambdaConnection.class.getName());
#Override
public String handleRequest(Object input, Context context) {
context.getLogger().log("Input: " + input);
//TODO: implement your handler
Connection con = null;
try {
con = DriverManager.getConnection("jdbc:mysql://cmf6lelghjzq.eu-west-1.rds.amazonaws.com:3306/botdb", "bot", "PassWordRandom");
}
catch (SQLException e) {
e.toString();
log.warn(e.toString());
System.out.println(e + "\nSQLException");
}catch (Exception e) {
e.toString();
System.out.println(e + "\ngenericException");
}
String status = null;
if (con != null) {
status = "connection stablished";
System.out.println(status);
}else status = "connetion failed";
return status;
}
And this is the POM (I was using a way lighter POM, but since nothing was working, I am now using one I know for sure is right since it is cloned from an unrelated proyect we run that has no issues.
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.spa</groupId>
<artifactId>demo</artifactId>
<version>1.0.0</version>
<packaging>jar</packaging>
<properties>
<jdk.version>1.8</jdk.version>
<encoding>UTF-8</encoding>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<hadoop.version>2.7.1</hadoop.version>
<scala.version>2.11.8</scala.version>
<scala.tools.version>2.11</scala.tools.version>
<spark.version>2.2.1</spark.version>
<aws.version>1.11.191</aws.version>
<spark-csv.version>1.5.0</spark-csv.version>
<commons-codec.version>1.10</commons-codec.version>
<log4j.version>1.2.17</log4j.version>
<json.version>20160212</json.version>
<slack.version>1.3.0</slack.version>
</properties>
<repositories>
<repository>
<id>SparkPackagesRepo</id>
<url>http://dl.bintray.com/spark-packages/maven</url>
</repository>
</repositories>
<build>
<finalName>TypeformSurveysTransformer</finalName>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.5</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
<configuration>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
</plugin>
</plugins>
</build>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-bom</artifactId>
<version>1.11.283</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.45</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-reflect</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.tools.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.tools.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_${scala.tools.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>com.databricks</groupId>
<artifactId>spark-csv_2.10</artifactId>
<version>${spark-csv.version}</version>
</dependency>
<dependency>
<groupId>commons-codec</groupId>
<artifactId>commons-codec</artifactId>
<version>${commons-codec.version}</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
</dependency>
<dependency>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
<version>${json.version}</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-events</artifactId>
<version>1.3.0</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-core</artifactId>
<version>1.1.0</version>
</dependency>
<dependency>
<groupId>net.gpedro.integrations.slack</groupId>
<artifactId>slack-webhook</artifactId>
<version>${slack.version}</version>
</dependency>
</dependencies>
</project>

migrating from stash to bitbucket

Mi company is migrating from stash to bitbucket and they have some REST Extensions plugin for Stash that are not working anymore in bitbucket. The plugins are developed by them, internally and uploaded in the add-ons section (manage add-ons).
I would like to know if it's any possibility to adapt those plugins and make them work in bitbucket or if we must rewrite the code.
According to this tutorial:
https://confluence.atlassian.com/bitbucketserver/how-to-update-your-add-on-779302412.html
all I have to do is to rename our package namespace from com.atlassian.stash to com.atlassian.bitbucket. I did that in the pom.xml file and in the code also but now I have lots of errors.
For example the class CommitService is not recognized anymore after I've changed imports in the code from :
import com.atlassian.stash.commit.CommitService;
to
import com.atlassian.bitbucket.commit.CommitService;
(Now this line:
private final CommitService commitService;
generates the error "Cannot resolve symbol CommitService").
Can anyone help please, since I don't have much experience with the plugins development.
http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.myCompany.bitbucket.plugins</groupId>
<artifactId>myCompany-rest-extensions</artifactId>
<version>1.2.2</version>
<organization>
<name>MyCompany</name>
<url>http://www.myCompany.com/</url>
</organization>
<name>myCompany-rest-extensions</name>
<description>This is the myCompany-rest-extensions plugin for Atlassian bitbucket.</description>
<packaging>atlassian-plugin</packaging>
<dependencies>
<dependency>
<groupId>com.atlassian.bitbucket</groupId>
<artifactId>bitbucket-parent</artifactId>
<version>${bitbucket.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
<dependency>
<groupId>com.atlassian.sal</groupId>
<artifactId>sal-api</artifactId>
<version>${bitbucket.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.atlassian.bitbucket</groupId>
<artifactId>bitbucket-api</artifactId>
<version>${bitbucket.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.atlassian.bitbucket</groupId>
<artifactId>bitbucket-spi</artifactId>
<version>${bitbucket.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.atlassian.bitbucket</groupId>
<artifactId>bitbucket-page-objects</artifactId>
<version>${bitbucket.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.atlassian.bitbucket.server</groupId>
<artifactId>bitbucket-api</artifactId>
<version>${bitbucket.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>${bitbucket.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.10</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>commons-lang</groupId>
<artifactId>commons-lang</artifactId>
<version>2.6</version>
<scope>provided</scope>
</dependency>
<!-- WIRED TEST RUNNER DEPENDENCIES -->
<dependency>
<groupId>com.atlassian.plugins</groupId>
<artifactId>atlassian-plugins-osgi-testrunner</artifactId>
<version>${plugin.testrunner.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.ws.rs</groupId>
<artifactId>jsr311-api</artifactId>
<version>1.1.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.sun.xml.bind</groupId>
<artifactId>jaxb-impl</artifactId>
<version>2.2.7</version>
</dependency>
<dependency>
<groupId>javax.xml.bind</groupId>
<artifactId>jaxb-api</artifactId>
<version>2.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.2.2-atlassian-1</version>
</dependency>
<dependency>
<groupId>com.atlassian.plugins.rest</groupId>
<artifactId>atlassian-rest-common</artifactId>
<version>1.0.2</version>
<scope>provided</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>com.atlassian.maven.plugins</groupId>
<!-- <artifactId>maven-bitbucket-plugin</artifactId>-->
<artifactId>bitbucket-maven-plugin</artifactId>
<version>${amps.version}</version>
<extensions>true</extensions>
<configuration>
<products>
<product>
<id>bitbucket</id>
<instanceId>bitbucket</instanceId>
<version>${bitbucket.version}</version>
<dataVersion>${bitbucket.data.version}</dataVersion>
</product>
</products>
</configuration>
</plugin>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.6</source>
<target>1.6</target>
</configuration>
</plugin>
</plugins>
</build>
<properties>
<bitbucket.version>${bitbucket.version}</bitbucket.version>
<bitbucket.data.version>${bitbucket.version}</bitbucket.data.version>
<amps.version>5.0.13</amps.version>
<plugin.testrunner.version>1.2.3</plugin.testrunner.version>
</properties>
import com.atlassian.bitbucket.commit.CommitService;
// other imports
#Path("/stash-projects")
public class ProjectsListRestResource{
private final CommitService commitService;
// other declarations
public ProjectsListRestResource( RefService refService, CommitService commitService , ......){
this.commitService = commitService;
// other initializations
}
#GET
#AnonymousAllowed
#Produces({MediaType.APPLICATION_JSON, MediaType.APPLICATION_XML})
public Response getMessage()
{
log.info("Rest plugin main function start");
List<ProjectRestResourceModel> projects = new ArrayList<ProjectRestResourceModel>();
PageRequest repoPageReq = new PageRequestImpl(0,100);
for(String projectkey : projectService.findAllKeys())
{
Project project = projectService.getByKey(projectkey);
ProjectRestResourceModel projectRestResourceModel = new ProjectRestResourceModel();
projectRestResourceModel.setKey(projectkey);
// ..............
Page<? extends Repository> repoPage = repositoryService.findByProjectKey(projectkey, repoPageReq);
Set<HashMap<String,String>> repoList=new HashSet<HashMap<String, String>>();
for (Repository r : repoPage.getValues()) {
HashMap<String, String> repo = new HashMap<String, String>();
repo.put("repoName", r.getName());
// .....
if(mirrorHook!=null && mirrorHook.isEnabled()) {
//....
}
repoList.add(repo);
}
projectRestResourceModel.setRepoList(repoList);
Map<Permission,Set<String>> groupMap=new HashMap<Permission,Set<String>>();
//....
groupMap.put(Permission.PROJECT_ADMIN, new HashSet<String>());
for ( PermittedGroup pg : permittedGroupProjectPage.getValues()) {
//...
}
projectRestResourceModel.setAdminGroups(groupMap.get(Permission.PROJECT_ADMIN));
userMap.put(Permission.PROJECT_ADMIN, new HashSet<String>());
for ( PermittedUser pu : permittedUserProjectPage.getValues()) {
//...
}
projectRestResourceModel.setAdminUsers(userMap.get(Permission.PROJECT_ADMIN));
projects.add(projectRestResourceModel);
}
return Response.ok(projects, MediaType.APPLICATION_JSON_TYPE).build();
}
}

java.lang.NoClassDefFoundError: org/apache/spark/Logging

I'm always getting the following error.Can somebody help me please?
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/Logging
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at com.datastax.spark.connector.japi.DStreamJavaFunctions.<init>(DStreamJavaFunctions.java:24)
at com.datastax.spark.connector.japi.CassandraStreamingJavaUtil.javaFunctions(CassandraStreamingJavaUtil.java:55)
at SparkStream.main(SparkStream.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 20 more
When I compile the following code. I've searched the web but didn't find a solution. I've got the error when I added the saveToCassandra.
import com.datastax.spark.connector.japi.CassandraStreamingJavaUtil;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.streaming.Duration;
import org.apache.spark.streaming.api.java.JavaDStream;
import org.apache.spark.streaming.api.java.JavaPairInputDStream;
import org.apache.spark.streaming.api.java.JavaStreamingContext;
import org.apache.spark.streaming.kafka.KafkaUtils;
import java.io.Serializable;
import java.util.Collections;
import java.util.HashMap;
import java.util.Map;
import java.util.Set;
import static com.datastax.spark.connector.japi.CassandraJavaUtil.mapToRow;
/**
* Created by jonas on 10/10/16.
*/
public class SparkStream implements Serializable{
public static void main(String[] args) throws Exception{
SparkConf conf = new SparkConf(true)
.setAppName("TwitterToCassandra")
.setMaster("local[*]")
.set("spark.cassandra.connection.host", "127.0.0.1")
.set("spark.cassandra.connection.port", "9042");
;
JavaSparkContext sc = new JavaSparkContext(conf);
JavaStreamingContext ssc = new JavaStreamingContext(sc, new Duration(5000));
Map<String, String> kafkaParams = new HashMap<>();
kafkaParams.put("bootstrap.servers", "localhost:9092");
Set<String> topics = Collections.singleton("Test");
JavaPairInputDStream<String, String> directKafkaStream = KafkaUtils.createDirectStream(
ssc,
String.class,
String.class,
kafka.serializer.StringDecoder.class,
kafka.serializer.StringDecoder.class,
kafkaParams,
topics
);
JavaDStream<Tweet> createTweet = directKafkaStream.map(s -> createTweet(s._2));
CassandraStreamingJavaUtil.javaFunctions(createTweet)
.writerBuilder("mykeyspace", "rawtweet", mapToRow(Tweet.class))
.saveToCassandra();
ssc.start();
ssc.awaitTermination();
}
public static Tweet createTweet(String rawKafka){
String[] splitted = rawKafka.split("\\|");
Tweet t = new Tweet(splitted[0], splitted[1], splitted[2], splitted[3]);
return t;
}
}
My pom is the following.
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.company</groupId>
<artifactId>Sentiment</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
<repositories>
<repository>
<id>twitter4j.org</id>
<name>twitter4j.org Repository</name>
<url>http://twitter4j.org/maven2</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.0.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
<version>2.0.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.scala-lang/scala-library -->
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.8</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector_2.10 -->
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>1.6.2</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.10</artifactId>
<version>0.9.0.0</version>
</dependency>
<dependency>
<groupId>org.twitter4j</groupId>
<artifactId>twitter4j-core</artifactId>
<version>[4.0,)</version>
</dependency>
<dependency>
<groupId>org.twitter4j</groupId>
<artifactId>twitter4j-stream</artifactId>
<version>4.0.4</version>
</dependency>
<dependency>
<groupId>org.twitter4j</groupId>
<artifactId>twitter4j-async</artifactId>
<version>4.0.4</version>
</dependency>
</dependencies>
</project>
org.apache.spark.Logging is available in Spark version 1.5.2 or lower version. It is not in the 2.0.0. Pls change versions as follows
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>1.5.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.5.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.5.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
<version>1.6.2</version>
</dependency>
The error is because you are using Spark 2.0 libraries with the connector from Spark 1.6 (which looks for the Spark 1.6 logging class. Use the 2.0.5 version of the connector.
It's because of the missing of org.apache.spark.Logging class since 1.5.2, just like everybody say. (Only org.apache.spark.internal.Logging exists in later version... )
But it seems none of the solution against maven can solve this dependency, so I just try to add this class to lib manually. Here is my way to fixed the problem:
Package the scala org.apache.spark.internal.Logging into a public jar. Or download it from https://raw.githubusercontent.com/swordsmanliu/SparkStreamingHbase/master/lib/spark-core_2.11-1.5.2.logging.jar (Thanks to this host.)
Move the jar into your spark cluster's jars directory.
Submit your project again, wish it will help u.
I got the solution by changing above mention jar.
Initially, I was having degraded jar for spark-kafka-streaming:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
<version>2.1.1</version></dependency>
I also removed multilple sl4j-log4j.jars and log4j.jars which I had added externally from spark and kafka jar library.
if you're using IntelliJ, just check the 'include dependencies with provided scope' box and that will fix the issue for you without mucking around with the pom or manually downloading the files.
Download spark-core_2.11-1.5.2.logging.jar and use as --jar option
spark-submit --class com.SentimentTwiteer --packages "org.apache.spark:spark-streaming-twitter_2.11:1.6.3" --jars /root/Desktop/spark-core_2.11-1.5.2.logging.jar /root/Desktop/SentimentTwiteer.jar consumerKey consumerSecret accessToken accessTokenSecret yoursearchTag
https://github.com/sinhavicky4/SentimentTwiteer
One reason that may cause this problem is lib and class conflict.
I faced this problem and solved it using some maven exclusions:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.0.0</version>
<scope>provided</scope>
<exclusions>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.0.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
<version>2.0.0</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
</exclusions>
</dependency>
this pom.xml is resolve my issue:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.6.1</version>
</dependency>
Downloaded the jar and use --jars in spark-submit ,worked for me
spark-submit --class com.SentimentTwiteer --packages "org.apache.spark:spark-streaming-twitter_2.11:1.6.3" --jars /root/Desktop/spark-core_2.11-1.5.2.logging.jar /root/Desktop/SentimentTwiteer.jar XX XX XX XX
Download the below Jar ,and it into you library ,and it will work as expected.
https://raw.githubusercontent.com/swordsmanliu/SparkStreamingHbase/master/lib/spark-core_2.11-1.5.2.logging.jar
its a version issue try with latest version
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.1.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
<version>2.1.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.0</version>
</dependency>
Missing logging Jar in your dependency jars list. Try to download "spark-core_2.11-1.5.2.logging" jar from mvn repository, then add it as external jar to your spark project, you don't get "java.lang.NoClassDefFoundError: org/apache/spark/Logging" error.
Based on scala version you can download the jar{2.10,2.11,etc}.

Categories

Resources