Apache Spark: issue with Scala example - java

I'm trying to learn to use Apache Spark and I have a problem with a simple example but I can not find a solution. I'm working on Ubuntu 13.04 with Java-7-Oracle and scala 2.9.3.
When I try to run SparkPi examples I get this output:
filippo#filippo-HP-Pavilion-dv6-Notebook-PC:/usr/local/spark$ ./bin/run-example SparkPi 10
java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:337)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
This is the example show in Spark documentation but I don't understand what is the problem :(

You may have downloaded a source release rather than a pre-built?
To build and assemble with sbt, you can run sbt assembly in the spark root directory.

Your installation directory is /usr/local/spark, which doesn't contain the required class.
Try just extract the downloaded tgz file from http://spark.apache.org/downloads.html. Cd to the directory and run the example command.
Make sure that you have lib/spark-examples-XXX-YYY.jar when you run bin/run-example

Related

Raspberry pi - java - Serial Com

I've got a serious problem with my Raspberry Pi (OS: Raspbian) and Java (JDK-7-Armhf). My code wont execute without creating a Exception.
I'll been reading and trying several proposals and nothing have worked yet.
So now i am confused of what went wrong..?
So it goes like this:
I've got a Java source code that will run in my Eclipes IDE.
But when i export the .jar-file, even with library (JRE System lib. _86, JavaSE1.7, JAVAX.comm), as "Runnabel JAR" or just .JAR and execute it om my Raspberry pi, i've got "NoCLassDefFoundExecption: SerialPortEventListener..!
I just don't know why, it won't find the library and use it..!
On my Pi i have librxtx-java installed and JDK-7-oracle-armhf.
librxtx-java should be setting its JAVA_PATH on it's own when installed, but not sure that is done correct!
(I've got a folder /usr/jni and it contains: librxtxSerial.so and
/usr/lib/jvm/jdk-7-oracle-armhf/).
Looking at the library Javax.RXTXcomm in Eclipse, i've got RXTXcomm.jar, that contains SerialPortEventListener, so it is defininated i the .jar.
I'm thinking about, could it be, that there is a missing link between my Java JVM and linux serialport driver? My JRE is working in some way, because it will execute a code to readout HOSTNAME & IP-Adress.
So is there some one who knows how to fix this..?
This is a readout of Java Exception:
Exception in thread "main" java.lang.NoClassDefFoundError: javax/comm/SerialPortEventListener
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:792)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:482)
Caused by: java.lang.ClassNotFoundException: javax.comm.SerialPortEventListener
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 13 more
A copy of my code. "SimpleRead.java" The original from java comm API samples
You have to change the reference from javax.comm to gnu.io in your source code.
Try following:
Switch to root and run the program you created
If that helps do following
sudo adduser username dialout
then log out and log in from all terminals and GUIs

Exception in thread "main" java.lang.NoClassDefFoundError: java/util/function/Predicate

I have created a jar file using
mvn assembly:assembly -DdescriptorId=jar-with-dependencies
and I run it on windows and it works fine and works as expected. Then I run it on Ubuntu and it gives the following exception:
Exception in thread "main" java.lang.NoClassDefFoundError: java/util/function/Predicate
at Maxima_ImageJ.run(Maxima_ImageJ.java:13)
at Maxima_ImageJ.main(Maxima_ImageJ.java:27)
Caused by: java.lang.ClassNotFoundException: java.util.function.Predicate
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 2 more
I have no idea why it works differently in Windows and Ubuntu. If someone do, please help. Is it related to the versions of java?
Set class path to Java 1.8 as java.util.function.Predicate is part of JavaSE8 and will not be available in 1.7 some of the set commands to use before executing maven command.
set path=C:\Program Files\Java\jdk1.8.0_05\bin//UPTO Bin
set JRE_HOME=C:\Program Files\Java\jre8//Upto Root folder of JRE
set JAVA_HOME=C:\Program Files\Java\jdk1.8.0_05//UPTO Root folder of JDK
Check your java version by using below command.
readlink -f $(which java)
If it is less than 1.8, then you have to update Java_Version.
One way is to edit in .bashrc file.

java.lang.ClassNotFoundException when trying to run camus

I downloaded the confluent package which includes camus jars and I followed the instructions online enter link description here.
Hadoop is properly setup (meaning I can use hadoop fs -ls commands and other hadoop jar commands). However, when i tried to run
hadoop jar confluent-camus-1.0.jar com.linkedin.camus.etl.kafka.CamusJob
I got "main" classNotFound error
Exception in thread "main" java.lang.ClassNotFoundException: com.linkedin.camus.
etl.kafka.CamusJob
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:344)
at org.apache.hadoop.util.RunJar.main(RunJar.java:205)
The the path to the "confluent-camus-1.0.jar" is correct (right under the folder). I didn't start the kafka service, just to try to run it.
Anyone got similar problems?
Thanks.
You should try to inspect your jar file:
jar tvf confluent-camus-1.0.jar | grep com.linkedin.camus.etl.kafka.CamusJob
If you do not find this class, try to find it in other jar, which generated by camus.
After you should add target jar with
hadoop jar confluent-camus-1.0.jar com.linkedin.camus.etl.kafka.CamusJob -libjars {JAR_NAME}

Hadoop external jars

I am trying to run a hadoop job on a server. The version is 0.20.2.
I have a big amount of jars, I am running:
hadoop jar GenData.jar -libjars /path/jar1,path/jar2,...
I am getting the error below even if the corresponding classes are inside the jars:
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/avro/mapreduce/AvroKeyInputFormat at
GenerateTrainningData.main(GenerateTrainningData.java:256) at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606) at
org.apache.hadoop.util.RunJar.main(RunJar.java:197) Caused by:
java.lang.ClassNotFoundException:
org.apache.avro.mapreduce.AvroKeyInputFormat at
java.net.URLClassLoader$1.run(URLClassLoader.java:366) at
java.net.URLClassLoader$1.run(URLClassLoader.java:355) at
java.security.AccessController.doPrivileged(Native Method) at
java.net.URLClassLoader.findClass(URLClassLoader.java:354) at
java.lang.ClassLoader.loadClass(ClassLoader.java:425) at
java.lang.ClassLoader.loadClass(ClassLoader.java:358)
Looks like you are getting this exception from Hadoop client side, Mapreduce driver code execution happens in Client JVM. In hadoop -libjars is a generic option which is used for adding dependent jars to mapper/reducer. In your case for adding Jars to Client set you may set the following environment variable,before executing the hadoop command.
export HADOOP_CLASSPATH=<PATH_to_jar>/Jar1:<PATH_to_jar>/Jar2;
(colon ":" can be used for specifying more than 1 jars, In your case you may add the Jar that contains the class org.apache.avro.mapreduce.AvroKeyInputFormat).
New edits
Here first of all you need to find the jar containing the class org.apache.avro.mapreduce.AvroKeyInputFormat. You can find the class inside the jar avro-mapred*.jar (Get the compatible version of avro-mapred-version.jar from internet ) include the same in your classpath using the above command.
You are missing avro-mapred dependency.

Hadoop 2.2.0 mapreduce job not running after upgrading from hadoop 1.0.4

I have upgraded my hadoop version from 1.0.4 to 2.2.0. The mapreduce job was running fine earlier. Now i have added almost all jars provided for hadoop 2.2.0. Still it gives me this exception. Let me know where i am doing wrong.
Exception in thread "main" java.lang.NoClassDefFoundError: com/google/protobuf/ServiceException
at org.apache.hadoop.ipc.ProtobufRpcEngine.<clinit>(ProtobufRpcEngine.java:69)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:1659)
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1624)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1718)
at org.apache.hadoop.ipc.RPC.getProtocolEngine(RPC.java:203)
at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)
at org.apache.hadoop.hdfs.NameNodeProxies.createNNProxyWithClientProtocol(NameNodeProxies.java:328)
at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:235)
at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:139)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:510)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:453)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:136)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2433)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:88)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2467)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2449)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:367)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:166)
Caused by: java.lang.ClassNotFoundException: com.google.protobuf.ServiceException
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
Thanks in advance.
Please check your protobuf dependency. AFAIR Hadoop 2.x needs 2.5.x (please check your hadoop dependencies) and you could just take outdated one because of some external component.
Just for note: if your job uses org.apache.hadoop.mapreduce package it could be binary incompatible with Hadoop 2.x. You should recompile in this case. There should be no such issue with "old" API org.apache.hadoop.mapred. But I'd recommend recompile in any case.
Hope this at least help.

Categories

Resources