java.lang.ClassNotFoundException when trying to run camus - java

I downloaded the confluent package which includes camus jars and I followed the instructions online enter link description here.
Hadoop is properly setup (meaning I can use hadoop fs -ls commands and other hadoop jar commands). However, when i tried to run
hadoop jar confluent-camus-1.0.jar com.linkedin.camus.etl.kafka.CamusJob
I got "main" classNotFound error
Exception in thread "main" java.lang.ClassNotFoundException: com.linkedin.camus.
etl.kafka.CamusJob
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:344)
at org.apache.hadoop.util.RunJar.main(RunJar.java:205)
The the path to the "confluent-camus-1.0.jar" is correct (right under the folder). I didn't start the kafka service, just to try to run it.
Anyone got similar problems?
Thanks.

You should try to inspect your jar file:
jar tvf confluent-camus-1.0.jar | grep com.linkedin.camus.etl.kafka.CamusJob
If you do not find this class, try to find it in other jar, which generated by camus.
After you should add target jar with
hadoop jar confluent-camus-1.0.jar com.linkedin.camus.etl.kafka.CamusJob -libjars {JAR_NAME}

Related

Hadoop MapReduce ClassNotFoundException Error

I have extracted a jar from a Maven project that runs the MapReduce job. However, I keep receiving the error "java.lang.ClassNotFoundException". The things that I have tried to use to repair this is:
Configured the classpath
Tried doing job.setJar(.jar)
Attempted job.setJarbyClass(.class)
Changing JobConf path file
Caress Hadoop and tell it everything is going to be okay
I extracted the jar file from Maven and transferred it to a Linux server, and running it from there.
The full error message is:
Exception in thread "main" java.lang.ClassNotFoundException: BLAMapAttempt2
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.hadoop.util.RunJar.run(RunJar.java:214)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
If there is any missing information needed to answer this question, please let me know, and thank you for reading.
When exporting the jar out of Eclipse, I unchecked "classpath" because I was not running it locally. This fixed my error.

Hbase example, Exception in thread "main" java.lang.NoClassDefFoundError

We are trying to execute basic Hbase example on hortonworks sandbox (2.3).
hadoop jar /usr/hdp/2.3.0.0-2557/hbase/lib/hbase-examples.jar org.apache.hadoop.hbase.mapreduce.IndexBuilder
We are getting below exception after executing this program.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/util/Bytes
at org.apache.hadoop.hbase.mapreduce.IndexBuilder.<clinit>(IndexBuilder.java:67)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:278)
at org.apache.hadoop.util.RunJar.run(RunJar.java:214)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.util.Bytes
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 5 more
Based on this error we tried to set the Hadoop classpath in Hbase-env.sh.
/usr/hdp/2.3.0.0-2557/hbase/lib/hbase-client-1.1.1.2.3.0.0-2557.jar:/usr/hdp/2.3.0.0-2557/hbase/lib/hbase-common-1.1.1.2.3.0.0-2557.jar:/usr/hdp/2.3.0.0-2557/hbase/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.0.0-2557/hbase/lib/guava-12.0.1.jar:$/usr/hdp/2.3.0.0-2557/hbase/lib/zookeeper.jar:/usr/hdp/2.3.0.0-2557/hbase/lib/hbase-protocol-1.1.1.2.3.0.0-2557.jar:/usr/hdp/2.3.0.0-2557/hbase/lib/commons-configuration-1.6.jar:/usr/hdp/2.3.0.0-2557/hbase/lib/hadoop-common.jar:/usr/hdp/2.3.0.0-2557/hbase/lib/hbase-0.94.27.jar
But still getting the same error.
Instead of manually adding jars into classpath you can directly use below command.
$(hbase classpath) recursively search in hortonworks hadoop folders and finds the required jars from sandbox.
HADOOP_CLASSPATH=$(hbase classpath):/usr/hdp/2.3.0.0-2557/hbase/conf hadoop jar /usr/hdp/2.3.0.0-2557/hbase/lib/hbase-examples.jar org.apache.hadoop.hbase.mapreduce.IndexBuilder
When I face NoClassDefFoundError error with mapreduce, I add jar using one of the jar class in JobBuilder to resolve it.
e.g.
Job job = new Job(conf);
job.setJarByClass(org.apache.hadoop.hbase.util.Bytes.class);
Supply jars using libjars parameter to your job-
e.g.
LIB=hbase-x.x.x.jar
hadoop jar /usr/hdp/2.3.0.0-2557/hbase/lib/hbase-examples.jar org.apache.hadoop.hbase.mapreduce.IndexBuilder -libjars ${LIB}
you can also add jar to HADOOP_CLASSPATH variable before launch job.
Is all the latest code included in the jar? Use a java decompiler such as jd-gui to look inside the jar file to make sure this class you are referencing is actually there. Also check that the necessary import statements are present in the Java class.

Hadoop external jars

I am trying to run a hadoop job on a server. The version is 0.20.2.
I have a big amount of jars, I am running:
hadoop jar GenData.jar -libjars /path/jar1,path/jar2,...
I am getting the error below even if the corresponding classes are inside the jars:
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/avro/mapreduce/AvroKeyInputFormat at
GenerateTrainningData.main(GenerateTrainningData.java:256) at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606) at
org.apache.hadoop.util.RunJar.main(RunJar.java:197) Caused by:
java.lang.ClassNotFoundException:
org.apache.avro.mapreduce.AvroKeyInputFormat at
java.net.URLClassLoader$1.run(URLClassLoader.java:366) at
java.net.URLClassLoader$1.run(URLClassLoader.java:355) at
java.security.AccessController.doPrivileged(Native Method) at
java.net.URLClassLoader.findClass(URLClassLoader.java:354) at
java.lang.ClassLoader.loadClass(ClassLoader.java:425) at
java.lang.ClassLoader.loadClass(ClassLoader.java:358)
Looks like you are getting this exception from Hadoop client side, Mapreduce driver code execution happens in Client JVM. In hadoop -libjars is a generic option which is used for adding dependent jars to mapper/reducer. In your case for adding Jars to Client set you may set the following environment variable,before executing the hadoop command.
export HADOOP_CLASSPATH=<PATH_to_jar>/Jar1:<PATH_to_jar>/Jar2;
(colon ":" can be used for specifying more than 1 jars, In your case you may add the Jar that contains the class org.apache.avro.mapreduce.AvroKeyInputFormat).
New edits
Here first of all you need to find the jar containing the class org.apache.avro.mapreduce.AvroKeyInputFormat. You can find the class inside the jar avro-mapred*.jar (Get the compatible version of avro-mapred-version.jar from internet ) include the same in your classpath using the above command.
You are missing avro-mapred dependency.

Java running code from command line

I'm on Mac OS X 10.9.2 (Mavericks). I have code that uses an external Java library (twitter4j). It runs fine when I run it through NetBeans. However, trying to run almost identical code in the terminal gives me errors.
My directory structure is straightforward- I have a 'src' folder with the .java file and a 'lib' folder with the external .jar files I use.
From the src folder, I call javac -cp "../lib/*" MyProgram.java which seems to work alright. Now, if I call java MyProgram, I get an exception:
Exception in thread "main" java.lang.NoClassDefFoundError: twitter4j/StreamListener
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2693)
at java.lang.Class.privateGetMethodRecursive(Class.java:3040)
at java.lang.Class.getMethod0(Class.java:3010)
at java.lang.Class.getMethod(Class.java:1776)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: twitter4j.StreamListener
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
It seems to have trouble properly importing the external twitter4j library. What am I missing here?
These are my import statements in the code:
import twitter4j.*;, import twitter4j.conf.ConfigurationBuilder;
UPDATE: Using suggestions below, I also tried running it via java -cp "../lib/*" MyProgram which gives: "Could not find or load main class MyProgram"
You need to specify the classpath when running the program as well as when compiling it:
E.g.
java -cp "../lib/*" MyProgram

Error running spoon on Ubuntu 14.04 64 bit

I am using Spoon tool of Pentaho data integration for long and it was working fine on my system. But since i moved it to /opt I am unable to run again . I have Oracle Java 8 installed on my system and each time try to run it i am end up with following exception
Exception in thread "main" java.lang.NoClassDefFoundError: org/eclipse/swt/widgets/Composite
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2688)
at java.lang.Class.getMethod0(Class.java:2937)
at java.lang.Class.getMethod(Class.java:1771)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:149)
Caused by: java.lang.ClassNotFoundException: org.eclipse.swt.widgets.Composite
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 5 more
please help me to resolve this error i haven't found any solution yet
I found a solution to this problem I have removed all hidden directories generated by Kettle as well as its copy from /opt . Then I have extracted new version copy . after that I have added /opt/data-integration to my path variable and I have tried to run it from my home . Although it was not a successful run but it has generated all those dependent hidden folders required to run it . then I have to go to that directory by issuing
cd /opt/data-integration
and then I was successful to run it by issuing
sh spoon.sh
I have to go to that directory because Pentaho developers has set it so by placing relative path to launcher folder in their main command at spoon.sh.

Categories

Resources