I am using the Restlet framework.
I am trying to run my project from a jar file created using Eclipse, by doing: Export->Runnable JAR File, and selecting the option Package required libraries into generated jar.
However, when I try to execute the jar file in the command line, by typing:
java -Djava.security.policy=Client.Policy -jar identiscopeRunnable.jar
I get the following:
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.eclipse.jdt.internal.jarinjarloader.JarRsrcLoader.main(JarRsrcLoader.java:58)
Caused by: java.lang.NoClassDefFoundError: org/restlet/service/TunnelService
at rest.IdentiscopeServer.main(IdentiscopeServer.java:24)
... 5 more
Caused by: java.lang.ClassNotFoundException: org.restlet.service.TunnelService
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
... 6 more
I have added all the jar files downloaded from the Restlet Framework to my project, so I presume it is not a problem with them. Does anyone have any clue about this?
Just in case anyone asks, the line 24 of IdentiscopeServer.java is:
IdentiscopeServerApplication identiscopeServerApp = new IdentiscopeServerApplication();
The class IdentiscopeServerApplication basically does this:
#Override
public Restlet createInboundRoot() {
Router router = new Router(getContext());
//attaches the /tweet path to the TweetRest class
router.attach("/collectionPublic", CollectionPublicREST.class);
router.attach("/collectionPrivate", CollectionPrivateREST.class);
router.attach("/analysis", AnalysisREST.class);
return router;
}
Adding the jars to your eclipse project will not add the jars to your command line classpath.
java -cp <add your jars here separated by ';'(win) or ':'(linux) > -Djava.security.policy=Client.Policy -jar identiscopeRunnable.jar
See if this helps.
Related
I'm a Hadoop&Hbase newbie. I've already run the WordCount example successfully. Now I modify the Mapper and try to use Hbase row as input data, so I need to import some HBase classes.
After I rebuild WordCount.jar and run:
$ hadoop jar ./out/artifacts/WordCount_jar/WordCount.jar WordCount
I got error like:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
at WordCount.main(WordCount.java:83)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
So I copy all hbase library to a folder and set HADOOP_CLASSPATH:
$ export HADOOP_CLASSPATH=/home/kayuuzu/jar/*
$ hadoop fs -put /home/kayuuzu/jar/* /home/kayuuzu/jar/
$ hadoop jar ./out/artifacts/WordCount_jar/WordCount.jar WordCount
Now it found hbase classes but print error like:
Exception in thread "main" java.io.FileNotFoundException: File does not exist: hdfs://mycluster/ldata/bin/hadoop-2.3.0-cdh5.0.1/share/hadoop/mapreduce2/hadoop-mapreduce-client-core-2.3.0-cdh5.0.1.jar
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1128)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1120)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1120)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1295)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1292)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1292)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1313)
at WordCount.main(WordCount.java:101)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
hadoop classpath:
$ hadoop classpath:
/home/kayuzu/jar/*:/ldata/bin/hadoop-2.3.0-cdh5.0.1/etc/hadoop:/ldata/bin/hadoop-2.3.0-cdh5.0.1/share/hadoop/common/lib/*:/ldata/bin/hadoop-2.3.0-cdh5.0.1/share/hadoop/common/*:/ldata/bin/hadoop-2.3.0-cdh5.0.1/share/hadoop/hdfs:/ldata/bin/hadoop-2.3.0-cdh5.0.1/share/hadoop/hdfs/lib/*:/ldata/bin/hadoop-2.3.0-cdh5.0.1/share/hadoop/hdfs/*:/ldata/bin/hadoop-2.3.0-cdh5.0.1/share/hadoop/yarn/lib/*:/ldata/bin/hadoop-2.3.0-cdh5.0.1/share/hadoop/yarn/*:/ldata/bin/hadoop-2.3.0-cdh5.0.1/share/hadoop/mapreduce/lib/*:/ldata/bin/hadoop-2.3.0-cdh5.0.1/share/hadoop/mapreduce/*
It seems strangely using "ldata/bin/hadoop-2.3.0-cdh5.0.1"(my hadoop installation path) to expand classpath and tring to load jar from hdfs filesytem just like from local.
If I move hadoop-mapreduce-client-core-2.3.0-cdh5.0.1.jar to /home/kayuuzu/jar/ and upload it to hdfs://home/kayuuzu/jar/, this error will dismiss and then fail to load other class. It seems hadoop try to load class from hdfs using the same path on my local machine.
I guess it will work if I move all hadoop library file to one directory and upload it to hdfs keeping same path, but it will destroy my local hadoop installation and there is so many jar file.
Have I misunderstood something? How to specify library path of remote mapreduce?
I downloaded the confluent package which includes camus jars and I followed the instructions online enter link description here.
Hadoop is properly setup (meaning I can use hadoop fs -ls commands and other hadoop jar commands). However, when i tried to run
hadoop jar confluent-camus-1.0.jar com.linkedin.camus.etl.kafka.CamusJob
I got "main" classNotFound error
Exception in thread "main" java.lang.ClassNotFoundException: com.linkedin.camus.
etl.kafka.CamusJob
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:344)
at org.apache.hadoop.util.RunJar.main(RunJar.java:205)
The the path to the "confluent-camus-1.0.jar" is correct (right under the folder). I didn't start the kafka service, just to try to run it.
Anyone got similar problems?
Thanks.
You should try to inspect your jar file:
jar tvf confluent-camus-1.0.jar | grep com.linkedin.camus.etl.kafka.CamusJob
If you do not find this class, try to find it in other jar, which generated by camus.
After you should add target jar with
hadoop jar confluent-camus-1.0.jar com.linkedin.camus.etl.kafka.CamusJob -libjars {JAR_NAME}
I am trying to run a hadoop job on a server. The version is 0.20.2.
I have a big amount of jars, I am running:
hadoop jar GenData.jar -libjars /path/jar1,path/jar2,...
I am getting the error below even if the corresponding classes are inside the jars:
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/avro/mapreduce/AvroKeyInputFormat at
GenerateTrainningData.main(GenerateTrainningData.java:256) at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606) at
org.apache.hadoop.util.RunJar.main(RunJar.java:197) Caused by:
java.lang.ClassNotFoundException:
org.apache.avro.mapreduce.AvroKeyInputFormat at
java.net.URLClassLoader$1.run(URLClassLoader.java:366) at
java.net.URLClassLoader$1.run(URLClassLoader.java:355) at
java.security.AccessController.doPrivileged(Native Method) at
java.net.URLClassLoader.findClass(URLClassLoader.java:354) at
java.lang.ClassLoader.loadClass(ClassLoader.java:425) at
java.lang.ClassLoader.loadClass(ClassLoader.java:358)
Looks like you are getting this exception from Hadoop client side, Mapreduce driver code execution happens in Client JVM. In hadoop -libjars is a generic option which is used for adding dependent jars to mapper/reducer. In your case for adding Jars to Client set you may set the following environment variable,before executing the hadoop command.
export HADOOP_CLASSPATH=<PATH_to_jar>/Jar1:<PATH_to_jar>/Jar2;
(colon ":" can be used for specifying more than 1 jars, In your case you may add the Jar that contains the class org.apache.avro.mapreduce.AvroKeyInputFormat).
New edits
Here first of all you need to find the jar containing the class org.apache.avro.mapreduce.AvroKeyInputFormat. You can find the class inside the jar avro-mapred*.jar (Get the compatible version of avro-mapred-version.jar from internet ) include the same in your classpath using the above command.
You are missing avro-mapred dependency.
I have a problem with Java libraries. I'm using javax.mail and mysql-connector.
While compiling I don't have any problems, but if I try to execute the program with sudo:
$sudo java Server -jar mysql-connector-java-5.1.28.jar
It gives me this error:
java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:259)
at dbConnect.dbQuery(dbConnect.java:192)
at Server.main(Server.java:39)
while if I try to execute it without sudo:
$java Server -jar mysql-connector-java-5.1.28.jar
It gives me this error:
Exception in thread "main" java.lang.NoClassDefFoundError: javax/mail/internet/AddressException
at dbConnect.registration(dbConnect.java:161)
at dbConnect.splitUsrPass(dbConnect.java:87)
at dbConnect.dbQuery(dbConnect.java:196)
at Server.main(Server.java:39)
Caused by: java.lang.ClassNotFoundException: javax.mail.internet.AddressException
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 4 more
I can understand why it doesn't work without administrator privileges (I guess the mail library requires them) but it seems strange to me that it won't find the MySQL library if I give it the administrator privileges.
Does anyone know if this is a known issue?
When you want to run something with Java on the command line, you have two options:
Run a class: java MyClass
Run a jar: java -jar MyJar.jar
What you are trying to do is both, which will result in the -jar-part will be ignored.
I'm guessing that what you want to do is run the class Server, with the MySQL-jar on the classpath. The correct command will then be:
java Server -cp mysql-connector-java-5.1.28.jar
You could also add the directory containing the MySQL-jar to the CLASSPATH environment variable. The reason you get different error messages when using sudo and not is probably because the root user has a different CLASSPATH than your user (your user lacks the javax.mail-jar on the classpath, while root has it.)
I am trying to run the DiscardServer module of a Maven compiled program called Netty, which I have downloaded as source code. I'm using a GNU/Linux command line terminal to try and follow the instructions in the manual http://netty.io/wiki/user-guide-for-4.x.html.
I'm assuming I should run DiscardServer in directory example/src/main/java, but when I move to that directory and type
$ java -cp "~/norbert/netty-master/all/target/netty-all-5.0.0.Alpha1-SNAPSHOT.jar:~/norbert/netty-master/all/target/netty-all-5.0.0.Alpha1-SNAPSHOT-sources.jar" io.netty.example.discard.DiscardServer
the response is "Error: Could not find or load main class io.netty.example.discard.DiscardServer"
I know the class can be run from a source directory somehow, because
$ java io.netty.example.discard.DiscardServer
produces
Exception in thread "main" java.lang.NoClassDefFoundError: io/netty/channel/EventLoopGroup
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2442)
at java.lang.Class.getMethod0(Class.java:2685)
at java.lang.Class.getMethod(Class.java:1620)
at sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:492)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:484)
Caused by: java.lang.ClassNotFoundException: io.netty.channel.EventLoopGroup
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
... 6 more
Having downloaded the binary version netty-4.0.13.Final in addition, I was able to start the discard server in directory netty-4.0.13.Final/jar as follows:
$ java -cp "netty-example-4.0.13.Final.jar:netty-transport-4.0.13.Final.jar:netty-common-4.0.13.Final.jar:netty-buffer-4.0.13.Final.jar" io.netty.example.discard.DiscardServer
How can I run the DiscardServer from directory example/src/main/java or the relevant source directory, though? Thanks, any help would be appreciated.
inside the example folder, you should run:
mvn install
then
mvn exec:java -Dexec.mainClass="io.netty.example.discard.DiscardServer"