#!/bin/bash
JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_60.jdk
CLASSPATH=/Users/sunny/CronTest/out/production/CronTest
$JAVA_HOME/Contents/Home/bin/java -cp $CLASSPATH ".:/Users/sunny/Downloads/sqlite-jdbc-3.8.11.2.jar" sample.Main
exit 0
Sqlite jar file is in /Users/sunny/Downloads/sqlite-jdbc-3.8.11.2.jar
Compiled java class file is in /Users/sunny/CronTest/out/production/CronTest/sample/Main.class
And I've set the cron job to be schedule at every 1 minute. Java class is getting exucuted but I am getting java.lang.ClassNotFoundException: org.sqlite.JDBC
Same command which is in script is working in terminal.
My question is how can I add jar file properly to be executed from shell script.
Seems there is a space in between your $CLASS PATH and ".:/Users/sunny/Downloads/sqlite-jdbc-3.8.11.2.jar" and missing a colon(:).
Please try with following with export:
export CLASSPATH=.:/Users/sunny/CronTest/out/production/CronTest:/Users/sunny/Downloads/sqlite-jdbc-3.8.11.2.jar
and call java command with -cp $CLASSPATH.
If it still not work, please try with -cp directly:
$JAVA_HOME/Contents/Home/bin/java -cp ".:/Users/sunny/CronTest/out/production/CronTest:/Users/sunny/Downloads/sqlite-jdbc-3.8.11.2.jar" sample.Main
Related
I am trying to run two java application one after other in my docker container.
In my dockerfile i have specified invoker.sh as the entry point.
ENTRYPOINT ["sh", "/opt/invoker.sh"]
Then i use this script to run two jar files.
#!/bin/sh
java -jar loader.jar
java -jar service.jar
but this does not work. It gives
Error: Unable to access jarfile javaimpl-loader.jar
and only the service.jar is executed. When i tried echo $(ls) it shows that both the jar files are there.
but if i change the script to
#!/bin/sh
echo $(java -jar loader.jar)
java -jar service.jar
then both the jars work. Why cant i use the 1st script. any help regarding this highly apreciated.
It appears the first example is being treated as a single line, you could work with that. Also, I would prefer bash to /bin/sh. Like,
#!/usr/bin/env bash
java -jar loader.jar && java -jar service.jar
I am following the documentation found at this link
https://hadoop.apache.org/docs/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core/MapReduceTutorial.html#Usage
When i try to compile for WordCount.java and create a jar, I get the following error
bin/hadoop com.sun.tools.javac.Main WordCount.java
Error: Could not find or load main class com.sun.tools.javac.Main
I have verified my $JAVA_HOME and $HADOOP_CLASSPATH in the hadoop-env.sh file and also verified to see if I have the jdk
Here are the contents from hadoop-env.sh
export JAVA_HOME="/Library/Java/JavaVirtualMachines/jdk1.8.0_111.jdk/Contents/Home/"
.......
.........
for f in $HADOOP_HOME/contrib/capacity-scheduler/*.jar; do
if [ "$HADOOP_CLASSPATH" ]; then
export HADOOP_CLASSPATH="$JAVA_HOME/lib/tools.jar"
else
export HADOOP_CLASSPATH=$f
fi
I am not sure the reason behind error or if I am missing another key configuration?
This doesn't make sense in that loop... Nor does checking the existence of the variable first
if [ "$HADOOP_CLASSPATH" ]; then
export HADOOP_CLASSPATH="$JAVA_HOME/lib/tools.jar"
else
You need to set HADOOP_CLASSPATH="$JAVA_HOME/lib/tools.jar", as the documentation says for that class to be found. And that class is only available in the JDK
But, you could just run javac command to compile code. Not sure why the docs have you calling that class.
How to compile a Hadoop program
$ javac -classpath ${HADOOP_CLASSPATH} -d WordCount/ WordCount.java
To create jar:
$ jar -cvf WordCount.jar -C WordCount/ .
To run:
$ hadoop jar WordCount.jar WordCount input/ output
Suggestion Please use Maven/Gradle to create proper JAR files, and an IDE to write code.
P.S. Not many people actually write plain MapReduce
I'm running my hbase program using
java -classpath run.jar com.mycompany.app.HBaseImporter test2
/home/rahulko/Downloads/my-app/xaa
I have specified the HADOOP_CLASSPATH in hadoop.env.sh like this
for f in $HADOOP_HOME/contrib/capacity-scheduler/*.jar; do
if [ "$HADOOP_CLASSPATH" ]; then
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$f
else
export HADOOP_CLASSPATH=$f
fi
done
export HADOOP_CLASSPATH="$HBASE_HOME/lib/hbase-client-0.98.18-hadoop2.jar:\
$HBASE_HOME/lib/hbase-common-0.98.18-hadoop2.jar:\
$HBASE_HOME/lib/protobuf-java-2.5.0.jar:\
$HBASE_HOME/lib/guava-12.0.1.jar:\
$HBASE_HOME/lib/zookeeper-3.4.6.jar:\
$HBASE_HOME/lib/hbase-protocol-0.98.18-hadoop2.jar"
I have also specified in bashrc
export CLASSPATH=$CLASSPATH:/usr/local/hbase1/lib/*:/usr/local/hadoop/share/hadoop/common/*
But I'm still getting java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/MasterNotRunningException
But when I run from eclipse the program runs successfully
Solved it using
java -cp "run.jar:/usr/local/hbase1/lib/*"
com.mycompany.app.HBaseImporter test2
/home/rahulko/Downloads/my-app/xaa
mkdir like libs and put you dependents jar into this
and run this command:
java -Djava.ext.dirs=/xxx/xxx/libs -jar /xxx/xxx/you program jar
that can auto use the dependents jar
I am trying to learn MapReduce from the official documentation. To make a jar file for WordCount class, the documentation says to run the following command:
javac -classpath ${HADOOP_HOME}/hadoop-${HADOOP_VERSION}-core.jar -d wordcount_classes WordCount.java
But, I found that my Hadoop directory has no core.jar present. I suppose my Hadoop installation is alright as I can execute the Hadoop shell script from the Bin folder.
If you trying with that:
javac -classpath `hadoop classpath` -d wordcount_classes WordCount.java
Isn't the best practice, I think, but work for me.
Check in your hadoop-1.2.1 folder (as in my case), which you unzipped in "Prepare to Start Cluster" of single node setup. There you would find hadoop-1.2.1-core.jar
That is the file being used to compile here.
I'm trying to compile a simple WordCount.java map-reduce example on a linux (CentOS) installation of Cloudera 4. I keep hitting compiler errors when I reference any of the hadoop classes, but I can't figure out which jars of the hundreds under /usr/lib/hadoop I need to add to my classpath to get things to compile. Any help would be greatly appreciated! What I'd like most is a java file for word count (just in case the one I found is bad for some reason) along with the associated command to compile and run it.
I am trying to do this using just javac rather than Eclipse. My main issue either way is what exactly are the Hadoop libraries from the Cloudera 4 install which I need to include in order to get the classic WordCount example to compile. Basically, I need to put the Java MapReduce API classes (Mapper, Reducer, etc.) in my classpath.
I have a script that builds my hadoop classes. Try:
#!/bin/bash
program=`echo $1 | awk -F "." '{print $1}'`
if [ ! -d "${program}_classes" ]
then mkdir ${program}_classes/;
fi
javac -classpath /usr/lib/hadoop/hadoop-common-2.0.0-cdh4.0.1.jar:/usr/lib/hadoop/client/h\
adoop-mapreduce-client-core-2.0.0-cdh4.0.1.jar -d ${program}_classes/ $1
jar -cvf ${program}.jar -C ${program}_classes/ .;
You were probably missing the key jars:
/usr/lib/hadoop/hadoop-common-2.0.0-cdh4.0.1.jar
and
/usr/lib/hadoop/client/hadoop-mapreduce-client-core-2.0.0-cdh4.0.1.jar
If you are running the Cloudera CDH4 Virtual Machine then the following should get you running:
javac -classpath /usr/lib/hadoop/hadoop-common-2.0.0-cdh4.0.0.jar:/usr/lib/hadoop/client/hadoop-mapreduce-client-core-2.0.0-cdh4.0.0.jar -d wordcount_classes WordCount.java
Or you can export environment:
export JAVA_HOME=/usr/java/default
export PATH=${JAVA_HOME}/bin:${PATH}
export HADOOP_CLASSPATH=${JAVA_HOME}/lib/tools.jar
and use the commands below:
$ bin/hadoop com.sun.tools.javac.Main WordCount.java
$ jar cf wc.jar WordCount*.class
If you are using Eclipse please do add Hadoop packages. you may get it from java2s or any similar sites. I couldn't say without know anything about what you did till now.