I have MyClass.java to define the map-reduce task. MyClass.java contains the definition of mapper, reducer and main. It works properly, but if I try to use/add an external jar, I have the message ClassNotFoundException.
To compile I use the command:
javac -classpath hadoop_library_path:my_library_path -sourcepath code_path/ -d class_path/ path/MyClass.java
I create the jar, and then I run the task:
hadoop jar maclass.jar MyClass input output -target target
The external jar need to be added also in in "jar hadoop" command?
I tried with the -libjars option with no result. Any idea?
As I commented, I see two options (there could be more):
Use Eclipse and generate a runnable jar (I am not sure about NetBeans or IntelliJ).
Use maven and its shade plugin to generate an uber jar. You should add all the external libraries that you use as dependencies.
I recommend the latter option.
Related
I was going through spring-boot-maven-plugin documentation and came across a term auto executable jar.
Could someone please explain me what is an auto executable jar and how is it different then normal jar files and how they are auto executed?
spring-boot-maven-plugin documentation mentions the term but does not go further to explain it
repackage: create a jar or war file that is auto-executable. It can replace the regular artifact or can be attached to the build lifecycle with a separate classifier.
Could someone please explain me what is an auto executable jar
A fully executable jar can be executed like any other executable
binary or it can be registered with init.d or systemd. This makes it
very easy to install and manage Spring Boot applications in common
production environments.
So In conclusion is like any other executable when you use a executable jar
how is it different then normal jar files and how they are auto executed?
Well a java file you need to run with java -jar
From Spring Docs
The Maven build of a Springboot application first build your own application and pack it into a JAR file.
In the second stage (repackage) it will wrap that jar with all the jar files from the dependency tree into a new wrapper jar archive. It will also generate a Manifest file where is defined what's the application Main class is (also in the wrapper jar).
After mvn package you can also see 2 jar files in your target directory. The original file and the wrapped jar file.
You can start a Springboot application with a simple command like:
java -jar my-springboot-app.jar
I may suggest that auto executable means that you supplied main method so that it can be launched with java -jar options, otherwise it may be just a java library.
Here is a quote from https://docs.spring.io/spring-boot/docs/current/maven-plugin/repackage-mojo.html
Repackages existing JAR and WAR archives so that they can be executed from the command line using java -jar. With layout=NONE can also be used simply to package a JAR with nested dependencies (and no main class, so not executable).
Executable jar - the one that has main class declared in manifest and can be run with java -jar yourJarFile.jar command
Other jars - jars jars without delcared main calss. Can be anything - application, library, etc. Still can run application by providing fully.qualified.class.name as entry point like java -cp yourJarFile.jar my.bootstrap.BootstrapClass
Autoexecutable jars - never heard about it :)
I have a jar file that is being created by Spring Boot. Application runs smoothly when run by command java -jar. I want to create an install anywhere launcher with this jar file.
What I have tried is to send the Spring Boot main class (PropertiesLauncher). The issue is that calling it like this won't load the nested jars inside my executable jar and also the loader.path doesn't seems to work.
Is there a way to call the executable jar like java -jar from the install anywhere launcher?
I was thinking that another option was to create an install anywhere launcher for a script file and inside have the java -jar call. So another question will be:
How do I create an install anywhere launcher for a script file?
'execute command' step will do the trick:
Use this command line:
java -jar <path.to.jar.file>
Use EXECUTE_STDOUT, EXECUTE_STDERR and EXECUTE_EXITCODE built-in variables to catch errors and parse the jar's execution result.
Important notes:
You'll have to make sure your jar includes all of the dependencies (or at least set the classpath in the command line);
To include the dependencies within your jar using eclipse you can:
Export your project as a 'runnable jar file' and select the
'Extract/Package required libraries into generated JAR' option/s
Use Maven to build the project with dependencies; the
maven-assembly-plugin is required.
The 'execute command' will work for batch/cmd/shell scripts as well, but you'll have to make sure the scripts are extracted to a local folder such as %TEMP% or /tmp before you can use them.
Goodluck
I am very new to gradle, so I need help.
I have a simple bat file with below code.
set cp=%CLASSPATH%;.
set cp=%cp%;.\abc.jar
set cp=%cp%;.\xyz.jar
javac -d classes -cp "%cp%" com\myproject\convert\plugins\plugin1\MyPlugin1.java
jar -cvf CustomPlugin1.jar -C classes .
Now i want to convert this bat file in build.gradle file.
Like plugin1 I have total 10 plugins to build, but i want to build separate jar for each plugin. And every plugin includes a single java file. SO I guess I have to define 10 different tasks in gradle, as i want a provision to build any single plugin whenever required.
So how can I achieve it in an easy way ?
I need to be able to reproduce the following Eclipse process:
Export
Runnable JAR file with option "Extract required libraries into generated JAR"
This results in a JAR that includes all the referenced libraries and they have been compiled.
However, when I use this command on Linux:
javac -cp lib/lib1.jar:lib/lib2.jar -d newJAR src/Main.java
I get a JAR that includes just the class files of my own code.
I need to find a command that could produce the same output as with Eclipse, on the Linux command-line.
What you are asking for here is non-trivial. You want to package all dependencies into the JAR and also have the class path set so that you can run a class from the new JAR.
Two plugins I have used in the past to do this are maven-assembly-plugin and oneJar plugin.
i have a cloudera hadoop version 4 installed on my cluster.
It comes packaged with google protobuffer jar version 2.4.
in my application code i use protobuffer classes compiled with protobuffer version 2.5.
This causes unresolved compilation problems at run time.
Is there a way to run the map reduce jobs with an external jar or am i stuck until cloudera upgrades their service?
Thanks.
Yes you can run MR jobs with external jars.
Be sure to add any dependencies to both the HADOOP_CLASSPATH and -libjars upon submitting a job like in the following examples:
You can use the following to add all the jar dependencies from current and lib directories:
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:`echo *.jar`:`echo lib/*.jar | sed 's/ /:/g'`
Bear in mind that when starting a job through hadoop jar you'll need to also pass it the jars of any dependencies through use of -libjars. I like to use:
hadoop jar <jar> <class> -libjars `echo ./lib/*.jar | sed 's/ /,/g'` [args...]
NOTE: The sed commands require a different delimiter character; the HADOOP_CLASSPATH is : separated and the -libjars need to be , separated.
EDIT: If you need your classpath to be interpreted first to ensure your jar (and not the pre-packaged jar) is the one that gets used, you can set the following:
export HADOOP_USER_CLASSPATH_FIRST=true