I've setup a cluster running Hadoop 2.1 beta on 64 bit linux. However, each time I run the hadoop command tools, a warning message pops out:
WARN util.NativeCodeLoader: Unable to load native-hadoop library for
your platform...
using builtin-java classes where applicable
Then I found out that it is lacking the native library for the the 64 bit linux. The official hadoop 2.1 tarball only provides the native library for 32 bit linux in /lib/native folder.
And I read the official document for hadoop native library, the guide says:
Once you installed the prerequisite packages use the standard hadoop
build.xml file and pass along the compile.native flag (set to true) to
build the native hadoop library:
$ant -Dcompile.native=true <target>
I search the hadoop folder, there is no file named build.xml. Haven't enough knowledge of java programming and hadoop, so I want to know how can I compile the native library for the 64 bit linux system? Thanks.
The build system has changed to maven. You can find the instructions for building here:
https://svn.apache.org/repos/asf/hadoop/common/trunk/BUILDING.txt
Specifically, you can run this:
mvn package -Pdist,native,docs -DskipTests -Dtar
(once you've install protobuf 2.5.0)
Download and install protobuf
wget http://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz /tmp
tar -xvf protobuf-2.5.0.tar.gz
cd /tmp/protobuf-2.5.0
sudo ./configure
sudo ./make install
sudo ldconfig
Install cmake
sudo apt-get install cmake
Build native libraries using maven
mvn compile -Pnative
If there are any errors run maven with -e -X switch which will output details debugging information. Look at what the error is and make suitable changes.
For example I got the following errors:
[ERROR] Could not find goal 'protoc' in plugin org.apache.hadoop:hadoop-maven-plugins:3.0.0-SNAPSHOT among available goals -> [Help 1]
Means you have incorrect protobuf version.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-common: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "cmake" (in directory "/tmp/hadoop-2.5.0-src/hadoop-common-project/hadoop-common/target/native"): error=2, No such file or directory
cmake is not installed.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-javadoc-plugin:2.8.1:jar (module-javadocs) on project hadoop-annotations: MavenReportException: Error while creating archive:
[ERROR] Exit code: 1 - /tmp/hadoop-2.5.0-src/hadoop-common-project/hadoop-annotations/src/main/java/org/apache/hadoop/classification/InterfaceStability.java:27: error: unexpected end tag: </ul>
[ERROR] * </ul>
Don't know what the issue is. Just skip the javadoc generation by passing the -Dmaven.javadoc.skip=true flag to maven.
Related
I have a java program that I wrote. The main things include OpenJDK8, Maven, and JavaFX. The program builds and runs on its own. I want to put it in a Docker container, but I'm having trouble getting it to build.
Here is my Dockerfile:
FROM openjdk:8-jdk
FROM maven:3.3-jdk-8-onbuild
RUN apt-get update && apt-get install -y --no-install-recommends openjfx && rm -rf /var/lib/apt/lists/*
CMD ["java","-jar","target/"CodeDemo-1.0-SNAPSHOT.jar"]
Here is what I ran to build the container:
sudo docker build -t java-maven-code-demo .
Here is the error I keep getting complaining about no javafxpackager:
Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2.1:exec
(unpack-dependencies) on project CodeDemo: Command execution failed.
Cannot run program
"/usr/lib/jvm/java-8-openjdk-amd64/jre/../bin/javafxpackager" (in
directory "/usr/src/app"): error=2, No such file or directory -> [Help
1]
I have all the files in a CodeDemo directory. At the top level, I have src, target, Dockerfile, pom.xml. In target, I have the compiled jar.
I'm confused by the error because I thought Java 8 OpenJDK came with JavaFX. So, if I'm pulling OpenJDK, I should be gettng the things I need for JavaFX (similar question on GitHub - solution still gave the error though).
Can anyone point me in the direction of what I could be doing wrong? Is there something else I should be doing to get the proper libraries?
You have multiple FROM lines. Based from your Dockerfile, only the maven base image will be used. Maybe try installing openjdk through another RUN statement before installing openjfx?
I also don't see any COPY statement in your Dockerfile. I initially assumed CodeDemo-1.0-SNAPSHOT.jar exists by default on the maven image but I just tried building the image and it doesn't exist. If that's the jar file from your Java program, don't forget to add it through a COPY statement in your Dockerfile.
I am trying to compile Hadoop 3.1 from its source code.
Once inside docker container i am building it with maven following included instructions in BUILDING.txt inside Hadoop's source code files.
While Apache Hadoop Common ............................... FAILURE [ 0.458 s] is trying to build, i am getting the following error, regarding protoc --version
[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:3.1.0:protoc (compile-protoc) on project hadoop-common:org.apache.maven.plugin.MojoExecutionException: 'protoc --version' did not return a version -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:3.1.0:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: 'protoc --version' did not return a version
However, the protoc command is working on my installation:
$ protoc
Missing input file.
$ protoc --version
libprotoc 2.5.0
Same thing is happening with protoc version 3.0.
Anyone experienced something similar?
After not finding the exact solution, here is a workaround.
Hadoop's source code, comes with ./start-build-env.sh. Upon running the script, a Docker container will be launched by it, in which Hadoop can be build with Maven.
All the necessary libraries exist in the container, but are only included in user's $PATH, and not root's $PATH. That's a problem because it comes handy using sudo to build the package, and that's the reason that protoc --version could not return something.
Now every container launched by ./start-build-env.sh seems to be temporary, and you can't edit the root's $PATH via ~/.bashrc. A workaround is to launch the container through ./start-build-env.sh and from another terminal window, log into the container using the command: sudo docker exec -it <container> bash. Now you can download a simple editor (like nano or vi through apt-get), and edit the default $PATH in visudo.
Then the command sudo mvn package -Pdist... inside the container, wont stuck in the problem I described above
I am compiling proto file using command line mention below.
protoc -I ./ --python_out=. --grpc_out=. --plugin=protoc-gen-grpc=`which grpc_python_plugin` ./test.proto
Above command i am issuing is to generate python based code and it works fine.
Now same command i tried to issue for Java based code but getting errors.
protoc -I ./ --java_out=. --grpc_out=. --plugin=protoc-gen-grpc=`which grpc_java_plugin` ./test.proto
I looked at my plugin directory and found that following plug-ins are installed but not the Java one.
grpc_csharp_plugin
grpc_cpp_plugin
grpc_objective_c_plugin
grpc_node_plugin
grpc_python_plugin
grpc_ruby_plugin
How can i found the Java Plugin?
The plugin is hosted on Maven Central: http://repo1.maven.org/maven2/io/grpc/protoc-gen-grpc-java/
While there isn't a protobuf plugin for Java (it is built into protoc), there is a plugin for the gRPC generated Stubs. Normally this is provided for you by default as a Gradle plugin, but you can manually download and use the plugin. Currently, there are Linux, OSX and Windows builds.
I am trying to install opencv on openshift. following the post mentioned here
http://codingexodus.blogspot.in/2013/04/how-to-install-opencv-on-openshift.html
I am developing in java so i have installed jdk1.8 and apache ant into the $OPENSHIFT_DATA_DIR
I had to install apache ant also so i did the same again in the $OPENSHIFT_DATA_DIR
my JAVA_HOME is set to the jdk and even ANT_HOME is also pointing to the right place
JAVA_HOME="$OPENSHIFT_DATA_DIR/jdk1.8.0_05"
I am doing a cmake in the end to generate the makefile with the command
cmake -D CMAKE_BUILD_TYPE=RELEASE -D BUILD_SHARED_LIBS=OFF ..
going through the cmake output i do see that for java it has found the ant, jni etc. finally when i execute a
make
make install
i get the error
make: *** No targets specified and no makefile found. Stop.
there is no make file generated. what am missing.
I am trying to get up and running a new android-maven project in Netbeans 7.1.1. I keep getting this error
Failed to execute goal com.jayway.maven.plugins.android.generation2:android-maven-plugin:3.1.1:dex (default-dex) on project myproject: MojoExecutionException: ANDROID-040-001: Could not execute: Command = cmd.exe /X /C ""C:\Program Files\Java\jdk1.7.0_03\jre\bin\java" -Xmx1024M -jar "C:\Program Files\Android\android-sdk\platform-tools\lib\dx.jar" --dex "--output=C:\Documents and Settings\Administrator\My Documents\NetBeansProjects\MyProject\target\classes.dex" "C:\Documents and Settings\Administrator\My Documents\NetBeansProjects\MyProject\target\classes"", Result = 1 -> [Help 1]
This is the pom file: http://pastebin.com/k1ZzfEYY
No. You don't have to downgrade to a Java 6 SDK from Java 7 SDK to use Maven with Android.
The problem is that after you made some updates with the SDK Manager something got corrupted and you no longer have the dx.jar file in your android-sdk\platform-tools\lib folder, and when you try to build with Maven your project doesn't have that jar in the path and it fails to build.
To fix this you need to open the SDK Manager and delete the "Android SDK Platform-tools" that you currently have installed and reinstall it. After that go to the android-sdk\platform-tools\lib folder and see if the dx.jar file is there and try to build your project with mvn clean install android:deploy.
I've tested this with the sample helloflashlight application and it worked correctly after I've reinstalled the platform tools.
Also make sure that you have the environment variables JAVA_HOME, ANDROID_HOME, M2_HOME pointing to the right paths.