I am trying to compile Hadoop 3.1 from its source code.
Once inside docker container i am building it with maven following included instructions in BUILDING.txt inside Hadoop's source code files.
While Apache Hadoop Common ............................... FAILURE [ 0.458 s] is trying to build, i am getting the following error, regarding protoc --version
[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:3.1.0:protoc (compile-protoc) on project hadoop-common:org.apache.maven.plugin.MojoExecutionException: 'protoc --version' did not return a version -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:3.1.0:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: 'protoc --version' did not return a version
However, the protoc command is working on my installation:
$ protoc
Missing input file.
$ protoc --version
libprotoc 2.5.0
Same thing is happening with protoc version 3.0.
Anyone experienced something similar?
After not finding the exact solution, here is a workaround.
Hadoop's source code, comes with ./start-build-env.sh. Upon running the script, a Docker container will be launched by it, in which Hadoop can be build with Maven.
All the necessary libraries exist in the container, but are only included in user's $PATH, and not root's $PATH. That's a problem because it comes handy using sudo to build the package, and that's the reason that protoc --version could not return something.
Now every container launched by ./start-build-env.sh seems to be temporary, and you can't edit the root's $PATH via ~/.bashrc. A workaround is to launch the container through ./start-build-env.sh and from another terminal window, log into the container using the command: sudo docker exec -it <container> bash. Now you can download a simple editor (like nano or vi through apt-get), and edit the default $PATH in visudo.
Then the command sudo mvn package -Pdist... inside the container, wont stuck in the problem I described above
Related
I have a java program that I wrote. The main things include OpenJDK8, Maven, and JavaFX. The program builds and runs on its own. I want to put it in a Docker container, but I'm having trouble getting it to build.
Here is my Dockerfile:
FROM openjdk:8-jdk
FROM maven:3.3-jdk-8-onbuild
RUN apt-get update && apt-get install -y --no-install-recommends openjfx && rm -rf /var/lib/apt/lists/*
CMD ["java","-jar","target/"CodeDemo-1.0-SNAPSHOT.jar"]
Here is what I ran to build the container:
sudo docker build -t java-maven-code-demo .
Here is the error I keep getting complaining about no javafxpackager:
Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2.1:exec
(unpack-dependencies) on project CodeDemo: Command execution failed.
Cannot run program
"/usr/lib/jvm/java-8-openjdk-amd64/jre/../bin/javafxpackager" (in
directory "/usr/src/app"): error=2, No such file or directory -> [Help
1]
I have all the files in a CodeDemo directory. At the top level, I have src, target, Dockerfile, pom.xml. In target, I have the compiled jar.
I'm confused by the error because I thought Java 8 OpenJDK came with JavaFX. So, if I'm pulling OpenJDK, I should be gettng the things I need for JavaFX (similar question on GitHub - solution still gave the error though).
Can anyone point me in the direction of what I could be doing wrong? Is there something else I should be doing to get the proper libraries?
You have multiple FROM lines. Based from your Dockerfile, only the maven base image will be used. Maybe try installing openjdk through another RUN statement before installing openjfx?
I also don't see any COPY statement in your Dockerfile. I initially assumed CodeDemo-1.0-SNAPSHOT.jar exists by default on the maven image but I just tried building the image and it doesn't exist. If that's the jar file from your Java program, don't forget to add it through a COPY statement in your Dockerfile.
I am using Jenkins, and using a Github repo as Source Code.
In the Build section, I am executing this as a Windows Batch command:
set path=%path%;C:\Program Files\Java\jdk1.8.0_144\bin
cd \Users\harr\JenkinsServer\JenkinsTest\src
javac SimpleTest.java //Error is after this executes
java SimpleTest
I know it has something to do with classpath, but I am unsure how to solve this problem in jenkins.
Let me know if more information would be helpful.
Suppose you deploy the jekins server on linux platform, so you have to install the jdk, tomcat and so on, set the env path as well. Then you don't have to execute set path before every build.
you can create a script and copy the command into it, then when jenkins performs the build task, it can execute the script. Refer to the jenkins tutorial to learn about it.
I inherited Android Java-code in my company, without Gradle-files etc, and I want to be able to compile this on my dev-server (I program from a ChromeOS machine, hence a CLI SSH connection to a server where I do dev stuff). Now I found some Docker images like this one (which doesn't even have a working command line example) but I haven't managed to create an APK yet. What am I missing and how would you do this?
You have three steps to do:
Migrate your project to gradle.
It isn't too difficult since there are plenty of gradle project out there and you can try to follow them or just read "Migrating to Gradle" article.
Build project with gradle on local machine.
If you migrated properly you can build your project in terminal like:
./gradlew assembleDebug
but it might be also assembleDevDebug or assembleProdRelease which depends on your buildType and flavor in gradle. Check which assembles are available by running:
./gradlew tasks
Build project using Docker.
Based on image you linked:
docker run -t -i ksoichiro/android -v /path/to/project:/workspace -w workspace /bin/sh -c "./gradlew assembleDebug"
I am re-setting up Jenkins 1.5888 on our Mac OS X box. I have googled much about this problem and have come up with the following steps.
I upgraded all plugins as requested.
From Configure System, I set up the Ant plugin to automatically download from the Apache site. I have called this installation Default.
I added and invoke ant step to my build. I selected Default a my ant installation
I ran the build. Here is the part of the output that is causing my frustration:
[participant-test] $ ant -file build.xml clean emma debug install test
FATAL: command execution failed.Maybe you need to configure the job to choose one of your Ant installations?
java.io.IOException: Cannot run program "ant" (in directory "/Users/bob/.jenkins/jobs/participant/workspace/participant-test"): error=2, No such file or directory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
at hudson.Proc$LocalProc.(Proc.java:244)
at hudson.Proc$LocalProc.(Proc.java:216)
at hudson.Launcher$LocalLauncher.launch(Launcher.java:803)
at hudson.Launcher$ProcStarter.start(Launcher.java:381)
at hudson.Launcher$ProcStarter.join(Launcher.java:388)
at hudson.tasks.Ant.perform(Ant.java:217)
at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:770)
at hudson.model.Build$BuildExecution.build(Build.java:199)
at hudson.model.Build$BuildExecution.doRun(Build.java:160)
at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:533)
at hudson.model.Run.execute(Run.java:1759)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:89)
at hudson.model.Executor.run(Executor.java:240)
Caused by: java.io.IOException: error=2, No such file or directory
at java.lang.UNIXProcess.forkAndExec(Native Method)
at java.lang.UNIXProcess.(UNIXProcess.java:184)
at java.lang.ProcessImpl.start(ProcessImpl.java:130)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
... 15 more
Build step 'Invoke Ant' marked build as failure
I can run ant from the command line just fine for the same exact build. I've been at this for a day. I've found some similar issues that appear to be for older versions of Jenkins and don't seem to apply.
The exception seems to be because of missing Ant Home in the job configuration .. check if you have one ? And before executing from Jenkins also check if you are able to invoke ant commands from command line
In the end, we just kept trying things until it worked. The automatic install appears not to have been working.
I installed the JDK automatically. I'm not sure that part of the problem with ant was no java inside of Jenkins.
I then manually downloaded ant from the apache site and installed it within the .jenkins directory in the build user's home directory under tools. I added the ANT_HOME environment variable, added it to my path, and rebooted to make sure everything had it.
I pointed the manually configured ant to this above home directory and then pointed my build to this configuration. I believe this combination is what fixed my issue.
In my case I was running incompatible java version.
Jenkins requires Java in order to run, however yum install jenkins does not enforce that java is already installed. Check to make sure that you already have java installed by running java -version. To further make things difficult for CentOS users, the default CentOS version of Java is not compatible with Jenkins. Jenkins typically works best with a Sun implementation of Java, which is not included in CentOS for licensing reasons.
If you get output similar to the following, it means you're using the default (GCJ) version of Java, which will not work with Jenkins:
Remove old java version: # yum remove java
Install new version : # yum install java-1.7.0-openjdk
I've setup a cluster running Hadoop 2.1 beta on 64 bit linux. However, each time I run the hadoop command tools, a warning message pops out:
WARN util.NativeCodeLoader: Unable to load native-hadoop library for
your platform...
using builtin-java classes where applicable
Then I found out that it is lacking the native library for the the 64 bit linux. The official hadoop 2.1 tarball only provides the native library for 32 bit linux in /lib/native folder.
And I read the official document for hadoop native library, the guide says:
Once you installed the prerequisite packages use the standard hadoop
build.xml file and pass along the compile.native flag (set to true) to
build the native hadoop library:
$ant -Dcompile.native=true <target>
I search the hadoop folder, there is no file named build.xml. Haven't enough knowledge of java programming and hadoop, so I want to know how can I compile the native library for the 64 bit linux system? Thanks.
The build system has changed to maven. You can find the instructions for building here:
https://svn.apache.org/repos/asf/hadoop/common/trunk/BUILDING.txt
Specifically, you can run this:
mvn package -Pdist,native,docs -DskipTests -Dtar
(once you've install protobuf 2.5.0)
Download and install protobuf
wget http://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz /tmp
tar -xvf protobuf-2.5.0.tar.gz
cd /tmp/protobuf-2.5.0
sudo ./configure
sudo ./make install
sudo ldconfig
Install cmake
sudo apt-get install cmake
Build native libraries using maven
mvn compile -Pnative
If there are any errors run maven with -e -X switch which will output details debugging information. Look at what the error is and make suitable changes.
For example I got the following errors:
[ERROR] Could not find goal 'protoc' in plugin org.apache.hadoop:hadoop-maven-plugins:3.0.0-SNAPSHOT among available goals -> [Help 1]
Means you have incorrect protobuf version.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-common: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "cmake" (in directory "/tmp/hadoop-2.5.0-src/hadoop-common-project/hadoop-common/target/native"): error=2, No such file or directory
cmake is not installed.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-javadoc-plugin:2.8.1:jar (module-javadocs) on project hadoop-annotations: MavenReportException: Error while creating archive:
[ERROR] Exit code: 1 - /tmp/hadoop-2.5.0-src/hadoop-common-project/hadoop-annotations/src/main/java/org/apache/hadoop/classification/InterfaceStability.java:27: error: unexpected end tag: </ul>
[ERROR] * </ul>
Don't know what the issue is. Just skip the javadoc generation by passing the -Dmaven.javadoc.skip=true flag to maven.