I installed hadoop watching a youtube video.
while I try to run this in terminal it gives me this error
$ /usr/local/hadoop-2.6.0/bin/hadoop version
/usr/local/hadoop-2.6.0/bin/hadoop: line 144: /usr/iib/java/jdk1.8.0_31/bin//bin/java: No such file or directory
I have java in /usr/lib/java/jdk1.8.0_31/bin/ but I dont know why the error is showing a different path.
I have setup the path in hadoop-env.sh like this
JAVA_HOME=/usr/iib/java/jdk1.8.0_31/bin/
export JAVA_HOME=${JAVA_HOME}
I did the same in .bashrc
export HADOOP_HOME=/usr/local/hadoop-2.6.0/bin/hadoop
JAVA_HOME=/usr/iib/java/jdk1.8.0_31/bin
export JAVA_HOME
PATH=$PATH:$JAVA_HOME
export PATH
Please help me. if you need any extra info please ask.
Your JAVA_HOME should point to /usr/iib/java/jdk1.8.0_31 and not the bin folder under it.
Also, usually it would be lib and not iib.
The PATH should include the $JAVA_HOME/bin, though.
You need to change JAVA_HOME from:
JAVA_HOME=/usr/iib/java/jdk1.8.0_31/bin/
TO
JAVA_HOME=/usr/iib/java/jdk1.8.0_31
i.e. Remove bin directory from JAVA_HOME variable.
And change PATH from:
PATH=$PATH:$JAVA_HOME
TO
PATH=$PATH:$JAVA_HOME/bin
Ok. I saw you problem.you may be no install jdk or your jdk's path is wrong.
now you can test it :
test you jdk location: whereis java
found your path and type it:export JAVA_HOME=/usr/iib/java/jdk1.8.0_31/
export JRE_HOME=/usr/iib/java/jdk1.8.0_31
export CLASSPATH=.:$JAVA_HOME/lib:$JRE_HOME/lib:$ClASSPATH
export PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$PATH
done it .Then you type hadoop path
all type in ~/.bashrc
Related
I have installed jdk on my mac, ran /usr/libexec/java_home and found the path to java to be this: /Library/Java/JavaVirtualMachines/jdk-9.0.4.jdk/Contents/Home
I added this line to my ~/.bashrc file:
export PATH=$PATH:/Library/Java/JavaVirtualMachines/jdk-9.0.4.jdk/Contents/Home
I still get this error message:
java: command not found
Can anyone help? I have been trying Stack Overflow solutions for hours now.
Thanks!
While it is sufficient to add the "bin" folder to your PATH, doing so will leave you unable to run several desirable Java standard tools (like maven, ant, sbt, scala and groovy). Instead, first set a JAVA_HOME and then add that with "bin" to your PATH. Like,
export JAVA_HOME="/Library/Java/JavaVirtualMachines/jdk-9.0.4.jdk/Contents/Home"
export PATH="$PATH:$JAVA_HOME/bin"
You have set your PATH to the wrong variable. Java is inside a bin folder, which you have to append to your current path. The correct command would be:
export PATH=$PATH:/Library/Java/JavaVirtualMachines/jdk-9.0.4.jdk/Contents/Home/bin
I am working in Ubuntu 16.04. I need to install gradle and the gradle is installed when i checked with sudo apt list --installed command but when i use gradle -version command it shows the following error,
JAVA_HOME is set to an invalid directory: /usr/lib/jvm/java-8-oracle/jre/bin/java
In sudo vim /etc/environment file,
PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games"
JAVA_HOME="/usr/lib/jvm/java-8-openjdk-amd64/"
http_proxy="http://username:password#IP:port no/"
https_proxy="https://IP:port no/"
ftp_proxy="ftp://IP:port no/"
I don't know where i made mistakes. Please help me.
Thanks.
On a 64bit openSuse 64 42.1 box;
readlink -f $(which java)
provided;
/usr/lib64/jvm/java-1.8.0-openjdk-1.8.0/jre/bin/java
But;
export JAVA_HOME=/usr/lib64/jvm/jre-1.8.0-openjdk
is the path that worked and allowed java emulator to run.
So i think we have to manually browse our file system and see what path to choose.
Today I faced this problem. I am using the default java that comes with your linux distro (so in my case, linux mint).
$ whereis java
This command gave me
java: /usr/bin/java /usr/share/java
So, I opened /user/bin. There was a link to Java. I right clicked it and selected follow original link. This lead me to /usr/lib/jvm/java-11-openjdk-amd64/bin/java.
So now that I know where this java is, I opened my .bashrc file, and edited the JAVA_HOME.
So for my case,
## My Custom variables
export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64
export PATH=$JAVA_HOME/bin:$PATH
This solved the problem.
Now if you are using some other java (say you downloaded from oracle and extracted the zip file ...), then you have to add that location. So for example, if your java is in /home/user/.sdkman/candidates/java/current, then
export JAVA_HOME=/home/user/.sdkman/candidates/java/current
export PATH=$JAVA_HOME/bin:$PATH
I see a mismatch. In your enviornment file the JAVA_HOME is set to "/usr/lib/jvm/java-8-openjdk-amd64/" and your mentioned that the error that you got relates to the JAVA_HOME as "/usr/lib/jvm/java-8-oracle/jre/bin/java"
If you JAVA is really installed in /usr/lib/jvm/java-8-oracle directory, then you need to ensure that the JAVA_HOME is set to that directory. And also your PATH reflects $JAVA_HOME/bin in it.
I typically install Oracle JDK/JRE separately in a separate directory such as /usr/local/jdk1.8.0 etc.
check the jvm installtion folder from Files
eg : /usr/lib/jvm/java-12-oracle
then in terminal run sudo nano /etc/environment and add the line
JAVA_HOME="/usr/lib/jvm/java-12-oracle"
Then open terminal and run
export JAVA_HOME="/usr/lib/jvm/java-12-oracle"
I installed Hadoop 2.7.0 single node cluster on Ubuntu 15.04 following commands from here and nearly all went fine until the command formatting hdfs.
When I enter
$ hdfs namenode -format
I get error: Could not find or load main class Djava.librarary.path=.usr.local.hadoop.lib
In bashrc I have
#hadoop variables
export JAVA_HOME=/usr/lib/jvm/java-8-oracle/lib/amd64
export HADOOP_HOME=/usr/local/hadoop
export PATH=$PATH:$HADOOP_HOME/bin
export PATH=$PATH:$HADOOP_HOME/sbin
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
In hadoop-env.sh I have
export JAVA_HOME="/usr/lib/jvm/java-8-oracle"
What do I do to get rid of the error?
I'm assuming something's wrong with the java path but I can't figure what...
Can anybody help?...
Thanks in advance.
Set the following variables alone in bashrc. Remove all other variables that you have set and then try
export JAVA_HOME=/usr/lib/jvm/java-8-oracle
export HADOOP_HOME=/usr/local/hadoop
export PATH=$PATH:$HADOOP_HOME/bin
export PATH=$PATH:$HADOOP_HOME/sbin
Let me know if still you have any problem.
Yes, it looks like you are getting java error. You need to configure JAVA_HOME properly. JAVA_HOME on .bashrc and hadoop-env.sh should be same.
$java -version
what is the location of your bin/java and bin/javac?
I had the same issue. The problem was that I was setting the JAVA_HOME env variable in the hadoop-env.sh file.
I believe the '.' separated path appears when you set the JAVA_HOME variable using the following command
set JAVA_HOME=/my/path/to/java
Instead of this you should simply have
JAVA_HOME=/my/path/to/java
And everything will start up correctly.
I am a beginner with hadoop and trying to install and run hadoop in my Ubuntu as a single node cluster. This is my JAVA_HOME in my hadoop_env.sh
# The java implementation to use.
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-i386/
export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/etc/hadoop"}
But when I run it the following errors come-
Starting namenodes on [localhost]
localhost: Error: JAVA_HOME is not set and could not be found.
localhost: Error: JAVA_HOME is not set and could not be found.
Starting secondary namenodes [0.0.0.0]
0.0.0.0: Error: JAVA_HOME is not set and could not be found.
How do I remove this error?
I debugged the code and found out that even though JAVA_HOME is set in the environment, the value is lost as ssh connections to other hosts is made inside the code, and the JAVA_HOME variable that was showing well set in start-dfs.sh became unset in hadoop-env.sh.
The solution to this problem will be to set JAVA_HOME variable in hadoop-env.sh and it should work properly.
I had the same error and solved it with Soil Jain's remark, but to make it even a bit more clear: the hadoop-env.sh uses an expression such as
export JAVA_HOME=${JAVA_HOME}
if you hard-code the path to your JVM installation it works
export JAVA_HOME=/usr/lib/jvm/java...
this resolution by environmental variable as is seems to fail. Hard-coding fixed the problem for me.
Under your HADOOP_HOME/conf directory please update the hadoop-env.sh file. It has entry to export JAVA_HOME.
Setting to appropriate JAVA_HOME in this file should solve your issue.
Are you loading hadoop_env.sh? you may be refering to hadoop-env.sh ( dash instead of underscore - that is under conf directory)
BTW, This is a very useful guide for quick installation :
http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
First, you must set JAVA_HOME in your hadoop_env.sh . (your local JAVA_HOME in .bashrc would likely to be ignore somehow)
# The java implementation to use.
export JAVA_HOME=/usr/lib/jvm/default-java
Then, set HADOOP_CONF_DIR point to directory of your hadoop_env.sh . In ~/.bashrc, add the following line:
HADOOP_CONF_DIR="/usr/local/hadoop/etc/hadoop"
export HADOOP_CONF_DIR
Where /usr/local/hadoop/etc/hadoop is the directory contained hadoop_env.sh
I'm using hadoop 2.8.0. Even though I exported JAVA_HOME (I put it in .bashrc), I still caught this error while trying to run start-dfs.sh.
user#host:/opt/hadoop-2.8.0 $ echo $JAVA_HOME
<path_to_java>
user#host:/opt/hadoop-2.8.0 $ $JAVA_HOME/bin/java -version
java version "1.8.0_65"
...
user#host:/opt/hadoop-2.8.0 $ sbin/start-dfs.sh
...
Starting namenodes on []
localhost: Error: JAVA_HOME is not set and could not be found.
localhost: Error: JAVA_HOME is not set and could not be found.
The only way I could get it to run was to add JAVA_HOME=path_to_java to etc/hadoop/hadoop-env.sh and then source it:
:/opt/hadoop-2.8.0 $ grep JAVA_HOME etc/hadoop/hadoop-env.sh
#export JAVA_HOME=${JAVA_HOME}
export JAVA_HOME=path_to_java
user#host:/opt/hadoop-2.8.0 $ source etc/hadoop/hadoop-env.sh
Maybe that (sourcing hadoop-env.sh) was implied in the posts above. Just thought someone should say it out loud. Now it runs. I've encountered other issues (due, I suspect, to the limited resources on the server I'm using), but at least I got past this one.
Above answers should work as long as you are using default conf directory $HADOOP_HOME/conf or $HADOOP_HOME/etc/hadoop. Here are a few things you should do if you're using a different conf folder.
Copy the hadoop-env.sh file from the default conf directory to your conf folder, say /home/abc/hadoopConf.
Replace the line
#export JAVA_HOME=${JAVA_HOME}
with the following:
export JAVA_HOME=/usr/lib/jvm/java-8-oracle
export HADOOP_CONF_DIR=/home/abc/hadoopConf
Change the values appropriately. If you have any other environment variables related to hadoop configured in your .bashrc or .profile or .bash_profile consider adding them next to the above lines.
It not know space between Program and Files: "Program Files".
So, I copy folder of jdk to C: or folder which not contains space in name of folder and assign: export JAVA_HOME=Name_Path_Copied. I see it run ok
I have made a jar file which i tested in windows and it works fine. now i want to test it for red hat enterprise linux 5. but i dont know how to run jar files in rhel5.
i've tried java -jar My.jar but it says bash: java: command not found. i've set JAVA_HOME variable as export JAVA_HOME=/root/jdk1.6.0_21 but still not working.
can anybody tell me how to run jar file in rhel5?
You need to set PATH variable , something like
export PATH=$PATH:/usr/java/jdk1.5.0_07/bin
replace /usr/java/jdk1.5.0_07/bin with path to your jdk's bin directory.
The problem is your terminal tries to find java command from the PATH , but it couldn't find it.
Update:
You need to setup global config in /etc/profile OR /etc/bash.bashrc file for all users:
# vi /etc/profile
Next setup PATH / JAVA_PATH variables as follows:
export PATH=$PATH:/usr/java/jdk1.5.0_07/bin
export PATH=$PATH:/usr/java/jdk1.5.0_07/bin
Top tip but slightly off topic.
1) Install your JDK in /usr/local/jdkX.X.X_XX/
2) Create a symbolic link /usr/local/java -> your chosen JDK installation
When you install new versions of java or if you want to revert to an older version, just change the symbolic link.