I am following this tutorial to setup hadoop on my windows 10. I successfully followed the tutorial till Page 5 and after putting everything I tried to check the hadoop version on the command line. But it returns the following error:
D:\hadoop-2.6.0\bin>hadoop version
Error: Could not find or load main class Azfar
Instead, if I use the following command it somehow works:
D:\hadoop-2.6.0\bin>hadoop
Usage: hadoop [--config confdir] COMMAND
where COMMAND is one of:
fs run a generic filesystem user client
version print the version
jar <jar> run a jar file
checknative [-a|-h] check native hadoop and compression libraries availability
distcp <srcurl> <desturl> copy file or directories recursively
archive -archiveName NAME -p <parent path> <src>* <dest> create a hadoop archive
classpath prints the class path needed to get the
Hadoop jar and the required libraries
credential interact with credential providers
key manage keys via the KeyProvider
daemonlog get/set the log level for each daemon
or
CLASSNAME run the class named CLASSNAME
Most commands print help when invoked w/o parameters.
I think it is doing this because my username "Azfar Faizan" contains a space, but I didn't use any path which includes my username folder during setting up. Can anybody guide me where exactly is the problem or I am doing it totally wrong ?
Related
I am trying to build a customized StanfordNer model, training data and properties file are ready.
But when I am trying to run the following code :
java -cp "stanford-ner.jar:lib/*" -mx4g edu.stanford.nlp.ie.crf.CRFClassifier -prop download.txt
This error is popping out :
Error: Could not find or load main class
edu.stanford.nlp.ie.crf.CRFClassifier
Steps followed:
Downloaded and extracted stanford-ner-2018-10-16.zip file.
Java 8 installed and $JAVA_HOME has been set.
The properties file (download.txt) has been placed in the folder where stanford-ner-2018-10-16.zip is extracted.
If you are seeing errors like that it means your CLASSPATH is not properly configured.
You need to run that command in the same folder as the NER download or it won't find the needed jars. That command should be run in whatever directory has stanford-ner.jar and lib in it. Alternatively you can just set the CLASSPATH environment variable and remove the -cp option from the command.
More info on Java CLASSPATH here: https://docs.oracle.com/javase/tutorial/essential/environment/paths.html
I'm trying to install and start Hadoop 2.7.1 on my computer (windows 10) with command lines and I have followed steps from different websites for that. I have configurated systems variables and Hadoop (edit some files in etc folder : Hadoop-env.cmd, core-site.xml, mapred-site.xml, yarn-site.xml, hdfs-site.xml) and download a new bin folder. I'm currently trying to start Hadoop and I have executed the command hdfs namenode -format successfully.
However, when pointing in command prompt to sbin folder and trying to execute start-dfs.cmd I have an error message telling : The system cannot find the file hadoop. Anyone has any idea what I should do or have done wrong ?
Set the Hadoop home and path variable in environment variable path
change the file name from hadoop to hadoop.cmd in bin/hadoop.cmd
and run start-all in cmd and check if it is working or not.
and check java_home path in environment variables.
Old but for others starting out, this is how I fixed this problem. I will assume you have followed:
https://github.com/MuhammadBilalYar/Hadoop-On-Window/wiki/Step-by-step-Hadoop-2.8.0-installation-on-Window-10
And are having troubles.
Open start-all.cmd in 'C:\hadoop-2.8.0\sbin' in a text editor like notepad++
Replace line 24 with set 'HADOOP_BIN_PATH=C:\hadoop-2.8.0\bin'
In this file note the calls to 'hadoop-config .cmd' , 'start-dfs .cmd' , 'start-yarn.cmd'. Open these in a text editor.
Replace the hadoop path as per step 2 . Set HADOOP_BIN_PATH=C:\hadoop-2.8.0\bin
Save files and re-run your start-all cmd
Hope this helps. `
I am able to get localhost:16010 running. But, somehow the Hbase shell is not launching when I use :
01HW993798:bin tcssig$ cd /Users/tcssig/Downloads/hbase-1.0.3/bin
01HW993798:bin tcssig$ hbase shell
-bash: hbase: command not found
When I directly launch Hbase Unix executable, it generates the below error log.
Error: JAVA_HOME is not set
Although I have set it. After this only, the localhost:16010 is running.
NOTE : I know there is one similar question, but no relevant answers are present there.
Using this I am able to invoke the command, but now it gives the error :
./hbase: line 403: /Users/tcssig/Downloads/hbase-
1.0.3/bin/JAVA_HOME:/Library/Java/JavaVirtualMachines/jdk1.8.0_101.jdk/Cont``ents/Home/bin/java: No such file or directory
Although I have java file there.
Your hbase invocation should be like this:
cd /Users/tcssig/Downloads/hbase-1.0.3/bin
./hbase shell [Note the ./]
When you just type hbase shell linux searches for hbase executable in all directories included in PATH environment variable. Since above bin directory is not included it errors out.
Alternatively you can also update your path variable, based on linux distribution, the command to do that may vary. It should be something like:
export PATH=/Users/tcssig/Downloads/hbase-1.0.3/bin:$PATH
Put this command in your .bashrc or .bash_profile and then source this file. That way the bin directory is now included in PATH and hbase command is available.
Go into $HBASE_HOME/bin path, and try:
./hbase shell
I'm trying to run wordcount topology on apache storm via command line in ubuntu and it is using multiland property to split words from sentences with a program written in python.
I've set the classpath of the multilang dir in .bashrc file but still at the time of execution it is giving error
java.lang.RuntimeException: Error when launching multilang subprocess
Caused by: java.io.IOException: Cannot run program "python" (in directory "/tmp/eaf0b6b3-67c1-4f89-b3d8-23edada49b04/supervisor/stormdist/word-count-1-1414559082/resources"): error=2, No such file or directory
I found my answer, I was submitting jar to storm but the cluster it contain was Local and hence the classpath was not working while uploading jar to storm, I re modified the code and change the local cluster to storm cluster and then It was uploaded successfully to storm, along this I have also included the classpath of multilang folder in the eclipse ide itself instead of creating it in .bashrc file.
The python installed in the system may have its default path, such as /usr/bin or /usr/local/bin. Python modules may have different paths.
Do not fully override $PATH environment variable in .bashrc.
Or you can set the execution bit of the Python script you would like to run, and call the script as a normal program in storm.
My site is being hosted on a shared server so I don't have su access. I needed to run a piece of code with java but it's not available on the server. So I got a self-extracting version of java and put it on the server in my home directory. Then I gave executable permissions to java and I try running the code. I have to use relative paths when running the file because of the restrictions of the server.
Trying to run the java file ../java/bin/java -jar 'javafile.jar' gives me the following:
error while loading shared libraries: libjli.so: cannot open shared object file: No such file or directory
I looked and libjli.so is located at ../java/lib/i386/jli/libjli.so. So I'm thinking that because I'm running java using a relative path it doesn't exactly know how to look for the other files. I'm hoping that if I can add absolute/path/to/java/bin to $PATH then this issue will be resolved.
So once I'm running my PHP, I can use dirname(__FILE__) to get the full path of my java bin directory. I've tried the following code:
exec('export PATH='.$bin_path.':$PATH', $output, $return);
print_r(array(getenv('PATH'), $output, $return));
Prints:
Array(
[0] => /usr/local/admin/bin:/usr/local/admin/bin/servers:/sbin:/usr/sbin:/usr/local/sbin:/usr/local/bin:/bin:/usr/bin,
[1] => Array(),
[2] => 0
)
So nothing was added to $PATH, no output was given, and the command returned a successful exit value. Is it just the restriction of the server that is preventing me from getting this working?
Firstly, this is not going to work.
exec('export PATH='.$bin_path.':$PATH', $output, $return);
It will launch a child process with a shell, run the export command in the shell, and then the shell will exit. But the export command only changes $PATH for that shell.
I'm not sure, but I suspect that you need to use putenv.
I'm hoping that if I can add absolute/path/to/java/bin to $PATH then this issue will be resolved.
Well, it could only help if you used a simple command name for invoking the java command.
And it would be simpler to just run java using the full absolute pathname; e.g. "/absolute/path/to/java/bin/java"