Hadoop with eclipse is not connecting - java

I am using ubuntu 12.04. I am trying to connect hadoop in eclipse.Successfully installed plugin for 1.04. I am using java 1.7 for this.
My configuration data are
username:hduser,locationname:test,map/reduce host port are localhost:9101 and M/R masterhost localhost:9100.
My temp directory is /app/hduser/temp.
As per this location I set advanced parameters.But I was not able to set fs.s3.buffer.dir as there was no such directory created like /app/hadoop/tmp//s3. unable to set map reduce master directory.I only found local directory. I didnot find mapred.jobtracker.persist.job.dir. And also map red temp dir.
When I ran hadoop in pseudo distributed mode I didnot found any datanode running also with JPS.
I am not sure what is the problem here.In eclipse I got the error while setting the dfs server.I got the message like...
An internal error occurred during: "Connecting to DFS test".
org/apache/commons/configuration/Configuration
Thanks all

I was facing the same issue. Later found this:
Hadoop eclipse mapreduce is not working?
The main blog post is this. HTH someone who is looking for a solution.

Related

Apache Livy : Could not find or load main class org.apache.livy.server.LivyServer

I am trying to start Apache Livy 0.8.0 server on my windows 10 machine for spark 3.1.2 and hadoop 3.2.1. I am taking help from here.. I have successfully built apache livy using maven (I have attached a of it) But I am not able to run the livy server. When I run it I get the following error -
> starting C:/AmazonJDK/jdk1.8.0_332/bin/java -cp /d/ApacheLivy/incubator-livy-master/incubator-livy-master/server/target/jars/*:/d/ApacheLivy/incubator-livy-master/incubator-livy-master/conf:D:/Program_files/spark/conf:D:/ApacheHadoop/hadoop-3.2.1/etc/hadoop: org.apache.livy.server.LivyServer, logging to D:/ApacheLivy/incubator-livy-master/incubator-livy-master/logs/livy--server.out
ps: unknown option -- o
Try `ps --help' for more information.
failed to launch C:/AmazonJDK/jdk1.8.0_332/bin/java -cp /d/ApacheLivy/incubator-livy-master/incubator-livy-master/server/target/jars/*:/d/ApacheLivy/incubator-livy-master/incubator-livy-master/conf:D:/Program_files/spark/conf:D:/ApacheHadoop/hadoop-3.2.1/etc/hadoop: org.apache.livy.server.LivyServer:
Error: Could not find or load main class org.apache.livy.server.LivyServer
full log in D:/ApacheLivy/incubator-livy-master/incubator-livy-master/logs/livy--server.out
I am using Git bash. If you need more information I will provide
The error got resolved when I used Windows Subsystem for Linux (WSL).

Jenkins fails to start up and empty plugins directory

I am new to jenkins, I upgraded a couple of plugins(don't remember which), after that I when I try java -jar jenkins.war I end up getting the following error.
jenkins.InitReactorRunner$1 onTaskFailed
SEVERE: Failed Loading global config
java.io.IOException: Unable to read /home/.jenkins/config.xml
I went through several links which address this issue, but no luck yet. In this link which I found https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=764711 it says some plugins are missing, and surprisingly, my /home/.jenkins/plugins/ is empty!!!
How do I restore the necessary plugins from my command line?
I am using CentOS release 6.8 (Final)
Thank you :)
I had encountered this issue somedays back,however after restarting jenkins service solved the issue.
The best you can do now is rename your config file. This way jenkins will load with the default startup.

cannot find org.neo4j.batchimport

I am continuing to have trouble with the import.bat file for the Neo4j batch importer. I started a new thread as the original problem was resolved.
from the command prompt I run
import.bat test.db sample\nodes.csv sample\rels.csv
With some variations on the path listing for the files, including absolute paths. I continue to get the following error message
The system cannot find the path specified.
Error: Could not find or load main class org.neo4j.batchimport.Importer
I also tried running import.sh from Cygwin and in my Debian VM but keep getting the error
Error: Could not find or load main class org.neo4j.batchimport.Importer
What am I doing wrong?
Please download the zip-file, not the github clone.
This is a pre-build binary as outlined in the readme, that doesn't require that you have to have maven installed to build it.

PHP - Error installing JavaBridge.war

I'm trying to install JavaBridge as I want to make connections between my php files and Java files.
I need my php request a GET method to Java file, and Java file will get the ID and run some script and return with an ARRAY.
I tried to install JavaBridge, but I get errors while trying to excess to localhost:8080/JavaBridge
javax.servlet.ServletException: php.java.bridge.http.FCGIConnectException: Could not connect to server
php.java.bridge.http.FCGIConnectException: Could not connect to server
java.io.IOException: File \\.\pipe\C:\apache-tomcat-7.0.39\temp\JavaBridge939398813756155712.socket not writable
java.io.IOException: PHP not found. Please install php-cgi. PHP test command was: [php-cgi, -v]
I didn't show out all the errors here, I showed only the first line of each root cause.
I will give more details in my setup.
I'm using XAMPP all the time, with PHP, MySQL in the XAMPP.
Then I just installed Tomcat inorder to install JavaBridge.
I copied JavaBridge.war into c:/tomcats/webapps, copied JavaBridge.jar and php-servlet.jar into c:/tomcats/lib
I assume the problems should lies with the PHP, but I don't know how to fix it as well.
Install PHP again? but I'm not sure too.
Whats the problem with my setup?
Or is there others way to make connections between PHP and Java as what I stated above?
Thank you.
Have you enabled php-cgi?
Here's how u can do it. Go to: xampp\apache\conf\extra\httpd-xampp.conf and uncomment this:
<FilesMatch "\.php$">
SetHandler application/x-httpd-php-cgi
</FilesMatch>
<IfModule actions_module>
Action application/x-httpd-php-cgi "/php-cgi/php-cgi.exe"
</IfModule>
Restart Apache & Tomcat.

Failure to Login

I have Copied Hadoop-Eclipse plugin to plugin folder of Eclipse and also made SSH psw less but when i try to create new DFS location it gets created but it throws error like An internal error occurred during: "Map/Reduce location status updater".org/codehaus/jackson/map/JsonMappingException and ubable to connect to DFS directory, it says failure to login, I am new to Hadoop, Please help me to get rid of this.
I keep the same port no in Map/Reducer and Master DFS as same as I keep in Conf files
I am getting the same error with Hadoop-0.20.203. I think there are some issues in having this version work with eclipse. Try reading about them and find if your issue is related.

Categories

Resources