Hadoop Connection Refused Error - java

I was trying to install RMR (RHadoop) package and I somehow managed to mess up my hadoop setup. Now, it gives the connection refused error which I just can't find a solution for. Any help would be appreciated. Thanks
java.net.ConnectException: Call to master/***.***.***.***:54310 failed on connection exception: java.net.ConnectException: Connection refused
at org.apache.hadoop.ipc.Client.wrapException(Client.java:1095)
at org.apache.hadoop.ipc.Client.call(Client.java:1071)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
at $Proxy2.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
at org.apache.hadoop.mapred.Child$4.run(Child.java:254)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:574)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:489)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:434)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:560)
at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:184)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1202)
at org.apache.hadoop.ipc.Client.call(Client.java:1046)
... 18 more

When you see this, it basically means that you are unable to connected to the NameNode. It's either not running or running on a different port. If you backed up your working *-site.xml files, you may be able go back to the working version without the complete re-install you sugges in the comment to your question.

I have struggled two days and the night between to find out the answer to this problem.
In my case( and I'm sure this is the problem in most cases ) had to create the hadoop temporary folder by hand and add them to the hdfs-site.xml !
<property>
<name>dfs.data.dir</name>
<value>/home/stefan/Downloads/hadoop-2.7.1/tmp/dfs/name/data</value>
<final>true</final>
</property>
<property>
<name>dfs.name.dir</name>
<value>/home/stefan/Downloads/hadoop-2.7.1/tmp/dfs/name</value>
<final>true</final>
</property>
I hope this helps you guys not to go through the same hell as me.
Besides that
chown user_name hadoop_folder hadoop_temp_folder
chmod 755 hadoop_folder hadoop_temp_folder

Related

UnknownHostException error, building java grpc example

I am following this grpc tutorial and I haven't even been able to make it through the first step. The first step is to git clone the project and then run
cd examples
./gradlew installDist
I am hit with this stack trace
Downloading https://services.gradle.org/distributions/gradle-2.13-bin.zip
Exception in thread "main" java.net.UnknownHostException: services.gradle.org
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:184)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at sun.security.ssl.SSLSocketImpl.connect(SSLSocketImpl.java:668)
at sun.security.ssl.BaseSSLSocketImpl.connect(BaseSSLSocketImpl.java:173)
at sun.net.NetworkClient.doConnect(NetworkClient.java:180)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
at sun.net.www.protocol.https.HttpsClient.<init>(HttpsClient.java:264)
at sun.net.www.protocol.https.HttpsClient.New(HttpsClient.java:367)
at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.getNewHttpClient(AbstractDelegateHttpsURLConnection.java:191)
at sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1105)
at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:999)
at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect(AbstractDelegateHttpsURLConnection.java:177)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1513)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1441)
at sun.net.www.protocol.https.HttpsURLConnectionImpl.getInputStream(HttpsURLConnectionImpl.java:254)
at org.gradle.wrapper.Download.downloadInternal(Download.java:58)
at org.gradle.wrapper.Download.download(Download.java:44)
at org.gradle.wrapper.Install$1.call(Install.java:61)
at org.gradle.wrapper.Install$1.call(Install.java:48)
at org.gradle.wrapper.ExclusiveFileAccessManager.access(ExclusiveFileAccessManager.java:65)
at org.gradle.wrapper.Install.createDist(Install.java:48)
at org.gradle.wrapper.WrapperExecutor.execute(WrapperExecutor.java:128)
at org.gradle.wrapper.GradleWrapperMain.main(GradleWrapperMain.java:61)
I thought that was a proxy issue, so I took some inspiration from this post and opened up the file
gradle/wrapper/gradle-wrapper.properties
and added the lines
systemProp.http.proxyHost=<my proxy>
systemProp.http.proxyPort=<my port>
I also replaced distributionUrl line with this
distributionUrl=http\://services.gradle.org/distributions/gradle-2.13-bin.zip
That is, I switched https --> http.
After all of this, I am still getting the same stack trace.
Anybody have advice?
EDIT : I added a
gradle.properties
file in the home directory, and added the fields
systemProp.http.proxyHost
systemProd.http.proxyPort
and
systemProp.https.proxyHost
systemProp.https.proxyPort
but I still got the same error as before. HOWEVER, the build script appeared to stall on this line
Downloading https://services.gradle.org/distributions/gradle-2.13-bin.zip
For about 10 seconds or so, before failing. So... progress?
I had the same problem and finally found this solution after 2 hours...
Go to the file gradlew.bat in your project and change DEFAULT_JVM_OPTS variable as below:
set DEFAULT_JVM_OPTS=-Dhttp.proxyHost=YOUR_HOST -Dhttp.proxyPort=PORT -Dhttp.proxyUser=USERNAME -Dhttp.proxyPassword=PASSWORD -Dhttps.proxyHost=YOUR_HOST -Dhttps.proxyPort=PORT -Dhttps.proxyUser=USERNAME -Dhttps.proxyPassword=PASSWORD
then run gradlew clean build.
I don't know if it is an optimal solution, but what eventually worked for me was to replace
distributionUrl=https\://services.gradle.org/distributions/gradle-2.13-bin.zip
with
distributionUrl=https\://services.gradle.org/distributions/gradle-3.0-bin.zip
Essentially upgrading the version of gradle I am grabbing.
I also added a global
~/.gradle/gradle.properties
In my home directory with all the proxy info I had previously declared. My attempt to build after doing this was unsuccessful, but I can't say for sure if doing this had zero impact.

Cassandra won't start on OSX "Reason: Connection refused."

I'm working on a java project in which cassandra is included in the repository itself. I'm having trouble getting it to run however, receiving the following error:
/Users/xxx/dev/xxxx/build/cassandra/bin/cassandra-cli -h localhost -p 9052 -f
/Users/xxx/dev/xxxx/schema.txt
return code: 0
stderr: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused
at org.apache.thrift.transport.TSocket.open(TSocket.java:183)
at org.apache.thrift.transport.TFramedTransport.open(TFramedTransport.java:81)
at org.apache.cassandra.cli.CliMain.connect(CliMain.java:73)
at org.apache.cassandra.cli.CliMain.main(CliMain.java:249)
Caused by: java.net.ConnectException: Connection refused
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:579)
at org.apache.thrift.transport.TSocket.open(TSocket.java:178)
... 3 more
Exception connecting to localhost/9052. Reason: Connection refused.
stdout: Not connected to a cassandra instance.
Not connected to a cassandra instance.
I've tried altering the port and the localhost hostname to 127.0.0.1 and 0.0.0.0 but this makes no difference really.
I'm using java version "1.7.0_71"
Any ideas would be appreciated, thanks
The issue was cassandra trying to start on a hostname that was not in my /etc/hosts file.
I found which hostname by running /Users/xxx/dev/xxxx/build/cassandra/bin/cassandra -f which gave better output as to what the issue was.

ConnectException when submitting hadoop job from eclipse

I'm trying to submit a job (a simple word count) to hadoop-2.5.0 (installed on a ubuntu 14.04.1 server running on a virtual machine) from eclipse on windows. In the job configuration, i've set "fs.defaultFS" to "hdfs://192.168.2.216:8020" (as suggested in this thread) but when I run the main progam I got the following exception:
WARN - NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
ERROR - Shell - Failed to locate the winutils binary in the hadoop binary path
Exception in thread "main" java.net.ConnectException: Call From EL-OUED/192.168.2.8 to 192.168.2.216:8020 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
at org.apache.hadoop.ipc.Client.call(Client.java:1414)
at org.apache.hadoop.ipc.Client.call(Client.java:1363)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:699)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1762)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1124)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1120)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1120)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1398)
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:145)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:458)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:343)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
at com.heavenize.hadoop.WordCountMR.main(WordCountMR.java:55)
Caused by: java.net.ConnectException: Connection refused: no further information
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:735)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:604)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:699)
at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1462)
at org.apache.hadoop.ipc.Client.call(Client.java:1381)
... 28 more
Also, when checking connection configuration on hadoop, it seems it is listening/accepting for connections on 127.0.0.1:8020.
$netstat -lent | grep 8020
tcp 0 0 127.0.0.1:8020 0.0.0.0:* LISTEN 1001 10380
This is the content of core-site.xml, I wonder if it is the source of this problem and how to fix it?
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost</value>
</property>
</configuration>
Basically your namenode is listening on the localhost interface, therefore it allows connections only from 127.0.0.1. As you suggested, the error is indeed in the fs.default.name parameter, which should be modified to use the hostname instead of localhost.
Beware that /etc/hosts should contain a line like
192.168.2.216 hostname.fully.qualified.domain.com hostname
You can verify that the hostname is properly setting running the command "hostname" and "hostname -f". "hostname" should return the the name of the system as returned by gethostname, while "hostname -f" should return the fqdn of the system.

ConnectException (timed out) running groovy Koans with gradle wrapper

I'm trying to run groovy koans http://groovykoans.org/ and when I use the gradlew script it tries to download gradle from the internet (from http://services.gradle.org/distributions/gradle-1.8-bin.zip)
But it crashes with a connection timed out exception. I'm able to download the file fine from firefox. I've included http proxy args on the command line as per the instructions and I can ping services.gradle.org from my machine.
I'm on windows.
C:\Users\me\My Documents\documents\work\build_system\groovykoans-master>gradlew removeSolutions -Dhttp.proxyHost=proxy.blah.com -Dhttp.proxyPort=8000
Downloading http://services.gradle.org/distributions/gradle-1.8-bin.zip
Exception in thread "main" java.lang.RuntimeException: java.net.ConnectException: Connection timed out: connect
at org.gradle.wrapper.ExclusiveFileAccessManager.access(ExclusiveFileAccessManager.java:78)
at org.gradle.wrapper.Install.createDist(Install.java:47)
at org.gradle.wrapper.WrapperExecutor.execute(WrapperExecutor.java:129)
at org.gradle.wrapper.GradleWrapperMain.main(GradleWrapperMain.java:48)
Caused by: java.net.ConnectException: Connection timed out: connect
at java.net.DualStackPlainSocketImpl.connect0(Native Method)
at java.net.DualStackPlainSocketImpl.socketConnect(DualStackPlainSocketImpl.java:69)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:157)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:391)
at java.net.Socket.connect(Socket.java:579)
at java.net.Socket.connect(Socket.java:528)
at sun.net.NetworkClient.doConnect(NetworkClient.java:180)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:378)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:473)
at sun.net.www.http.HttpClient.<init>(HttpClient.java:203)
at sun.net.www.http.HttpClient.New(HttpClient.java:290)
at sun.net.www.http.HttpClient.New(HttpClient.java:306)
at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:995)
at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:931)
at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:849)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1299)
at org.gradle.wrapper.Download.downloadInternal(Download.java:59)
at org.gradle.wrapper.Download.download(Download.java:45)
at org.gradle.wrapper.Install$1.call(Install.java:60)
at org.gradle.wrapper.Install$1.call(Install.java:47)
at org.gradle.wrapper.ExclusiveFileAccessManager.access(ExclusiveFileAccessManager.java:65)
... 3 more
If I can't solve the connect issue is there a way I can manually install the gradle that I've successfully downloaded via my browser and bypass the download step from the gradlewrapper?
Well I managed to work out where the gradle zip was being put (C:\Users\me\.gradle\wrapper\dists\gradle-1.8-bin\vruqmccc8532n7gr46qavsii8 so I dropped my separately downloaded zip in there and it got me past the issue.
However since then I've also realized I was specifying the -Dhttp properties after the command and not before it so I suspect had I done that it would have worked. (Haven't retried it with a cleaned install area though) i.e I should have had
gradlew -Dhttp.proxyHost=proxy.blah.com -Dhttp.proxyPort=8000 removeSolutions
instead of:
gradlew removeSolutions -Dhttp.proxyHost=proxy.blah.com -Dhttp.proxyPort=8000
duh!

hadoop running application- ERROR security.UserGroupInformation: PriviledgedActionException

I have written WordCount code of hadoop as an java application in eclipse to test hadoop for running applications, but when I try to run it as hdfs user, this error will appear:
./hadoop jar /home/masi/eclipse_workspace/WordCount_apacheSample/bin/test2.jar WordCountApacheSample /user/hdfs/wordCountInput /user/hdfs/wordCountOutput
13/10/02 17:14:50 INFO service.AbstractService: Service:org.apache.hadoop.yarn.client.YarnClientImpl is inited.
13/10/02 17:14:50 INFO service.AbstractService: Service:org.apache.hadoop.yarn.client.YarnClientImpl is started.
13/10/02 17:14:50 ERROR security.UserGroupInformation: PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.net.ConnectException: Call From virtual-machine/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Exception in thread "main" java.net.ConnectException: Call From virtual-machine/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:532)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:780)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:727)
at org.apache.hadoop.ipc.Client.call(Client.java:1239)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
at sun.proxy.$Proxy9.getFileInfo(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
at sun.proxy.$Proxy9.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:630)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1559)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:811)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1345)
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:140)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:418)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:333)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1218)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1215)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:416)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1478)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1215)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1236)
at WordCountApacheSample.main(WordCountApacheSample.java:71)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:597)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:526)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:490)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:508)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:603)
at org.apache.hadoop.ipc.Client$Connection.access$2100(Client.java:253)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1288)
at org.apache.hadoop.ipc.Client.call(Client.java:1206)
... 29 more
although I have tested input and output paths with hdfs://localhost:9000/ , there is no difference!
BTW, I have studied many posts related to my problem but they were not useful
any help is appreciated.
thanks.
finally I solved the problem by myself and decide to tell the reason here to help others :) the reason sounds somehow silly but the problem was this : the hadoop daemons were stop! my VM shut down suddenly and after restarting the VM, I had forgotten to start daemons(datanode, namenode,...) again! so the reason of this problem is this: datanode and namenode and other daemons are not running.
if you discover that your hdfs is corrupt then you can do following:
sudo -su hdfs
hadoop fsck /
hadoop dfsadmin -safemode leave
... - then delete the corrupted files if any -using following:
hadoop fs -rmr -skipTrash <folder with your files>
hadoop fsck -files delete /
check status :
hadoop fsck /
status should be HEALTHY after this - then manually restart everything in Ambari
I tried this on a small cluster and managed to get it up and running again after having similar error as mentioned above

Categories

Resources