I am trying to run hazelcast server v3.2.4 on 0.0.0.0 interface on an ubuntu VM on port 5701 (port is not used by any other service nor are there are firewall settings preventing me from listening on 5701 based on what I can tell). However, I keep seeing the following in my log files (relevant hazelcast xml config is copied below as well):
hazelcast.xml:
<network>
<port auto-increment="true" port-count="100">5701</port>
<outbound-ports>
<!--
Allowed port range when connecting to other nodes.
0 or * means use system provided port.
-->
<ports>0</ports>
</outbound-ports>
<join>
<multicast enabled="false">
<multicast-group>224.2.2.3</multicast-group>
<multicast-port>54327</multicast-port>
</multicast>
<tcp-ip enabled="true">
<interface>0.0.0.0</interface> <!-- 127.0.0.1 -->
</tcp-ip>
<aws enabled="false">
</aws>
</join>
<interfaces enabled="false">
<interface>10.10.1.*</interface>
</interfaces>
<ssl enabled="false" />
<socket-interceptor enabled="false" />
<symmetric-encryption enabled="false">
<algorithm>PBEWithMD5AndDES</algorithm>
<salt>fakesalt</salt>
<password>fakepwd</password>
<iteration-count>19</iteration-count>
</symmetric-encryption>
</network>
Logs:
2014-09-04 13:13:21,752 INFO c.h.i.DefaultAddressPicker [main] null [dev] [3.2.4] Interfaces is disabled, trying to pick one address from TCP-IP config addresses: [0.0.0.0]
2014-09-04 13:13:21,754 INFO c.h.i.DefaultAddressPicker [main] null [dev] [3.2.4] Prefer IPv4 stack is true.
2014-09-04 13:13:21,755 WARN c.h.i.DefaultAddressPicker [main] null [dev] [3.2.4] Could not find a matching address to start with! Picking one of non-loopback addresses.
2014-09-04 13:13:21,762 INFO c.h.i.DefaultAddressPicker [main] null [dev] [3.2.4] Picked Address[xxx.xxx.xxx.xxx]:5701, using socket ServerSocket[addr=/0:0:0:0:0:0:0:0,localport=5701], bind any local is true
2014-09-04 13:13:21,950 INFO c.h.system [main] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] Hazelcast 3.2.4 (20140721) starting at Address[xxx.xxx.xxx.xxx]:5701
2014-09-04 13:13:21,950 INFO c.h.system [main] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] Copyright (C) 2008-2014 Hazelcast.com
2014-09-04 13:13:21,952 INFO c.h.i.Node [main] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] Creating TcpIpJoiner
2014-09-04 13:13:21,956 INFO c.h.c.LifecycleService [main] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] Address[xxx.xxx.xxx.xxx]:5701 is STARTING
2014-09-04 13:13:22,042 INFO c.h.c.TcpIpJoiner [main] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] Connecting to possible member: Address[0.0.0.0]:5702
2014-09-04 13:13:22,045 INFO c.h.c.TcpIpJoiner [main] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] Connecting to possible member: Address[0.0.0.0]:5701
2014-09-04 13:13:22,050 INFO c.h.c.TcpIpJoiner [main] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] Connecting to possible member: Address[0.0.0.0]:5703
2014-09-04 13:13:22,058 INFO c.h.n.SocketConnector [hz._hzInstance_1_dev.cached.thread-4] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] Connecting to /0.0.0.0:5703, timeout: 0, bind-any: true
2014-09-04 13:13:22,058 INFO c.h.n.SocketConnector [hz._hzInstance_1_dev.cached.thread-2] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] Connecting to /0.0.0.0:5702, timeout: 0, bind-any: true
2014-09-04 13:13:22,058 INFO c.h.n.SocketConnector [hz._hzInstance_1_dev.cached.thread-3] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] Connecting to /0.0.0.0:5701, timeout: 0, bind-any: true
2014-09-04 13:13:22,061 INFO c.h.n.SocketConnector [hz._hzInstance_1_dev.cached.thread-2] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] Could not connect to: /0.0.0.0:5702. Reason: SocketException[Connection refused to address /0.0.0.0:5702]
2014-09-04 13:13:22,062 INFO c.h.n.SocketConnector [hz._hzInstance_1_dev.cached.thread-4] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] Could not connect to: /0.0.0.0:5703. Reason: SocketException[Connection refused to address /0.0.0.0:5703]
2014-09-04 13:13:22,065 INFO c.h.n.SocketAcceptor [hz._hzInstance_1_dev.IO.thread-Acceptor] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] Accepting socket connection from /xxx.xxx.xxx.xxx:50460
2014-09-04 13:13:22,085 INFO c.h.n.TcpIpConnectionManager [hz._hzInstance_1_dev.cached.thread-3] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] 50460 accepted socket connection from /0.0.0.0:5701
2014-09-04 13:13:22,085 INFO c.h.n.TcpIpConnectionManager [hz._hzInstance_1_dev.IO.thread-Acceptor] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] 5701 accepted socket connection from /xxx.xxx.xxx.xxx:50460
2014-09-04 13:13:22,103 WARN c.h.n.TcpIpConnectionManager [hz._hzInstance_1_dev.global-operation.thread-1] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] Wrong bind request from Address[xxx.xxx.xxx.xxx]:5701! This node is not requested endpoint: A
ddress[0.0.0.0]:5701
2014-09-04 13:13:22,104 INFO c.h.n.TcpIpConnection [hz._hzInstance_1_dev.IO.thread-in-0] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] Connection [Address[0.0.0.0]:5701] lost. Reason: java.io.EOFException[Remote socket closed!]
2014-09-04 13:13:22,104 WARN c.h.n.ReadHandler [hz._hzInstance_1_dev.IO.thread-in-0] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] hz._hzInstance_1_dev.IO.thread-in-0 Closing socket to endpoint Address[0.0.0.0]:5701, Cause:java.io.EOFException: R
emote socket closed!
2014-09-04 13:13:22,116 INFO c.h.n.TcpIpConnection [hz._hzInstance_1_dev.global-operation.thread-1] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] Connection [/xxx.xxx.xxx.xxx:50460] lost. Reason: Socket explicitly closed
2014-09-04 13:13:23,051 INFO c.h.n.SocketConnector [hz._hzInstance_1_dev.cached.thread-3] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] Connecting to /0.0.0.0:5701, timeout: 0, bind-any: true
2014-09-04 13:13:23,051 INFO c.h.n.TcpIpConnectionManager [hz._hzInstance_1_dev.cached.thread-3] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] 45113 accepted socket connection from /0.0.0.0:5701
2014-09-04 13:13:23,051 INFO c.h.n.SocketAcceptor [hz._hzInstance_1_dev.IO.thread-Acceptor] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] Accepting socket connection from /xxx.xxx.xxx.xxx:45113
2014-09-04 13:13:23,052 INFO c.h.n.TcpIpConnectionManager [hz._hzInstance_1_dev.IO.thread-Acceptor] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] 5701 accepted socket connection from /xxx.xxx.xxx.xxx:45113
2014-09-04 13:13:23,054 WARN c.h.n.TcpIpConnectionManager [hz._hzInstance_1_dev.global-operation.thread-2] [xxx.xxx.xxx.xxx]:5701 [dev] [3.2.4] Wrong bind request from Address[xxx.xxx.xxx.xxx]:5701! This node is not requested endpoint: A
ddress[0.0.0.0]:5701
Update
I changed the tcp settings to 127.0.0.1 and it seems to bind now on port 5701. However, I keep seeing the following in my server output:
Sep 04, 2014 6:58:07 PM com.hazelcast.config.FileSystemXmlConfig
INFO: Configuring Hazelcast from '/opt/xxx/resources/hazelcast.xml'.
I am unable to get a client to connect to the server!
Have you tried forcing your machine to use public adress? I had similar problem soved by adding to hazelcast.xml:
<network>
<public-address>X.X.X.X</public-address>
...
</network>
Have you tried flipping over the listening host to 127.0.0.1 instead of 0.0.0.0? I could be wrong but I think the version you are using appears to not work if you use 127.0.0.1 on linux (or was the case when I was trying some time ago).
Related
I'm using Hazelcast on a java project, but I only use a single node and do not want any clustering. It's actually causing us issues when several dev instances discover each other and form a cluster.
Is there a way to fully disable clustering in the xml configuration?
Edit: I'm using Hazelcast 3.8.4, I've tried disabling multicast like so, but it looks like it's still enabled:
<hz:hazelcast id="hazelcast">
<hz:config>
<hz:group name="dev" password="password"/>
<hz:properties>
<hz:property name="hazelcast.logging.type">slf4j</hz:property>
</hz:properties>
<hz:network port="5701" port-auto-increment="true">
<hz:join>
<hz:multicast enabled="false"/>
</hz:join>
</hz:network>
<hz:topic name="topicStuff"/>
</hz:config>
</hz:hazelcast>
Logs:
12:54:51,273 INFO [com.hazelcast.hibernate.HazelcastLocalCacheRegionFactory] - Starting up HazelcastLocalCacheRegionFactory
12:54:51,284 INFO [com.hazelcast.config.XmlConfigLocator] - Loading 'hazelcast-default.xml' from classpath.
12:54:51,398 INFO [com.hazelcast.instance.DefaultAddressPicker] - [LOCAL] [dev] [3.8.4] Prefer IPv4 stack is true.
12:54:51,409 INFO [com.hazelcast.instance.DefaultAddressPicker] - [LOCAL] [dev] [3.8.4] Picked [10.212.134.200]:5701, using socket ServerSocket[addr=/0:0:0:0:0:0:0:0,localport=5701], bind any local is true
12:54:51,426 INFO [com.hazelcast.system] - [10.212.134.200]:5701 [dev] [3.8.4] Hazelcast 3.8.4 (20170809 - 297a77e) starting at [10.212.134.200]:5701
12:54:51,426 INFO [com.hazelcast.system] - [10.212.134.200]:5701 [dev] [3.8.4] Copyright (c) 2008-2016, Hazelcast, Inc. All Rights Reserved.
12:54:51,426 INFO [com.hazelcast.system] - [10.212.134.200]:5701 [dev] [3.8.4] Configured Hazelcast Serialization version : 1
12:54:51,648 INFO [com.hazelcast.spi.impl.operationservice.impl.BackpressureRegulator] - [10.212.134.200]:5701 [dev] [3.8.4] Backpressure is disabled
12:54:52,088 INFO [com.hazelcast.instance.Node] - [10.212.134.200]:5701 [dev] [3.8.4] Creating MulticastJoiner
12:54:52,219 INFO [com.hazelcast.spi.impl.operationexecutor.impl.OperationExecutorImpl] - [10.212.134.200]:5701 [dev] [3.8.4] Starting 12 partition threads
12:54:52,220 INFO [com.hazelcast.spi.impl.operationexecutor.impl.OperationExecutorImpl] - [10.212.134.200]:5701 [dev] [3.8.4] Starting 7 generic threads (1 dedicated for priority tasks)
12:54:52,241 INFO [com.hazelcast.core.LifecycleService] - [10.212.134.200]:5701 [dev] [3.8.4] [10.212.134.200]:5701 is STARTING
12:54:54,286 INFO [com.hazelcast.system] - [10.212.134.200]:5701 [dev] [3.8.4] Cluster version set to 3.8
12:54:54,289 INFO [com.hazelcast.internal.cluster.impl.MulticastJoiner] - [10.212.134.200]:5701 [dev] [3.8.4]
Members [1] { Member [10.212.134.200]:5701 - 820072f4-f1ef-4a5e-9722-eb2bd038f37e this }
12:54:54,361 INFO [com.hazelcast.core.LifecycleService] - [10.212.134.200]:5701 [dev] [3.8.4] [10.212.134.200]:5701 is STARTEDsdf
When no join method is enabled, Hazelcast starts in standalone mode. Prior to version 4.1, the default join method is multicast and when it's disabled, the member will start standalone:
<network>
<join>
<multicast enabled="false"/>
</join>
</network>
which yields the log:
WARNING: [192.168.1.3]:5701 [dev] [4.0] No join method is enabled! Starting standalone.
With version 4.1, this becomes
<network>
<join>
<auto-detection enabled="false"/>
</join>
</network>
I'm trying to connect to a SFTP host using vfs2 connector, which internally uses jsch.
However, when I try to connect, I get the following exception:
12:50:31.747 INFO o.a.c.v.p.sftp.SftpClientFactory - Connecting to <host> port <port>
12:50:32.074 INFO o.a.c.v.p.sftp.SftpClientFactory - Connection established
12:50:32.376 INFO o.a.c.v.p.sftp.SftpClientFactory - Remote version string: SSH-2.0-OpenSSH_7.4
12:50:32.376 INFO o.a.c.v.p.sftp.SftpClientFactory - Local version string: SSH-2.0-JSCH-0.1.54
12:50:32.376 INFO o.a.c.v.p.sftp.SftpClientFactory - CheckCiphers: aes256-ctr,aes192-ctr,aes128-ctr,aes256-cbc,aes192-cbc,aes128-cbc,3des-ctr,arcfour,arcfour128,arcfour256
12:50:33.489 INFO o.a.c.v.p.sftp.SftpClientFactory - CheckKexes: diffie-hellman-group14-sha1,ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521
12:50:34.192 INFO o.a.c.v.p.sftp.SftpClientFactory - CheckSignatures: ecdsa-sha2-nistp256,ecdsa-sha2-nistp384,ecdsa-sha2-nistp521
12:50:34.203 INFO o.a.c.v.p.sftp.SftpClientFactory - SSH_MSG_KEXINIT sent
12:50:34.203 INFO o.a.c.v.p.sftp.SftpClientFactory - SSH_MSG_KEXINIT received
12:50:34.203 INFO o.a.c.v.p.sftp.SftpClientFactory - kex: server: curve25519-sha256#libssh.org,diffie-hellman-group-exchange-sha256
12:50:34.203 INFO o.a.c.v.p.sftp.SftpClientFactory - kex: server: ssh-rsa,rsa-sha2-512,rsa-sha2-256,ssh-ed25519
12:50:34.203 INFO o.a.c.v.p.sftp.SftpClientFactory - kex: server: chacha20-poly1305#openssh.com,aes128-ctr,aes192-ctr,aes256-ctr,aes128-gcm#openssh.com,aes256-gcm#openssh.com,aes128-cbc,aes192-cbc,aes256-cbc,blowfish-cbc,cast128-cbc,3des-cbc
12:50:34.203 INFO o.a.c.v.p.sftp.SftpClientFactory - kex: server: chacha20-poly1305#openssh.com,aes128-ctr,aes192-ctr,aes256-ctr,aes128-gcm#openssh.com,aes256-gcm#openssh.com,aes128-cbc,aes192-cbc,aes256-cbc,blowfish-cbc,cast128-cbc,3des-cbc
12:50:34.203 INFO o.a.c.v.p.sftp.SftpClientFactory - kex: server: hmac-sha2-512-etm#openssh.com,hmac-sha2-256-etm#openssh.com,umac-128-etm#openssh.com
12:50:34.203 INFO o.a.c.v.p.sftp.SftpClientFactory - kex: server: hmac-sha2-512-etm#openssh.com,hmac-sha2-256-etm#openssh.com,umac-128-etm#openssh.com
12:50:34.203 INFO o.a.c.v.p.sftp.SftpClientFactory - kex: server: none,zlib#openssh.com
12:50:34.204 INFO o.a.c.v.p.sftp.SftpClientFactory - kex: server: none,zlib#openssh.com
12:50:34.204 INFO o.a.c.v.p.sftp.SftpClientFactory - kex: server:
12:50:34.204 INFO o.a.c.v.p.sftp.SftpClientFactory - kex: server:
12:50:34.204 INFO o.a.c.v.p.sftp.SftpClientFactory - kex: client: ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group14-sha1,diffie-hellman-group-exchange-sha256,diffie-hellman-group-exchange-sha1,diffie-hellman-group1-sha1
12:50:34.204 INFO o.a.c.v.p.sftp.SftpClientFactory - kex: client: ssh-rsa,ssh-dss,ecdsa-sha2-nistp256,ecdsa-sha2-nistp384,ecdsa-sha2-nistp521
12:50:34.204 INFO o.a.c.v.p.sftp.SftpClientFactory - kex: client: aes128-ctr,aes128-cbc,3des-ctr,3des-cbc,blowfish-cbc,aes192-ctr,aes192-cbc,aes256-ctr,aes256-cbc
12:50:34.204 INFO o.a.c.v.p.sftp.SftpClientFactory - kex: client: aes128-ctr,aes128-cbc,3des-ctr,3des-cbc,blowfish-cbc,aes192-ctr,aes192-cbc,aes256-ctr,aes256-cbc
12:50:34.204 INFO o.a.c.v.p.sftp.SftpClientFactory - kex: client: hmac-md5,hmac-sha1,hmac-sha2-256,hmac-sha1-96,hmac-md5-96
12:50:34.204 INFO o.a.c.v.p.sftp.SftpClientFactory - kex: client: hmac-md5,hmac-sha1,hmac-sha2-256,hmac-sha1-96,hmac-md5-96
12:50:34.204 INFO o.a.c.v.p.sftp.SftpClientFactory - kex: client: none
12:50:34.204 INFO o.a.c.v.p.sftp.SftpClientFactory - kex: client: none
12:50:34.204 INFO o.a.c.v.p.sftp.SftpClientFactory - kex: client:
12:50:34.204 INFO o.a.c.v.p.sftp.SftpClientFactory - kex: client:
12:50:34.205 INFO o.a.c.v.p.sftp.SftpClientFactory - Disconnecting from <host> port <port>
12:50:34.243 ERROR c.s.saw.export.SFTPUploader - IO SFTP Error:
org.apache.commons.vfs2.FileSystemException: Could not connect to SFTP server at "sftp://<user>#<host>:<port>/".
at org.apache.commons.vfs2.provider.sftp.SftpFileProvider.createSession(SftpFileProvider.java:72)
at org.apache.commons.vfs2.provider.sftp.SftpFileProvider.doCreateFileSystem(SftpFileProvider.java:93)
at org.apache.commons.vfs2.provider.AbstractOriginatingFileProvider.getFileSystem(AbstractOriginatingFileProvider.java:93)
at org.apache.commons.vfs2.provider.AbstractOriginatingFileProvider.findFile(AbstractOriginatingFileProvider.java:72)
at org.apache.commons.vfs2.provider.AbstractOriginatingFileProvider.findFile(AbstractOriginatingFileProvider.java:56)
at org.apache.commons.vfs2.impl.DefaultFileSystemManager.resolveFile(DefaultFileSystemManager.java:717)
at org.apache.commons.vfs2.impl.DefaultFileSystemManager.resolveFile(DefaultFileSystemManager.java:654)
at com.synchronoss.saw.export.SFTPUploader.uploadFile(SFTPUploader.java:99)
at com.synchronoss.saw.export.SFTPUploader.main(SFTPUploader.java:134)
Caused by: org.apache.commons.vfs2.FileSystemException: Could not connect to SFTP server at "<host>".
at org.apache.commons.vfs2.provider.sftp.SftpClientFactory.createConnection(SftpClientFactory.java:163)
at org.apache.commons.vfs2.provider.sftp.SftpFileProvider.createSession(SftpFileProvider.java:65)
... 8 common frames omitted
Caused by: com.jcraft.jsch.JSchException: Algorithm negotiation fail
at com.jcraft.jsch.Session.receive_kexinit(Session.java:590)
at com.jcraft.jsch.Session.connect(Session.java:320)
at com.jcraft.jsch.Session.connect(Session.java:183)
at org.apache.commons.vfs2.provider.sftp.SftpClientFactory.createConnection(SftpClientFactory.java:161)
... 9 common frames omitted
I referred to a few blogs and stackoverflow links and tried multiple things. Still I'm not getting any successful result. Also, tried multiple versions of vfs2 and jsch with no avail.
Here are some of the links I referred:
JSchException: Algorithm negotiation fail
JSch Algorithm negotiation fail
https://dzone.com/articles/install-java-cryptography-extension-jce-unlimited
The cipher 'aes256-cbc' is required, but it is not available
Can someone please help me with this?
Your server supports only these MACs:
hmac-sha2-512-etm#openssh.com,hmac-sha2-256-etm#openssh.com,umac-128-etm#openssh.com
Those are not supported by JSch.
If you need to connect with JSch, you will have to configure your server to allow some MACs that JSch does support:
hmac-md5,hmac-sha1,hmac-sha2-256,hmac-sha1-96,hmac-md5-96
Use the MACs directive in sshd_config.
Though hmac-sha2-*-etm#openssh.com and hmac-sha2-* are possibly just aliases. So configuring JSch appropriately may help too:
config.put("hmac-sha2-256-etm#openssh.com", "com.jcraft.jsch.jce.HMACSHA256");
But I'm not sure, if vfs2 interface allows such configuration.
$ hadoop jar target/projeto5-1.0-SNAPSHOT-fatjar.jar br.edu.ufam.anibrata.HBaseWordCount -input shakespeare.txt -output wcount -numReducers 1
17/07/15 20:23:29 INFO zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x6022bb5c connecting to ZooKeeper ensemble=localhost:2181
17/07/15 20:23:29 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.5-cdh5.11.1--1, built on 06/01/2017 17:37 GMT
17/07/15 20:23:29 INFO zookeeper.ZooKeeper: Client environment:host.name=quickstart.cloudera
17/07/15 20:23:29 INFO zookeeper.ZooKeeper: Client environment:java.version=1.7.0_67
17/07/15 20:23:29 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation
17/07/15 20:23:29 INFO zookeeper.ZooKeeper: Client environment:java.home=/usr/java/jdk1.7.0_67-cloudera/jre
17/07/15 20:23:29 INFO zookeeper.ZooKeeper: Client environment:java.class.path=/etc/hadoop/conf:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/htrace-core4-4.0.1-incubating.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/avro.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/azure-data-lake-store-sdk-2.1.4.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/aws-java-sdk-dynamodb-1.10.6.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.8.8.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/aws-java-sdk-sts-1.10.6.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/netty-3.10.5.Final.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/aws-java-sdk-s3-1.10.6.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/aws-java-sdk-core-1.10.6.jar:/usr/lib/hadoop/lib/logredactor-1.0.3.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jetty-6.1.26.cloudera.4.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.cloudera.4.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/hue-plugins-3.9.0-cdh5.11.1.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.5.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/slf4j-log4j12.jar:/usr/lib/hadoop/lib/aws-java-sdk-kms-1.10.6.jar:/usr/lib/hadoop/lib/jackson-xc-1.8.8.jar:/usr/lib/hadoop/lib/commons-beanutils-1.9.2.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop/lib/zookeeper.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/.//parquet-tools.jar:/usr/lib/hadoop/.//parquet-hadoop-bundle.jar:/usr/lib/hadoop/.//parquet-format-javadoc.jar:/usr/lib/hadoop/.//parquet-avro.jar:/usr/lib/hadoop/.//parquet-pig.jar:/usr/lib/hadoop/.//hadoop-nfs.jar:/usr/lib/hadoop/.//parquet-pig-bundle.jar:/usr/lib/hadoop/.//parquet-column.jar:/usr/lib/hadoop/.//hadoop-common.jar:/usr/lib/hadoop/.//hadoop-aws-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop/.//parquet-format-sources.jar:/usr/lib/hadoop/.//hadoop-auth-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop/.//hadoop-common-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop/.//hadoop-nfs-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop/.//parquet-protobuf.jar:/usr/lib/hadoop/.//hadoop-azure-datalake-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop/.//parquet-scrooge_2.10.jar:/usr/lib/hadoop/.//hadoop-common-2.6.0-cdh5.11.1-tests.jar:/usr/lib/hadoop/.//parquet-generator.jar:/usr/lib/hadoop/.//hadoop-common-tests.jar:/usr/lib/hadoop/.//parquet-common.jar:/usr/lib/hadoop/.//hadoop-aws.jar:/usr/lib/hadoop/.//parquet-test-hadoop2.jar:/usr/lib/hadoop/.//parquet-scala_2.10.jar:/usr/lib/hadoop/.//hadoop-annotations-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop/.//parquet-jackson.jar:/usr/lib/hadoop/.//hadoop-auth.jar:/usr/lib/hadoop/.//parquet-encoding.jar:/usr/lib/hadoop/.//parquet-format.jar:/usr/lib/hadoop/.//parquet-cascading.jar:/usr/lib/hadoop/.//parquet-hadoop.jar:/usr/lib/hadoop/.//hadoop-azure-datalake.jar:/usr/lib/hadoop/.//hadoop-annotations.jar:/usr/lib/hadoop/.//parquet-thrift.jar:/usr/lib/hadoop-hdfs/./:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core4-4.0.1-incubating.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/commons-el-1.0.jar:/usr/lib/hadoop-hdfs/lib/netty-3.10.5.Final.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.cloudera.4.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.cloudera.4.jar:/usr/lib/hadoop-hdfs/lib/jsp-api-2.1.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jasper-runtime-5.5.23.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/.//hadoop-hdfs-2.6.0-cdh5.11.1-tests.jar:/usr/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/usr/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/.//hadoop-hdfs-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/spark-1.6.0-cdh5.11.1-yarn-shuffle.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.8.8.jar:/usr/lib/hadoop-yarn/lib/jline-2.11.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/spark-yarn-shuffle.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.cloudera.4.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.cloudera.4.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.8.8.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/lib/zookeeper.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-registry-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-tests-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-common-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-common-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-client-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-api-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/avro.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.10.5.Final.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/.//jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/.//commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/.//api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/.//apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/.//guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/.//jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/.//jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/.//jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/.//commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/.//activation-1.1.jar:/usr/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/.//curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/.//hadoop-datajoin-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-mapreduce/.//jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/.//commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/.//hadoop-azure-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-mapreduce/.//htrace-core4-4.0.1-incubating.jar:/usr/lib/hadoop-mapreduce/.//hadoop-gridmix-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-mapreduce/.//commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/.//avro.jar:/usr/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/.//commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/.//hadoop-rumen-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-mapreduce/.//okio-1.4.0.jar:/usr/lib/hadoop-mapreduce/.//jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/.//commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/.//commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/.//protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-mapreduce/.//apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/.//commons-el-1.0.jar:/usr/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/usr/lib/hadoop-mapreduce/.//hadoop-distcp-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-mapreduce/.//junit-4.11.jar:/usr/lib/hadoop-mapreduce/.//hadoop-auth-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-mapreduce/.//hadoop-archive-logs-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-mapreduce/.//jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-mapreduce/.//hadoop-streaming-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-mapreduce/.//commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/.//metrics-core-3.0.2.jar:/usr/lib/hadoop-mapreduce/.//jackson-jaxrs-1.8.8.jar:/usr/lib/hadoop-mapreduce/.//hadoop-azure.jar:/usr/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/.//jettison-1.1.jar:/usr/lib/hadoop-mapreduce/.//xz-1.0.jar:/usr/lib/hadoop-mapreduce/.//snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-mapreduce/.//commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/.//hadoop-extras.jar:/usr/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.6.0-cdh5.11.1-tests.jar:/usr/lib/hadoop-mapreduce/.//commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/.//paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/.//jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/.//api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-mapreduce/.//hadoop-sls-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-mapreduce/.//jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/.//hadoop-archives.jar:/usr/lib/hadoop-mapreduce/.//jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/.//jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/.//curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/.//java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-mapreduce/.//hadoop-extras-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-mapreduce/.//hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/.//asm-3.2.jar:/usr/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/usr/lib/hadoop-mapreduce/.//jetty-6.1.26.cloudera.4.jar:/usr/lib/hadoop-mapreduce/.//jetty-util-6.1.26.cloudera.4.jar:/usr/lib/hadoop-mapreduce/.//httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/.//httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/.//curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/.//jasper-compiler-5.5.23.jar:/usr/lib/hadoop-mapreduce/.//jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/.//commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/.//hadoop-archives-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-mapreduce/.//hadoop-sls.jar:/usr/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/.//stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/.//jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/.//okhttp-2.4.0.jar:/usr/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/.//commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/.//jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-mapreduce/.//servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/.//gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/.//xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/.//hadoop-auth.jar:/usr/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/.//jackson-xc-1.8.8.jar:/usr/lib/hadoop-mapreduce/.//commons-beanutils-1.9.2.jar:/usr/lib/hadoop-mapreduce/.//microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/lib/hadoop-mapreduce/.//jasper-runtime-5.5.23.jar:/usr/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/.//commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/.//log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/.//hadoop-ant.jar:/usr/lib/hadoop-mapreduce/.//zookeeper.jar:/usr/lib/hadoop-mapreduce/.//mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/usr/lib/hadoop-mapreduce/.//hadoop-openstack-2.6.0-cdh5.11.1.jar:/usr/lib/hadoop-mapreduce/.//hadoop-ant-2.6.0-cdh5.11.1.jar
17/07/15 20:23:29 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/usr/lib/hadoop/lib/native
17/07/15 20:23:29 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp
17/07/15 20:23:29 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
17/07/15 20:23:29 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux
17/07/15 20:23:29 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64
17/07/15 20:23:29 INFO zookeeper.ZooKeeper: Client environment:os.version=2.6.32-696.3.2.el6.x86_64
17/07/15 20:23:29 INFO zookeeper.ZooKeeper: Client environment:user.name=cloudera
17/07/15 20:23:29 INFO zookeeper.ZooKeeper: Client environment:user.home=/home/cloudera
17/07/15 20:23:29 INFO zookeeper.ZooKeeper: Client environment:user.dir=/home/cloudera/topicosBD-pis/topicosBD-pis/projeto5
17/07/15 20:23:29 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=localhost:2181 sessionTimeout=90000 watcher=hconnection-0x6022bb5c0x0, quorum=localhost:2181, baseZNode=/hbase
17/07/15 20:23:29 INFO zookeeper.ClientCnxn: Opening socket connection to server localhost/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error)
17/07/15 20:23:29 INFO zookeeper.ClientCnxn: Socket connection established, initiating session, client: /127.0.0.1:52300, server: localhost/127.0.0.1:2181
17/07/15 20:23:29 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:2181, sessionid = 0x15d42b07712000e, negotiated timeout = 40000
Exception in thread "main" org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
Sat Jul 15 20:24:23 PDT 2017, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=69886: row 'wcount,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=quickstart.cloudera,60020,1500062548162, seqNum=0
at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:270)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:219)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:57)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:293)
at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:268)
at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:140)
at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:135)
at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:886)
at org.apache.hadoop.hbase.MetaTableAccessor.fullScan(MetaTableAccessor.java:601)
at org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:365)
at org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:310)
at org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:320)
at br.edu.ufam.anibrata.HBaseWordCount.run(HBaseWordCount.java:175)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at br.edu.ufam.anibrata.HBaseWordCount.main(HBaseWordCount.java:224)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=69886: row 'wcount,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=quickstart.cloudera,60020,1500062548162, seqNum=0
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:159)
at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:64)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupConnection(RpcClientImpl.java:404)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:710)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:881)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:850)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1174)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:216)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:300)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:31889)
at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:349)
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:193)
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:62)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:332)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:306)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
... 4 more
I am trying to run the wordCount program in MapReduce with HBase as the sink. Now I haven't made any configuration changes in hbase, and I am using the same 'AS IS' provided in the distribution of Cloudera for virtualBox VM. I am not aware of the configuration files for hbase and how to map those to mapreduce and HDFS. Any help or assistance is very much appreciated.
The error is a sign that your hbase client is trying to initiate unsecure RPCs while HBase is expecting Kerberos authentication.
Moreover the timeout period should not be short as the minimum number of cells that must be scanned before a timeout check occurs.
The default value is 10000. A smaller value causes timeout checks to occur more often. From your post I can say timeout has happened.
Edit hbase-site.xml and add the following properties, modifying the values as needed.
<property>
<name>hbase.rpc.timeout</name>
<value>60000</value>
</property>
<property>
<name>hbase.client.scanner.timeout.period</name>
<value>60000</value>
</property>
<property>
<name>hbase.cells.scanned.per.heartbeat.check</name>
<value>10000</value>
</property>
Copy the hbase-site.xml to all your cluster hosts and restart the HBase master and RegionServer processes for the change to take effect.
I have to launch an hbase process to a private hadoop cluster and need to simply demonstrate minimum functionality; installing hadoop is unnecessary for the demonstration. following the QuickStart apache hbase docs failed via shell and I have been able to duplicate the failures using a self contained maven project. /etc/hosts is not an issue.
I located a maven project which appears to have been built to deomnstrate this same functionality and updated the deps.
that project is here: hbase-demo
to repeat my results requires shell and two lines with maven installed:
git clone https://github.com/jnorthrup/hbase-demo
cd hbase-demo
$ mvn clean package exec:java
[INFO] Scanning for projects...
[INFO]
[...]
Tests run: 0, Failures: 0, Errors: 0, Skipped: 0
[INFO]
[INFO] --- maven-jar-plugin:2.3.2:jar (default-jar) # demo ---
[INFO] Building jar: /vol/big240/snap/jim/work/hbase-demo/target/demo-0.0.1-SNAPSHOT.jar
[INFO]
[INFO] >>> exec-maven-plugin:1.2.1:java (default-cli) # demo >>>
[INFO]
[INFO] <<< exec-maven-plugin:1.2.1:java (default-cli) # demo <<<
[INFO]
[INFO] --- exec-maven-plugin:1.2.1:java (default-cli) # demo ---
13/10/30 14:30:49 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.5-1392090, built on 09/30/2012 17:52 GMT
13/10/30 14:30:49 INFO zookeeper.ZooKeeper: Client environment:host.name=localhost
13/10/30 14:30:49 INFO zookeeper.ZooKeeper: Client environment:java.version=1.7.0_40
13/10/30 14:30:49 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation
13/10/30 14:30:49 INFO zookeeper.ZooKeeper: Client environment:java.home=/opt/jdk1.7.0_40/jre
13/10/30 14:30:49 INFO zookeeper.ZooKeeper: Client environment:java.class.path=/opt/maven/boot/plexus-classworlds-2.4.jar
13/10/30 14:30:49 INFO zookeeper.ZooKeeper: Client environment:java.library.path=:/opt/AMDAPP/lib/x86_64/:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
13/10/30 14:30:49 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp
13/10/30 14:30:49 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
13/10/30 14:30:49 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux
13/10/30 14:30:49 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64
13/10/30 14:30:49 INFO zookeeper.ZooKeeper: Client environment:os.version=3.8.0-32-generic
13/10/30 14:30:49 INFO zookeeper.ZooKeeper: Client environment:user.name=jim
13/10/30 14:30:49 INFO zookeeper.ZooKeeper: Client environment:user.home=/home/jim
13/10/30 14:30:49 INFO zookeeper.ZooKeeper: Client environment:user.dir=/vol/big240/snap/jim/work/hbase-demo
13/10/30 14:30:49 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=localhost:2181 sessionTimeout=180000 watcher=hconnection
13/10/30 14:30:49 INFO zookeeper.RecoverableZooKeeper: The identifier of this process is 13494#keyframe
13/10/30 14:30:49 INFO zookeeper.ClientCnxn: Opening socket connection to server localhost/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error)
13/10/30 14:30:49 WARN zookeeper.ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:735)
at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:350)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1068)
13/10/30 14:30:49 WARN zookeeper.RecoverableZooKeeper: Possibly transient ZooKeeper exception: org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase/master
13/10/30 14:30:49 INFO util.RetryCounter: Sleeping 2000ms before retry #1...
13/10/30 14:30:50 INFO zookeeper.ClientCnxn: Opening socket connection to server localhost/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error)
13/10/30 14:30:50 WARN zookeeper.ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect
[repeats]
any help reviving this demo code would be greatly appreciated. twiddling the hbase-site.xml makes no appreciable difference to zookeeper failures with standalone, either with shell or maven..
thanks
Change hbase-site.xml to point to your hbase server:
<property>
<name>hbase.rootdir</name>
<value>hdfs://<hbase machine name>:8020/hbase</value>
</property>
<property>
<name>hbase.zookeeper.quorum</name>
<value><zookeeper machine name></value>
</property>
I've set up everything, like the hbase documentation says. Here is my hbase-site.xml
<configuration>
<property>
<name>hbase.rootdir</name>
<value>file:///users/urijvoskresenskij/hbase-0.94.3/hbase</value>
</property>
<property>
<name>hbase.zookeeper.property.dataDir</name>
<value>/users/urijvoskresenskij/hbase-0.94.3/zookeeper</value>
</property>
</configuration>
Java Code:
import java.io.IOException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.HBaseAdmin;
import org.apache.hadoop.hbase.client.HTable;
import org.apache.hadoop.hbase.client.Put;
import org.apache.hadoop.hbase.util.Bytes;
public class MainHBase {
/**
* #param args
*/
public static void main(String[] args) throws Exception {
Configuration config = HBaseConfiguration.create();
config.set("hbase.zookeeper.quorum", "localhost");
HBaseAdmin admin = new HBaseAdmin(config);
try {
HTable table = new HTable(config, "test-table");
Put put = new Put(Bytes.toBytes("test-key"));
put.add(Bytes.toBytes("cf"), Bytes.toBytes("q"), Bytes.toBytes("value"));
table.put(put);
} finally {
admin.close();
}
}
and here is the exception:
2012-12-26 11:08:07.980 java[10204:1c03] Unable to load realm info from SCDynamicStore
12/12/26 10:08:08 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.3-1240972, built on 02/06/2012 10:48 GMT
12/12/26 10:08:08 INFO zookeeper.ZooKeeper: Client environment:host.name=192.168.1.101
12/12/26 10:08:08 INFO zookeeper.ZooKeeper: Client environment:java.version=1.7.0-ea
12/12/26 10:08:08 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation
12/12/26 10:08:08 INFO zookeeper.ZooKeeper: Client environment:java.home=/Library/Java/JavaVirtualMachines/JDK 1.7.0 Developer Preview.jdk/Contents/Home/jre
12/12/26 10:08:08 INFO zookeeper.ZooKeeper: Client environment:java.class.path=/Users/urijvoskresenskij/Documents/workspace/Hbase/bin:/Users/urijvoskresenskij/hbase-0.94.3/lib/activation-1.1.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/asm-3.1.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/avro-1.5.3.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/avro-ipc-1.5.3.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/commons-beanutils-1.7.0.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/commons-beanutils-core-1.8.0.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/commons-cli-1.2.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/commons-codec-1.4.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/commons-collections-3.2.1.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/commons-configuration-1.6.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/commons-digester-1.8.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/commons-el-1.0.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/commons-httpclient-3.1.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/commons-io-2.1.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/commons-lang-2.5.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/commons-logging-1.1.1.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/commons-math-2.1.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/commons-net-1.4.1.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/core-3.1.1.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/guava-11.0.2.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/hadoop-core-1.0.4.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/high-scale-lib-1.1.1.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/httpclient-4.1.2.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/httpcore-4.1.3.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/jackson-core-asl-1.8.8.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/jackson-jaxrs-1.8.8.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/jackson-mapper-asl-1.8.8.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/jackson-xc-1.8.8.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/jamon-runtime-2.3.1.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/jasper-compiler-5.5.23.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/jasper-runtime-5.5.23.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/jaxb-api-2.1.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/jaxb-impl-2.2.3-1.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/jersey-core-1.8.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/jersey-json-1.8.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/jersey-server-1.8.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/jettison-1.1.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/jetty-6.1.26.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/jetty-util-6.1.26.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/jruby-complete-1.6.5.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/jsp-2.1-6.1.14.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/jsp-api-2.1-6.1.14.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/jsr305-1.3.9.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/junit-4.10-HBASE-1.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/libthrift-0.8.0.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/log4j-1.2.16.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/metrics-core-2.1.2.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/netty-3.2.4.Final.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/protobuf-java-2.4.0a.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/servlet-api-2.5-6.1.14.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/slf4j-api-1.4.3.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/slf4j-log4j12-1.4.3.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/snappy-java-1.0.3.2.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/stax-api-1.0.1.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/velocity-1.7.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/xmlenc-0.52.jar:/Users/urijvoskresenskij/hbase-0.94.3/lib/zookeeper-3.4.3.jar:/Users/urijvoskresenskij/hbase-0.94.3/hbase-0.94.3-tests.jar:/Users/urijvoskresenskij/hbase-0.94.3/hbase-0.94.3.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/dist/org.springframework.aop-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/dist/org.springframework.asm-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/dist/org.springframework.aspects-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/dist/org.springframework.beans-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/dist/org.springframework.context-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/dist/org.springframework.context.support-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/dist/org.springframework.core-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/dist/org.springframework.expression-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/dist/org.springframework.instrument-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/dist/org.springframework.instrument.tomcat-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/dist/org.springframework.jdbc-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/dist/org.springframework.jms-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/dist/org.springframework.orm-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/dist/org.springframework.oxm-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/dist/org.springframework.test-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/dist/org.springframework.transaction-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/dist/org.springframework.web-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/dist/org.springframework.web.portlet-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/dist/org.springframework.web.servlet-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/dist/org.springframework.web.struts-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Downloads/commons-logging-1.1.1/commons-logging-1.1.1.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/src/org.springframework.web-sources-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/src/org.springframework.aop-sources-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/src/org.springframework.asm-sources-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/src/org.springframework.aspects-sources-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/src/org.springframework.beans-sources-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/src/org.springframework.context-sources-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/src/org.springframework.context.support-sources-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/src/org.springframework.core-sources-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/src/org.springframework.expression-sources-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/src/org.springframework.instrument-sources-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/src/org.springframework.instrument.tomcat-sources-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/src/org.springframework.jdbc-sources-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/src/org.springframework.jms-sources-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/src/org.springframework.orm-sources-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/src/org.springframework.oxm-sources-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/src/org.springframework.test-sources-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/src/org.springframework.transaction-sources-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/src/org.springframework.web.portlet-sources-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/src/org.springframework.web.servlet-sources-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Desktop/spring-framework-3.1.3.RELEASE/src/org.springframework.web.struts-sources-3.1.3.RELEASE.jar:/Users/urijvoskresenskij/Downloads/commons-logging-1.1.1/commons-logging-1.1.1-javadoc.jar:/Users/urijvoskresenskij/Downloads/commons-logging-1.1.1/commons-logging-1.1.1-sources.jar:/Users/urijvoskresenskij/Downloads/commons-logging-1.1.1/commons-logging-adapters-1.1.1.jar:/Users/urijvoskresenskij/Downloads/commons-logging-1.1.1/commons-logging-api-1.1.1.jar:/Users/urijvoskresenskij/Downloads/commons-logging-1.1.1/commons-logging-tests.jar
12/12/26 10:08:08 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/Users/urijvoskresenskij/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java
12/12/26 10:08:08 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/var/folders/_q/z5blmlxs39d4pmmv1fj9mh500000gn/T/
12/12/26 10:08:08 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
12/12/26 10:08:08 INFO zookeeper.ZooKeeper: Client environment:os.name=Mac OS X
12/12/26 10:08:08 INFO zookeeper.ZooKeeper: Client environment:os.arch=x86_64
12/12/26 10:08:08 INFO zookeeper.ZooKeeper: Client environment:os.version=10.7.5
12/12/26 10:08:08 INFO zookeeper.ZooKeeper: Client environment:user.name=urijvoskresenskij
12/12/26 10:08:08 INFO zookeeper.ZooKeeper: Client environment:user.home=/Users/urijvoskresenskij
12/12/26 10:08:08 INFO zookeeper.ZooKeeper: Client environment:user.dir=/Users/urijvoskresenskij/Documents/workspace/Hbase
12/12/26 10:08:08 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=localhost:2181 sessionTimeout=180000 watcher=hconnection
12/12/26 10:08:08 INFO zookeeper.ClientCnxn: Opening socket connection to server /127.0.0.1:2181
12/12/26 10:08:08 INFO zookeeper.RecoverableZooKeeper: The identifier of this process is 10204#MacBook-Air-Urij.local
12/12/26 10:08:08 INFO client.ZooKeeperSaslClient: Client will not SASL-authenticate because the default JAAS configuration section 'Client' could not be found. If you are not using SASL, you may ignore this. On the other hand, if you expected SASL to work, please fix your JAAS configuration.
12/12/26 10:08:08 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:2181, initiating session
12/12/26 10:08:08 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:2181, sessionid = 0x13bd4823ba20007, negotiated timeout = 40000
12/12/26 10:08:08 WARN client.HConnectionManager$HConnectionImplementation: Encountered problems when prefetch META table:
org.apache.hadoop.hbase.TableNotFoundException: Cannot find row in .META. for table: test-table, row=test-table,,99999999999999
at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:164)
at org.apache.hadoop.hbase.client.MetaScanner.access$000(MetaScanner.java:54)
at org.apache.hadoop.hbase.client.MetaScanner$1.connect(MetaScanner.java:133)
at org.apache.hadoop.hbase.client.MetaScanner$1.connect(MetaScanner.java:130)
at org.apache.hadoop.hbase.client.HConnectionManager.execute(HConnectionManager.java:360)
at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:130)
at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:105)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.prefetchRegionCache(HConnectionManager.java:922)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:977)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:864)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:821)
at org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:234)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:174)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:133)
at MainHBase.main(MainHBase.java:20)
Exception in thread "main" org.apache.hadoop.hbase.TableNotFoundException: test-table
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:999)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:864)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:821)
at org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:234)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:174)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:133)
at MainHBase.main(MainHBase.java:20)
Tried to google it already, but wasn't able to find the solution. My operating system is OS X. The table, that i've tried to create, cannot be scanned from hbase shell. Please, help :)
have you created "test-table"?? table must exist before putting the data into it.
Configuration conf = HBaseConfiguration.create();
HBaseAdmin hbase = new HBaseAdmin(conf);
HTableDescriptor desc = new HTableDescriptor("test_table");
HColumnDescriptor cf = new HColumnDescriptor("cf".getBytes());
desc.addFamily(cf);
hbase.createTable(desc);