WSO2: Point-to-Point Messaging - java

I am following this tutorial https://docs.wso2.com/display/EI630/Point-to-Point+Messaging#865c10b8d4d64ac688d6a0799cfb6012
and after reaching Step 2 i.e. running the JMS Publisher I am getting the error in JMeter that
Thread Name: Thread Group 1-1
Sample Start: 2019-09-25 06:11:40 NPT
Load time: 0
Connect Time: 0
Latency: 0
Size in bytes: 1209
Sent bytes:0
Headers size in bytes: 0
Body size in bytes: 1209
Sample Count: 1
Error Count: 1
Data type ("text"|"bin"|""):
Response code: 000
Response message: javax.naming.NamingException: javax.naming.NoInitialContextException: Cannot instantiate class: org.wso2.andes.jndi.PropertiesFileInitialContextFactory [Root exception is java.lang.ClassNotFoundException: org.wso2.andes.jndi.PropertiesFileInitialContextFactory ]
SampleResult fields:
ContentType:
DataEncoding: UTF-8
I have followed the required jar files as instructed in Jmeter lib folder, also I am running Jmeter with Admin permissions.
Still I am getting this error:
Response message: javax.naming.NamingException: javax.naming.NoInitialContextException: Cannot instantiate class: org.wso2.andes.jndi.PropertiesFileInitialContextFactory [Root exception is java.lang.ClassNotFoundException: org.wso2.andes.jndi.PropertiesFileInitialContextFactory ]
How can I fix this? What am I missing?
After providing jar in test plan, browse option (so that I don't have to restart), still it is giving class not found error. Here are the logs:
2019-09-25 15:10:19,299 INFO o.j.r.JARSourceHTTP: Requesting https://jmeter-plugins.org/repo/?installID=docker-461b0856afade2414c8e3dfbab5ca751-gui
2019-09-25 15:10:20,264 INFO o.a.j.p.h.s.HTTPSamplerBase: Parser for text/html is org.apache.jmeter.protocol.http.parser.LagartoBasedHtmlParser
2019-09-25 15:10:20,266 INFO o.a.j.p.h.s.HTTPSamplerBase: Parser for application/xhtml+xml is org.apache.jmeter.protocol.http.parser.LagartoBasedHtmlParser
2019-09-25 15:10:20,266 INFO o.a.j.p.h.s.HTTPSamplerBase: Parser for application/xml is org.apache.jmeter.protocol.http.parser.LagartoBasedHtmlParser
2019-09-25 15:10:20,266 INFO o.a.j.p.h.s.HTTPSamplerBase: Parser for text/xml is org.apache.jmeter.protocol.http.parser.LagartoBasedHtmlParser
2019-09-25 15:10:20,266 INFO o.a.j.p.h.s.HTTPSamplerBase: Parser for text/vnd.wap.wml is org.apache.jmeter.protocol.http.parser.RegexpHTMLParser
2019-09-25 15:10:20,267 INFO o.a.j.p.h.s.HTTPSamplerBase: Parser for text/css is org.apache.jmeter.protocol.http.parser.CssParser
2019-09-25 15:10:22,056 INFO o.a.j.e.KeyToolUtils: keytool found at 'keytool'
2019-09-25 15:10:22,057 INFO o.a.j.p.h.p.ProxyControl: HTTP(S) Test Script Recorder SSL Proxy will use keys that support embedded 3rd party resources in file E:\apache-jmeter-5.0\bin\proxyserver.jks
2019-09-25 15:10:22,781 INFO o.a.j.s.FileServer: Default base='C:\Windows\system32'
2019-09-25 15:10:25,004 INFO o.a.j.s.SampleResult: Note: Sample TimeStamps are START times
2019-09-25 15:10:25,004 INFO o.a.j.s.SampleResult: sampleresult.default.encoding is set to ISO-8859-1
2019-09-25 15:10:25,004 INFO o.a.j.s.SampleResult: sampleresult.useNanoTime=true
2019-09-25 15:10:25,004 INFO o.a.j.s.SampleResult: sampleresult.nanoThreadSleep=5000
2019-09-25 15:10:28,324 WARN o.j.r.PluginManagerMenuItem: Failed to load plugin updates info
java.lang.NoSuchMethodError: org.slf4j.spi.LocationAwareLogger.log(Lorg/slf4j/Marker;Ljava/lang/String;ILjava/lang/String;[Ljava/lang/Object;Ljava/lang/Throwable;)V
at org.apache.commons.logging.impl.SLF4JLocationAwareLog.debug(SLF4JLocationAwareLog.java:131) ~[jcl-over-slf4j-1.7.25.jar:1.7.25]
at org.apache.http.client.protocol.RequestAuthCache.process(RequestAuthCache.java:77) ~[httpclient-4.5.6.jar:4.5.6]
at org.apache.http.protocol.ImmutableHttpProcessor.process(ImmutableHttpProcessor.java:133) ~[httpcore-4.4.10.jar:4.4.10]
at org.apache.http.protocol.HttpRequestExecutor.preProcess(HttpRequestExecutor.java:167) ~[httpcore-4.4.10.jar:4.4.10]
at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:484) ~[httpclient-4.5.6.jar:4.5.6]
at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:835) ~[httpclient-4.5.6.jar:4.5.6]
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83) ~[httpclient-4.5.6.jar:4.5.6]
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56) ~[httpclient-4.5.6.jar:4.5.6]
at org.jmeterplugins.repository.JARSourceHTTP.execute(JARSourceHTTP.java:499) ~[jmeter-plugins-manager-1.3.jar:?]
at org.jmeterplugins.repository.JARSourceHTTP.execute(JARSourceHTTP.java:494) ~[jmeter-plugins-manager-1.3.jar:?]
at org.jmeterplugins.repository.JARSourceHTTP.getJSON(JARSourceHTTP.java:152) ~[jmeter-plugins-manager-1.3.jar:?]
at org.jmeterplugins.repository.JARSourceHTTP.getRepositories(JARSourceHTTP.java:276) ~[jmeter-plugins-manager-1.3.jar:?]
at org.jmeterplugins.repository.JARSourceHTTP.getRepo(JARSourceHTTP.java:304) ~[jmeter-plugins-manager-1.3.jar:?]
at org.jmeterplugins.repository.PluginManager.load(PluginManager.java:71) ~[jmeter-plugins-manager-1.3.jar:?]
at org.jmeterplugins.repository.PluginManagerMenuItem$1.run(PluginManagerMenuItem.java:41) [jmeter-plugins-manager-1.3.jar:?]
2019-09-25 15:18:30,504 INFO o.a.j.g.a.Load: Loading file: F:\Training\MQ\MQPubslisher.jmx
2019-09-25 15:18:30,505 INFO o.a.j.s.FileServer: Set new base='F:\Training\MQ'
2019-09-25 15:18:31,128 INFO o.a.j.s.SaveService: Testplan (JMX) version: 2.2. Testlog (JTL) version: 2.2
2019-09-25 15:18:31,168 INFO o.a.j.s.SaveService: Using SaveService properties file encoding UTF-8
2019-09-25 15:18:31,171 INFO o.a.j.s.SaveService: Using SaveService properties version 5.0
2019-09-25 15:18:31,178 INFO o.a.j.s.SaveService: Loading file: F:\Training\MQ\MQPubslisher.jmx
2019-09-25 15:18:32,513 INFO o.a.j.s.FileServer: Set new base='F:\Training\MQ'
2019-09-25 15:27:32,675 INFO o.a.j.s.FileServer: Set new base='F:\Training\MQ'
2019-09-25 15:28:58,900 INFO o.a.j.e.StandardJMeterEngine: Running the test!
2019-09-25 15:28:58,902 INFO o.a.j.s.SampleEvent: List of sample_variables: []
2019-09-25 15:28:58,903 INFO o.a.j.s.SampleEvent: List of sample_variables: []
2019-09-25 15:28:58,950 INFO o.a.j.t.TestPlan: added E:\apache-jmeter-5.0\lib\andes-client-4.0.0.jar to classpath
2019-09-25 15:28:58,952 INFO o.a.j.g.u.JMeterMenuBar: setRunning(true, *local*)
2019-09-25 15:28:59,559 INFO o.a.j.e.StandardJMeterEngine: Starting ThreadGroup: 1 : Thread Group
2019-09-25 15:28:59,564 INFO o.a.j.e.StandardJMeterEngine: Starting 1 threads for group Thread Group.
2019-09-25 15:28:59,565 INFO o.a.j.e.StandardJMeterEngine: Thread will continue on error
2019-09-25 15:28:59,566 INFO o.a.j.t.ThreadGroup: Starting thread group... number=1 threads=1 ramp-up=1 perThread=1000.0 delayedStart=false
2019-09-25 15:28:59,590 INFO o.a.j.t.ThreadGroup: Started thread group number 1
2019-09-25 15:28:59,590 INFO o.a.j.e.StandardJMeterEngine: All thread groups have been started
2019-09-25 15:28:59,593 INFO o.a.j.t.JMeterThread: Thread started: Thread Group 1-1
2019-09-25 15:28:59,759 INFO o.a.j.t.JMeterThread: Thread is done: Thread Group 1-1
2019-09-25 15:28:59,760 INFO o.a.j.t.JMeterThread: Thread finished: Thread Group 1-1
2019-09-25 15:28:59,761 INFO o.a.j.e.StandardJMeterEngine: Notifying test listeners of end of test
2019-09-25 15:28:59,766 INFO o.a.j.p.j.c.InitialContextFactory: InitialContextFactory.close() called and Context instances cleaned up
2019-09-25 15:28:59,766 INFO o.a.j.g.u.JMeterMenuBar: setRunning(false, *local*)

Remember that you need to restart JMeter for any property change and when you add .jars under JMeter Classpath, otherwise the changes will not be picked up.
According to The Real Secret to Building a Database Test Plan With JMeter article it should be also possible to add the libraries to JMeter Classpath at Test Plan level like:
in that case JMeter restart will not be required

I had the same issue, I solve it by removing a whitespace at the end of the Initial Context Factory field.

Related

Error when starting HBase standalone on Linux Fedora Hyper-V Virtual Machine

UPDATE
UPDATE
I have fixed the issue below thank you to Mike for pointing out however, now when I ran the command "jps" to check HMaster process as per the quick start guide suggested, I got the error command not found:
I search for this and this command is related to java. Therefore here is my java configuration on my machine:
In .bashrc and .bash_profile:
export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.222.b10-1.fc31.x86_64/jre
export JRE_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.222.b10-1.fc31.x86_64/jre
export PATH=$PATH:$HOME/bin:$JAVA_HOME/bin
In hbase-env.sh:
export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.222.b10-1.fc31.x86_64/jre
Location of my java:
[hadoop#new-hbase-shuti logs]$ whereis java
java: /usr/bin/java /usr/lib/java /etc/java /usr/share/java /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.222.b10-1.fc31.x86_64/jre/bin/java /usr/share/man/man1/java.1.gz
My java version:
[hadoop#new-hbase-shuti logs]$ java -version
openjdk version "1.8.0_222"
OpenJDK Runtime Environment (build 1.8.0_222-b10)
OpenJDK 64-Bit Server VM (build 25.222-b10, mixed mode)
Here is the new log file from Hbase (hbase-hadoop-master-new-hbase-shuti.log):
I follow the quick start guide to just install HBase standalone. Here is my configuration:
I was not quite sure which HBase pkg to use but it said to choose the stable one, so I downloaded this: http://mirrors.standaloneinstaller.com/apache/hbase/stable/hbase-2.2.3-bin.tar.gz
The conf/hbase-env.sh where I just have JAVA_HOME env path:
The conf/hbase-site.xml
<configuration>
<property>
<name>hbase.rootdir</name>
<value>file:///home/testuser/hbase</value>
</property>
<property>
<name>hbase.zookeeper.property.dataDir</name>
<value>/home/testuser/zookeeper</value>
</property>
<property>
<name>hbase.unsafe.stream.capability.enforce</name>
<value>false</value>
<description>
Controls whether HBase will check for stream capabilities (hflush/hsync).
Disable this if you intend to run on LocalFileSystem, denoted by a rootdir
with the 'file://' scheme, but be mindful of the NOTE below.
WARNING: Setting this to false blinds you to potential data loss and
inconsistent system state in the event of process and/or node failures. If
HBase is complaining of an inability to use hsync or hflush it's most
likely not a false positive.
</description>
</property>
</configuration>
Then run from bin, the script start-hbase.sh
But I get this error:
/home/hadoop/hadoop/bin/../libexec/hadoop-functions.sh: line 2360: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_USER: invalid variable name
/home/hadoop/hadoop/bin/../libexec/hadoop-functions.sh: line 2455: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_OPTS: invalid variable name
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/hbase-2-2-3/hbase-2.2.3/lib/client-facing-thirdparty/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
running master, logging to /home/hadoop/hbase-2-2-3/hbase-2.2.3/bin/../logs/hbase-hadoop-master-new-hbase-shuti.out
/home/hadoop/hadoop/bin/../libexec/hadoop-functions.sh: line 2360: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_USER: invalid variable name
/home/hadoop/hadoop/bin/../libexec/hadoop-functions.sh: line 2455: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_OPTS: invalid variable name
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/hbase-2-2-3/hbase-2.2.3/lib/client-facing-thirdparty/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
I have also attached the error log files from HBase below. Could anyone who is familiar with HBase help me please? Thank you very much in advance.
Error from "hbase-hadoop-master-new-hbase-shuti.log"
Thu 26 Mar 2020 08:59:07 PM CET Starting master on new-hbase-shuti
core file size (blocks, -c) unlimited
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 0
file size (blocks, -f) unlimited
pending signals (-i) 7523
max locked memory (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files (-n) 1024
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) 7523
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited
2020-03-26 20:59:07,933 INFO [main] master.HMaster: STARTING service HMaster
2020-03-26 20:59:07,934 INFO [main] util.VersionInfo: HBase 2.2.3
2020-03-26 20:59:07,934 INFO [main] util.VersionInfo: Source code repository git://hao-OptiPlex-7050/home/hao/open_source/hbase revision=6a830d87542b766bd3dc4cfdee28655f62de3974
2020-03-26 20:59:07,934 INFO [main] util.VersionInfo: Compiled by hao on 2020年 01月 10日 星期五 18:27:51 CST
2020-03-26 20:59:07,934 INFO [main] util.VersionInfo: From source with checksum 097925184b85f6995e20da5462b10f3f
2020-03-26 20:59:08,190 INFO [main] master.HMasterCommandLine: Starting a zookeeper cluster
2020-03-26 20:59:08,204 INFO [main] server.ZooKeeperServer: Server environment:zookeeper.version=3.4.10-39d3a4f269333c922ed3db283be479f9deacaa0f, built on 03/23/2017 10:13 GMT
2020-03-26 20:59:08,205 INFO [main] server.ZooKeeperServer: Server environment:host.name=new-hbase-shuti.mshome.net
2020-03-26 20:59:08,205 INFO [main] server.ZooKeeperServer: Server environment:java.version=1.8.0_222
2020-03-26 20:59:08,205 INFO [main] server.ZooKeeperServer: Server environment:java.vendor=Oracle Corporation
2020-03-26 20:59:08,205 INFO [main] server.ZooKeeperServer: Server environment:java.home=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.222.b10-1.fc31.x86_64/jre
2020-03-26 20:59:08,205 INFO [main] server.ZooKeeperServer: vices-core-3.1.3.jar:/home/hadoop/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-3.1.3.jar:/home/hadoop/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-3.1.3.jar:/home/hadoop/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-3.1.3.jar:/home/hadoop/hadoop/share/hadoop/yarn/hadoop-yarn-api-3.1.3.jar:/home/hadoop/hadoop/share/hadoop/yarn/hadoop-yarn-client-3.1.3.jar:/home/hadoop/hadoop/share/hadoop/yarn/hadoop-yarn-server-timeline-pluginstorage-3.1.3.jar:/home/hadoop/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-3.1.3.jar:/home/hadoop/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-3.1.3.jar:/home/hadoop/hadoop/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-3.1.3.jar:/home/hadoop/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-3.1.3.jar:/home/hadoop/hadoop/share/hadoop/yarn/hadoop-yarn-common-3.1.3.jar:/home/hadoop/hbase-2-2-3/hbase-2.2.3/bin/../lib/client-facing-thirdparty/slf4j-log4j12-1.7.25.jar
2020-03-26 20:59:08,205 INFO [main] server.ZooKeeperServer: Server environment:java.library.path=/home/hadoop/hadoop//lib/native
2020-03-26 20:59:08,205 INFO [main] server.ZooKeeperServer: Server environment:java.io.tmpdir=/tmp
2020-03-26 20:59:08,205 INFO [main] server.ZooKeeperServer: Server environment:java.compiler=<NA>
2020-03-26 20:59:08,205 INFO [main] server.ZooKeeperServer: Server environment:os.name=Linux
2020-03-26 20:59:08,205 INFO [main] server.ZooKeeperServer: Server environment:os.arch=amd64
2020-03-26 20:59:08,205 INFO [main] server.ZooKeeperServer: Server environment:os.version=5.3.7-301.fc31.x86_64
2020-03-26 20:59:08,205 INFO [main] server.ZooKeeperServer: Server environment:user.name=hadoop
2020-03-26 20:59:08,205 INFO [main] server.ZooKeeperServer: Server environment:user.home=/home/hadoop
2020-03-26 20:59:08,205 INFO [main] server.ZooKeeperServer: Server environment:user.dir=/home/hadoop/hbase-2-2-3/hbase-2.2.3/bin
2020-03-26 20:59:08,207 ERROR [main] master.HMasterCommandLine: Master exiting
java.io.IOException: Unable to create data directory /home/testuser/zookeeper/zookeeper_0/version-2
at org.apache.zookeeper.server.persistence.FileTxnSnapLog.<init>(FileTxnSnapLog.java:85)
at org.apache.zookeeper.server.ZooKeeperServer.<init>(ZooKeeperServer.java:224)
at org.apache.hadoop.hbase.zookeeper.MiniZooKeeperCluster.startup(MiniZooKeeperCluster.java:229)
at org.apache.hadoop.hbase.zookeeper.MiniZooKeeperCluster.startup(MiniZooKeeperCluster.java:187)
at org.apache.hadoop.hbase.master.HMasterCommandLine.startMaster(HMasterCommandLine.java:210)
at org.apache.hadoop.hbase.master.HMasterCommandLine.run(HMasterCommandLine.java:140)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.hbase.util.ServerCommandLine.doMain(ServerCommandLine.java:149)
at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:2940)
Error from "hbase-hadoop-master-new-hbase-shuti.out":
/home/hadoop/hadoop/bin/../libexec/hadoop-functions.sh: line 2360: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_USER: invalid variable name
/home/hadoop/hadoop/bin/../libexec/hadoop-functions.sh: line 2455: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_OPTS: invalid variable name
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/hbase-2-2-3/hbase-2.2.3/lib/client-facing-thirdparty/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
After digging in... hadoop I found that, in my case, it has something to do
with ubuntu user permission ...
vi /opt/hadoop/libexec/hadoop-functions.sh
function hadoop_verify_user_resolves
{
...
}
so I've decided to add those lines in
/opt/hbase/conf/hbase-env.sh
export HBASE_SSH_OPTS="-p 22 -l daniel"
export HBASE_OPTS="$HBASE_OPTS -XX:+UseConcMarkSweepGC"
I have corrected my JAVA_HOME path environment to make sure it points to jdk instead of jre.

Error when running mapreduce job from windows client

I am new in the hadoop world. I have setup a hadoop cluster in Linux and I try to run a mapreduce job from windows.
I wordcount example work find, but when I try to run another job that I wrote myself, I have this error:
16/07/05 16:04:36 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/07/05 16:04:42 INFO client.RMProxy: Connecting to ResourceManager at /myIP:8050
16/07/05 16:04:42 WARN mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
16/07/05 16:04:42 WARN mapreduce.JobResourceUploader: No job jar file set. User classes may not be found. See Job or Job#setJar(String).
16/07/05 16:04:42 INFO input.FileInputFormat: Total input paths to process : 3
16/07/05 16:04:42 INFO mapreduce.JobSubmitter: number of splits:3
16/07/05 16:04:43 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1467727275250_0002
16/07/05 16:04:43 INFO mapred.YARNRunner: Job jar is not present. Not adding any jar to the list of resources.
16/07/05 16:04:43 INFO impl.YarnClientImpl: Submitted application application_1467727275250_0002
16/07/05 16:04:43 INFO mapreduce.Job: The url to track the job: http://hadoopMaster:8088/proxy/application_1467727275250_0002/
16/07/05 16:04:43 INFO mapreduce.Job: Running job: job_1467727275250_0002
16/07/05 16:04:46 INFO mapreduce.Job: Job job_1467727275250_0002 running in uber mode : false
16/07/05 16:04:46 INFO mapreduce.Job: map 0% reduce 0%
16/07/05 16:04:46 INFO mapreduce.Job: Job job_1467727275250_0002 failed with state FAILED due to: Application application_1467727275250_0002 failed 2 times due to AM Container for appattempt_1467727275250_0002_000002 exited with exitCode: 1
For more detailed output, check application tracking page:http://hadoopMaster:8088/cluster/app/application_1467727275250_0002Then, click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_1467727275250_0002_02_000001
Exit code: 1
Exception message: /bin/bash: Zeile 0: fg: Keine Job-Steuerung in dieser Shell.
Stack trace: ExitCodeException exitCode=1: /bin/bash: line 0: fg: no Job control in this Shell.
at org.apache.hadoop.util.Shell.runCommand(Shell.java:545)
at org.apache.hadoop.util.Shell.run(Shell.java:456)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Container exited with a non-zero exit code 1
Failing this attempt. Failing the application.
16/07/05 16:04:46 INFO mapreduce.Job: Counters: 0
Can someone help me please?
mapred-site.xml add property
<property>
<name>mapred.remote.os</name>
<value>Linux</value>
<description>Remote MapReduce framework's OS, can be either Linux or Windows</description>
</property>
<property>
<name>mapreduce.app-submission.cross-platform</name>
<value>true</value>
</property>

Eureka Service fails with exception after client connects

I am following the example from here.
I also have the Eureka server running on localhost:8080.
As a next step I attempt to run the sample service, like this:
./gradlew :eureka-examples:runExampleService
Here is the output that I get:
$ ./gradlew :eureka-examples:runExampleService --stacktrace
Inferred project: eureka, version: 1.4.6-SNAPSHOT
Publication mavenNebula not found in project :.
[buildinfo] Properties file path was not found! (Relevant only for builds running on a CI Server)
Publication named 'mavenNebula' does not exist for project ':' in task ':artifactoryPublish'.
None of the specified publications matched for project ':' - nothing to publish.
:eureka-client:compileJava UP-TO-DATE
:eureka-client:processResources UP-TO-DATE
:eureka-client:classes UP-TO-DATE
:eureka-client:writeManifestProperties UP-TO-DATE
:eureka-client:jar
:eureka-examples:compileJava
warning: [options] bootstrap class path not set in conjunction with -source 1.7
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
1 warning
:eureka-examples:processResources UP-TO-DATE
:eureka-examples:classes
:eureka-examples:runExampleService
[main] WARN com.netflix.config.sources.URLConfigurationSource - No URLs will be polled as dynamic configuration sources.
[main] INFO com.netflix.config.sources.URLConfigurationSource - To enable URLs as dynamic configuration sources, define System property archaius.configurationSource.additionalUrls or make config.properties available on classpath.
[main] INFO com.netflix.config.DynamicPropertyFactory - DynamicPropertyFactory is initialized with configuration sources: com.netflix.config.ConcurrentCompositeConfiguration#46ee7fe8
[main] INFO com.netflix.config.util.ConfigurationUtils - Loaded properties file file:/Users/lenok/Documents/Programming/Github/eureka/eureka-examples/conf/sample-eureka-service.properties
[main] WARN com.netflix.config.util.ConfigurationUtils - file:/Users/lenok/Documents/Programming/Github/eureka/eureka-examples/conf/sample-eureka-service.properties is already loaded
[main] INFO com.netflix.appinfo.providers.EurekaConfigBasedInstanceInfoProvider - Setting initial instance status as: STARTING
[main] INFO com.netflix.discovery.provider.DiscoveryJerseyProvider - Using JSON encoding codec LegacyJacksonJson
[main] INFO com.netflix.discovery.provider.DiscoveryJerseyProvider - Using JSON decoding codec LegacyJacksonJson
[main] INFO com.netflix.discovery.provider.DiscoveryJerseyProvider - Using XML encoding codec XStreamXml
[main] INFO com.netflix.discovery.provider.DiscoveryJerseyProvider - Using XML decoding codec XStreamXml
[main] INFO com.netflix.discovery.shared.resolver.aws.ConfigClusterResolver - Resolving eureka endpoints via configuration
[main] INFO com.netflix.discovery.DiscoveryClient - Disable delta property : false
[main] INFO com.netflix.discovery.DiscoveryClient - Single vip registry refresh property : null
[main] INFO com.netflix.discovery.DiscoveryClient - Force full registry fetch : false
[main] INFO com.netflix.discovery.DiscoveryClient - Application is null : false
[main] INFO com.netflix.discovery.DiscoveryClient - Registered Applications size is zero : true
[main] INFO com.netflix.discovery.DiscoveryClient - Application version is -1: true
[main] INFO com.netflix.discovery.DiscoveryClient - Getting all instance registry info from the eureka server
[main] INFO com.netflix.discovery.DiscoveryClient - The response status is 200
[main] INFO com.netflix.discovery.DiscoveryClient - Starting heartbeat executor: renew interval is: 30
[main] INFO com.netflix.discovery.InstanceInfoReplicator - InstanceInfoReplicator onDemand update allowed rate per min is 4
Registering service to eureka with STARTING status
Simulating service initialization by sleeping for 2 seconds...
Done sleeping, now changing status to UP
[main] INFO com.netflix.discovery.DiscoveryClient - Saw local status change event StatusChangeEvent [timestamp=1458260744041, current=UP, previous=STARTING]
Waiting ... verifying service registration with eureka ...
[DiscoveryClient-InstanceInfoReplicator-0] INFO com.netflix.discovery.DiscoveryClient - DiscoveryClient_SAMPLEREGISTERINGSERVICE/Alenas-MacBook-Pro.local: registering service...
[DiscoveryClient-InstanceInfoReplicator-0] INFO com.netflix.discovery.DiscoveryClient - DiscoveryClient_SAMPLEREGISTERINGSERVICE/Alenas-MacBook-Pro.local - registration status: 204
Waiting ... verifying service registration with eureka ...
Waiting ... verifying service registration with eureka ...
Service started and ready to process requests..
> Building 88% > :eureka-examples:runExampleService
After I start the client, like this:
$ ./gradlew :eureka-examples:runExampleClient
Inferred project: eureka, version: 1.4.6-SNAPSHOT
Publication mavenNebula not found in project :.
[buildinfo] Properties file path was not found! (Relevant only for builds running on a CI Server)
Publication named 'mavenNebula' does not exist for project ':' in task ':artifactoryPublish'.
None of the specified publications matched for project ':' - nothing to publish.
:eureka-client:compileJava UP-TO-DATE
:eureka-client:processResources UP-TO-DATE
:eureka-client:classes UP-TO-DATE
:eureka-client:writeManifestProperties UP-TO-DATE
:eureka-client:jar
:eureka-examples:compileJava
warning: [options] bootstrap class path not set in conjunction with -source 1.7
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
1 warning
:eureka-examples:processResources UP-TO-DATE
:eureka-examples:classes
:eureka-examples:runExampleClient
[main] WARN com.netflix.config.sources.URLConfigurationSource - No URLs will be polled as dynamic configuration sources.
[main] INFO com.netflix.config.sources.URLConfigurationSource - To enable URLs as dynamic configuration sources, define System property archaius.configurationSource.additionalUrls or make config.properties available on classpath.
[main] INFO com.netflix.config.DynamicPropertyFactory - DynamicPropertyFactory is initialized with configuration sources: com.netflix.config.ConcurrentCompositeConfiguration#46ee7fe8
[main] INFO com.netflix.config.util.ConfigurationUtils - Loaded properties file file:/Users/lenok/Documents/Programming/Github/eureka/eureka-examples/conf/sample-eureka-client.properties
[main] WARN com.netflix.config.util.ConfigurationUtils - file:/Users/lenok/Documents/Programming/Github/eureka/eureka-examples/conf/sample-eureka-client.properties is already loaded
[main] INFO com.netflix.appinfo.providers.EurekaConfigBasedInstanceInfoProvider - Setting initial instance status as: STARTING
[main] INFO com.netflix.discovery.provider.DiscoveryJerseyProvider - Using JSON encoding codec LegacyJacksonJson
[main] INFO com.netflix.discovery.provider.DiscoveryJerseyProvider - Using JSON decoding codec JacksonJson
[main] INFO com.netflix.discovery.provider.DiscoveryJerseyProvider - Using XML encoding codec XStreamXml
[main] INFO com.netflix.discovery.provider.DiscoveryJerseyProvider - Using XML decoding codec XStreamXml
[main] INFO com.netflix.discovery.shared.resolver.aws.ConfigClusterResolver - Resolving eureka endpoints via configuration
[main] INFO com.netflix.discovery.DiscoveryClient - Disable delta property : false
[main] INFO com.netflix.discovery.DiscoveryClient - Single vip registry refresh property : null
[main] INFO com.netflix.discovery.DiscoveryClient - Force full registry fetch : false
[main] INFO com.netflix.discovery.DiscoveryClient - Application is null : false
[main] INFO com.netflix.discovery.DiscoveryClient - Registered Applications size is zero : true
[main] INFO com.netflix.discovery.DiscoveryClient - Application version is -1: true
[main] INFO com.netflix.discovery.DiscoveryClient - Getting all instance registry info from the eureka server
[main] INFO com.netflix.discovery.DiscoveryClient - The response status is 200
[main] INFO com.netflix.discovery.DiscoveryClient - Not registering with Eureka server per configuration
Found an instance of example service to talk to from eureka: sampleservice.mydomain.net:8001
healthCheckUrl: http://Alenas-MacBook-Pro.local:8001/healthcheck
override: UNKNOWN
Connected to server. Sending a sample request: FOO Thu Mar 17 17:11:33 PDT 2016
Waiting for server response..
Received response from server: BAR Thu Mar 17 17:11:33 PDT 2016
Exiting the client. Demo over..
BUILD SUCCESSFUL
Total time: 9.497 secs
It basically connects to the server and waits for response, etc.
At the same time on the service terminal I see the following happening. Sometimes when I do these actions I see the successful response on the eureka service:
Client got connected... processing request from the client
Received a request from the example client: FOO Thu Mar 17 17:30:16 PDT 2016
Sending the response to the client: BAR Thu Mar 17 17:30:16 PDT 2016
Simulating service doing work by sleeping for 10 seconds...
Removing registration from eureka
[main] INFO com.netflix.discovery.DiscoveryClient - DiscoveryClient_SAMPLEREGISTERINGSERVICE/Alenas-MacBook-Pro.local - deregister status: 200
Shutting down server. Demo over.
BUILD SUCCESSFUL
Total time: 4 mins 53.331 secs
And sometimes it gives errors, like here:
Client got connected... processing request from the client
Received a request from the example client: FOO Thu Mar 17 17:33:30 PDT 2016
Sending the response to the client: BAR Thu Mar 17 17:33:30 PDT 2016
Simulating service doing work by sleeping for 10 seconds...
Removing registration from eureka
Exception in thread "main" java.lang.NoClassDefFoundError: com/netflix/discovery/shared/transport/decorator/EurekaHttpClientDecorator$2
at com.netflix.discovery.shared.transport.decorator.EurekaHttpClientDecorator.cancel(EurekaHttpClientDecorator.java:71)
at com.netflix.discovery.DiscoveryClient.unregister(DiscoveryClient.java:886)
at com.netflix.discovery.DiscoveryClient.shutdown(DiscoveryClient.java:869)
at com.netflix.eureka.ExampleServiceBase.stop(ExampleServiceBase.java:89)
at com.netflix.eureka.ExampleServiceBase.start(ExampleServiceBase.java:80)
at com.netflix.eureka.ExampleEurekaService.main(ExampleEurekaService.java:45)
Caused by: java.lang.ClassNotFoundException: com.netflix.discovery.shared.transport.decorator.EurekaHttpClientDecorator$2
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 6 more
:eureka-examples:runExampleService FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':eureka-examples:runExampleService'.
> Process 'command '/Library/Java/JavaVirtualMachines/jdk1.8.0_73.jdk/Contents/Home/bin/java'' finished with non-zero exit value 1
* Try:
Run with --info or --debug option to get more log output.
* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':eureka-examples:runExampleService'.
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:69)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:46)
at org.gradle.api.internal.tasks.execution.PostExecutionAnalysisTaskExecuter.execute(PostExecutionAnalysisTaskExecuter.java:35)
at org.gradle.api.internal.tasks.execution.SkipUpToDateTaskExecuter.execute(SkipUpToDateTaskExecuter.java:64)
at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:58)
at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:42)
at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:52)
at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:53)
at org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter.execute(ExecuteAtMostOnceTaskExecuter.java:43)
at org.gradle.api.internal.AbstractTask.executeWithoutThrowingTaskFailure(AbstractTask.java:305)
at org.gradle.execution.taskgraph.AbstractTaskPlanExecutor$TaskExecutorWorker.executeTask(AbstractTaskPlanExecutor.java:79)
at org.gradle.execution.taskgraph.AbstractTaskPlanExecutor$TaskExecutorWorker.processTask(AbstractTaskPlanExecutor.java:63)
at org.gradle.execution.taskgraph.AbstractTaskPlanExecutor$TaskExecutorWorker.run(AbstractTaskPlanExecutor.java:51)
at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor.process(DefaultTaskPlanExecutor.java:23)
at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter.execute(DefaultTaskGraphExecuter.java:88)
at org.gradle.execution.SelectedTaskExecutionAction.execute(SelectedTaskExecutionAction.java:29)
at org.gradle.execution.DefaultBuildExecuter.execute(DefaultBuildExecuter.java:62)
at org.gradle.execution.DefaultBuildExecuter.access$200(DefaultBuildExecuter.java:23)
at org.gradle.execution.DefaultBuildExecuter$2.proceed(DefaultBuildExecuter.java:68)
at org.gradle.execution.DryRunBuildExecutionAction.execute(DryRunBuildExecutionAction.java:32)
at org.gradle.execution.DefaultBuildExecuter.execute(DefaultBuildExecuter.java:62)
at org.gradle.execution.DefaultBuildExecuter.execute(DefaultBuildExecuter.java:55)
at org.gradle.initialization.DefaultGradleLauncher.doBuildStages(DefaultGradleLauncher.java:149)
at org.gradle.initialization.DefaultGradleLauncher.doBuild(DefaultGradleLauncher.java:106)
at org.gradle.initialization.DefaultGradleLauncher.run(DefaultGradleLauncher.java:86)
at org.gradle.launcher.exec.InProcessBuildActionExecuter$DefaultBuildController.run(InProcessBuildActionExecuter.java:80)
at org.gradle.launcher.cli.ExecuteBuildAction.run(ExecuteBuildAction.java:33)
at org.gradle.launcher.cli.ExecuteBuildAction.run(ExecuteBuildAction.java:24)
at org.gradle.launcher.exec.InProcessBuildActionExecuter.execute(InProcessBuildActionExecuter.java:36)
at org.gradle.launcher.exec.InProcessBuildActionExecuter.execute(InProcessBuildActionExecuter.java:26)
at org.gradle.launcher.cli.RunBuildAction.run(RunBuildAction.java:51)
at org.gradle.internal.Actions$RunnableActionAdapter.execute(Actions.java:171)
at org.gradle.launcher.cli.CommandLineActionFactory$ParseAndBuildAction.execute(CommandLineActionFactory.java:237)
at org.gradle.launcher.cli.CommandLineActionFactory$ParseAndBuildAction.execute(CommandLineActionFactory.java:210)
at org.gradle.launcher.cli.JavaRuntimeValidationAction.execute(JavaRuntimeValidationAction.java:35)
at org.gradle.launcher.cli.JavaRuntimeValidationAction.execute(JavaRuntimeValidationAction.java:24)
at org.gradle.launcher.cli.CommandLineActionFactory$WithLogging.execute(CommandLineActionFactory.java:206)
at org.gradle.launcher.cli.CommandLineActionFactory$WithLogging.execute(CommandLineActionFactory.java:169)
at org.gradle.launcher.cli.ExceptionReportingAction.execute(ExceptionReportingAction.java:33)
at org.gradle.launcher.cli.ExceptionReportingAction.execute(ExceptionReportingAction.java:22)
at org.gradle.launcher.Main.doAction(Main.java:33)
at org.gradle.launcher.bootstrap.EntryPoint.run(EntryPoint.java:45)
at org.gradle.launcher.bootstrap.ProcessBootstrap.runNoExit(ProcessBootstrap.java:54)
at org.gradle.launcher.bootstrap.ProcessBootstrap.run(ProcessBootstrap.java:35)
at org.gradle.launcher.GradleMain.main(GradleMain.java:23)
at org.gradle.wrapper.BootstrapMainStarter.start(BootstrapMainStarter.java:30)
at org.gradle.wrapper.WrapperExecutor.execute(WrapperExecutor.java:127)
at org.gradle.wrapper.GradleWrapperMain.main(GradleWrapperMain.java:56)
Caused by: org.gradle.process.internal.ExecException: Process 'command '/Library/Java/JavaVirtualMachines/jdk1.8.0_73.jdk/Contents/Home/bin/java'' finished with non-zero exit value 1
at org.gradle.process.internal.DefaultExecHandle$ExecResultImpl.assertNormalExitValue(DefaultExecHandle.java:365)
at org.gradle.process.internal.DefaultJavaExecAction.execute(DefaultJavaExecAction.java:31)
at org.gradle.api.tasks.JavaExec.exec(JavaExec.java:60)
at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:63)
at org.gradle.api.internal.project.taskfactory.AnnotationProcessingTaskFactory$StandardTaskAction.doExecute(AnnotationProcessingTaskFactory.java:218)
at org.gradle.api.internal.project.taskfactory.AnnotationProcessingTaskFactory$StandardTaskAction.execute(AnnotationProcessingTaskFactory.java:211)
at org.gradle.api.internal.project.taskfactory.AnnotationProcessingTaskFactory$StandardTaskAction.execute(AnnotationProcessingTaskFactory.java:200)
at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:579)
at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:562)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:80)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:61)
... 47 more
BUILD FAILED
Total time: 1 mins 1.267 secs
Please, help me get the example properly running. What might be the problem?

Hadoop wordcount Pseudodistributed mode error Exit code:127

I have installed Hadoop 2.7.1 stable version. I followed Tom White's book for installation in Pseudodistributed mode. I did set all environment variables like JAVA_HOME, HADOOP_HOME, PATH etc.. I configured yarn-site.xml, hdfs-site.xml, core-site.xml, mapred-site.xml.
I copied the sample file file.txt using following command.
$hadoop fs -copyFromLocal textFiles/file.txt file.txt
which shows me
Found 2 items
-rw-r--r-- 1 RAMA supergroup 3737 2015-12-27 21:52 file.txt
drwxr-xr-x - RAMA supergroup 0 2015-12-27 22:17 input
When I am executing the wordcount program in hadoop-mapreduce-examples-2.7.1.jar using below command
RAMAs-MacBook-Pro:hadoop-2.7.1 RAMA$ hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.1.jar wordcount file.txt output
it is throwing me following exception, for which I am not able to find any feasible solution.
15/12/27 22:41:52 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/12/27 22:41:53 INFO client.RMProxy: Connecting to ResourceManager at localhost/127.0.0.1:8032
15/12/27 22:41:53 INFO input.FileInputFormat: Total input paths to process : 1
15/12/27 22:41:53 INFO mapreduce.JobSubmitter: number of splits:1
15/12/27 22:41:53 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1451216397139_0020
15/12/27 22:41:54 INFO impl.YarnClientImpl: Submitted application application_1451216397139_0020
15/12/27 22:41:54 INFO mapreduce.Job: The url to track the job: http://192.168.1.6:8088/proxy/application_1451216397139_0020/
15/12/27 22:41:54 INFO mapreduce.Job: Running job: job_1451216397139_0020
15/12/27 22:41:57 INFO mapreduce.Job: Job job_1451216397139_0020 running in uber mode : false
15/12/27 22:41:57 INFO mapreduce.Job: map 0% reduce 0%
15/12/27 22:41:57 INFO mapreduce.Job: Job job_1451216397139_0020 failed with state FAILED due to: Application application_1451216397139_0020 failed 2 times due to AM Container for appattempt_1451216397139_0020_000002 exited with exitCode: 127
For more detailed output, check application tracking page:http://192.168.1.6:8088/cluster/app/application_1451216397139_0020Then, click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_1451216397139_0020_02_000001
Exit code: 127
Stack trace: ExitCodeException exitCode=127:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:545)
at org.apache.hadoop.util.Shell.run(Shell.java:456)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Container exited with a non-zero exit code 127
Failing this attempt. Failing the application.
15/12/27 22:41:57 INFO mapreduce.Job: Counters: 0
Any suggestions/comments is of great help...
In hadoop-env.sh, explicitly add the java home path in JAVA_HOME variable.

using hive got exception java.lang.NoClassDefFoundError: org/apache/tez/dag/api/SessionNotRunning

after configuring hadoop
I could run hdfs
then install hive and edit the conf file to make it run on tez by default,
but running into some special issue when using hive directly:
hive
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/tez/dag/api/SessionNotRunning
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:353)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Version Info:
hadoop: 2.5
hive 0.13
tez 0.41
anyone met this before?
seems not like a PATH related error.
My peoblem is:
Could not open connection to jdbc:hive2://localhost:10000: java.net.ConnectException: Connection refused (state=08S01,code=0)
Here is my solution and the progress to figure it out:
when run the below command to start hiveserver2
hive --service hiveserver2
the log tells the Excetion:
Error starting HiveServer2 on attempt 1, will retry in 60000ms
java.lang.NoClassDefFoundError: org/apache/tez/dag/api/TezConfiguration
but continue to look up we find that
2018-11-16T18:45:14,836 INFO [main] server.HiveServer2: HS2 interactive HA not enabled. Starting tez sessions..
2018-11-16T18:45:14,836 INFO [main] server.HiveServer2: Starting/Reconnecting tez sessions..
so the reason is thad the default setting disable the HS2 interactive HA configuration.
just chage to true to fix this in the hive-site.xml
<property>
<name>hive.server2.active.passive.ha.enable</name>
<value>false</value> # change false to true
</property>
Problem solved !!!
part of the logs(releated to the Tez):
2018-11-16T18:45:14,835 INFO [main] server.HiveServer2: Web UI has started on port 10002
2018-11-16T18:45:14,836 INFO [main] server.HiveServer2: HS2 interactive HA not enabled. Starting tez sessions..
2018-11-16T18:45:14,836 INFO [main] server.HiveServer2: Starting/Reconnecting tez sessions..
2018-11-16T18:45:14,836 INFO [main] server.HiveServer2: Initializing tez session pool manager
2018-11-16T18:45:14,847 INFO [main] server.HiveServer2: Shutting down HiveServer2
2018-11-16T18:45:14,847 INFO [main] service.AbstractService: Service:ThriftBinaryCLIService is stopped.
2018-11-16T18:45:14,847 INFO [main] service.AbstractService: Service:OperationManager is stopped.
2018-11-16T18:45:14,848 INFO [main] service.AbstractService: Service:SessionManager is stopped.
2018-11-16T18:45:14,850 INFO [main] service.AbstractService: Service:CLIService is stopped.
2018-11-16T18:45:14,850 INFO [main] service.AbstractService: Service:HiveServer2 is stopped.
2018-11-16T18:45:14,847 INFO [main] thrift.ThriftCLIService: Thrift server has stopped
2018-11-16T18:45:14,866 INFO [main] server.HiveServer2: Stopping/Disconnecting tez sessions.
2018-11-16T18:45:14,866 INFO [main] server.HiveServer2: Stopped tez session pool manager.
2018-11-16T18:45:14,869 WARN [main] server.HiveServer2: Error starting HiveServer2 on attempt 1, will retry in 60000ms
java.lang.NoClassDefFoundError: org/apache/tez/dag/api/TezConfiguration
at org.apache.hadoop.hive.ql.exec.tez.TezSessionPoolSession$AbstractTriggerValidator.startTriggerValidator(TezSessionPoolSession.java:74) ~[hive-exec-3.1.1.jar:3.1.1]
I temporarily solved this by add hiveconf to force hive use mr engine not tez;
like this:
hive -hiveconf hive.execution.engine=mr -e "my sql"
but as I want to use tez, anyone could help?
For me I had to add the below dir to the classpath. This is mapr specific but other distro will have similar path
/opt/mapr/tez/tez-0.9/lib/*:/opt/mapr/tez/tez-0.9/*

Categories

Resources