Hive startup -[ERROR] Terminal initialization failed; falling back to unsupported - java

I have downloaded hive and modified HADOOP_HOME to
HADOOP_HOME=${bin}/../../usr/local/hadoop
my actual hadoop path is
/usr/local/hadoop
in .bashrc i have added the below env variables
export HIVE_HOME=/usr/lib/hive/apache-hive-1.1.0-bin
export PATH=$PATH:$HIVE_HOME/bin
export CLASSPATH=$CLASSPATH:/usr/local/Hadoop/lib/*:.
export CLASSPATH=$CLASSPATH:/usr/local/hive/lib/*:.
then i tried starting hive using bin/hive. I got the below error
Logging initialized using configuration in jar:file:/usr/lib/hive/apache-hive-1.1.0-bin/lib/hive-common-1.1.0.jar!/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hive/apache-hive-1.1.0-bin/lib/hive-jdbc-1.1.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
[ERROR] Terminal initialization failed; falling back to unsupported
java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected
at jline.TerminalFactory.create(TerminalFactory.java:101)
at jline.TerminalFactory.get(TerminalFactory.java:158)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:229)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:221)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:209)
at org.apache.hadoop.hive.cli.CliDriver.getConsoleReader(CliDriver.java:773)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:715)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Exception in thread "main" java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected
at jline.console.ConsoleReader.<init>(ConsoleReader.java:230)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:221)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:209)
at org.apache.hadoop.hive.cli.CliDriver.getConsoleReader(CliDriver.java:773)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:715)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

I had the same problem and got it working from this link:
https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started
Hive has upgraded to Jline2 but jline 0.9x exists in the Hadoop lib.
So you should follow these steps:
Delete jline from the Hadoop lib directory (it's only pulled in transitively from ZooKeeper).
export HADOOP_USER_CLASSPATH_FIRST=true

Try after removing the jline-0.9.94.jar file under the path $HADOOP_HOME/share/hadoop/yarn/lib/jline-0.9.94.jar
Here's the link to jira ticket
https://issues.apache.org/jira/browse/HIVE-8609

I just set the
HADOOP_USER_CLASSPATH_FIRST=true
and it works for me for this issue.

Try to delete one of this file
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hive/apache-hive-1.1.0-bin/lib/hive-jdbc-1.1.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
then i think it will only choose to bind to one. so multiple binding will not be available anymore

I had the same problem with cloudera CDH5.4 . Removing jline-0.9.94.jar from yarn/libs folder worked for me.

With Hadoop version 2.4.1 and Hive 1.2.0. I had the same issue. And after setting HADOOP_USER_CLASSPATH_FIRST=true in .bashrc
It worked like a charm!!!

Do this locate jline.
The file, jline-0.9.94.jar, is located in 3 locations, delete it from all the 3 locations and do the necessary export:
$ export HADOOP_USER_CLASSPATH_FIRST=true

With the new version of Hive (1.2.1), I just had to replace jline-2.12.jar with jline-2.13.jar in the installation / lib folder.

check this link it might help you Facing issue while running hive from CLI

You should init the hadoop libary:
$ vi ~/.bashrc
$ export HADOOP_USER_CLASSPATH_FIRST=true
$ source .bashrc
$ hive

I renamed
addJava "-Djline.terminal=jline.UnixTerminal"
to
addJava "-Djline.terminal=jline2.UnixTerminal"
in the "activator" file

Related

Apache Ignite - error 'JavaLoggerFileHandler' when running Spark shell

I am starting spark-shell (of spark 2.2) and added bunch of jars in spark-shell command (from Ignite 2.1 directory).
Still getting error:
Can't load log handler "org.apache.ignite.logger.java.JavaLoggerFileHandler"
Also followed recommendation from here:
https://apacheignite.readme.io/v1.2/docs/installation--deployment
# Optionally set IGNITE_HOME here.
# IGNITE_HOME=/path/to/ignite
IGNITE_LIBS="${IGNITE_HOME}/libs/*"
for file in ${IGNITE_HOME}/libs/*
do
if [ -d ${file} ] && [ "${file}" != "${IGNITE_HOME}"/libs/optional ]; then
IGNITE_LIBS=${IGNITE_LIBS}:${file}/*
fi
done
export SPARK_CLASSPATH=$IGNITE_LIBS
Also set logging to only ERROR as well but still getting error:
Can't load log handler "org.apache.ignite.logger.java.JavaLoggerFileHandler"
java.lang.ClassNotFoundException: org.apache.ignite.logger.java.JavaLoggerFileHandler
java.lang.ClassNotFoundException: org.apache.ignite.logger.java.JavaLoggerFileHandler
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.util.logging.LogManager$5.run(LogManager.java:965)
at java.security.AccessController.doPrivileged(Native Method)
Looks like you use documentation for an old Ignite version 1.2, while you use Ignite 2.1. Check the documentation for the latest version here: https://apacheignite-fs.readme.io/v2.2/docs/installation-deployment
Also, please make sure that have configured IGNITE_HOME in your environment. JavaLoggerFileHandler placed in the ignite-core module, looks like spark classpath doesn't see any Ignite lib at all.
The documentation describe the issue here:
https://apacheignite-fs.readme.io/v2.2/docs/troubleshooting
This issue appears when you do not have any loggers included in classpath and Ignite tries to use standard Java logging. By default Spark loads all user jar files using separate class loader. Java logging framework, on the other hand, uses application class loader to initialize log handlers. To resolve this, you can either add ignite-log4j module to the list of the used jars so that Ignite would use Log4j as a logging subsystem, or alter default Spark classpath as described

Maven1.x: javax.xml.parsers.FactoryConfigurationError: Provider org.apache.xerces.jaxp.SAXParserFactoryImpl not found

After OS update from Windows 7 to Windows 10, I'm facing following error:
Apache Maven
intelligent projects ~
v. 1.0.2
javax.xml.parsers.FactoryConfigurationError: Provider org.apache.xerces.jaxp.SAXParserFactoryImpl not
at javax.xml.parsers.SAXParserFactory.newInstance(SAXParserFactory.java:93)
at org.apache.maven.jelly.JellyUtils.compileScript(JellyUtils.java:202)
at org.apache.maven.jelly.JellyUtils.compileScript(JellyUtils.java:180)
at org.apache.maven.jelly.JellyUtils.compileScript(JellyUtils.java:146)
at org.apache.maven.plugin.PluginManager.loadScript(PluginManager.java:1109)
at org.apache.maven.plugin.PluginManager.runScript(PluginManager.java:1135)
at org.apache.maven.plugin.PluginManager.initialiseHousingPluginContext(PluginManager.java:770)
at org.apache.maven.plugin.PluginManager.prepAttainGoal(PluginManager.java:725)
at org.apache.maven.plugin.PluginManager.attainGoals(PluginManager.java:656)
at org.apache.maven.MavenSession.attainGoals(MavenSession.java:263)
at org.apache.maven.cli.App.doMain(App.java:488)
at org.apache.maven.cli.App.main(App.java:1239)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:324)
at com.werken.forehead.Forehead.run(Forehead.java:551)
at com.werken.forehead.Forehead.main(Forehead.java:581)
You have encountered an unknown error running Maven. Please help us to correct
this problem by following these simple steps:
- read the Maven FAQ at http://maven.apache.org/faq.html
- run the same command again with the '-e' parameter, eg maven -e jar
As my application is using Maven1 and JDK1.4, I'm running eclipse:generate-classpath command.
I also tried to add xerces and xercesImpl jars via maven. Also added these two jars in C:\Dev\Jdks\jdk1.8.0_31\jre\lib\ext directory. Still getting the same error. Can anyone please help? I'll be greatful, Thanks in advance.

How to resolve NoClassDefFoundError while using Hadoop?

I am getting
Exception in thread "main" java.lang.NoClassDefFoundError: com/linkedin/camus/etl/IEtlKey.
On running the command:
hadoop jar camus-etl-kafka-0.1.0-SNAPSHOT.jar
com.linkedin.camus.etl.kafka.CamusJob -P camus.properties
I am getting the below exceptions..
2016-04-27 11:34:04.622 java[13567:351959] Unable to load realm mapping info from SCDynamicStore
[NativeCodeLoader] - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.NoClassDefFoundError: com/linkedin/camus/etl/IEtlKey
at com.linkedin.camus.etl.kafka.CamusJob.run(CamusJob.java:252)
at com.linkedin.camus.etl.kafka.CamusJob.run(CamusJob.java:235)
at com.linkedin.camus.etl.kafka.CamusJob.run(CamusJob.java:691)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at com.linkedin.camus.etl.kafka.CamusJob.main(CamusJob.java:646)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.ClassNotFoundException: com.linkedin.camus.etl.IEtlKey
I have included camus-example-0.1.0-SNAPSHOT-shaded.jar in the classpath .
Please let me know if I am missing something .
Thanks in Advance
Soumyajit
You should try to include camus-api you can find on this LinkedIn's previous generation Kafka to HDFS pipeline page, since the missing class is contained in this package, as you can see here.
Pay attention to other transitive dependencies that may be required by Camus.
In addition, to be sure that classes will be found in the classpath when you use the hadoop jar from command line, you can add the libjars command line option, as reported in Using the libjars option with Hadoop:
$ export LIBJARS=/path/jar1,/path/jar2
$ hadoop jar my-example.jar com.example.MyTool -libjars ${LIBJARS} -mytoolopt value
It could be useful to know that Camus is going to be superseded by Gobblin:
Camus is being phased out and replaced by Gobblin. For those using or
interested in Camus, we suggest taking a look at Gobblin.
For instructions on Migrating from Camus to Gobblin, please take
a look at Camus Gobblin Migration.

ClassNotFoundException is thrown when running ExportSnapshot

I'm getting a confusing ClassNotFoundException when I try to run ExportSnapshot from my HBase master node. hbase shell and other commands work just fine, and my cluster is fully operational.
This feels like a Classpath issue, but I don't know what I'm missing.
$ /usr/bin/hbase org.apache.hadoop.hbase.snapshot.ExportSnapshot -snapshot ambarismoketest-snapshot -copy-to hdfs://10.0.1.90/apps/hbase/data -mappers 16
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2015-10-13 20:05:02,339 INFO [main] Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
2015-10-13 20:05:04,351 INFO [main] util.FSVisitor: No logs under directory:hdfs://cinco-de-nameservice/apps/hbase/data/.hbase-snapshot/impression_event_production_hbase-transfer-to-staging-20151013/WALs
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/Job
at org.apache.hadoop.hbase.snapshot.ExportSnapshot.runCopyJob(ExportSnapshot.java:529)
at org.apache.hadoop.hbase.snapshot.ExportSnapshot.run(ExportSnapshot.java:646)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.hbase.snapshot.ExportSnapshot.innerMain(ExportSnapshot.java:697)
at org.apache.hadoop.hbase.snapshot.ExportSnapshot.main(ExportSnapshot.java:701)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapreduce.Job
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 5 more
Problem
It turns out this is because the mapreduce2 JARs are not available in the classpath. The classpath was properly set up, but I did not have the mapreduce2 client installed on that node. HBase's ExportSnapshot apparently depends on those client JARs when exporting snapshots to another cluster because it writes to HDFS.
Fix
If you use Ambari:
Load Ambari UI
Pull up node where you were running the ExportSnapshot from and getting the above error
Under "components", click "Add"
Click "Mapreduce 2 client"
Background
There's a ticket here https://issues.apache.org/jira/browse/HBASE-9687 where the title is ClassNotFoundException is thrown when ExportSnapshot runs against hadoop cluster where HBase is not installed on the same node as resourcemanager. The title implies that installing resource manager is the fix and this may work; however, the crux is you need the hadoop mapreduce2 jars in the classpath and you can do that by simply installing the mapreduce2 client.
For us, specifically, the reason our snapshot exports were working one day and broken the next is that our HBase master switched on us b/c of another issue we had. Our backup HBase master did not have the mapreduce2 client JARs, but the original primary master did.

Mahout - Error when try out wikipedia examples

Note this post is similar to
Caused by: java.lang.ClassNotFoundException: classpath
but different error message.
When I try to run Wikipedia Bayes Example from https://cwiki.apache.org/confluence/display/MAHOUT/Wikipedia+Bayes+Example
When I ran the following command :
lis-macbook-pro:mahout-distribution-0.8 Li$ mahout wikipediaXMLSplitter -d examples/temp/enwiki-latest-pages-articles10.xml -o wikipedia/chunks -c 64
I got error message:
MAHOUT_LOCAL is set, so we don't add HADOOP_CONF_DIR to classpath.
MAHOUT_LOCAL is set, running locally
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/Li/File/Java/mahout-distribution-0.8/examples/target/mahout-examples-0.8-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/Li/File/Java/mahout-distribution-0.8/examples/target/dependency/slf4j-jcl-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.JCLLoggerFactory]
Oct 21, 2013 4:25:47 PM org.slf4j.impl.JCLLoggerAdapter warn
WARNING: Unable to add class: wikipediaXMLSplitter
java.lang.ClassNotFoundException: wikipediaXMLSplitter
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:171)
at org.apache.mahout.driver.MahoutDriver.addClass(MahoutDriver.java:236)
at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:127)
Oct 21, 2013 4:25:47 PM org.slf4j.impl.JCLLoggerAdapter warn
WARNING: No wikipediaXMLSplitter.props found on classpath, will use command-line arguments only
Unknown program 'wikipediaXMLSplitter' chosen.
I am using Hadoop 1.2 and Mahout 0.8.
mahout-distribution-0.8/bin has been added to $PATH.
$MAHOUT_LOCAL is set to "True", so it runs locally.
I dont know why I got "Unable to add class: wikipediaXMLSplitter"
To the original question:
The reason you are seeing the error:
There's no entry for wikipediaXmlSplitter in $MAHOUT_HOME/src/conf/driver.classes.default.props. Add the following line to this file:
org.apache.mahout.text.wikipedia.WikipediaXmlSplitter = wikipediaXmlSplitter : wikipedia splitter
You should now be able to invoke via:
mahout wikipediaXmlSplitter
Note, the case sensitivity in 'wikipediaXmlSplitter'.
There's an error on Mahout wiki wherein it reads as 'wikipediaXMLSplitter' as opposed to 'wikipediaXmlSplitter' which has since been fixed on the new Mahout website at http://mahout.apache.org/users/classification/wikipedia-bayes-example.html
You could try to use the full name
org.apache.mahout.text.wikipedia.WikipediaXmlSplitter
rather than just WikipediaXmlSplitter

Categories

Resources