Note this post is similar to
Caused by: java.lang.ClassNotFoundException: classpath
but different error message.
When I try to run Wikipedia Bayes Example from https://cwiki.apache.org/confluence/display/MAHOUT/Wikipedia+Bayes+Example
When I ran the following command :
lis-macbook-pro:mahout-distribution-0.8 Li$ mahout wikipediaXMLSplitter -d examples/temp/enwiki-latest-pages-articles10.xml -o wikipedia/chunks -c 64
I got error message:
MAHOUT_LOCAL is set, so we don't add HADOOP_CONF_DIR to classpath.
MAHOUT_LOCAL is set, running locally
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/Li/File/Java/mahout-distribution-0.8/examples/target/mahout-examples-0.8-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/Li/File/Java/mahout-distribution-0.8/examples/target/dependency/slf4j-jcl-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.JCLLoggerFactory]
Oct 21, 2013 4:25:47 PM org.slf4j.impl.JCLLoggerAdapter warn
WARNING: Unable to add class: wikipediaXMLSplitter
java.lang.ClassNotFoundException: wikipediaXMLSplitter
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:171)
at org.apache.mahout.driver.MahoutDriver.addClass(MahoutDriver.java:236)
at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:127)
Oct 21, 2013 4:25:47 PM org.slf4j.impl.JCLLoggerAdapter warn
WARNING: No wikipediaXMLSplitter.props found on classpath, will use command-line arguments only
Unknown program 'wikipediaXMLSplitter' chosen.
I am using Hadoop 1.2 and Mahout 0.8.
mahout-distribution-0.8/bin has been added to $PATH.
$MAHOUT_LOCAL is set to "True", so it runs locally.
I dont know why I got "Unable to add class: wikipediaXMLSplitter"
To the original question:
The reason you are seeing the error:
There's no entry for wikipediaXmlSplitter in $MAHOUT_HOME/src/conf/driver.classes.default.props. Add the following line to this file:
org.apache.mahout.text.wikipedia.WikipediaXmlSplitter = wikipediaXmlSplitter : wikipedia splitter
You should now be able to invoke via:
mahout wikipediaXmlSplitter
Note, the case sensitivity in 'wikipediaXmlSplitter'.
There's an error on Mahout wiki wherein it reads as 'wikipediaXMLSplitter' as opposed to 'wikipediaXmlSplitter' which has since been fixed on the new Mahout website at http://mahout.apache.org/users/classification/wikipedia-bayes-example.html
You could try to use the full name
org.apache.mahout.text.wikipedia.WikipediaXmlSplitter
rather than just WikipediaXmlSplitter
Related
I encounter a problem when I tried to run a simple rest web service.
Information:
Java platform: 1.6.0_38
Oracle IDE: 11.1.1.9.40.71.67
Below is the error message:
<Apr 18, 2018 7:28:24 PM PDT> <Warning> <Deployer> <BEA-149078> <Stack trace for message 149004
weblogic.application.ModuleException: Failed to load webapp: 'lib-RESTWebService-context-root'
at weblogic.servlet.internal.WebAppModule.prepare(WebAppModule.java:395)
at weblogic.application.internal.flow.ScopedModuleDriver.prepare(ScopedModuleDriver.java:176)
at weblogic.application.internal.flow.ModuleListenerInvoker.prepare(ModuleListenerInvoker.java:199)
at weblogic.application.internal.flow.DeploymentCallbackFlow$1.next(DeploymentCallbackFlow.java:517)
at weblogic.application.utils.StateMachineDriver.nextState(StateMachineDriver.java:52)
Truncated. see log file for complete stacktrace
Caused By: java.lang.ClassNotFoundException: oracle.adfinternal.view.faces.taglib.region.RichDeclarativeComponentTag
at weblogic.utils.classloaders.GenericClassLoader.findLocalClass(GenericClassLoader.java:297)
at weblogic.utils.classloaders.GenericClassLoader.findClass(GenericClassLoader.java:270)
at weblogic.utils.classloaders.ChangeAwareClassLoader.findClass(ChangeAwareClassLoader.java:64)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Truncated. see log file for complete stacktrace
I searched a lot but failed to find a solution, could you please give your advice on it?
Thanks!
This particular class
oracle.adfinternal.view.faces.taglib.region.RichDeclarativeComponentTag
is part of the ADF Faces Runtime 11 library. Add it up to your project.
Also Make sure to add
javax.faces.context.ExternalContext
class , which is part of the Java EE 1.5 API library.
Also add JSF 1.2 library to your project.
I have installed Apache Spark 2.1.1 on Windows 10, with Java 1.8 and Python version 3.6 Anaconda 4.3.1. I have also downloaded the winutils.exe and setup environment avriables for JAVA_HOME, HADOOP_HOME and SPARK_HOME as well as updated the path variable. I have also run winutils.exe chmod -R 777 \tmp\hive. But I am getting the below error when running pyspark in cmd prompt.
Please please can someone help, let me know if I missed out any important detail
Thanks in advance!
c:\Spark>bin\pyspark
Python 3.6.0 |Anaconda 4.3.1 (64-bit)| (default, Dec 23 2016, 11:57:41) [MSC v.1900 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Traceback (most recent call last):
File "c:\Spark\python\pyspark\sql\utils.py", line 63, in deco
return f(*a, **kw)
File "c:\Spark\python\lib\py4j-0.10.4-src.zip\py4j\protocol.py", line 319, in get_return_value
py4j.protocol.Py4JJavaError: **An error occurred while calling o22.sessionState.
: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':**
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
I still get errors when launching [spark-shell], but it looks like Spark launches since I get the 'Welcome to Spark' piece. The error I get is
C:\Spark>bin\spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/06/23 12:20:15 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/bin/../jars/datanucleus-api-jdo-3.2.6.jar."
17/06/23 12:20:15 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/bin/../jars/datanucleus-rdbms-3.2.9.jar."
17/06/23 12:20:15 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/bin/../jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/jars/datanucleus-core-3.2.10.jar."
java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
at org.apache.spark.repl.Main$.createSparkSession(Main.scala:96)
... 47 elided
Caused by: java.lang.reflect.InvocationTargetException:
java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
... 58 more
Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:169)
at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
... 63 more
Caused by: java.lang.reflect.InvocationTargetException: java.lang.reflect.InvocationTargetException: java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
... 71 more
Caused by: java.lang.reflect.InvocationTargetException:
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:358)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:262)
at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:66)
... 76 more
Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode(NativeIO.java:524)
at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:478)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:532)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:509)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:305)
at org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:639)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:561)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:188)
... 84 more
14: error: not found: value spark
import spark.implicits._
^
14: error: not found: value spark
import spark.sql
^
Welcome to
Setup that worked for me is as follows "i didn't use winutils.exe":-
install pyspark and findspark using "Anaconda Command Prompt" as
pip3 install pyspark
and
pip3 install findspark
as you have already downloaded spark setup. unzip it and keep it in "C" drive i.e. "C:\spark-2.2.0-bin-hadoop2.7" and create new environment variable "SPARK_HOME" and set it to "C:\spark-2.2.0-bin-hadoop2.7\bin" and open "path" variable in system variables and add the same there as well.
now open your command prompt and come from "C:\User*" to "C:\" by doing cd.. twice and run following command
set SPARK_HOME='spark-2.2.0-bin-hadoop2.7'
and you are good to go.
Now you just have to import the file location before importing pyspark in your jupyter notebook. use following code:-
import findspark
findspark.init('C:\spark-2.2.0-bin-hadoop2.7')
import pyspark
I'm getting a confusing ClassNotFoundException when I try to run ExportSnapshot from my HBase master node. hbase shell and other commands work just fine, and my cluster is fully operational.
This feels like a Classpath issue, but I don't know what I'm missing.
$ /usr/bin/hbase org.apache.hadoop.hbase.snapshot.ExportSnapshot -snapshot ambarismoketest-snapshot -copy-to hdfs://10.0.1.90/apps/hbase/data -mappers 16
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2015-10-13 20:05:02,339 INFO [main] Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
2015-10-13 20:05:04,351 INFO [main] util.FSVisitor: No logs under directory:hdfs://cinco-de-nameservice/apps/hbase/data/.hbase-snapshot/impression_event_production_hbase-transfer-to-staging-20151013/WALs
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/Job
at org.apache.hadoop.hbase.snapshot.ExportSnapshot.runCopyJob(ExportSnapshot.java:529)
at org.apache.hadoop.hbase.snapshot.ExportSnapshot.run(ExportSnapshot.java:646)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.hbase.snapshot.ExportSnapshot.innerMain(ExportSnapshot.java:697)
at org.apache.hadoop.hbase.snapshot.ExportSnapshot.main(ExportSnapshot.java:701)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapreduce.Job
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 5 more
Problem
It turns out this is because the mapreduce2 JARs are not available in the classpath. The classpath was properly set up, but I did not have the mapreduce2 client installed on that node. HBase's ExportSnapshot apparently depends on those client JARs when exporting snapshots to another cluster because it writes to HDFS.
Fix
If you use Ambari:
Load Ambari UI
Pull up node where you were running the ExportSnapshot from and getting the above error
Under "components", click "Add"
Click "Mapreduce 2 client"
Background
There's a ticket here https://issues.apache.org/jira/browse/HBASE-9687 where the title is ClassNotFoundException is thrown when ExportSnapshot runs against hadoop cluster where HBase is not installed on the same node as resourcemanager. The title implies that installing resource manager is the fix and this may work; however, the crux is you need the hadoop mapreduce2 jars in the classpath and you can do that by simply installing the mapreduce2 client.
For us, specifically, the reason our snapshot exports were working one day and broken the next is that our HBase master switched on us b/c of another issue we had. Our backup HBase master did not have the mapreduce2 client JARs, but the original primary master did.
I have downloaded hive and modified HADOOP_HOME to
HADOOP_HOME=${bin}/../../usr/local/hadoop
my actual hadoop path is
/usr/local/hadoop
in .bashrc i have added the below env variables
export HIVE_HOME=/usr/lib/hive/apache-hive-1.1.0-bin
export PATH=$PATH:$HIVE_HOME/bin
export CLASSPATH=$CLASSPATH:/usr/local/Hadoop/lib/*:.
export CLASSPATH=$CLASSPATH:/usr/local/hive/lib/*:.
then i tried starting hive using bin/hive. I got the below error
Logging initialized using configuration in jar:file:/usr/lib/hive/apache-hive-1.1.0-bin/lib/hive-common-1.1.0.jar!/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hive/apache-hive-1.1.0-bin/lib/hive-jdbc-1.1.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
[ERROR] Terminal initialization failed; falling back to unsupported
java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected
at jline.TerminalFactory.create(TerminalFactory.java:101)
at jline.TerminalFactory.get(TerminalFactory.java:158)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:229)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:221)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:209)
at org.apache.hadoop.hive.cli.CliDriver.getConsoleReader(CliDriver.java:773)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:715)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Exception in thread "main" java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected
at jline.console.ConsoleReader.<init>(ConsoleReader.java:230)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:221)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:209)
at org.apache.hadoop.hive.cli.CliDriver.getConsoleReader(CliDriver.java:773)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:715)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
I had the same problem and got it working from this link:
https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started
Hive has upgraded to Jline2 but jline 0.9x exists in the Hadoop lib.
So you should follow these steps:
Delete jline from the Hadoop lib directory (it's only pulled in transitively from ZooKeeper).
export HADOOP_USER_CLASSPATH_FIRST=true
Try after removing the jline-0.9.94.jar file under the path $HADOOP_HOME/share/hadoop/yarn/lib/jline-0.9.94.jar
Here's the link to jira ticket
https://issues.apache.org/jira/browse/HIVE-8609
I just set the
HADOOP_USER_CLASSPATH_FIRST=true
and it works for me for this issue.
Try to delete one of this file
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hive/apache-hive-1.1.0-bin/lib/hive-jdbc-1.1.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
then i think it will only choose to bind to one. so multiple binding will not be available anymore
I had the same problem with cloudera CDH5.4 . Removing jline-0.9.94.jar from yarn/libs folder worked for me.
With Hadoop version 2.4.1 and Hive 1.2.0. I had the same issue. And after setting HADOOP_USER_CLASSPATH_FIRST=true in .bashrc
It worked like a charm!!!
Do this locate jline.
The file, jline-0.9.94.jar, is located in 3 locations, delete it from all the 3 locations and do the necessary export:
$ export HADOOP_USER_CLASSPATH_FIRST=true
With the new version of Hive (1.2.1), I just had to replace jline-2.12.jar with jline-2.13.jar in the installation / lib folder.
check this link it might help you Facing issue while running hive from CLI
You should init the hadoop libary:
$ vi ~/.bashrc
$ export HADOOP_USER_CLASSPATH_FIRST=true
$ source .bashrc
$ hive
I renamed
addJava "-Djline.terminal=jline.UnixTerminal"
to
addJava "-Djline.terminal=jline2.UnixTerminal"
in the "activator" file
Good Day,
I a working my way through the official JavaFX FXML tutorial (See source code here). However when I compile it using the Netbeans IDE I get the following error:
Can anyone help me with this
I'm running JDK 1.7 and JavaFX 2.0
init: Deleting:
C:\Users\riash\Documents\Riaz\Personal\Java\Samples\FXMLExample\build\built-jar.properties
deps-jar: Updating property file:
C:\Users\riash\Documents\Riaz\Personal\Java\Samples\FXMLExample\build\built-jar.properties
compile: Detected JavaFX Ant API version 1.1 Launching task
from C:\Program Files (x86)\Oracle\JavaFX 2.0 SDK\tools\ant-javafx.jar
Signing JAR:
C:\Users\riash\Documents\Riaz\Personal\Java\Samples\FXMLExample\dist\FXMLExample.jar
to
C:\Users\riash\Documents\Riaz\Personal\Java\Samples\FXMLExample\dist\FXMLExample.jar
as nb-jfx
Warning: The signer certificate will expire within six months. Enter
Passphrase for keystore: Enter key password for nb-jfx: Launching
task from C:\Program Files (x86)\Oracle\JavaFX 2.0
SDK\tools\ant-javafx.jar Skip jar copy to itself: FXMLExample.jar
jfx-deployment: jar: run: Jun 19, 2012 9:10:33 PM
javafx.fxml.FXMLLoader logException SEVERE: The following error
occurred at line 48 in file
/C:/Users/riash/Documents/Riaz/Personal/Java/Samples/FXMLExample/build/classes/fxmlexample/fxml_example.fxml[Ljava.lang.StackTraceElement;#1bb3a11
Exception in Application start method
java.lang.reflect.InvocationTargetException at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601) at
com.javafx.main.Main.launchApp(Main.java:453) at
com.javafx.main.Main.main(Main.java:537) Caused by:
java.lang.RuntimeException: Exception in Application start method at
com.sun.javafx.application.LauncherImpl.launchApplication1(Unknown
Source) at com.sun.javafx.application.LauncherImpl.access$000(Unknown
Source) at com.sun.javafx.application.LauncherImpl$1.run(Unknown
Source) at java.lang.Thread.run(Thread.java:722) Caused by:
javafx.fxml.LoadException: javafx.scene.layout.GridPane does not have
a default property.
Upgrading your JavaFX runtime to at least 2.1 will fix your problem.
The sample source you reference is designed for a 2.1 runtime, not a 2.0 runtime.
The reason the new source is incompatible with 2.0 is that 2.1 adds an inherited #DefaultProperty annotation to the Pane class (this annotated behaviour gets inherited by GridPane). Because of this, when you write fxml using 2.1 you can omit certain tags which get defaulted, making the 2.1 fxml less verbose than what is required for 2.0. A full explanation of this is provided by Dustin Marx in his blog.
Upgrade from JavaFx 2.0 to JavaFx 2.2 will fix this problem.