I have installed Apache Spark 2.1.1 on Windows 10, with Java 1.8 and Python version 3.6 Anaconda 4.3.1. I have also downloaded the winutils.exe and setup environment avriables for JAVA_HOME, HADOOP_HOME and SPARK_HOME as well as updated the path variable. I have also run winutils.exe chmod -R 777 \tmp\hive. But I am getting the below error when running pyspark in cmd prompt.
Please please can someone help, let me know if I missed out any important detail
Thanks in advance!
c:\Spark>bin\pyspark
Python 3.6.0 |Anaconda 4.3.1 (64-bit)| (default, Dec 23 2016, 11:57:41) [MSC v.1900 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Traceback (most recent call last):
File "c:\Spark\python\pyspark\sql\utils.py", line 63, in deco
return f(*a, **kw)
File "c:\Spark\python\lib\py4j-0.10.4-src.zip\py4j\protocol.py", line 319, in get_return_value
py4j.protocol.Py4JJavaError: **An error occurred while calling o22.sessionState.
: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':**
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
I still get errors when launching [spark-shell], but it looks like Spark launches since I get the 'Welcome to Spark' piece. The error I get is
C:\Spark>bin\spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/06/23 12:20:15 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/bin/../jars/datanucleus-api-jdo-3.2.6.jar."
17/06/23 12:20:15 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/bin/../jars/datanucleus-rdbms-3.2.9.jar."
17/06/23 12:20:15 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/bin/../jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/jars/datanucleus-core-3.2.10.jar."
java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
at org.apache.spark.repl.Main$.createSparkSession(Main.scala:96)
... 47 elided
Caused by: java.lang.reflect.InvocationTargetException:
java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
... 58 more
Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:169)
at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
... 63 more
Caused by: java.lang.reflect.InvocationTargetException: java.lang.reflect.InvocationTargetException: java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
... 71 more
Caused by: java.lang.reflect.InvocationTargetException:
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:358)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:262)
at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:66)
... 76 more
Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode(NativeIO.java:524)
at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:478)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:532)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:509)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:305)
at org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:639)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:561)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:188)
... 84 more
14: error: not found: value spark
import spark.implicits._
^
14: error: not found: value spark
import spark.sql
^
Welcome to
Setup that worked for me is as follows "i didn't use winutils.exe":-
install pyspark and findspark using "Anaconda Command Prompt" as
pip3 install pyspark
and
pip3 install findspark
as you have already downloaded spark setup. unzip it and keep it in "C" drive i.e. "C:\spark-2.2.0-bin-hadoop2.7" and create new environment variable "SPARK_HOME" and set it to "C:\spark-2.2.0-bin-hadoop2.7\bin" and open "path" variable in system variables and add the same there as well.
now open your command prompt and come from "C:\User*" to "C:\" by doing cd.. twice and run following command
set SPARK_HOME='spark-2.2.0-bin-hadoop2.7'
and you are good to go.
Now you just have to import the file location before importing pyspark in your jupyter notebook. use following code:-
import findspark
findspark.init('C:\spark-2.2.0-bin-hadoop2.7')
import pyspark
Related
I am working on Spark Scala using IntelliJ IDE, Recently I installed Scala and Spark in my local and there was system update for mac-os version, so not sure what broke it.
I am getting error when I try to build my project now, Which was working fine a day before.
I checked for JRE vs JDK as suggested in other answers and I am sure that my project is pointing to JDK 1.8. Also I removed Scala and Spark from machine to make sure my machine is in same state as before. Still I am getting this error. Checked existing answers on same error, but no help.
[INFO] --- maven-surefire-plugin:2.7:test (default-test) # dotcom-jobs ---
[INFO] Tests are skipped.
[INFO]
[INFO] --- scalatest-maven-plugin:2.0.0:test (small-tests) # dotcom-jobs ---
*** RUN ABORTED ***
java.lang.IllegalStateException: Could not initialize plugin: interface org.mockito.plugins.MockMaker (alternate: null)
at org.mockito.internal.configuration.plugins.PluginLoader$1.invoke(PluginLoader.java:74)
at com.sun.proxy.$Proxy2.isTypeMockable(Unknown Source)
at org.mockito.internal.util.MockUtil.typeMockabilityOf(MockUtil.java:29)
at org.mockito.internal.util.MockCreationValidator.validateType(MockCreationValidator.java:22)
at org.mockito.internal.creation.MockSettingsImpl.validatedSettings(MockSettingsImpl.java:241)
at org.mockito.internal.creation.MockSettingsImpl.build(MockSettingsImpl.java:229)
at org.mockito.internal.MockitoCore.mock(MockitoCore.java:62)
at org.mockito.Mockito.spy(Mockito.java:1992)
at com.homelabs.sc.rbac.utils.ConfigHelper$class.configUtil(ConfigHelper.scala:26)
at com.homelabs.sc.rbac.base.BaseSmallTest.configUtil$lzycompute(BaseSmallTest.scala:7)
...
Cause: java.lang.IllegalStateException: Failed to load interface org.mockito.plugins.MockMaker implementation declared in sun.misc.CompoundEnumeration#517d4a0d
at org.mockito.internal.configuration.plugins.PluginInitializer.loadImpl(PluginInitializer.java:54)
at org.mockito.internal.configuration.plugins.PluginLoader.loadPlugin(PluginLoader.java:57)
at org.mockito.internal.configuration.plugins.PluginLoader.loadPlugin(PluginLoader.java:44)
at org.mockito.internal.configuration.plugins.PluginRegistry.<init>(PluginRegistry.java:22)
at org.mockito.internal.configuration.plugins.Plugins.<clinit>(Plugins.java:19)
at org.mockito.internal.util.MockUtil.<clinit>(MockUtil.java:24)
at org.mockito.internal.util.MockCreationValidator.validateType(MockCreationValidator.java:22)
at org.mockito.internal.creation.MockSettingsImpl.validatedSettings(MockSettingsImpl.java:241)
at org.mockito.internal.creation.MockSettingsImpl.build(MockSettingsImpl.java:229)
at org.mockito.internal.MockitoCore.mock(MockitoCore.java:62)
...
Cause: org.mockito.exceptions.base.MockitoInitializationException: Could not initialize inline Byte Buddy mock maker. (This mock maker is not supported on Android.)
Are you running a JRE instead of a JDK? The inline mock maker needs to be run on a JDK.
Java : 1.8
JVM vendor name : Oracle Corporation
JVM vendor version : 25.221-b11
JVM name : Java HotSpot(TM) 64-Bit Server VM
JVM version : 1.8.0_221-b11
JVM info : mixed mode
OS name : Mac OS X
OS version : 10.16
at org.mockito.internal.creation.bytebuddy.InlineByteBuddyMockMaker.<init>(InlineByteBuddyMockMaker.java:170)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at org.mockito.internal.configuration.plugins.PluginInitializer.loadImpl(PluginInitializer.java:49)
at org.mockito.internal.configuration.plugins.PluginLoader.loadPlugin(PluginLoader.java:57)
at org.mockito.internal.configuration.plugins.PluginLoader.loadPlugin(PluginLoader.java:44)
at org.mockito.internal.configuration.plugins.PluginRegistry.<init>(PluginRegistry.java:22)
...
Cause: java.lang.IllegalStateException: No compatible attachment provider is available
at net.bytebuddy.agent.ByteBuddyAgent.install(ByteBuddyAgent.java:597)
at net.bytebuddy.agent.ByteBuddyAgent.install(ByteBuddyAgent.java:581)
at net.bytebuddy.agent.ByteBuddyAgent.install(ByteBuddyAgent.java:533)
at net.bytebuddy.agent.ByteBuddyAgent.install(ByteBuddyAgent.java:510)
at org.mockito.internal.creation.bytebuddy.InlineByteBuddyMockMaker.<clinit>(InlineByteBuddyMockMaker.java:104)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
...
The problem was due to mac-os Big Sur update, contrary to what I thought the root cause is (installing scala).
So I solved this following this answer here at apple forum: https://developer.apple.com/forums/thread/666681
sudo rm -fr /Library/Internet\ Plug-Ins/JavaAppletPlugin.plugin
sudo rm -fr /Library/PreferencePanes/JavaControlPanel.prefpane
I am trying to install open source Accumulo on RHEL 7.x. I have two GB of swap space. I have installed Java 1.8, Hadoop 3, and Zookeeper. I have run the bootstrap_config.sh script for Accumulo 1.9.2.
I ran this (and expected it to work):
/bin/accumulo-1.9.2/bin/accumulo init
But I get this error:
[start.Main] ERROR: Uncaught exception
java.util.ServiceConfigurationError:
org.apache.accumulo.start.spi.KeywordExecutable: Provider
org.apache.accumulo.proxy.Proxy could not be instantiated
at java.util.ServiceLoader.fail(ServiceLoader.java:232)
at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at org.apache.accumulo.start.Main.checkDuplicates(Main.java:237)
at org.apache.accumulo.start.Main.getExecutables(Main.java:228)
at org.apache.accumulo.start.Main.main(Main.java:84) Caused by: java.lang.NoClassDefFoundError:
org/apache/commons/configuration/Configuration
at java.lang.Class.getDeclaredConstructors0(Native Method)
at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671)
at java.lang.Class.getConstructor0(Class.java:3075)
at java.lang.Class.newInstance(Class.java:412)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
... 5 more Caused by: java.lang.ClassNotFoundException: org.apache.commons.configuration.Configuration
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at org.apache.accumulo.start.classloader.AccumuloClassLoader$2.loadClass(AccumuloClassLoader.java:294)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 10 more
I used the Accumulo bootstrap_config.sh script to configure Hadoop version 3. How do I get "/bin/accumulo-1.9.2/bin/accumulo init" to work?
Accumulo 1.9.2 expects Hadoop 2 out of the box, but does have a build profile to rebuild a tarball specifically for use with Hadoop 3. You can build Accumulo with the Hadoop 3 profile by downloading the source tarball and doing:
mvn clean package -Dhadoop.profile=3 -DskipTests
If you're not interested in rebuilding from source, it may be possible to simply fix the class path issues by reading the error message, and adjusting your class path accordingly. In this case, it seems you're missing a commons-configuration jar.
Elasticsearch Version - 5.5.1,
Java Version - 8-u131
When I run the elasticsearch version command(or elasticsearch-plugin install), instead of displaying the version it shows this stacktrace:
>./elasticsearch --version
Exception in thread "main" java.lang.IllegalArgumentException: Invalid Configuration class specified
at org.apache.logging.log4j.core.config.builder.impl.DefaultConfigurationBuilder.build(DefaultConfigurationBuilder.java:198)
at org.apache.logging.log4j.core.config.builder.impl.DefaultConfigurationBuilder.build(DefaultConfigurationBuilder.java:159)
at org.apache.logging.log4j.core.config.builder.impl.DefaultConfigurationBuilder.build(DefaultConfigurationBuilder.java:55)
at org.elasticsearch.common.logging.LogConfigurator.configureStatusLogger(LogConfigurator.java:175)
at org.elasticsearch.common.logging.LogConfigurator.configureWithoutConfig(LogConfigurator.java:99)
at org.elasticsearch.cli.Command.main(Command.java:85)
at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:91)
at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:84)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at org.apache.logging.log4j.core.config.builder.impl.DefaultConfigurationBuilder.build(DefaultConfigurationBuilder.java:170)
... 7 more
Caused by: java.lang.NoSuchMethodError: org.apache.logging.log4j.core.config.AbstractConfiguration.<init>(Lorg/apache/logging/log4j/core/LoggerContext;Lorg/apache/logging/log4j/core/config/ConfigurationSource;)V
at org.apache.logging.log4j.core.config.builder.impl.BuiltConfiguration.<init>(BuiltConfiguration.java:58)
... 12 more
This seems like a case of missing/incorrect versioning of log4j jars packaged with elasticsearch 5.5.1 rpm.
It ships with 2.9.1 and missing method appeared in 2.7.x. Do you have the same exception with zip version? You might want to add set -x as second line in elasticsearch script file and check what is passed in -cp "$ES_CLASSPATH" - maybe it picks up some old jars from some other path.
I have been following the HBase installation instructions as given on this page:
https://hbase.apache.org/book.html#quickstart
I am using HBase version 1.1.3 (stable release) and have configured in standalone mode. I have installed OpenJDK 7
When I try to start hbase it crashes after few seconds and I get the following error:
2016-01-29 17:37:04,136 ERROR [main] master.HMasterCommandLine: Master exiting
java.lang.RuntimeException: Failed construction of Master: class org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster
at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:143)
at org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:219)
at org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:155)
at org.apache.hadoop.hbase.master.HMasterCommandLine.startMaster(HMasterCommandLine.java:224)
at org.apache.hadoop.hbase.master.HMasterCommandLine.run(HMasterCommandLine.java:139)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.hbase.util.ServerCommandLine.doMain(ServerCommandLine.java:126)
at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:2355)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hbase.util.Bytes$LexicographicalComparerHolder$UnsafeComparer
at org.apache.hadoop.hbase.util.Bytes.putInt(Bytes.java:899)
at org.apache.hadoop.hbase.KeyValue.createByteArray(KeyValue.java:1082)
at org.apache.hadoop.hbase.KeyValue.<init>(KeyValue.java:652)
at org.apache.hadoop.hbase.KeyValue.<init>(KeyValue.java:580)
at org.apache.hadoop.hbase.KeyValue.<init>(KeyValue.java:483)
at org.apache.hadoop.hbase.KeyValue.<init>(KeyValue.java:370)
at org.apache.hadoop.hbase.KeyValue.<clinit>(KeyValue.java:267)
at org.apache.hadoop.hbase.HConstants.<clinit>(HConstants.java:978)
at org.apache.hadoop.hbase.HTableDescriptor.<clinit>(HTableDescriptor.java:1488)
at org.apache.hadoop.hbase.util.FSTableDescriptors.<init>(FSTableDescriptors.java:124)
at org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:570)
at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:365)
at org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster.<init>(HMasterCommandLine.java:307)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:139)
... 7 more
Can anyone tell me the reason for this error ?
Please let me know if you need more info.
The exception causing this is NoClassDefFoundError which usually means you are missing something from your classpath. In this you're missing org.apache.hadoop.hbase.util.Bytes which comes from hbase-common.jar. However, if one is missing, there are probably others too. This might mean you're starting HBase the wrong way. Have you tried the included start-up scripts?
I have installed gerrit on a Linux Mint machine, and I'm trying to use a mysql database connection for HTTP authentication.
But when I try to start gerrit, I get the following error:
ERROR com.google.gerrit.pgm.Daemon : Unable to start daemon
com.google.gerrit.common.Die: Cannot connect to SQL database
at com.google.gerrit.pgm.util.AbstractProgram.die(AbstractProgram.java:88)
at com.google.gerrit.pgm.util.SiteProgram.createDbInjector(SiteProgram.java:158)
at com.google.gerrit.pgm.Daemon.start(Daemon.java:275)
at com.google.gerrit.pgm.Daemon.run(Daemon.java:204)
at com.google.gerrit.pgm.util.AbstractProgram.main(AbstractProgram.java:64)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.google.gerrit.launcher.GerritLauncher.invokeProgram(GerritLauncher.java:166)
at com.google.gerrit.launcher.GerritLauncher.mainImpl(GerritLauncher.java:93)
at com.google.gerrit.launcher.GerritLauncher.main(GerritLauncher.java:50)
at Main.main(Main.java:25)
Caused by: java.sql.SQLException: Driver class com.mysql.jdbc.Driver not available
at com.google.gwtorm.jdbc.SimpleDataSource.loadDriver(SimpleDataSource.java:171)
at com.google.gwtorm.jdbc.SimpleDataSource.<init>(SimpleDataSource.java:85)
at com.google.gerrit.server.schema.DataSourceProvider.open(DataSourceProvider.java:144)
at com.google.gerrit.server.schema.DataSourceProvider.get(DataSourceProvider.java:65)
at com.google.gerrit.pgm.util.SiteLibraryBasedDataSourceProvider.get(SiteLibraryBasedDataSourceProvider.java:52)
at com.google.gerrit.pgm.util.SiteLibraryBasedDataSourceProvider.get(SiteLibraryBasedDataSourceProvider.java:32)
at com.google.inject.internal.ProviderInternalFactory.provision(ProviderInternalFactory.java:86)
at com.google.inject.internal.BoundProviderFactory.provision(BoundProviderFactory.java:73)
at com.google.inject.internal.ProviderInternalFactory.circularGet(ProviderInternalFactory.java:66)
at com.google.inject.internal.BoundProviderFactory.get(BoundProviderFactory.java:63)
at com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46)
at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1066)
at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40)
at com.google.inject.Scopes$1$1.get(Scopes.java:65)
at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:41)
at com.google.inject.internal.InternalInjectorCreator$1.call(InternalInjectorCreator.java:205)
at com.google.inject.internal.InternalInjectorCreator$1.call(InternalInjectorCreator.java:199)
at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1059)
at com.google.inject.internal.InternalInjectorCreator.loadEagerSingletons(InternalInjectorCreator.java:199)
at com.google.inject.internal.InternalInjectorCreator.injectDynamically(InternalInjectorCreator.java:180)
at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:110)
at com.google.inject.Guice.createInjector(Guice.java:96)
at com.google.gerrit.pgm.util.SiteProgram.createDbInjector(SiteProgram.java:152)
... 11 more
Caused by: java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at com.google.gwtorm.jdbc.SimpleDataSource.loadDriver(SimpleDataSource.java:168)
... 33 more
I am copied mysql-connector-java-5.1.28.jar in /usr/share/java but still, this error persists.
Can somebody point me to a direction?
Better to leave /usr/share/java untouched. Anything you put there will be lost when you upgrade the next time. Instead, define your own common location for your drivers. If it is just for you, a subdirectory of your home location might be good: mkdir ~/jdbc.
Then copy your .jar for your jdbc driver in that location.
Then depending on how you run your java code, you will have to figure out how to include your jar's into your classpath.
standalone java program started with command-line java, then use the -cp parameter option.
Application Servers like Tomcat, or Weblogic, refer their documentation. Note that many suggest a drop location within the app server install, but for the same reasons as for the /usr/share/java, I would advise not to do so. Actually tomcat allows you to separate out CATALINA_HOME from CATALINA_BASE to solve that issue.
Either way, it appears that the exception was generated from a simple main, not a web app, so I assume you can go for the first option.