I'm trying to set up apache cassandra on my OS X macbook. I'm trying to test NoSql databases, I've already set up memcached and redis. But with cassandra I'm having
I'm using jdk 1.7.0_09 from oracle
I've followed the installation instructions, but when I'm trying to start the server, I get this the issue from the console:
MacBook-Air-Urij:bin urijvoskresenskij$ ./cassandra -f
xss = -ea -javaagent:./../lib/jamm-0.2.5.jar -XX:+UseThreadPriorities -XX:ThreadPriorityPolicy=42 -Xms1024M -Xmx1024M -Xmn200M -XX:+HeapDumpOnOutOfMemoryError
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: /var/log/cassandra/system.log (Permission denied)
at java.io.FileOutputStream.open(Native Method)
at java.io.FileOutputStream.<init>(FileOutputStream.java:212)
at java.io.FileOutputStream.<init>(FileOutputStream.java:136)
at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
at org.apache.log4j.RollingFileAppender.setFile(RollingFileAppender.java:207)
at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:809)
at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:735)
at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:615)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:502)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:395)
at org.apache.log4j.PropertyWatchdog.doOnChange(PropertyConfigurator.java:922)
at org.apache.log4j.helpers.FileWatchdog.checkAndConfigure(FileWatchdog.java:89)
at org.apache.log4j.helpers.FileWatchdog.<init>(FileWatchdog.java:58)
at org.apache.log4j.PropertyWatchdog.<init>(PropertyConfigurator.java:914)
at org.apache.log4j.PropertyConfigurator.configureAndWatch(PropertyConfigurator.java:461)
at org.apache.cassandra.service.AbstractCassandraDaemon.initLog4j(AbstractCassandraDaemon.java:100)
at org.apache.cassandra.thrift.CassandraDaemon.<clinit>(CassandraDaemon.java:61)
INFO 17:40:17,717 Logging initialized
INFO 17:40:17,720 JVM vendor/version: Java HotSpot(TM) 64-Bit Server VM/1.7.0_09
INFO 17:40:17,720 Heap size: 1052770304/1052770304
INFO 17:40:17,721 Classpath: ./../conf:./../build/classes/main:./../build/classes/thrift:./../lib/antlr-3.2.jar:./../lib/apache-cassandra-1.1.7.jar:./../lib/apache-cassandra-clientutil-1.1.7.jar:./../lib/apache-cassandra-thrift-1.1.7.jar:./../lib/avro-1.4.0-fixes.jar:./../lib/avro-1.4.0-sources-fixes.jar:./../lib/commons-cli-1.1.jar:./../lib/commons-codec-1.2.jar:./../lib/commons-lang-2.4.jar:./../lib/compress-lzf-0.8.4.jar:./../lib/concurrentlinkedhashmap-lru-1.3.jar:./../lib/guava-r08.jar:./../lib/high-scale-lib-1.1.2.jar:./../lib/jackson-core-asl-1.9.2.jar:./../lib/jackson-mapper-asl-1.9.2.jar:./../lib/jamm-0.2.5.jar:./../lib/jline-0.9.94.jar:./../lib/json-simple-1.1.jar:./../lib/libthrift-0.7.0.jar:./../lib/log4j-1.2.16.jar:./../lib/metrics-core-2.0.3.jar:./../lib/servlet-api-2.5-20081211.jar:./../lib/slf4j-api-1.6.1.jar:./../lib/slf4j-log4j12-1.6.1.jar:./../lib/snakeyaml-1.6.jar:./../lib/snappy-java-1.0.4.1.jar:./../lib/snaptree-0.1.jar:./../lib/jamm-0.2.5.jar
INFO 17:40:17,722 JNA not found. Native methods will be disabled.
INFO 17:40:17,734 Loading settings from file:/Users/urijvoskresenskij/apache-cassandra-1.1.7/conf/cassandra.yaml
INFO 17:40:17,924 DiskAccessMode 'auto' determined to be mmap, indexAccessMode is mmap
INFO 17:40:18,149 Global memtable threshold is enabled at 334MB
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.xerial.snappy.SnappyLoader.loadNativeLibrary(SnappyLoader.java:317)
at org.xerial.snappy.SnappyLoader.load(SnappyLoader.java:219)
at org.xerial.snappy.Snappy.<clinit>(Snappy.java:44)
at org.apache.cassandra.io.compress.SnappyCompressor.create(SnappyCompressor.java:46)
at org.apache.cassandra.io.compress.SnappyCompressor.isAvailable(SnappyCompressor.java:56)
at org.apache.cassandra.io.compress.SnappyCompressor.<clinit>(SnappyCompressor.java:38)
at org.apache.cassandra.config.CFMetaData.<clinit>(CFMetaData.java:76)
at org.apache.cassandra.config.KSMetaData.systemKeyspace(KSMetaData.java:84)
at org.apache.cassandra.config.DatabaseDescriptor.loadYaml(DatabaseDescriptor.java:438)
at org.apache.cassandra.config.DatabaseDescriptor.<clinit>(DatabaseDescriptor.java:114)
at org.apache.cassandra.service.AbstractCassandraDaemon.setup(AbstractCassandraDaemon.java:127)
at org.apache.cassandra.service.AbstractCassandraDaemon.activate(AbstractCassandraDaemon.java:389)
at org.apache.cassandra.thrift.CassandraDaemon.main(CassandraDaemon.java:106)
Caused by: java.lang.UnsatisfiedLinkError: no snappyjava in java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1860)
at java.lang.Runtime.loadLibrary0(Runtime.java:845)
at java.lang.System.loadLibrary(System.java:1084)
at org.xerial.snappy.SnappyNativeLoader.loadLibrary(SnappyNativeLoader.java:52)
... 17 more
WARN 17:40:18,269 Cannot initialize native Snappy library. Compression on new tables will be disabled.
ERROR 17:40:18,317 Exception encountered during startup
java.lang.AssertionError: Directory /var/lib/cassandra/data is not accessible.
at org.apache.cassandra.service.AbstractCassandraDaemon.setup(AbstractCassandraDaemon.java:155)
at org.apache.cassandra.service.AbstractCassandraDaemon.activate(AbstractCassandraDaemon.java:389)
at org.apache.cassandra.thrift.CassandraDaemon.main(CassandraDaemon.java:106)
java.lang.AssertionError: Directory /var/lib/cassandra/data is not accessible.
at org.apache.cassandra.service.AbstractCassandraDaemon.setup(AbstractCassandraDaemon.java:155)
at org.apache.cassandra.service.AbstractCassandraDaemon.activate(AbstractCassandraDaemon.java:389)
at org.apache.cassandra.thrift.CassandraDaemon.main(CassandraDaemon.java:106)
Exception encountered during startup: Directory /var/lib/cassandra/data is not accessible.
</pre></code>
I don't know how to solve the problem. Please, help me, guys :)
You don't have permission to write to /var/log, which is to be expected. You'll need to run Cassandra as sudo or create the directory yourself with the proper permissions.
config/cassandra.yaml - contains paths to all folders required by Cassandra. Change those paths to place where your user has write access so that you do not need to run it as sudo.
I think a potentially less intrusive solution for working on local dev machines is not to mess with root permissions at all. Just change the log locations to anywhere you want by editing the log4j properties file in the <cassandra_install_dir>/conf directory.
Related
I have installed Apache Spark 2.1.1 on Windows 10, with Java 1.8 and Python version 3.6 Anaconda 4.3.1. I have also downloaded the winutils.exe and setup environment avriables for JAVA_HOME, HADOOP_HOME and SPARK_HOME as well as updated the path variable. I have also run winutils.exe chmod -R 777 \tmp\hive. But I am getting the below error when running pyspark in cmd prompt.
Please please can someone help, let me know if I missed out any important detail
Thanks in advance!
c:\Spark>bin\pyspark
Python 3.6.0 |Anaconda 4.3.1 (64-bit)| (default, Dec 23 2016, 11:57:41) [MSC v.1900 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Traceback (most recent call last):
File "c:\Spark\python\pyspark\sql\utils.py", line 63, in deco
return f(*a, **kw)
File "c:\Spark\python\lib\py4j-0.10.4-src.zip\py4j\protocol.py", line 319, in get_return_value
py4j.protocol.Py4JJavaError: **An error occurred while calling o22.sessionState.
: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':**
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
I still get errors when launching [spark-shell], but it looks like Spark launches since I get the 'Welcome to Spark' piece. The error I get is
C:\Spark>bin\spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/06/23 12:20:15 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/bin/../jars/datanucleus-api-jdo-3.2.6.jar."
17/06/23 12:20:15 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/bin/../jars/datanucleus-rdbms-3.2.9.jar."
17/06/23 12:20:15 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/bin/../jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/jars/datanucleus-core-3.2.10.jar."
java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
at org.apache.spark.repl.Main$.createSparkSession(Main.scala:96)
... 47 elided
Caused by: java.lang.reflect.InvocationTargetException:
java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
... 58 more
Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:169)
at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
... 63 more
Caused by: java.lang.reflect.InvocationTargetException: java.lang.reflect.InvocationTargetException: java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
... 71 more
Caused by: java.lang.reflect.InvocationTargetException:
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:358)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:262)
at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:66)
... 76 more
Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode(NativeIO.java:524)
at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:478)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:532)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:509)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:305)
at org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:639)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:561)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:188)
... 84 more
14: error: not found: value spark
import spark.implicits._
^
14: error: not found: value spark
import spark.sql
^
Welcome to
Setup that worked for me is as follows "i didn't use winutils.exe":-
install pyspark and findspark using "Anaconda Command Prompt" as
pip3 install pyspark
and
pip3 install findspark
as you have already downloaded spark setup. unzip it and keep it in "C" drive i.e. "C:\spark-2.2.0-bin-hadoop2.7" and create new environment variable "SPARK_HOME" and set it to "C:\spark-2.2.0-bin-hadoop2.7\bin" and open "path" variable in system variables and add the same there as well.
now open your command prompt and come from "C:\User*" to "C:\" by doing cd.. twice and run following command
set SPARK_HOME='spark-2.2.0-bin-hadoop2.7'
and you are good to go.
Now you just have to import the file location before importing pyspark in your jupyter notebook. use following code:-
import findspark
findspark.init('C:\spark-2.2.0-bin-hadoop2.7')
import pyspark
I am trying to configure hadoop and format namenode using this command:
$ hdfs namenode -format
However, I keep getting this error. How can I fix it?
2017-06-20 12:22:25,792 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where applicable ^[[A2017-06-20 12:22:28,825 WARN ipc.Client: Failed
to connect to server: localhost/127.0.0.1:9000: try once and fail.
java.net.ConnectException: Connection refused at
sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531) at
org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495) at
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:681)
at
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:777)
at
org.apache.hadoop.ipc.Client$Connection.access$3500(Client.java:408)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1542) at
org.apache.hadoop.ipc.Client.call(Client.java:1373) at
org.apache.hadoop.ipc.Client.call(Client.java:1337) at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227)
at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:115)
at com.sun.proxy.$Proxy11.getFileInfo(Unknown Source) at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:812)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:398)
at
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163)
at
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155)
at
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:335)
at com.sun.proxy.$Proxy12.getFileInfo(Unknown Source) at
org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1638) at
org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1367)
at
org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1364)
at
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1379)
at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:64) at
org.apache.hadoop.fs.Globber.doGlob(Globber.java:269) at
org.apache.hadoop.fs.Globber.glob(Globber.java:148) at
org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1960) at
org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:326)
at
org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:239)
at
org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:222)
at
org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:103)
at org.apache.hadoop.fs.shell.Command.run(Command.java:166) at
org.apache.hadoop.fs.FsShell.run(FsShell.java:326) at
org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at
org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) at
org.apache.hadoop.fs.FsShell.main(FsShell.java:389) ls: Call From
ubuntu/127.0.1.1 to localhost:9000 failed on connection exception:
java.net.ConnectException: Connection refused; For more details see:
http://wiki.apache.org/hadoop/ConnectionRefused \ERROR: JAVA_HOME
/opt/jdk1.8.0_91/ does not exist.
2017-06-20 12:22:25,792 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Apache Hadoop can integrate with an optional native library that contains extensions implementing deeper OS integration and performance enhancements for certain features. For more details, refer to the documentation page Native Libraries Guide.
In many cases for client-side usage or developer setups, the native library extensions are optional. However, the log warning can be a nuisance. If you'd like to get it out of the way, there are a few options:
Some Apache Hadoop releases include a pre-built native library. You could potentially use this. This bundled library would use whatever OS/architecture executed the Apache Hadoop release build, so there is no guarantee that it will match your own OS/architecture. Do not attempt to mix and match different versions of Hadoop code and native code or you may see unusual linkage errors.
Build the native library yourself. This would guarantee the build matches your actual runtime OS/architecture. In addition to the documentation page linked above, there are more specific instructions on how to build in the BUILDING.txt file. Again, it's important to match the version of the Hadoop code and native code.
If you use a commercial vendor's distro, then they likely have taken care of guaranteeing the native library is already set up and deployed correctly.
If all else fails, you can stifle the warning by adding the following to log4j.properties
log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR
org.apache.hadoop.fs.FsShell.main(FsShell.java:389) ls: Call From ubuntu/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
This is a different error, unrelated to the native code loader warning. This indicates that the client failed to connect to the NameNode during command execution. This can be caused by lack of network connectivity or simply the NameNode daemon is not running. The linked wiki page in the error message has more details on troubleshooting this.
ERROR: JAVA_HOME /opt/jdk1.8.0_91/ does not exist.
I'm not sure exactly where this message came from, but it's self-explanatory. Either check that you have Java deployed at that path, or change JAVA_HOME to the correct path.
Dears,
I'm struggling while trying to connect to Remote RMX.
Already found that additional Java start arguments must be added:
-Dcom.sun.management.jmxremote.authenticate=false
-Dcom.sun.management.jmxremote.ssl=false
-Dcom.sun.management.jmxremote.port=3333
But also it generates another problem where LogManager must be specified.
So to JAVA_OPTS I also added already:
set JAVA_OPTS=%JAVA_OPTS% -Djava.util.logging.manager=org.jboss.logmanager.LogManager
set JAVA_OPTS=%JAVA_OPTS% -Xbootclasspath/p:%JBOSS_HOME%\modules\system\layers\base\org\jboss\logmanager\main\jboss-logmanager-1.5.2.Final-redhat-1.jar
But yet I'm getting this error:
14:21:24,061 INFO [org.jboss.modules] (main) JBoss Modules version 1.3.5.Final-
redhat-1
java.lang.IllegalStateException: The LogManager was not properly installed (you
must set the "java.util.logging.manager" system property to "org.jboss.logmanage
r.LogManager")
at org.jboss.logmanager.Logger.getLogger(Logger.java:61)
at org.jboss.as.server.Main.main(Main.java:84)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.jboss.modules.Module.run(Module.java:312)
at org.jboss.modules.Main.main(Main.java:460)
Press any key to continue . . .
Any ideas what else could be checked or added?
Turned out that due to jboss customizations we are not able to run native JMX remote connection.
I used jolokia addon along with hawtio app to monitor server resources remotely.
I have installed gerrit on a Linux Mint machine, and I'm trying to use a mysql database connection for HTTP authentication.
But when I try to start gerrit, I get the following error:
ERROR com.google.gerrit.pgm.Daemon : Unable to start daemon
com.google.gerrit.common.Die: Cannot connect to SQL database
at com.google.gerrit.pgm.util.AbstractProgram.die(AbstractProgram.java:88)
at com.google.gerrit.pgm.util.SiteProgram.createDbInjector(SiteProgram.java:158)
at com.google.gerrit.pgm.Daemon.start(Daemon.java:275)
at com.google.gerrit.pgm.Daemon.run(Daemon.java:204)
at com.google.gerrit.pgm.util.AbstractProgram.main(AbstractProgram.java:64)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.google.gerrit.launcher.GerritLauncher.invokeProgram(GerritLauncher.java:166)
at com.google.gerrit.launcher.GerritLauncher.mainImpl(GerritLauncher.java:93)
at com.google.gerrit.launcher.GerritLauncher.main(GerritLauncher.java:50)
at Main.main(Main.java:25)
Caused by: java.sql.SQLException: Driver class com.mysql.jdbc.Driver not available
at com.google.gwtorm.jdbc.SimpleDataSource.loadDriver(SimpleDataSource.java:171)
at com.google.gwtorm.jdbc.SimpleDataSource.<init>(SimpleDataSource.java:85)
at com.google.gerrit.server.schema.DataSourceProvider.open(DataSourceProvider.java:144)
at com.google.gerrit.server.schema.DataSourceProvider.get(DataSourceProvider.java:65)
at com.google.gerrit.pgm.util.SiteLibraryBasedDataSourceProvider.get(SiteLibraryBasedDataSourceProvider.java:52)
at com.google.gerrit.pgm.util.SiteLibraryBasedDataSourceProvider.get(SiteLibraryBasedDataSourceProvider.java:32)
at com.google.inject.internal.ProviderInternalFactory.provision(ProviderInternalFactory.java:86)
at com.google.inject.internal.BoundProviderFactory.provision(BoundProviderFactory.java:73)
at com.google.inject.internal.ProviderInternalFactory.circularGet(ProviderInternalFactory.java:66)
at com.google.inject.internal.BoundProviderFactory.get(BoundProviderFactory.java:63)
at com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46)
at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1066)
at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40)
at com.google.inject.Scopes$1$1.get(Scopes.java:65)
at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:41)
at com.google.inject.internal.InternalInjectorCreator$1.call(InternalInjectorCreator.java:205)
at com.google.inject.internal.InternalInjectorCreator$1.call(InternalInjectorCreator.java:199)
at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1059)
at com.google.inject.internal.InternalInjectorCreator.loadEagerSingletons(InternalInjectorCreator.java:199)
at com.google.inject.internal.InternalInjectorCreator.injectDynamically(InternalInjectorCreator.java:180)
at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:110)
at com.google.inject.Guice.createInjector(Guice.java:96)
at com.google.gerrit.pgm.util.SiteProgram.createDbInjector(SiteProgram.java:152)
... 11 more
Caused by: java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at com.google.gwtorm.jdbc.SimpleDataSource.loadDriver(SimpleDataSource.java:168)
... 33 more
I am copied mysql-connector-java-5.1.28.jar in /usr/share/java but still, this error persists.
Can somebody point me to a direction?
Better to leave /usr/share/java untouched. Anything you put there will be lost when you upgrade the next time. Instead, define your own common location for your drivers. If it is just for you, a subdirectory of your home location might be good: mkdir ~/jdbc.
Then copy your .jar for your jdbc driver in that location.
Then depending on how you run your java code, you will have to figure out how to include your jar's into your classpath.
standalone java program started with command-line java, then use the -cp parameter option.
Application Servers like Tomcat, or Weblogic, refer their documentation. Note that many suggest a drop location within the app server install, but for the same reasons as for the /usr/share/java, I would advise not to do so. Actually tomcat allows you to separate out CATALINA_HOME from CATALINA_BASE to solve that issue.
Either way, it appears that the exception was generated from a simple main, not a web app, so I assume you can go for the first option.
I'm trying to extract a java heap dump from a native dump taken by WER when it crashed:
jstack -m -l "c:\Program Files\Java\jre6\bin\java.exe" WER.tmp.hdmp
but I get the following exception:
Attaching to core c:\Users\xxx\Desktop\WER.tmp.hdmp from executable c:\Program Files\Java\jre6\bin\java.exe, please wait...
sun.jvm.hotspot.debugger.NoSuchSymbolException: Could not find symbol "gHotSpotVMTypes" in any of the known library names (jvm.dll, jvm_g.dll)
at sun.jvm.hotspot.HotSpotTypeDataBase.lookupInProcess(HotSpotTypeDataBase.java:389)
at sun.jvm.hotspot.HotSpotTypeDataBase.readVMTypes(HotSpotTypeDataBase.java:104)
at sun.jvm.hotspot.HotSpotTypeDataBase.<init>(HotSpotTypeDataBase.java:85)
at sun.jvm.hotspot.bugspot.BugSpotAgent.setupVM(BugSpotAgent.java:565)
at sun.jvm.hotspot.bugspot.BugSpotAgent.go(BugSpotAgent.java:494)
at sun.jvm.hotspot.bugspot.BugSpotAgent.attach(BugSpotAgent.java:348)
at sun.jvm.hotspot.tools.Tool.start(Tool.java:169)
at sun.jvm.hotspot.tools.JStack.main(JStack.java:86)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at sun.tools.jstack.JStack.runJStackTool(JStack.java:118)
at sun.tools.jstack.JStack.main(JStack.java:84)
Debugger attached successfully.
jstack requires a java VM process/core!
I've double checked that:
I'm running the same version of java as the system that the crash occurred on
The bitness of Windows (32) matches the system that the crash occurred on
jvm.dll is in the path
windbg is installed
I get exactly the same error if I try to extract a java heap dump using jmap.
Does anyone know what could be going wrong?