Starting HBASE, java.lang.ClassNotFoundException: org.apache.htrace.SamplerBuilder - java

I am trying to start HBASE with start-hbase.sh, however, I get the error: java.lang.ClassNotFoundException: org.apache.htrace.SamplerBuilder.
I have tried to add various .jar's to various folders (as suggested in other threads) but nothing works. I am using Hadoop 3.11 and HBase 2.10 Here is the (end of the) error log.
java.lang.RuntimeException: Failed construction of Master: class org.apache.hadoop.hbase.master.HMaster.
at org.apache.hadoop.hbase.master.HMaster.constructMaster(HMaster.java:2972)
at org.apache.hadoop.hbase.master.HMasterCommandLine.startMaster(HMasterCommandLine.java:236)
at org.apache.hadoop.hbase.master.HMasterCommandLine.run(HMasterCommandLine.java:140)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.hbase.util.ServerCommandLine.doMain(ServerCommandLine.java:149)
at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:2983)
Caused by: java.lang.NoClassDefFoundError: org/apache/htrace/SamplerBuilder
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:635)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:619)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:149)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2669)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:94)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2703)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2685)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
at org.apache.hadoop.hbase.util.CommonFSUtils.getRootDir(CommonFSUtils.java:358)
at org.apache.hadoop.hbase.util.CommonFSUtils.isValidWALRootDir(CommonFSUtils.java:407)
at org.apache.hadoop.hbase.util.CommonFSUtils.getWALRootDir(CommonFSUtils.java:383)
at org.apache.hadoop.hbase.regionserver.HRegionServer.initializeFileSystem(HRegionServer.java:691)
at org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:600)
at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:484)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:488)
at org.apache.hadoop.hbase.master.HMaster.constructMaster(HMaster.java:2965)
... 5 more
Caused by: java.lang.ClassNotFoundException: org.apache.htrace.SamplerBuilder
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:582)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:190)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:499)
... 25 more

HBase 2.1.0 release uses HTrace, that is an incubating Apache Foundation project.
There is a folder for 3rd-party libraries in HBase lib folder, client-facing-thirdparty. You need to copy htrace-core-3.1.0-incubating.jar from there to the HBase lib directory. (see reference)
There is also another solution at Cloudera Community that changes a configuration instead of adding the library manually.

There seems to have been a compatibility issue between Hbase and Hadoop, I reverted to using Hadoop 2.9.1 and Hbase 1.2.6 together with JDK 1.8.0

HBase 2.1.4 has no htrace-core-3.1.0-incubating.jar in client-facing-thirdparty
You need copy from 2.1.3 or download from https://mvnrepository.com/artifact/org.apache.htrace/htrace-core

Related

java.lang.ClassNotFoundException while using docker-java

We are using this library to do 'docker pull' from docker hub and to check whether an image with a given name already exists. We need this to work in Linux, Mac and Windows having the latest version of docker installed. But in several cases, we have hit the error mentioned in the title and that error is coming from jersey-client that is internally used by this library.
We have tested using docker-java 3.1.5 and the latest 3.2.5 and on docker 19.03.5 and also the latest docker. The latest docker version varies on different platforms as mentioned below:
Linux (19.03.12)
Mac (19.03.8)
Windows (19.03.8)
Attached result of our tests
I am sharing a code snippet of what we are trying:
DockerClient dockerClient;
dockerClient = DockerClientBuilder.getInstance().build();
String imageName = "SOME_IMAGE_NAME";
List images = dockerClient.listImagesCmd().withImageNameFilter(imageName).exec();
if(!images.isEmpty()){
dockerClient.removeImageCmd(images.get(0).getId());
}
dockerClient.pullImageCmd(imageName).exec(new PullImageResultCallback()).awaitCompletion(DOCKER_PULL_WAIT_TIME, TimeUnit.SECONDS);
We are using Java 8.
If anyone has faced this kind of an issue before and solved it, can you please suggest how we should approach this problem?
Sharing the stacktrace:
java.lang.ClassNotFoundException: com.sun.ws.rs.ext.RuntimeDelegateImpl
2020-07-13 22:14:59,255 ERROR [docker-java-stream--1445483847] ResultCallbackTemplate - Error during callback
java.lang.RuntimeException: java.lang.ClassNotFoundException: com.sun.ws.rs.ext.RuntimeDelegateImpl
at javax.ws.rs.ext.RuntimeDelegate.findDelegate(RuntimeDelegate.java:122) ~[jsr311-api-1.1.1.jar:?]
at javax.ws.rs.ext.RuntimeDelegate.getInstance(RuntimeDelegate.java:91) ~[jsr311-api-1.1.1.jar:?]
at javax.ws.rs.core.UriBuilder.newInstance(UriBuilder.java:69) ~[jsr311-api-1.1.1.jar:2.1.6]
at javax.ws.rs.core.UriBuilder.fromUri(UriBuilder.java:80) ~[jsr311-api-1.1.1.jar:2.1.6]
at javax.ws.rs.core.UriBuilder.fromUri(UriBuilder.java:99) ~[jsr311-api-1.1.1.jar:2.1.6]
at org.glassfish.jersey.client.JerseyWebTarget.<init>(JerseyWebTarget.java:48) ~[jersey-client-2.30.1.jar:?]
at org.glassfish.jersey.client.JerseyClient.target(JerseyClient.java:274) ~[jersey-client-2.30.1.jar:?]
at org.glassfish.jersey.client.JerseyClient.target(JerseyClient.java:56) ~[jersey-client-2.30.1.jar:?]
at com.github.dockerjava.jaxrs.JerseyDockerHttpClient.execute(JerseyDockerHttpClient.java:291) ~[docker-java-transport-jersey-3.2.5.jar:?]
at com.github.dockerjava.core.DefaultInvocationBuilder.execute(DefaultInvocationBuilder.java:228) ~[docker-java-core-3.2.5.jar:?]
at com.github.dockerjava.core.DefaultInvocationBuilder.lambda$executeAndStream$1(DefaultInvocationBuilder.java:269) ~[docker-java-core-3.2.5.jar:?]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_252]
Caused by: java.lang.ClassNotFoundException: com.sun.ws.rs.ext.RuntimeDelegateImpl
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1358) ~[catalina.jar:8.5.57]
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1180) ~[catalina.jar:8.5.57]
at java.lang.Class.forName0(Native Method) ~[?:1.8.0_252]
at java.lang.Class.forName(Class.java:264) ~[?:1.8.0_252]
at javax.ws.rs.ext.FactoryFinder.newInstance(FactoryFinder.java:62) ~[jsr311-api-1.1.1.jar:?]
at javax.ws.rs.ext.FactoryFinder.find(FactoryFinder.java:155) ~[jsr311-api-1.1.1.jar:?]
at javax.ws.rs.ext.RuntimeDelegate.findDelegate(RuntimeDelegate.java:105) ~[jsr311-api-1.1.1.jar:?]
... 11 more
The default transport that is used by docker-java library is jersey-client which has issues on non-Unix platforms. For that I had to explicitly use a different transport. I used Apache httpclient 5 which is anyway planned to be made default in the future releases of docker-java.
A list of transports for docker-java can be found at https://github.com/docker-java/docker-java/blob/master/docs/transports.md .

java.lang.NoClassDefFoundError: oracle/dms/console/DMSConsole

My OS = CentOS-7
Oracle 18.4 XE
Java 8 JDK + Tomcat 8
I facing the following error:
Exception in thread "main" java.lang.NoClassDefFoundError: oracle/dms/console/DMSConsole
at oracle.jdbc.driver.DMSFactory.<clinit>(DMSFactory.java:46)
at oracle.jdbc.driver.PhysicalConnection.createDMSSensors(PhysicalConnection.java:1713)
at oracle.jdbc.driver.PhysicalConnection.<init>(PhysicalConnection.java:849)
at oracle.jdbc.driver.T4CConnection.<init>(T4CConnection.java:443)
at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:34)
at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:712)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at Test.main(Test.java:9)
Caused by: java.lang.ClassNotFoundException: oracle.dms.console.DMSConsole
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 9 more
I searched a lot but was unable to find any forum where I can see aforementioned error related to oracle 18.4.0.
I searched my whole server even unzipped all ojdbc*.jar but not found DMSConsole anywhere
I removed all previous jdbc jar drivers and downloaded latest ojdbc8-full.tar.gz driver form this LINK but nothing fixed the problem
Best Regards
What are you trying to do? If you are planning to use dms jar then you should use
ojdbc8dms.jar. Check out the question "What are the different JAR files on the 19.3 JDBC driver download page for?" in JDBC FAQ
You can get these jars from maven. Check out the blog for details. You can also download these from ojdbc8-debug.tar.gz from OTN. Make sure to have dms.jar in the classpath.
<groupId>com.oracle.database.jdbc</groupId>
<artifactId>ojdbc8dms</artifactId>
<version>19.6.0.0</version>
This can be solved by adding a correct dms.jar but when not detailled log is needed the following jars are the minimum to include
ojdbc8.jar
orai18n.jar

How do I troubleshoot the installation of Apache Accumulo on Linux?

I am trying to install open source Accumulo on RHEL 7.x. I have two GB of swap space. I have installed Java 1.8, Hadoop 3, and Zookeeper. I have run the bootstrap_config.sh script for Accumulo 1.9.2.
I ran this (and expected it to work):
/bin/accumulo-1.9.2/bin/accumulo init
But I get this error:
[start.Main] ERROR: Uncaught exception
java.util.ServiceConfigurationError:
org.apache.accumulo.start.spi.KeywordExecutable: Provider
org.apache.accumulo.proxy.Proxy could not be instantiated
at java.util.ServiceLoader.fail(ServiceLoader.java:232)
at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at org.apache.accumulo.start.Main.checkDuplicates(Main.java:237)
at org.apache.accumulo.start.Main.getExecutables(Main.java:228)
at org.apache.accumulo.start.Main.main(Main.java:84) Caused by: java.lang.NoClassDefFoundError:
org/apache/commons/configuration/Configuration
at java.lang.Class.getDeclaredConstructors0(Native Method)
at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671)
at java.lang.Class.getConstructor0(Class.java:3075)
at java.lang.Class.newInstance(Class.java:412)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
... 5 more Caused by: java.lang.ClassNotFoundException: org.apache.commons.configuration.Configuration
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at org.apache.accumulo.start.classloader.AccumuloClassLoader$2.loadClass(AccumuloClassLoader.java:294)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 10 more
I used the Accumulo bootstrap_config.sh script to configure Hadoop version 3. How do I get "/bin/accumulo-1.9.2/bin/accumulo init" to work?
Accumulo 1.9.2 expects Hadoop 2 out of the box, but does have a build profile to rebuild a tarball specifically for use with Hadoop 3. You can build Accumulo with the Hadoop 3 profile by downloading the source tarball and doing:
mvn clean package -Dhadoop.profile=3 -DskipTests
If you're not interested in rebuilding from source, it may be possible to simply fix the class path issues by reading the error message, and adjusting your class path accordingly. In this case, it seems you're missing a commons-configuration jar.

Gerrit JDBC Driver not found

I have installed gerrit on a Linux Mint machine, and I'm trying to use a mysql database connection for HTTP authentication.
But when I try to start gerrit, I get the following error:
ERROR com.google.gerrit.pgm.Daemon : Unable to start daemon
com.google.gerrit.common.Die: Cannot connect to SQL database
at com.google.gerrit.pgm.util.AbstractProgram.die(AbstractProgram.java:88)
at com.google.gerrit.pgm.util.SiteProgram.createDbInjector(SiteProgram.java:158)
at com.google.gerrit.pgm.Daemon.start(Daemon.java:275)
at com.google.gerrit.pgm.Daemon.run(Daemon.java:204)
at com.google.gerrit.pgm.util.AbstractProgram.main(AbstractProgram.java:64)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.google.gerrit.launcher.GerritLauncher.invokeProgram(GerritLauncher.java:166)
at com.google.gerrit.launcher.GerritLauncher.mainImpl(GerritLauncher.java:93)
at com.google.gerrit.launcher.GerritLauncher.main(GerritLauncher.java:50)
at Main.main(Main.java:25)
Caused by: java.sql.SQLException: Driver class com.mysql.jdbc.Driver not available
at com.google.gwtorm.jdbc.SimpleDataSource.loadDriver(SimpleDataSource.java:171)
at com.google.gwtorm.jdbc.SimpleDataSource.<init>(SimpleDataSource.java:85)
at com.google.gerrit.server.schema.DataSourceProvider.open(DataSourceProvider.java:144)
at com.google.gerrit.server.schema.DataSourceProvider.get(DataSourceProvider.java:65)
at com.google.gerrit.pgm.util.SiteLibraryBasedDataSourceProvider.get(SiteLibraryBasedDataSourceProvider.java:52)
at com.google.gerrit.pgm.util.SiteLibraryBasedDataSourceProvider.get(SiteLibraryBasedDataSourceProvider.java:32)
at com.google.inject.internal.ProviderInternalFactory.provision(ProviderInternalFactory.java:86)
at com.google.inject.internal.BoundProviderFactory.provision(BoundProviderFactory.java:73)
at com.google.inject.internal.ProviderInternalFactory.circularGet(ProviderInternalFactory.java:66)
at com.google.inject.internal.BoundProviderFactory.get(BoundProviderFactory.java:63)
at com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46)
at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1066)
at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40)
at com.google.inject.Scopes$1$1.get(Scopes.java:65)
at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:41)
at com.google.inject.internal.InternalInjectorCreator$1.call(InternalInjectorCreator.java:205)
at com.google.inject.internal.InternalInjectorCreator$1.call(InternalInjectorCreator.java:199)
at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1059)
at com.google.inject.internal.InternalInjectorCreator.loadEagerSingletons(InternalInjectorCreator.java:199)
at com.google.inject.internal.InternalInjectorCreator.injectDynamically(InternalInjectorCreator.java:180)
at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:110)
at com.google.inject.Guice.createInjector(Guice.java:96)
at com.google.gerrit.pgm.util.SiteProgram.createDbInjector(SiteProgram.java:152)
... 11 more
Caused by: java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at com.google.gwtorm.jdbc.SimpleDataSource.loadDriver(SimpleDataSource.java:168)
... 33 more
I am copied mysql-connector-java-5.1.28.jar in /usr/share/java but still, this error persists.
Can somebody point me to a direction?
Better to leave /usr/share/java untouched. Anything you put there will be lost when you upgrade the next time. Instead, define your own common location for your drivers. If it is just for you, a subdirectory of your home location might be good: mkdir ~/jdbc.
Then copy your .jar for your jdbc driver in that location.
Then depending on how you run your java code, you will have to figure out how to include your jar's into your classpath.
standalone java program started with command-line java, then use the -cp parameter option.
Application Servers like Tomcat, or Weblogic, refer their documentation. Note that many suggest a drop location within the app server install, but for the same reasons as for the /usr/share/java, I would advise not to do so. Actually tomcat allows you to separate out CATALINA_HOME from CATALINA_BASE to solve that issue.
Either way, it appears that the exception was generated from a simple main, not a web app, so I assume you can go for the first option.

Kafka + Storm - satisfying dependencies

I'm attempting to deploy my first topology to a storm cluster as part of an assessment for my company. The topology is just to get values from kafka and put them into cassandra and redis.
After copying over scads of .jar files to try to satisfy the various dependencies I've run into a issue where storm claims a dependency is missing but the startup class list in the logs shows the class as available.
Here's the exception:
java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
at kafka.utils.Pool.(Pool.scala:28) ~[kafka_2.10-0.8.1.1.jar:na]
at kafka.consumer.FetchRequestAndResponseStatsRegistry$.
~[kafka_2.10-0.8.1.1.jar:na]
at kafka.consumer.FetchRequestAndResponseStatsRegistry$.(FetchRequestAndResponseStats.scala) ~[kafka_2.10-0.8.1.1.jar:na]
at kafka.consumer.SimpleConsumer.(SimpleConsumer.scala:39) ~[kafka_2.10-0.8.1.1.jar:na]
at kafka.javaapi.consumer.SimpleConsumer.(SimpleConsumer.scala:34) ~[kafka_2.10-0.8.1.1.jar:na]
at storm.kafka.DynamicPartitionConnections.register(DynamicPartitionConnections.java:60) ~[storm-kafka-0.9.4.jar:0.9.4]
at storm.kafka.PartitionManager.(PartitionManager.java:64) ~[storm-kafka-0.9.4.jar:0.9.4]
at storm.kafka.ZkCoordinator.refresh(ZkCoordinator.java:98) ~[storm-kafka-0.9.4.jar:0.9.4]
at storm.kafka.ZkCoordinator.getMyManagedPartitions(ZkCoordinator.java:69) ~[storm-kafka-0.9.4.jar:0.9.4]
at storm.kafka.KafkaSpout.nextTuple(KafkaSpout.java:135) ~[storm-kafka-0.9.4.jar:0.9.4]
at backtype.storm.daemon.executor$fn__4654$fn__4669$fn__4698.invoke(executor.clj:565) ~[storm-core-0.9.4.jar:0.9.4]
at backtype.storm.util$async_loop$fn__458.invoke(util.clj:463) ~[storm-core-0.9.4.jar:0.9.4]
at clojure.lang.AFn.run(AFn.java:24) [clojure-1.5.1.jar:na]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_45]
Caused by: java.lang.ClassNotFoundException: scala.collection.GenTraversableOnce$class
at java.net.URLClassLoader.findClass(URLClassLoader.java:381) ~[na:1.8.0_45]
at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[na:1.8.0_45]
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) ~[na:1.8.0_45]
at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ~[na:1.8.0_45]
When I look at the startup info for the supervisor thread I see this:
2015-06-07T07:55:19.941-0700 o.a.z.ZooKeeper [INFO] Client environment:java.class.path= ... /usr/local/src/apache-storm-0.9.4/lib/scala-library-2.11.6.jar: ...
When I open this file I see this entry:
-rwxrwxrwx 0 0 0 0 Mar 18 2014 scala/collection/GenTraversableOnce.class
So something else is amiss. What step(s) have I missed here?
NOTE: similar issues from org/jboss/netty/channel/ChannelFactory..
The Kafka version specifies which Scala version it is built against.
Scala 2.10 - kafka_2.10-0.9.0.1.tgz (asc, md5)
Scala 2.11 - kafka_2.11-0.9.0.1.tgz (asc, md5)
I made the mistake of using Scala 2.10 with kafka 2.11.
I was able to resolve this by correcting my maven dependancies to the correct scala and kafka combination.
Release Notes Source download: kafka-0.11.0.1-src.tgz (asc, md5)
Binary downloads: Scala 2.11 - kafka_2.11-0.11.0.1.tgz (asc, md5)
Scala 2.12 - kafka_2.12-0.11.0.1.tgz (asc, md5) We build for multiple
This only matters if you are using Scala and you want a version built for the same Scala version you use. Otherwise any version should work (2.11 is recommended).

Categories

Resources