im trying to connect cassandra with Hector:
public class Main {
public static void main(String[] args) {
StringSerializer stringSerializer = StringSerializer.get();
Cluster cluster = HFactory.getOrCreateCluster("Test Cluster", "localhost:9160");
Keyspace keyspace = HFactory.createKeyspace("Keyspace1", cluster);
Mutator<String> mutator = HFactory.createMutator(keyspace, stringSerializer);
mutator.insert("jsmith", "Standard1", HFactory.createStringColumn("first", "John"));
}
}
The problem:
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/impl/StaticLoggerBinder
at org.slf4j.LoggerFactory.getSingleton(LoggerFactory.java:230)
at org.slf4j.LoggerFactory.bind(LoggerFactory.java:121)
at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:112)
at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:275)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:248)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:261)
at me.prettyprint.cassandra.service.AbstractCluster.<init>(AbstractCluster.java:43)
at me.prettyprint.cassandra.service.AbstractCluster.<init>(AbstractCluster.java:57)
at me.prettyprint.cassandra.service.ThriftCluster.<init>(ThriftCluster.java:17)
at me.prettyprint.hector.api.factory.HFactory.createCluster(HFactory.java:112)
at me.prettyprint.hector.api.factory.HFactory.getOrCreateCluster(HFactory.java:104)
at me.prettyprint.hector.api.factory.HFactory.getOrCreateCluster(HFactory.java:96)
at javaapplication1.Main.main(Main.java:25)
Caused by: java.lang.ClassNotFoundException: org.slf4j.impl.StaticLoggerBinder
at java.net.URLClassLoader$1.run(URLClassLoader.java:276)
at java.net.URLClassLoader$1.run(URLClassLoader.java:265)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:264)
at java.lang.ClassLoader.loadClass(ClassLoader.java:325)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
at java.lang.ClassLoader.loadClass(ClassLoader.java:270)
... 13 more
Java Result: 1
How could i solve?
I'm down to my last patience with hector and cassandra, I have tried to connect and failed.
i run twissjava sample and it works, but when i extend the main class to make my own test it doesn't run.
This problem" is documented on the wiki on the "SLF4J in Hector"-page
Short version: you need to add a library that implements the SLF4J api.
you have to add Hector dependencies like SLF4j,Log4j to your classpath,then your problem will solved..i have exactly same problem to you and my problem solved..sorry for my english..:D
What development environment are you using?
I had the same problem in netbeans.
Here are the files you need (but be carefull about the versions you are using).
log4j download: http://www.findjar.com/jar/log4j/log4j/1.2.11/log4j-1.2.11.jar.html
slf4j download: http://www.findjar.com/jar/de/huxhorn/lilith/de.huxhorn.lilith.slf4j/0.9.35/de.huxhorn.lilith.slf4j-0.9.35.jar.html
p.s. search the findjar website for the correct versions if these ones dont work, but i'm using these two with cassandra 1.0.7 and hector 1.0.2 and they worked for me.
Related
I'm trying to write a Java application for digitally signing documents using a bit4Id miniLector token.
I'm in a Linux development environment.
The token is correctly installed, I can sign my documents also with the app downloaded from the manufacturer, but I have to write a new one for other purposes. The driver used is located at
/usr/lib/x86_64-linux-gnu/engines-1.1/pkcs11.so
I'm stuck with this error:
/usr/lib/jvm/jdk1.8.0_111/bin/java ...
Exception in thread "main" java.security.ProviderException: Initialization failed
at sun.security.pkcs11.SunPKCS11.<init>(SunPKCS11.java:376)
at sun.security.pkcs11.SunPKCS11.<init>(SunPKCS11.java:103)
at com.itextpdf.samples.signatures.chapter02.C2_01_SignHelloWorld.main
(C2_01_SignHelloWorld.java:83)
Caused by: java.io.IOException: ERROR: C_GetFunctionList == NULL
at sun.security.pkcs11.wrapper.PKCS11.connect(Native Method)
at sun.security.pkcs11.wrapper.PKCS11.<init>(PKCS11.java:138)
at sun.security.pkcs11.wrapper.PKCS11.getInstance(PKCS11.java:151)
at sun.security.pkcs11.SunPKCS11.<init>(SunPKCS11.java:313)
... 2 more
The provider is listed in $JAVA_HOME/jre/lib/security/java.security file as:
security.provider.10=sun.security.pkcs11.SunPKCS11
The code behaving this way is this:
String configFile = "/opt/bar/cfg/pkcs11.cfg";
Provider provider = new sun.security.pkcs11.SunPKCS11(configFile); <-- line 83
The needed libraries are all imported by my IDE and I have no compile/link errors.
I didn't find this exact type of error in hours of googling.
If you need any further information let me know, any kind help is very appreciated, thanks.
For visual clarity I add all missing information with respect to the original question here below
Updates
Content of the pkcs11.cfg file:
$ cat /opt/bar/cfg/pkcs11.cfg
name="bit4id miniLector-EVO"
library=/usr/lib/x86_64-linux-gnu/engines-1.1/pkcs11.so
Ok, I got it.
The problem is the driver.
Replacing
/usr/lib/x86_64-linux-gnu/engines-1.1/pkcs11.so
with
/opt/Firma4NG/System/Firma4NG_Linux/Firma4/drivers/mu-x64/libbit4xpki.so
that is one of the manufacturer's driver, now I can go further and, for example, dumping all info about the card:
Information for provider SunPKCS11-bit4id miniLector-EVO
Library info:
cryptokiVersion: 2.20
manufacturerID: bit4id srl
flags: 0
libraryDescription: bit4id PKCS#11
libraryVersion: 1.02
...
This question can be closed.
I am trying to connect to Google big query using spark in java, but I am unable to find accurate documentation for the same.
I tried: https://cloud.google.com/dataproc/docs/tutorials/bigquery-connector-spark-example
and
https://github.com/GoogleCloudPlatform/spark-bigquery-connector#compiling-against-the-connector
My code:
sparkSession.conf().set("credentialsFile", "/path/OfMyProjectJson.json");
Dataset<Row> dataset = sparkSession.read().format("bigquery").option("table","myProject.myBigQueryDb.myBigQuweryTable")
.load();
dataset.printSchema();
But this is throwing exception:
Exception in thread "main" java.util.ServiceConfigurationError: org.apache.spark.sql.sources.DataSourceRegister: Provider com.google.cloud.spark.bigquery.BigQueryRelationProvider could not be instantiated
at java.util.ServiceLoader.fail(ServiceLoader.java:232)
at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:43)
at scala.collection.Iterator$class.foreach(Iterator.scala:891)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.filterImpl(TraversableLike.scala:247)
at scala.collection.TraversableLike$class.filter(TraversableLike.scala:259)
at scala.collection.AbstractTraversable.filter(Traversable.scala:104)
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:614)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:190)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:164)
at com.mySparkConnector.getDataset(BigQueryFetchClass.java:12)
Caused by: java.lang.IllegalArgumentException: A project ID is required for this service but could not be determined from the builder or the environment. Please set a project ID using the builder.
at com.google.cloud.spark.bigquery.repackaged.com.google.common.base.Preconditions.checkArgument(Preconditions.java:142)
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.ServiceOptions.<init>(ServiceOptions.java:285)
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.BigQueryOptions.<init>(BigQueryOptions.java:91)
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.BigQueryOptions.<init>(BigQueryOptions.java:30)
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.BigQueryOptions$Builder.build(BigQueryOptions.java:86)
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.BigQueryOptions.getDefaultInstance(BigQueryOptions.java:159)
at com.google.cloud.spark.bigquery.BigQueryRelationProvider$.$lessinit$greater$default$2(BigQueryRelationProvider.scala:29)
at com.google.cloud.spark.bigquery.BigQueryRelationProvider.<init>(BigQueryRelationProvider.scala:40)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
... 15 more
My json file contains project_id
I tried searching for possible solutions but am unable to find any, hence please help me with finding a solution to this exception, or else any documentation on how to connect to big query with spark.
I've got exactly the same error with DataProcPySparkOperator operator in the Airflow. The fix was to provide
dataproc_pyspark_jars='gs://spark-lib/bigquery/spark-bigquery-latest_2.12.jar'
instead of
dataproc_pyspark_jars='gs://spark-lib/bigquery/spark-bigquery-latest.jar'
I guess in your case it should be passed as a command line argument like
--jars=gs://spark-lib/bigquery/spark-bigquery-latest_2.12.jar
Recently a PR handling this issue has been merged to the the spark-bigquery-connector, a new version of the connector will be released soon.
A simple solution for now is to add the environment variable GOOGLE_APPLICATION_CREDENTIALS=/path/OfMyProjectJson.json to the spark runtime.
I recently upgraded a cluster from 3.7.2 to 3.9.2; shutting down all the boxes.
I'm using Portable Serialization and the current version number in the config file is 6. Yet, after a cold start, the cluster indicates a incompatible class definition.
I've restarted the cluster several times since, yet the same error remains.
How is it possible for the system to remain out of sync for some fields in some classes, yet the version for the class was upgraded?
Log:
Caused by:
com.hazelcast.nio.serialization.HazelcastSerializationException:
Incompatible class-definitions with same class-id:
ClassDefinition{factoryId=1, classId=8, version=6, fieldDefinitions=[
FieldDefinitionImpl{index=0, fieldName='feature', type=UTF, classId=0, factoryId=0, version=6},
FieldDefinitionImpl{index=1, fieldName='value', type=BOOLEAN, classId=0, factoryId=0, version=6}]}
VS
ClassDefinition{factoryId=1, classId=8, version=6, fieldDefinitions=[
FieldDefinitionImpl{index=0, fieldName='feature', type=UTF, classId=0, factoryId=0, version=0},
FieldDefinitionImpl{index=1, fieldName='value', type=BOOLEAN, classId=0, factoryId=0, version=0}]}
Config:
<serialization>
<portable-version>6</portable-version>
<portable-factories>
<portable-factory factory-id="1">
com.MyPortableFactory
</portable-factory>
</portable-factories>
</serialization>
It is a bug in Hazelcast library that got fixed in 3.9.4 and 3.10+.
Issue:
https://github.com/hazelcast/hazelcast/issues/12733
Fix for 3.10:
https://github.com/hazelcast/hazelcast/pull/12734
Fix for 3.9.4:
https://github.com/hazelcast/hazelcast/pull/12735
Many people have asked for help with this error:
javax.naming.NamingException: Failed to create remoting connection [Root exception is java.lang.NoSuchMethodError: org.jboss.remoting3.Remoting.createEndpoint(Ljava/lang/String;Lorg/xnio/OptionMap;)Lorg/jboss/remoting3/Endpoint;]
at
org.jboss.naming.remote.client.ClientUtil.namingException(ClientUtil.java:51)
at
org.jboss.naming.remote.client.InitialContextFactory.getInitialContext(InitialContextFactory.java:152)
at
javax.naming.spi.NamingManager.getInitialContext(Unknown Source)
...
But no request I can find ever provides a conclusive answer. Just suggestions to tinker with jars.
I believe that’s because there’s an inconsistency in the structure of Jboss interfaces. Can anyone confirm or correct that?
Here’s my code that throws the above error:
final private Properties env = new Properties() {
{put(Context.INITIAL_CONTEXT_FACTORY, "org.jboss.naming.remote.client.InitialContextFactory");
put(Context.PROVIDER_URL, "http-remoting://localhost:9990");
put(Context.SECURITY_PRINCIPAL, "myID");
put(Context.SECURITY_CREDENTIALS, "myPassword");
put("jboss.naming.client.ejb.context", true);
}
};
/****************************************************
* myID & myPassword open the Admin GUI for wildfly *
* on localhost:9990 *
****************************************************/
Context ctx = new InitialContext(this.env);
To determine the required jars I removed all jars from the Build path.
Then I ran my program till all ClassNotFoundException were gone.
First Error
java.lang.ClassNotFoundException:
org.jboss.naming.remote.client.InitialContextFactory]
Added jboss-remote-naming-1.0.7.final.jar to class path
Second Error
java.lang.NoClassDefFoundError:
org/jboss/logging/Logger
Added jboss-logging.jar
Third Error
java.lang.NoClassDefFoundError:
org/xnio/Options
Added xnio-api-3.0.7.ga.jar
Fourth Error
java.lang.NoClassDefFoundError:
org/jboss/remoting3/spi/ConnectionProviderFactory
Added jboss-remoting-3.jar
Fifth Error
java.lang.NoClassDefFoundError:
org/jboss/ejb/client/EJBClientContextIdentifier
Added jboss-ejb-client-1.0.19.final.jar
FINAL AND FATAL ERROR
(Note: All NoClassDefFoundError have been cleared)
java.lang.NoSuchMethodError: org.jboss.remoting3.Remoting.createEndpoint(Ljava/lang/String;Lorg/xnio/OptionMap;)Lorg/jboss/remoting3/Endpoint;]
Using Eclipse’s Project Explorer I verified:
a. jboss-remoting3.jar has the org.jboss.remoting3.Remoting Class.
b. That Remoting Class has this Method:
public Endpoint createEndpoint (String, Executor, OptionMap)
Note it calls for 3 parameters.
BUT the FINAL FATAL ERROR above calls for
public Endpoint createEndpoint (String, OptionMap)
Note: it calls for 2 parameters. Hence the NoSuchMethodError.
Looking at the top lines in the stack trace, I guess
org.jboss.naming.remote.client.InitialContextFactory.getInitialContext() is trying to call org.jboss.remoting3.Remoting.createEndpoint() using 2 parameters, but org.jboss.remoting3.Remoting only defines createEndpoint() with a 3-paramater signature.
Is that supposed to even be possible?
A jar that says it has the org.jboss.remoting3 package whose Remoting class has a single createEndpoint() method with a 3-parameter signature, and another jar that says it has the org.jboss.remoting3 package whose Remoting class has another createEndpoint() method with a 2-parameter signature?
HELP!
I mean do I need to look through every org.jboss.remoting3 package to find one whose Remoting class has a 2 parameter createEnpoint() method?
Or am I missing something important.
I mean this does explain how many questions have been posted about this error:
javax.naming.NamingException: Failed to create remoting connection [Root exception is java.lang.NoSuchMethodError: org.jboss.remoting3.Remoting.createEndpoint(Ljava/lang/String;Lorg/xnio/OptionMap;)Lorg/jboss/remoting3/Endpoint;]
And explain why there is never a conclusive explanation or solution other than fiddling with jars and build path.
I mean getting an InitialContext from WildFly running on the same PC as the Java program should be a trivial process. But it hasn’t been. Maybe it's because of inconsistencies in the API.
Thanks to Christoph Böhme:
jboss-logging-3.1.4.GA.jar has an org.jboss.remeoting package with a Remoting class that has createEndpoint() with a 0, 2 and 3 parameter signature.
Replacing jboss-remoting-4.0.7.Final.jar with the above jar was all it required to clear the NoSuchMethodError.
Hope that helps others.
I am having my java program for Neo4j with the Neo4j version as 2.3.0m1 .The jar files i was using for the connect was neo4j-desktop-2.3.0-M01.jar . Everything works fine. Now i want to load databases from 2.3.0-m3 version which is not opening in the current version. I am not able to find any jar files for this new version either .
This is my java code
import org.neo4j.graphdb.GraphDatabaseService;
import org.neo4j.graphdb.Node;
import org.neo4j.graphdb.Relationship;
import org.neo4j.graphdb.Transaction;
import org.neo4j.graphdb.factory.GraphDatabaseFactory;
public class Testing {
public static void main(String args[])
{
System.out.println("hai");
GraphDatabaseFactory dbFactory = new GraphDatabaseFactory();
GraphDatabaseService db= dbFactory.newEmbeddedDatabase("D:\\graph.db");
try (Transaction tx = db.beginTx()) {
System.out.println("began transaction");
tx.success();
}
catch(Exception e)
{
e.printStackTrace();
}
System.out.println("Done successfully");
}
}
This is the error i am getting if i try to open a higher version db (from 2.3.0-M3)
Exception in thread "main" java.lang.RuntimeException: Error starting org.neo4j.kernel.EmbeddedGraphDatabase, D:\data2\graph.db
at org.neo4j.kernel.InternalAbstractGraphDatabase.run(InternalAbstractGraphDatabase.java:314)
at org.neo4j.kernel.EmbeddedGraphDatabase.<init>(EmbeddedGraphDatabase.java:59)
at org.neo4j.graphdb.factory.GraphDatabaseFactory.newDatabase(GraphDatabaseFactory.java:107)
at org.neo4j.graphdb.factory.GraphDatabaseFactory$1.newDatabase(GraphDatabaseFactory.java:94)
at org.neo4j.graphdb.factory.GraphDatabaseBuilder.newGraphDatabase(GraphDatabaseBuilder.java:176)
at org.neo4j.graphdb.factory.GraphDatabaseFactory.newEmbeddedDatabase(GraphDatabaseFactory.java:66)
at Testing.main(Testing.java:19)
Caused by: org.neo4j.kernel.lifecycle.LifecycleException: Component 'org.neo4j.kernel.impl.transaction.state.DataSourceManager#258bb6ba' was successfully initialized, but failed to start. Please see attached cause exception.
at org.neo4j.kernel.lifecycle.LifeSupport$LifecycleInstance.start(LifeSupport.java:499)
at org.neo4j.kernel.lifecycle.LifeSupport.start(LifeSupport.java:108)
at org.neo4j.kernel.InternalAbstractGraphDatabase.run(InternalAbstractGraphDatabase.java:309)
... 6 more
Caused by: org.neo4j.kernel.lifecycle.LifecycleException: Component 'org.neo4j.kernel.NeoStoreDataSource#f1cb476' was successfully initialized, but failed to start. Please see attached cause exception.
at org.neo4j.kernel.lifecycle.LifeSupport$LifecycleInstance.start(LifeSupport.java:499)
at org.neo4j.kernel.lifecycle.LifeSupport.start(LifeSupport.java:108)
at org.neo4j.kernel.impl.transaction.state.DataSourceManager.start(DataSourceManager.java:117)
at org.neo4j.kernel.lifecycle.LifeSupport$LifecycleInstance.start(LifeSupport.java:493)
... 8 more
Caused by: org.neo4j.kernel.impl.storemigration.StoreUpgrader$UpgradingStoreVersionNotFoundException: 'neostore.nodestore.db' does not contain a store version, please ensure that the original database was shut down in a clean state.
at org.neo4j.kernel.impl.storemigration.UpgradableDatabase.checkUpgradeable(UpgradableDatabase.java:86)
at org.neo4j.kernel.impl.storemigration.StoreMigrator.needsMigration(StoreMigrator.java:158)
at org.neo4j.kernel.impl.storemigration.StoreUpgrader.getParticipantsEagerToMigrate(StoreUpgrader.java:259)
at org.neo4j.kernel.impl.storemigration.StoreUpgrader.migrateIfNeeded(StoreUpgrader.java:134)
at org.neo4j.kernel.NeoStoreDataSource.upgradeStore(NeoStoreDataSource.java:532)
at org.neo4j.kernel.NeoStoreDataSource.start(NeoStoreDataSource.java:434)
at org.neo4j.kernel.lifecycle.LifeSupport$LifecycleInstance.start(LifeSupport.java:493)
... 11 more
Also i am trying to download the enterprise version zip(2.3.0-M1) for windows from neo4j web site because i have the import-graphml feature in 2.3.0-M3 for which i want to move up to 2.3.0-M3.But the link provided in neo4j site is not downloading it completly.
http://neo4j.com/artifact.php?name=neo4j-enterprise-2.3.0-M01-windows.zip
Is there somewhere else this can be downloaded from
If you are doing upgrade of Neo4j you also need to upgrade database version.
neo4j.properties
# Enable this to be able to upgrade a store from an older version.
allow_store_upgrade=true