According to some research about Lagom and Cassandra, I found that:
Lagom uses DataStax Java Driver for Cassandra, and
DataStax Java Driver only supports Cassandra 3.0.x (link)
So, if I want to use Cassandra 3.11 and Lagom, what should I do:
Should I configure Lagom with another Cassandra driver like Achilles, PlayORM, ... (link). Is that possible?
is DataStax support Cassandra 3.11 in the enterprise edition?
Any help, please?
DataStax Java Driver 3.2.0 that is used by Lagom should work with Cassandra 3.11 out of the box (just checked it myself using simple queries).
Even if it won't work out of the box, you can explicitly override driver version in Maven or other build system.
Related
I getting below error while trying to call con.createArrayOf method on Microsoft SQL server with datasource pool connection
Uncaught Throwable
java.lang.AbstractMethodError: org.apache.commons.dbcp.PoolingDataSource$PoolGuardConnectionWrapper.createArrayOf(Ljava/lang/String;[Ljava/lang/Object;)Ljava/sql/Array;
Any lead would be appreciable.
Basically, you can't use the createArrayOf method with your current library. You'll have to use a different database driver library.
You could use something like https://learn.microsoft.com/en-us/sql/connect/jdbc/microsoft-jdbc-driver-for-sql-server since you're using SQL Server.
The error suggests you are using Apache DBCP 1.3 or older, as the method you are trying to call was introduced in JDBC 4 (Java 6), while Apache DBCP 1.3 supports JDBC 3 (Java 1.4/5).
Given the overview at https://commons.apache.org/proper/commons-dbcp/ you need to use at minimum Apache DBCP 1.4, or a newer version (the latest is 2.7.0, for Java 8 and higher).
Note though that upgrading won't help you much, because the Microsoft SQL Server JDBC driver implementation of createArrayOf will throw a SQLFeatureNotSupportedException as SQL Server doesn't support arrays.
Is it still possible to configure a Cluster (like Datastax java driver 3.8 driver version) with the new 4.0 version. Or the only solution is to use a configuration file like in the documentation ? https://docs.datastax.com/en/developer/java-driver/4.0/manual/core/configuration/
Yes, it's possible to configure driver programmatically. Just follow the section "" of driver documentation. You just need to define config loader using DriverConfigLoader.programmaticBuilder, and then use it when building the CqlSession:
DriverConfigLoader loader =
DriverConfigLoader.programmaticBuilder()
.withDuration(DefaultDriverOption.REQUEST_TIMEOUT, Duration.ofSeconds(5))
.startProfile("slow")
.withDuration(DefaultDriverOption.REQUEST_TIMEOUT, Duration.ofSeconds(30))
.endProfile()
.build();
CqlSession session = CqlSession.builder().withConfigLoader(loader).build();
Driver has a lot of options available, but as practice shows, it's ok to define many defaults in config file, and use loader only for something non-standard.
P.S. It's better to take driver 4.5 as it works with both OSS & DSE versions... Plus many improvements, like, reactive support, etc.
We are working on the RDBMS Adapter (i.e., to connect to Oracle, MySQL, MSSQL, PostgreSQL, etc) creation. During connecting to Oracle DB we're facing JDBC driver issue for connecting different versions of Oracle Database.
Isn't there any generic driver to connect to what ever the version of Oracle Database it may be?
Like for connecting,
MySQL -> mysql-connector-java
Oracle -> ?
Technology Stack:
Spring Boot
Java 8+ (Planning to go further also)
Angular JS
No, you are supposed to use the right driver for Oracle DB version and Java version. There is no generic driver that fits all combinations e.g. ojdbc6.jar driver does not implement methods introduced in JDBC 4.1+ (Java 7+).
See What are the Oracle JDBC releases versus JDBC specifications? and What are the Oracle JDBC releases versus JDK versions? docs to understand which OJBDC driver you should use.
I am using the datastax spark cassandra connector with Spark 1.6.3.
Is it possible to use the Spark cassandra connector Java API with Spark 2.0+?I see that the latest version of spark-cassandra-connector-java_2.11 is 1.6.0-M1.Does someone know about the future of the connector's Java API?Thanks,
Shai
If you're running Spark 2.0-2.2, you can use Cassandra connector from DataStax https://spark-packages.org/package/datastax/spark-cassandra-connector (which is written in Scala, but you can just consume jar).
Check out the compatibility table in the official repo: https://github.com/datastax/spark-cassandra-connector#version-compatibility
I have configured cassandra cluster with 2 datacenters, and 3 nodes each. i wanted to use DCAwareRoundRobinPolicy to specify the local datacenter. i tried using both Datastax java driver 2.0.2 and 3.1. but with 2.0.2 there no compile time error rather in run-time i am getting NoSuchMethodError and 3.1 giving me the DCAwareRoundRobinPolicy() constructor is not visible.
Can anyone please let me know how to fix this issue?
Thanks in advance.
For the Java Cassandra Driver 3.1 you now use a builder to create your DCAwareRoundRobinPolicy
DCAwareRoundRobinPolicy dcAwareRoundRobinPolicy = DCAwareRoundRobinPolicy.builder()
.withLocalDc("my-dc")
.withUsedHostsPerRemoteDc(1)
.build();
I would also suggest using TokenAware as well so when you are building your cluster add
Cluster.Builder()
.withLoadBalancingPolicy(new TokenAwarePolicy(dcAwareRoundRobinPolicy))