Problem with connecting to MongoDB on Java - java

I have a problem with my MongoDB in Java. I cannot connect to it I always get an exception and could not resolve it. I use Maven Dependencies for that. I tried using other versions of the dependency.
I also checked that the MongoDB is running. I started it with net start MongoDB.
My code:
import com.mongodb.MongoClient;
import com.mongodb.MongoClientURI;
public class MongoDBTest {
public static void main(String[] args) throws Exception {
MongoClient mongoClient = new MongoClient(new MongoClientURI("mongodb://127.0.0.1:27017"));
}
}
My dependency:
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>3.10.0</version>
</dependency>
And the exception i got:
Exception in thread "main" java.lang.NoSuchMethodError: com.mongodb.ConnectionString.getThreadsAllowedToBlockForConnectionMultiplier()Ljava/lang/Integer;
at com.mongodb.MongoClientURI.getOptions(MongoClientURI.java:351)
at com.mongodb.Mongo.createCluster(Mongo.java:724)
at com.mongodb.Mongo.<init>(Mongo.java:312)
at com.mongodb.Mongo.<init>(Mongo.java:308)
at com.mongodb.MongoClient.<init>(MongoClient.java:326)
at schlueting.arbeiten.MongoDBTest.main(MongoDBTest.java:9)

The solution was to add the following to the Maven dependencies:
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-sync</artifactId>
<version>3.10.1</version>
</dependency>
That fixed the Exception and I could connect to the MavenDB.
Btw. I was using MongoDB 4.4 and JDK 11.

Related

Spark 2.4.0 Avro Java - cannot resolve method from_avro

I'm trying to run a spark stream from a kafka queue containing Avro messages.
As per https://spark.apache.org/docs/latest/sql-data-sources-avro.html I should be able to use from_avro to convert column value to Dataset<Row>.
However, I'm unable to compile the project as it complains from_avro cannot be found. I can see the method declared in package.class of the dependency.
How can I use the from_avro method from org.apache.spark.sql.avro in my Java code locally?
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SparkSession;
import static org.apache.spark.sql.functions.*;
import org.apache.spark.sql.avro.*;
public class AvroStreamTest {
public static void main(String[] args) throws IOException, InterruptedException {
// Creating local sparkSession here...
Dataset<Row> df = sparkSession
.readStream()
.format("kafka")
.option("kafka.bootstrap.servers", "host:port")
.option("subscribe", "avro_queue")
.load();
// Cannot resolve method 'from_avro'...
df.select(from_avro(col("value"), jsonFormatSchema)).writeStream().format("console")
.outputMode("update")
.start();
}
}
pom.xml:
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-avro_2.11</artifactId>
<version>2.4.0</version>
</dependency>
<!-- more dependencies below -->
</dependencies>
It seems like Java is unable to import names from sql.avro.package.class
It's because of the generated class names, importing it as import org.apache.spark.sql.avro.package$; and then using package$.MODULE$.from_avro(...) should work
You need to include spark-sql-avro in your pom.xml which is available at
https://mvnrepository.com/artifact/org.apache.spark/spark-sql-avro_2.11/2.4.0-palantir.28-1-gdf34e2d

HiveContext.sql() gives runtime No Such Method Error

Hi I am trying to run a simple java program using Apache Hive and Apache Spark. The program compiles without any error, but on runtime I get the following error:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.hive.HiveContext.sql(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;
at SparkHiveExample.main(SparkHiveExample.java:13)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Following is my code:
import org.apache.spark.SparkContext;
import org.apache.spark.SparkConf;
import org.apache.spark.sql.hive.HiveContext;
import org.apache.spark.sql.DataFrame;
public class SparkHiveExample {
public static void main(String[] args) {
SparkConf conf = new SparkConf().setAppName("SparkHive Example");
SparkContext sc = new SparkContext(conf);
HiveContext hiveContext = new HiveContext(sc);
System.out.println("Hello World");
DataFrame df = hiveContext.sql("show tables");
df.show();
}
}
My pom.xml file looks as follows:
<project>
<groupId>edu.berkeley</groupId>
<artifactId>simple-project</artifactId>
<modelVersion>4.0.0</modelVersion>
<name>Simple Project</name>
<packaging>jar</packaging>
<version>1.0</version>
<dependencies>
<dependency> <!-- Spark dependency -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.10</artifactId>
<version>1.3.0</version>
</dependency>
</dependencies>
</project>
What could be the problem?
EDIT: I tried using SQLContext.sql() method and I still get a similar method not found runtime error. This stackoverflow answer suggests that the problem is caused due to dependency problem, but I am unable to figure out what.
make sure your spark core and spark hive dependencies are set to the scope of provided as shown below. These dependencies are provided by the cluster and not by your application.
And ensure the version of your spark installation is 1.3 or above. prior to 1.3 the sql method returned a RDD (SchemaRDD) instead of DataFrame. It is most likely the version of spark that is installed is older than 1.3.
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.3.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.10</artifactId>
<version>1.3.0</version>
<scope>provided</scope>
</dependency>
And it is recommended you use SparkSession object to run queries instead of HiveContext. The below code snippet explains the usage of SparkSession.
val spark = SparkSession.builder.
master("local")
.appName("spark session example")
.enableHiveSupport()
.getOrCreate()
spark.sql("show tables")
The error is because you are applying query of showing table and assigning to a Dataframe.
You can assign to a DataFrame when you use select query or similar queries but not show query
from pyspark.sql.types import DecimalType,StringType
from pyspark.sql.functions import *
from pyspark.sql import SparkSession
spark = SparkSession.builder.appName("Your APPName").enableHiveSupport().getOrCreate()
from pyspark.sql import HiveContext
hive_context = HiveContext(spark)
hive_context.sql("select current_date()").show()

Not able to connect Spark-Cloudant

I am trying to get data from Cloudant using Java code and getting error,
I tried with below Spark and cloudant-spark version,
Spark 2.0.0,
Spark 2.0.1,
Spark 2.0.2
Getting same error for all version as error posted below.
If I add scala dependencies to resolve error this error than it is conflicting with Spark library.
Below is my java code,
package spark.cloudant.connecter;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.SQLContext;
import com.cloudant.spark.*;
public class cloudantconnecter {
public static void main(String[] args) throws Exception {
try {
SparkConf sparkConf = new SparkConf().setAppName("spark cloudant connecter").setMaster("local[*]");
sparkConf.set("spark.streaming.concurrentJobs", "30");
JavaSparkContext sc = new JavaSparkContext(sparkConf);
SQLContext sqlContext = new SQLContext(sc);
System.out.print("initialization successfully");
Dataset<org.apache.spark.sql.Row> st = sqlContext.read().format("com.cloudant.spark")
.option("cloudant.host", "HOSTNAME").option("cloudant.username", "USERNAME")
.option("cloudant.password", "PASSWORD").load("DATABASENAME");
st.printSchema();
} catch (
Exception e) {
e.printStackTrace();
}
}
}
Maven Dependencies
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.10</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>cloudant-labs</groupId>
<artifactId>spark-cloudant</artifactId>
<version>2.0.0-s_2.11</version>
</dependency>
</dependencies>
Getting error details,
Exception in thread "main" java.lang.NoSuchMethodError: scala/Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object; (loaded from file:/C:/Users/Administrator/.m2/repository/org/scala-lang/scala-library/2.10.6/scala-library-2.10.6.jar by sun.misc.Launcher$AppClassLoader#9f916f97) called from class scalaj.http.HttpConstants$ (loaded from file:/C:/Users/Administrator/.m2/repository/org/scalaj/scalaj-http_2.11/2.3.0/scalaj-http_2.11-2.3.0.jar by sun.misc.Launcher$AppClassLoader#9f916f97).
at scalaj.http.HttpConstants$.liftedTree1$1(Http.scala:637)
at scalaj.http.HttpConstants$.<init>(Http.scala:636)
at scalaj.http.HttpConstants$.<clinit>(Http.scala)
at scalaj.http.BaseHttp$.$lessinit$greater$default$2(Http.scala:754)
at scalaj.http.Http$.<init>(Http.scala:738)
at scalaj.http.Http$.<clinit>(Http.scala)
at com.cloudant.spark.common.JsonStoreDataAccess.getQueryResult(JsonStoreDataAccess.scala:152)
at com.cloudant.spark.common.JsonStoreDataAccess.getTotalRows(JsonStoreDataAccess.scala:99)
at com.cloudant.spark.common.JsonStoreRDD.totalRows$lzycompute(JsonStoreRDD.scala:56)
at com.cloudant.spark.common.JsonStoreRDD.totalRows(JsonStoreRDD.scala:55)
at com.cloudant.spark.common.JsonStoreRDD.totalPartition$lzycompute(JsonStoreRDD.scala:59)
at com.cloudant.spark.common.JsonStoreRDD.totalPartition(JsonStoreRDD.scala:58)
at com.cloudant.spark.common.JsonStoreRDD.getPartitions(JsonStoreRDD.scala:81)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:248)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:246)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:246)
at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:248)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:246)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:246)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1934)
at org.apache.spark.rdd.RDD$$anonfun$fold$1.apply(RDD.scala:1046)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:358)
at org.apache.spark.rdd.RDD.fold(RDD.scala:1040)
at org.apache.spark.sql.execution.datasources.json.InferSchema$.infer(InferSchema.scala:68)
at org.apache.spark.sql.DataFrameReader$$anonfun$3.apply(DataFrameReader.scala:317)
at org.apache.spark.sql.DataFrameReader$$anonfun$3.apply(DataFrameReader.scala:317)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:316)
at com.cloudant.spark.DefaultSource.create(DefaultSource.scala:127)
at com.cloudant.spark.DefaultSource.createRelation(DefaultSource.scala:105)
at com.cloudant.spark.DefaultSource.createRelation(DefaultSource.scala:100)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:315)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:149)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:132)
at spark.cloudant.connecter.cloudantconnecter.main(cloudantconnecter.java:24)
Error is showing because mentioned library in question using scala 2.10 and mentioned package spark cloudant library using 2.11
So please change library spark-core_2.10 to spark-core_2.11
So now dependencies are,
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.0.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>2.0.1</version>
</dependency>
<dependency>
<groupId>cloudant-labs</groupId>
<artifactId>spark-cloudant</artifactId>
<version>2.0.0-s_2.11</version>
</dependency>

How to resolve error - package com.squareup.okhttp3 doesn't exist?

I am trying to convert JSON to JAVA object with the use of GSON in MAVEN, I am following a youtube video for guidance - https://www.youtube.com/watch?v=Vqgghm9pWe0 , however theres an error which occurs when in the Main Class the error is - package com.squareup.okhttp3 doesn't exist. The code is below:
Java
package com.codebeasty.json;
import com.squareup.okhttp3.OkHttpClient;
public class Main {
private static OkHttpClient client = new OkHttpClient();
public static void main (String [] args)
{
}
}
I even put in the dependency in pom.xml:
<dependencies>
<dependency>
<groupId>com.squareup.okhttp3</groupId>
<artifactId>okhttp</artifactId>
<version>3.4.2</version>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.8.0</version>
</dependency>
</dependencies>
I dont understand why it doesn't recognize the com.squareup. Is there something extra I may need to download? I have downloaded the JAR from this website - http://square.github.io/okhttp/ and also tried building the project with dependencies. Please help :(
Maven repo has latest version 3.4.1. Trying changing version in pom or install downloaded in your local using
mvn install:install-file -Dfile=path/to/okhttp-3.4.2.jar -DgroupId=com.squareup.okhttp3 -DartifactId=okhttp -Dversion=3.4.2 -Dpackaging=jar

Calling Sonar from my java program

I have installed sonar server on my localhost. And I am able to run and analyse the java project. Even i have installed sonar plugin on eclipse.
But I want to run sonar from my java project(like simple java class) and should retrieve the sonar results and able to save it in database. I searched for the tutorial but unable to find the answer for this. Please anyone can give sample code or resource where I can gain knowledge to overcome this task.
import javax.annotation.Resource;
import org.sonar.wsclient.Host;
import org.sonar.wsclient.Sonar;
import org.sonar.wsclient.connectors.HttpClient4Connector;
import org.sonar.wsclient.services.*;
public class SonarTask1{
public static void main(String[] args) {
//public void Hi(){
String url = "http://localhost:9000";
String login = "admin";
String password = "admin";
Sonar sonar = new Sonar(new HttpClient4Connector(new Host(url, login, password)));
String projectKey = "java-sonar-runner-simple";
String manualMetricKey = "burned_budget";
sonar.create(ManualMeasureCreateQuery.create(projectKey, manualMetricKey).setValue(50.0));
for (ManualMeasure manualMeasure : sonar.findAll(ManualMeasureQuery.create(projectKey))) {
System.out.println("Manual measure on project: " + manualMeasure);
}
}
}
There are 2 things you can do from a Java program:
launch a Sonar analysis: look into the Sonar Ant Task (in method #launchAnalysis) to see how to do that very easily.
retrieve results from the Sonar server: check the Web API for that purpose
About the " java.lang.NoClassDefFoundError: org/apache/http/client/methods/HttpRequestBase" you need this dependencies (Sonar WS uses them)
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.3.1</version>
</dependency>
<dependency>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
<version>1.1.1</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jcl-over-slf4j</artifactId>
<version>1.7.7</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.6.2</version>
</dependency>
Also, I'd recommend when using Eclipse IDE, update maven project and check force update of snapshots/releases, because somehow org.apache.httpclient wasn't being able to be recognized by classpath

Categories

Resources