The documentation here uses jdbc template.
https://docs.spring.io/spring-batch/docs/4.1.x/reference/html/testing.html#testing
I would like to ask on how I could write an integration test with Spring Batch using MongoDB? Preferably, if you guys could provide me a concrete example.
I am using these dependencies for your information
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-mongodb</artifactId>
</dependency>
<dependency>
<groupId>de.flapdoodle.embed</groupId>
<artifactId>de.flapdoodle.embed.mongo</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-batch</artifactId>
<version>${spring.version}</version>
</dependency>
<dependency>
<groupId>org.springframework.batch</groupId>
<artifactId>spring-batch-test</artifactId>
<version>${spring.batch.version}</version>
<scope>test</scope>
</dependency>
If only I could find a spring batch example using mongodb
You can find an example of a job reading/writing data from/to MongoDB here: https://github.com/spring-projects/spring-batch/tree/master/spring-batch-samples#mongodb-sample.
The code of the example is here: https://github.com/spring-projects/spring-batch/blob/master/spring-batch-samples/src/main/java/org/springframework/batch/sample/mongodb/MongoDBSampleApp.java.
If you are planning to write an integration test against MongoDB, you can use flapdoodle.embed.mongo or testcontainers.
Related
I want to set a timeout for 120 minutes in the session of Camunda.
This is the configuration in my pom.xml :
...
...
<dependency>
<groupId>org.camunda.bpm.springboot</groupId>
<artifactId>camunda-bpm-spring-boot-starter</artifactId>
<version>2.3.0</version>
</dependency>
<dependency>
<groupId>org.camunda.bpm.springboot</groupId>
<artifactId>camunda-bpm-spring-boot-starter-webapp</artifactId>
<version>2.3.0</version>
</dependency>
<dependency>
<groupId>org.camunda.bpm.springboot</groupId>
<artifactId>camunda-bpm-spring-boot-starter-rest</artifactId>
<version>2.3.0</version>
</dependency>
<dependency>
<groupId>org.camunda.bpm</groupId>
<artifactId>camunda-engine-rest-core</artifactId>
<version>7.8.0</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
...
...
I tried a lot of tracks, two of which are the most proposed on the forums, one to use if spring boot version is < 1 and the other if spring boot is > 2 :
server.connection-timeout=...
server.servlet.session.timeout=...
For the version of dependecies :
<springboot.version>2.3.0</springboot.version>
<version.camunda>7.8.0</version.camunda>
Are there other possibilities to set the timeout session ?
I achieved something similar using spring session, when using spring boot 2.x.x.
My pom.xml has these dependencies:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-jdbc</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.session</groupId>
<artifactId>spring-session-jdbc</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.session</groupId>
<artifactId>spring-session-core</artifactId>
</dependency>
Using these, Camunda will rely on spring session for session management so you can control session using standard options provided by spring session. However, keep in mind this will use you persistence layer (postgres/h2/etc).
There's a spring.session.timeout available, try setting that one to 120m in application.properties / application.yaml.
I have these in my application.yaml config file
spring:
session:
store-type: jdbc
jdbc.initialize-schema: always
I have a java spring boot graphql project.
My dependencies in the pom.xml:
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>com.graphql-java-kickstart</groupId>
<artifactId>graphql-spring-boot-starter</artifactId>
<version>12.0.0</version>
</dependency>
<dependency>
<groupId>com.graphql-java-kickstart</groupId>
<artifactId>playground-spring-boot-starter</artifactId>
<version>5.10.0</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
When i run the application and visit http://localhost:3001/playground i get an empty page saying
"Loading GraphQL Playground"
What may be possibly my problem here.
You can use only this dependency for graphQL playground interfaces and remove the playground-spring-boot-starter:
<dependency>
<groupId>com.graphql-java-kickstart</groupId>
<artifactId>graphql-spring-boot-starter</artifactId>
<version>12.0.0</version>
</dependency>
In your application.yml you need to explicit set the static path for interface files (the path is /vendor/playground/):
static-path:
base: <YOUR-CONTEXT-PATH>/vendor/playground/
Here is a example of some options to enable GraphQL playground, notice that the context-path here is /api:
graphql:
playground:
endpoint: /graphql
subscriptionEndpoint: /subscriptions
enabled: true
pageTitle: Playground
cdn:
enabled: false
version: latest
static-path:
base: /api/vendor/playground/
I had the same issue and the settings above worked for me.
Could you please let me know how to use spring-jdbc in Quarkus, as I am converting my application from spring to Quarkus, for now I required to use JdbcTemplate but I don't see how to use it.
I am using below dependencies:
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-spring-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-spring-web</artifactId>
</dependency>
But I didn't find anything for spring-jdbc
There isn't such thing as JdbcTemplate in Quarkus, nor a support for spring-jdbc.
So the answer is that you cannot use them, you need to convert the usage to Spring Data (or HIbernate with Panache), or inject a DataSource object and directly work with it.
We found that past version works perfectly fine with native compilation. Hope that's good enough in your case.
Replacing logging is also required for native compilation, because of Class.forName usages.
<spring.jdbc.version>4.3.30.RELEASE</spring.jdbc.version>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-jdbc</artifactId>
<version>${spring.jdbc.version}</version>
<exclusions>
<exclusion>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.jboss.logging</groupId>
<artifactId>commons-logging-jboss-logging</artifactId>
</dependency>
I'm trying to connect spark and cassandra database using Java language. For connecting spark and cassandra I'm using latest version of Spark-cassandra-Connector i.e 2.4.0. Currently I can connect spark and cassandra using connector. I am getting data in RDD format but I can not read data from that data structure. If I used row reader factory as third parameter of cassandraTable() I am getting
> Wrong 3rd argument type. Found:
> 'java.lang.Class<com.journaldev.sparkdemo.JohnnyDeppDetails>',
> required:
> 'com.datastax.spark.connector.rdd.reader.RowReaderFactory<T>'
Can any one tell me which version I should use or what is problem here?
CassandraTableScanJavaRDD pricesRDD2 =
CassandraJavaUtil.javaFunctions(sc).cassandraTable(keyspace,table,JohnnyDeppDetails.class);
My pom.xml:
<!-- Import Spark -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector -->
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.11</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector-java_2.10</artifactId>
<version>1.5.0-M2</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>2.1.9</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-mapping</artifactId>
<version>2.1.9</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
<version>2.4.0</version>
</dependency>
</dependencies>
Instead of passing the class instance, you need to create a RowReaderFactory using the mapRowTo function, like this (this is from my example):
CassandraJavaRDD<UUIDData> uuids = javaFunctions(spark.sparkContext())
.cassandraTable("test", "utest", mapRowTo(UUIDData.class));
when you'll write back, you can convert class into corresponding factory via mapToRow function.
I have some scala utility classes for loading csv files and manipulating as DataFrames. They work fine from scala.
I just tried using the classes by invoking my scala util from java. I got the following exception.
java.lang.NoClassDefFoundError: scala/Product$class
at org.apache.spark.SparkConf$DeprecatedConfig.(SparkConf.scala:723)
at org.apache.spark.SparkConf$.(SparkConf.scala:571)
Both my java and scala projects are maven projects
My java application pom.xml just has one dependency, the dependency to my scala util.
My scala util initiates a SparkSession loads the csv files and manipulates the data in DataFrames. It has the following dependencies (which work fine when runningas standalone scala)
<!---spark-->
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.11 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming_2.11 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-mllib_2.11 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-graphx_2.11 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-graphx_2.11</artifactId>
<version>2.2.0</version>
</dependency>
Can someone offer a hint as to what I am missing
UPDATE: This is not a duplicate question.
Scala is not invoked directly in java. Even so I added a property to the java pom:
<scala.version>2.11.11</scala.version>
This made no difference. The property is already in the scala pom