I tried to replicate the same example given in the following question.
import javax.sql.DataSource;
import org.apache.camel.main.Main;
import org.apache.camel.builder.RouteBuilder;
import org.apache.commons.dbcp.BasicDataSource;
public class JDBCExample {
private Main main;
public static void main(String[] args) throws Exception {
JDBCExample example = new JDBCExample();
example.boot();
}
public void boot() throws Exception {
// create a Main instance
main = new Main();
// enable hangup support so you can press ctrl + c to terminate the JVM
main.enableHangupSupport();
String url = "jdbc:oracle:thin:#MYSERVER:1521:myDB";
DataSource dataSource = setupDataSource(url);
// bind dataSource into the registery
main.bind("myDataSource", dataSource);
// add routes
main.addRouteBuilder(new MyRouteBuilder());
// run until you terminate the JVM
System.out.println("Starting Camel. Use ctrl + c to terminate the JVM.\n");
main.run();
}
class MyRouteBuilder extends RouteBuilder {
public void configure() {
String dst = "C:/Local Disk E/TestData/Destination";
from("direct:myTable")
.setBody(constant("select * from myTable"))
.to("jdbc:myDataSource")
.to("file:" + dst);
}
}
private DataSource setupDataSource(String connectURI) {
BasicDataSource ds = new BasicDataSource();
ds.setDriverClassName("oracle.jdbc.driver.OracleDriver");
ds.setUsername("sa");
ds.setPassword("devon1");
ds.setUrl(connectURI);
return ds;
}
}
I have included the camel-jdbc-3.0.1.jar and my db specific jar file in my class path.
When I try to compile the code using the following command
javac -cp .;D:\Code\bin JDBCExample.java
I am getting the following error.
JDBCExample.java:2: error: package org.apache.camel.main does not exist
import org.apache.camel.main.Main;
^
JDBCExample.java:3: error: package org.apache.camel.builder does not exist
import org.apache.camel.builder.RouteBuilder;
^
JDBCExample.java:4: error: package org.apache.commons.dbcp does not exist
import org.apache.commons.dbcp.BasicDataSource;
Where am I going wrong? I tried adding camel-core to the classpath, but it didn't help.
Kindly let me know your thoughts, thanks in advance.
You did well by adding camel-core to your classpath, but camel-core and camel-jdbc do not suffice, you should also add the following dependencies:
JDBCExample.java:2: error: package org.apache.camel.main does not exist
import org.apache.camel.main.Main;
Add camel-main dependency
JDBCExample.java:4: error: package org.apache.commons.dbcp does not exist
import org.apache.commons.dbcp.BasicDataSource;
Add commons-dbcp dependency
JDBCExample.java:3: error: package org.apache.camel.builder does not exist
import org.apache.camel.builder.RouteBuilder;
Add camel-core dependency
With these and the camel-jdbc dependency, you are good to go.
I suggest that you use maven to handle your dependencies (and much more) if you can... If you have not used it before this five minutes quickstart will gently introduce you to it.
Here is a sample pom.xml that resolves all these dependencies correctly
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>demo</groupId>
<artifactId>camel-jdbc-demo</artifactId>
<packaging>jar</packaging>
<version>1.0-SNAPSHOT</version>
<name>camel-jdbc-demo</name>
<url>http://maven.apache.org</url>
<dependencies>
<!-- https://mvnrepository.com/artifact/commons-dbcp/commons-dbcp -->
<dependency>
<groupId>commons-dbcp</groupId>
<artifactId>commons-dbcp</artifactId>
<version>1.4</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.camel/camel-main -->
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-main</artifactId>
<version>3.0.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.camel/camel-core -->
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-core</artifactId>
<version>3.0.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.camel/camel-jdbc -->
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-jdbc</artifactId>
<version>3.0.1</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
<scope>test</scope>
</dependency>
</dependencies>
</project>
Related
I'm trying to run a spark stream from a kafka queue containing Avro messages.
As per https://spark.apache.org/docs/latest/sql-data-sources-avro.html I should be able to use from_avro to convert column value to Dataset<Row>.
However, I'm unable to compile the project as it complains from_avro cannot be found. I can see the method declared in package.class of the dependency.
How can I use the from_avro method from org.apache.spark.sql.avro in my Java code locally?
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SparkSession;
import static org.apache.spark.sql.functions.*;
import org.apache.spark.sql.avro.*;
public class AvroStreamTest {
public static void main(String[] args) throws IOException, InterruptedException {
// Creating local sparkSession here...
Dataset<Row> df = sparkSession
.readStream()
.format("kafka")
.option("kafka.bootstrap.servers", "host:port")
.option("subscribe", "avro_queue")
.load();
// Cannot resolve method 'from_avro'...
df.select(from_avro(col("value"), jsonFormatSchema)).writeStream().format("console")
.outputMode("update")
.start();
}
}
pom.xml:
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-avro_2.11</artifactId>
<version>2.4.0</version>
</dependency>
<!-- more dependencies below -->
</dependencies>
It seems like Java is unable to import names from sql.avro.package.class
It's because of the generated class names, importing it as import org.apache.spark.sql.avro.package$; and then using package$.MODULE$.from_avro(...) should work
You need to include spark-sql-avro in your pom.xml which is available at
https://mvnrepository.com/artifact/org.apache.spark/spark-sql-avro_2.11/2.4.0-palantir.28-1-gdf34e2d
I'm trying to run a junit test with Serenity BDD framework, using IntelliJ IDEA.
I get an error when I try to run the test:
java.lang.Exception: No tests found matching Method ... from org.junit.internal.requests.ClassRequest#71e693fa
This appears to be due to using RunWith annotation invoking SerenityParameterizedRunner
#RunWith(SerenityParameterizedRunner.class)
When the RunWith annotation is commented out, the test is found and starts executing (though that is not of much use, since we're relying on the Parameterized runner for building data).
I'm able to reproduce the problem with a simple project in order to demonstrate the problem.
package com.home;
public class Doorbell {
private int ringCount = 0;
public Doorbell() {
}
public void ring(){
System.out.println("Ring!");
ringCount++;
}
public int getRings() {
return ringCount;
}
}
Test Class:
package com.home;
import net.serenitybdd.junit.runners.SerenityParameterizedRunner;
import net.serenitybdd.junit.runners.SerenityRunner;
import org.junit.Assert;
import org.junit.Test;
import org.junit.runner.RunWith;
#RunWith(SerenityParameterizedRunner.class)
public class DoorbellTest {
#Test
public void testRings()
{
Doorbell db = new Doorbell();
db.ring();
db.ring();
Assert.assertEquals(2,db.getRings());
}
}
Pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>homeproject</groupId>
<artifactId>mytestproject</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<serenity.version>1.9.31</serenity.version>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
</dependency>
<!-- https://mvnrepository.com/artifact/net.serenity-bdd/serenity-core -->
<dependency>
<groupId>net.serenity-bdd</groupId>
<artifactId>serenity-core</artifactId>
<version>1.9.31</version>
</dependency>
<!-- https://mvnrepository.com/artifact/net.serenity-bdd/serenity-junit -->
<dependency>
<groupId>net.serenity-bdd</groupId>
<artifactId>serenity-junit</artifactId>
<version>1.9.31</version>
</dependency>
</dependencies>
</project>
Plesae try running the single unit test in the project. Any help much appreciated.
Answer 11 from this Stack Overflow question solved my problem
Apparently you have to run the class containing the test, and not the test itself.
When practicing to realize a "hadoop RPC" sample, I keep getting this error.
According to previous similar questions and answers, I've checked the jar file in my classpath and got hadoop common.jar It shows that the jar file in the classpath contains hadoop.conf.Configuration.class.
And here's the code to build RPCServer:
*package rpc;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.ipc.RPC;
import org.apache.hadoop.ipc.Server;
public class RPCServer implements MyBizable {
public String doSomething(String str) {
return str;
}
public static void main(String[] args) throws Exception {
Server server = new RPC.Builder(new Configuration())
.setProtocol(MyBizable.class)
.setInstance(new RPCServer())
.setBindAddress("***.***.***.***")
.setPort(****)
.build();
server.start();
}
}*
And still this error shows up, anyone knows how to solve it?
Any help will be greatly appreciated! THX in advance!
Are you using Maven ?
if yes then add below dependencies.
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>${hadoop.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>${hadoop.version}</version>
<scope>provided</scope>
</dependency>
I am trying to use documents4j for converting file types. I have tried converting multiple file types from one to another.
The code is successfully executed and I can intermittently see files getting converted and produced. But at the end of the execution the converted files are, I think, deleted automatically. I cannot see the converted files in the temp folder that is created.
I printed Future conversion object and here is the result:
LocalConversion{pending=false, cancelled=false, done=true, priority=Priority{value=1000, creationTime=1527163966676}, file-system-target=C:\Users\USERNAME\Desktop\New folder\63cabe72-b2cf-4d52-b428-530dfc0fd63d\temp2}.
Is the target file moved to some other location after the conversion?
Or am I missing some lines of code that copies the target file to other location?
I am using 1.0.3 versions of documents4j libs.
Code:
import java.io.BufferedInputStream;
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.InputStream;
import java.util.concurrent.Future;
import java.util.concurrent.TimeUnit;
import com.documents4j.api.DocumentType;
import com.documents4j.api.IConverter;
import com.documents4j.job.LocalConverter;
public class Test {
public static void main (String[] args) {
try {
ByteArrayOutputStream bo = new ByteArrayOutputStream();
InputStream in = new BufferedInputStream(new FileInputStream("SOME_.TXT_FILE"));
IConverter converter = LocalConverter.builder()
.baseFolder(new File("SOME_FOLDER_PATH"))
.workerPool(20, 25, 2, TimeUnit.SECONDS)
.processTimeout(5, TimeUnit.SECONDS)
.build();
Future<Boolean> conversion = converter
.convert(in).as(DocumentType.TEXT)
.to(bo).as(DocumentType.DOCX)
.prioritizeWith(1000)
.schedule();
conversion.get();
System.out.println(conversion);
if(conversion.isDone()) {
System.out.println("Done");
} else if(conversion.isCancelled()){
System.out.println("Cancelled");
}
}catch(Exception e) {
System.out.println(e);
}
}
}
I am using maven for dependencies management. My pom.xml
<properties>
<documents4j.version>1.0.3</documents4j.version>
</properties>
<dependencies>
<dependency>
<groupId>com.documents4j</groupId>
<artifactId>documents4j-api</artifactId>
<version>${documents4j.version}</version>
</dependency>
<dependency>
<groupId>com.documents4j</groupId>
<artifactId>documents4j-util-conversion</artifactId>
<version>${documents4j.version}</version>
</dependency>
<dependency>
<groupId>com.documents4j</groupId>
<artifactId>documents4j-transformer</artifactId>
<version>${documents4j.version}</version>
</dependency>
<dependency>
<groupId>com.documents4j</groupId>
<artifactId>documents4j-util-all</artifactId>
<version>${documents4j.version}</version>
</dependency>
<dependency>
<groupId>com.documents4j</groupId>
<artifactId>documents4j-local</artifactId>
<version>${documents4j.version}</version>
</dependency>
<dependency>
<groupId>com.documents4j</groupId>
<artifactId>documents4j-transformer-msoffice-word</artifactId>
<version>${documents4j.version}</version>
</dependency>
<dependency>
<groupId>com.documents4j</groupId>
<artifactId>documents4j-transformer-msoffice-base</artifactId>
<version>${documents4j.version}</version>
</dependency>
</dependencies>
I figured it out after some one debugging.
So the target file is stored back in the ByteArrayOutputStream object bo.
Just need to save that as a file.
FileOutputStream fos = new FileOutputStream("C:\\Users\\USERNAME\\Desktop\\New folder\\OTGv4.docx");
bo.writeTo(fos);
Hope this helps!
I am having some issues connecting to the JCR repository within AEM 6.0. When I get to the point of creating a session on the repostory I get a javax.jcr.lock.LockException: Precondition Failed.
I have been using this tutorial to get started.
Here is my very simple code sample:
import java.io.FileNotFoundException;
import java.io.FileReader;
import javax.jcr.Repository;
import javax.jcr.Session;
import javax.jcr.SimpleCredentials;
import org.apache.jackrabbit.commons.JcrUtils;
import com.opencsv.CSVReader;
public class Main {
public static void main(String[] args) throws FileNotFoundException {
Repository repository;
FileReader fileReader;
CSVReader csvReader;
try {
System.out.println("connecting to repository");
repository = JcrUtils.getRepository("http://localhost:4502/crx/server");
Session session = repository.login( new SimpleCredentials("admin", "admin".toCharArray())); // throws javax.jcr.lock.LockException: Precondition Failed
}
catch(Exception e) {
System.out.println(e);
}
}
}
Any guidance would be greatly appreciated.
Inside a JCR repository, content is organized into one or more workspaces, each of which holds of a hierarchical structure of nodes and properties. So to create a jcr session & access node and properties you have to pass workspace with credentials, Default AEM workspace is crx.default
Instead of :
Session session = repository.login( new SimpleCredentials("admin", "admin".toCharArray()));
Use :
Session session = repository.login( new SimpleCredentials("admin", "admin".toCharArray()),"crx.default");
Please check the below link
javax.jcr.lock.LockException:Precondition Failed
The Obvious first: Is the AEM server running?
Secondly: Maybe your build environment is not set up correctly
I was able to set up a working project using your code and this maven file:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.stackoverflow.test</groupId>
<artifactId>access_crx_from_outside</artifactId>
<version>1.0-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>javax.jcr</groupId>
<artifactId>jcr</artifactId>
<version>2.0</version>
</dependency>
<dependency>
<groupId>org.apache.jackrabbit</groupId>
<artifactId>jackrabbit-jcr-commons</artifactId>
<version>2.7.4</version>
</dependency>
<dependency>
<groupId>org.apache.jackrabbit</groupId>
<artifactId>jackrabbit-jcr2dav</artifactId>
<version>2.6.0</version>
</dependency>
</dependencies>