I have just started learning Apache Camel. I understood the basics of Routes and Components. Now I want to give a try by connecting to Oracle database, reading records from one particular table and write those records to a file using File component. To read from database I assume I need to use JDBC component and give the dataSourceName.
However, I couldn't find any info on how to create a dataSource using camel. All info that I found related to this topic uses Spring DSL examples. I don't use Spring and I just need to test this using simple standalone Java application.
I am using JDK7u25 with Apache Camel 2.12.1.
Can someone please post a sample to read from the oracle table and write to a file?
[EDIT]
After checking several solutions on the web, I came to know about following two approaches:
Camel to run as standalone. Here is my code:
import javax.sql.DataSource;
import org.apache.camel.main.Main;
import org.apache.camel.builder.RouteBuilder;
import org.apache.commons.dbcp.BasicDataSource;
public class JDBCExample {
private Main main;
public static void main(String[] args) throws Exception {
JDBCExample example = new JDBCExample();
example.boot();
}
public void boot() throws Exception {
// create a Main instance
main = new Main();
// enable hangup support so you can press ctrl + c to terminate the JVM
main.enableHangupSupport();
String url = "jdbc:oracle:thin:#MYSERVER:1521:myDB";
DataSource dataSource = setupDataSource(url);
// bind dataSource into the registery
main.bind("myDataSource", dataSource);
// add routes
main.addRouteBuilder(new MyRouteBuilder());
// run until you terminate the JVM
System.out.println("Starting Camel. Use ctrl + c to terminate the JVM.\n");
main.run();
}
class MyRouteBuilder extends RouteBuilder {
public void configure() {
String dst = "C:/Local Disk E/TestData/Destination";
from("direct:myTable")
.setBody(constant("select * from myTable"))
.to("jdbc:myDataSource")
.to("file:" + dst);
}
}
private DataSource setupDataSource(String connectURI) {
BasicDataSource ds = new BasicDataSource();
ds.setDriverClassName("oracle.jdbc.driver.OracleDriver");
ds.setUsername("sa");
ds.setPassword("devon1");
ds.setUrl(connectURI);
return ds;
}
}
Using the approach mentioned by Claus lbsen. Here is the code again:
import javax.sql.DataSource;
import org.apache.camel.CamelContext;
import org.apache.camel.impl.DefaultCamelContext;
import org.apache.camel.impl.SimpleRegistry;
import org.apache.camel.main.Main;
import org.apache.camel.builder.RouteBuilder;
import org.apache.commons.dbcp.BasicDataSource;
public class JDBCExample {
private Main main;
public static void main(String[] args) throws Exception {
String url = "jdbc:oracle:thin:#MYSERVER:1521:myDB";
DataSource dataSource = setupDataSource(url);
SimpleRegistry reg = new SimpleRegistry() ;
reg.put("myDataSource",dataSource);
CamelContext context = new DefaultCamelContext(reg);
context.addRoutes(new JDBCExample().new MyRouteBuilder());
context.start();
Thread.sleep(5000);
context.stop();
}
class MyRouteBuilder extends RouteBuilder {
public void configure() {
String dst = "C:/Local Disk E/TestData/Destination";
from("direct:myTable")
.setBody(constant("select * from myTable"))
.to("jdbc:myDataSource")
.to("file:" + dst);
}
}
private static DataSource setupDataSource(String connectURI) {
BasicDataSource ds = new BasicDataSource();
ds.setDriverClassName("oracle.jdbc.driver.OracleDriver");
ds.setUsername("sa");
ds.setPassword("devon1");
ds.setUrl(connectURI);
return ds;
}
}
But in both the cases I am getting below exception:
Caused by: org.apache.camel.ResolveEndpointFailedException: Failed to resolve endpoint: jdbc://myDataSource due to: No component found with scheme: jdbc
at org.apache.camel.impl.DefaultCamelContext.getEndpoint(DefaultCamelContext.java:534)
at org.apache.camel.util.CamelContextHelper.getMandatoryEndpoint(CamelContextHelper.java:63)
at org.apache.camel.model.RouteDefinition.resolveEndpoint(RouteDefinition.java:192)
at org.apache.camel.impl.DefaultRouteContext.resolveEndpoint(DefaultRouteContext.java:106)
at org.apache.camel.impl.DefaultRouteContext.resolveEndpoint(DefaultRouteContext.java:112)
at org.apache.camel.model.SendDefinition.resolveEndpoint(SendDefinition.java:61)
at org.apache.camel.model.SendDefinition.createProcessor(SendDefinition.java:55)
at org.apache.camel.model.ProcessorDefinition.makeProcessor(ProcessorDefinition.java:500)
at org.apache.camel.model.ProcessorDefinition.addRoutes(ProcessorDefinition.java:213)
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:909)
... 12 more
[Thread-0] INFO org.apache.camel.main.MainSupport$HangupInterceptor - Received hang up - stopping the main instance.
There is the SQL example which shows how to setup a DataSource
http://camel.apache.org/sql-example.html
Yes that examples using Spring XML. But how you setup the DataSource can be done in Java code also. Then you need to register the DataSource in the Camel Registry.
For example you can use a JndiRegistry or the SimpleRegistry. The latter is easier.
Here is some pseudo code showing the principle, of creating a registry, add your beans into this registry, and then providing the registry to the constructor of DefaultCamelContext.
SimpleRegistry registry = new SimpleRegistry();
// code to create data source here
DateSource ds = ...
registry.put("myDataSource", ds);
CamelContext camel = new DefaultCamelContext(registry);
So silly me! I had not included camel-jdbc-2.12.1.jar in the CLASSPATH. Now above examples work.
Spring was mentioned there just because it is very useful paradigm of working with DB (mainly because of templates introduced by Spring Framework.) Of course you can hook up the standard JDBC connection and implement DAO by yourself - nothing wrong with that.
Related
I'm trying to migrate from Vert.x to Quarkus and in Vert.x when I write message consumers like Kafka/AMQP etc. I have to scale the number of verticals to maximize performance across multiple cores i.e. Vertical Scaling - is this possible in Quarkus? I see a similar question here but it wasn't answered.
For example, with Kafka I might create a consumer inside a vertical and then scale that vertical say 10 times (that is specify the number of instances in the deployment to be 10) after doing performance testing to determine that's the optimal number. My understanding is that by default, 1 vertical = 1 event loop and does not scale across multiple cores.
I know that it's possible to use Vert.x verticals in Quarkus but is there another way to scale things like the number of Kafka consumers across multiple core?
I see that this type of scalability is configurable for things like Quarkus HTTP but I can't find anything about message consumers.
Here's the Vert.x Verticle approach that overall I'm very happy with, but I wish there were better documentation on how to do this.
UPDATE - Field injection doesn't work with this example but constructor injection does work.
Lets say I want to inject this
#ApplicationScoped
public class CoffeeRepositoryService {
public CoffeeRepositoryService() {
System.out.println("Injection succeeded!");
}
}
Here's my Verticle
package org.acme;
import io.smallrye.mutiny.Uni;
import io.smallrye.mutiny.vertx.core.AbstractVerticle;
import io.vertx.core.impl.logging.Logger;
import io.vertx.core.impl.logging.LoggerFactory;
import io.vertx.mutiny.core.eventbus.EventBus;
import io.vertx.mutiny.rabbitmq.RabbitMQClient;
import io.vertx.mutiny.rabbitmq.RabbitMQConsumer;
import io.vertx.rabbitmq.QueueOptions;
import io.vertx.rabbitmq.RabbitMQOptions;
public class RQVerticle extends AbstractVerticle {
private final Logger LOGGER = LoggerFactory.getLogger(org.acme.RQVerticle.class);
//This doesn't work - returns null
#Inject
CoffeeRepositoryService coffeeRepositoryService;
RQVerticle() {} // dummy constructor needed
#Inject // constructor injection - this does work
RQVerticle(CoffeeRepositoryService coffeeRepositoryService) {
//Here coffeeRepositoryService is injected properly
}
#Override
public Uni<Void> asyncStart() {
LOGGER.info(
"Creating RabbitMQ Connection after Quarkus successful initialization");
RabbitMQOptions config = new RabbitMQOptions();
config.setUri("amqp://localhost:5672");
RabbitMQClient client = RabbitMQClient.create(vertx, config);
Uni<Void> clientResp = client.start();
clientResp.subscribe()
.with(asyncResult -> {
LOGGER.info("RabbitMQ successfully connected!");
});
return clientResp;
}
}
Main Class - injection doesn't work like this
package org.acme;
import io.quarkus.runtime.Quarkus;
import io.quarkus.runtime.QuarkusApplication;
import io.quarkus.runtime.annotations.QuarkusMain;
import io.vertx.core.DeploymentOptions;
import io.vertx.mutiny.core.Vertx;
#QuarkusMain
public class Main {
public static void main(String... args) {
Quarkus.run(MyApp.class, args);
}
public static class MyApp implements QuarkusApplication {
#Override
public int run(String... args) throws Exception {
var vertx = Vertx.vertx();
System.out.println("Deployment Starting");
DeploymentOptions options = new DeploymentOptions()
.setInstances(2);
vertx.deployVerticleAndAwait(RQVerticle::new, options);
System.out.println("Deployment completed");
Quarkus.waitForExit();
return 0;
}
}
}
Main Class with working injection but cannot deploy more than one instance
package org.acme;
import io.quarkus.runtime.StartupEvent;
import io.vertx.mutiny.core.Vertx;
import javax.enterprise.context.ApplicationScoped;
import javax.enterprise.event.Observes;
import org.jboss.logging.Logger;
#ApplicationScoped
public class MainVerticles {
private static final Logger LOGGER = Logger.getLogger(MainVerticles.class);
public void init(#Observes StartupEvent e, Vertx vertx, RQVerticle verticle) {
public void init(#Observes StartupEvent e, Vertx vertx, RQVerticle verticle) {
DeploymentOptions options = new DeploymentOptions()
.setInstances(2);
vertx.deployVerticle(verticle,options).await().indefinitely();
}
}
Std Out - first main class looks good
2021-09-15 15:48:12,052 INFO [org.acm.RQVerticle] (vert.x-eventloop-thread-2) Creating RabbitMQ Connection after Quarkus successful initialization
2021-09-15 15:48:12,053 INFO [org.acm.RQVerticle] (vert.x-eventloop-thread-3) Creating RabbitMQ Connection after Quarkus successful initialization
Std Out - second main class
2021-09-22 15:48:11,986 ERROR [io.qua.run.Application] (Quarkus Main
Thread) Failed to start application (with profile dev):
java.lang.IllegalArgumentException: Can't specify > 1 instances for
already created verticle
I am using Camel in Karaf using SCR to process messages from ActiveMQ
Versions:
Camel: 2.16.0
Karaf: 4.0.7
ActiveMQ 5.14.1
When I deploy the following Camel route to Karaf all works fine:
package com.test;
import org.apache.camel.builder.RouteBuilder;
public class TestRoute extends RouteBuilder {
#Override
public void configure() throws Exception {
from("activemq:queue:TEST.IN")
.routeId("test-route")
.log("Message picked up from IN queue");
}
}
Here is my SCR Runner class:
package com.test;
import java.util.ArrayList;
import java.util.List;
import org.apache.activemq.ActiveMQConnectionFactory;
import org.apache.activemq.camel.component.ActiveMQComponent;
import org.apache.activemq.pool.PooledConnectionFactory;
import org.apache.camel.RoutesBuilder;
import org.apache.camel.component.jms.JmsConfiguration;
import org.apache.camel.scr.AbstractCamelRunner;
import org.apache.camel.spi.ComponentResolver;
import org.apache.felix.scr.annotations.Component;
import org.apache.felix.scr.annotations.Properties;
import org.apache.felix.scr.annotations.Property;
import org.apache.felix.scr.annotations.Reference;
import org.apache.felix.scr.annotations.ReferenceCardinality;
import org.apache.felix.scr.annotations.ReferencePolicy;
import org.apache.felix.scr.annotations.ReferencePolicyOption;
import org.apache.felix.scr.annotations.References;
import org.osgi.framework.BundleContext;
#Component(label = TestRunner.COMPONENT_LABEL, description = TestRunner.COMPONENT_DESCRIPTION, immediate = true, metatype = true)
#Properties({
#Property(name = "camelContextId", value = "test-context"),
#Property(name = "active", value = "true"),
})
#References({
#Reference(name = "camelComponent",referenceInterface = ComponentResolver.class,
cardinality = ReferenceCardinality.MANDATORY_MULTIPLE, policy = ReferencePolicy.DYNAMIC,
policyOption = ReferencePolicyOption.GREEDY, bind = "gotCamelComponent", unbind = "lostCamelComponent")
})
public class TestRunner extends AbstractCamelRunner {
public static final String COMPONENT_LABEL = "TestRunner";
public static final String COMPONENT_DESCRIPTION = "This is the description for the test runner";
#Override
protected List<RoutesBuilder> getRouteBuilders() {
List<RoutesBuilder> routesBuilders = new ArrayList<RoutesBuilder>();
routesBuilders.add(new TestRoute());
return routesBuilders;
}
#Override
protected void setupCamelContext(BundleContext bundleContext, String camelContextId)throws Exception{
super.setupCamelContext(bundleContext, camelContextId);
// Add Active MQ connection factory
ActiveMQConnectionFactory amqConnectionFactory = new ActiveMQConnectionFactory("tcp://c3m-activemq:61616");
amqConnectionFactory.setUserName("admin");
amqConnectionFactory.setPassword("admin");
// Create Pooled Connection Factory
PooledConnectionFactory amqPooledConnectionFactory = new PooledConnectionFactory(amqConnectionFactory);
amqPooledConnectionFactory.setMaxConnections(5);
amqPooledConnectionFactory.setMaximumActiveSessionPerConnection(5);
// Create JMS Configuration
JmsConfiguration consumerJmsConfig = new JmsConfiguration(amqPooledConnectionFactory);
consumerJmsConfig.setConcurrentConsumers(5);
// Create the ActiveMQ Component
ActiveMQComponent activemq = ActiveMQComponent.activeMQComponent();
activemq.setConfiguration(consumerJmsConfig);
// Add activeMQ component to the Camel Context
getContext().addComponent("activemq", activemq);
// Use MDC logging
getContext().setUseMDCLogging(true);
// Use breadcrumb logging
getContext().setUseBreadcrumb(true);
}
}
However, if I add an errorHandler to my routeBuilder then things fail.
Here's the same route with the errorHandler added:
public void configure() throws Exception {
errorHandler(deadLetterChannel("activemq:queue:TEST.DLQ").useOriginalMessage());
from("activemq:queue:TEST.IN")
.routeId("test-route")
.log("Message picked up from IN queue");
}
What happens:
- When installing the bundle on Karaf the following error is given:
2016-12-20 09:49:58,248 | ERROR | nsole user karaf | router | 124 - com.test.router - 1.1.0.SNAPSHOT | [com.test.TestRunner(7)] The activate method has thrown an exception
java.lang.IllegalArgumentException: Cannot add component as its already previously added: activemq
at org.apache.camel.impl.DefaultCamelContext.addComponent(DefaultCamelContext.java:369)
at com.test.TestRunner.setupCamelContext(TestRunner.java:75)[124:com.test.router:1.1.0.SNAPSHOT]
at org.apache.camel.scr.AbstractCamelRunner.prepare(AbstractCamelRunner.java:90)[72:org.apache.camel.camel-scr:2.16.0]
at org.apache.camel.scr.AbstractCamelRunner.activate(AbstractCamelRunner.java:79)[72:org.apache.camel.camel-scr:2.16.0]
...
And then the Camel route is NOT deployed in Karaf.
I'll proceed with some more troubleshooting, but perhaps someone understand more fully what's going wrong here
In your own TestRunner class then only add the component if its not already registered, you can use
if (context.hasComponent("activemq") != null) {
... add component
}
In the end I solved the problem with the following hack: If the component already exists, I first remove it and then add it back.
Here's the code:
// If activemq component already exists, remove it
// Note: This is a bit of a hack, but if we keep the one that is there
// Camel throws a security exception.
if (getContext().hasComponent("activemq") != null) {
getContext().removeComponent("activemq");
}
// Create the ActiveMQ Component
ActiveMQComponent activemq = ActiveMQComponent.activeMQComponent();
activemq.setConfiguration(consumerJmsConfig);
getContext().addComponent("activemq", activemq);
Not pretty, but if I don't remove it, and deploy the route, camel gives a security exception, almost as if the existing component "lost" the credentials of the broker.
Thanks for the help Claus!
I have been inventing a way how to work around the problem of adding consumers to a jetty endpoint (it does not allow multiple consumers). The way we do it in our company is to build our own router and a broadcasting endpoint which consumes from jetty and routes requests to underlying "subscriptions". Only one of them will eventually process the request. It kind of works but it's not completely ok, since recently when updating to latest Camel we have found our custom built component to leak memory and in general I consider using built-in functionality over custom hacks.
I started investigating the Camel REST API and found it very nice and pretty much replacing our home-grown component apart from one thing - you cannot re-configure it at runtime - you have to stop the context basically for this to work. Below I include my unit test with a happy path and the path that fails. Frankly I think is a bug, but if there is a legitimate way to achieve what I want, I'd like to hear sound advice:
package com.anydoby.camel;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.fail;
import java.io.IOException;
import java.io.InputStream;
import java.net.URL;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.impl.DefaultCamelContext;
import org.apache.commons.io.IOUtils;
import org.junit.Before;
import org.junit.Test;
/**
* Test tries to add/remove routes at runtime.
*/
public class RoutesTest {
private DefaultCamelContext ctx;
#Before
public void pre() throws Exception {
ctx = new DefaultCamelContext();
new RouteBuilder(ctx) {
#Override
public void configure() throws Exception {
restConfiguration("jetty").host("localhost").port(8080);
rest("/")
.get("/issues/{isin}").route().id("issues")
.process(e -> e.getOut().setBody("Here's your issue " + e.getIn().getHeader("isin"))).endRest()
.get("/listings").route().id("listings").process(e -> e.getOut().setBody("some listings"));
}
}.addRoutesToCamelContext(ctx);
ctx.start();
}
#Test
public void test() throws IOException {
{
InputStream stream = new URL("http://localhost:8080/issues/35").openStream();
assertEquals("Here's your issue 35", IOUtils.toString(stream));
}
{
InputStream stream = new URL("http://localhost:8080/listings").openStream();
assertEquals("some listings", IOUtils.toString(stream));
}
}
#Test
public void disableRoute() throws Exception {
ctx.stopRoute("issues");
ctx.removeRoute("issues");
try (InputStream stream = new URL("http://localhost:8080/issues/35").openStream()) {
fail();
} catch (Exception e) {
}
new RouteBuilder(ctx) {
#Override
public void configure() throws Exception {
rest().get("/issues/{isin}/{sedol}").route().id("issues")
.process(e -> e.getOut()
.setBody("Here's your issue " + e.getIn().getHeader("isin") + ":" + e.getIn().getHeader("sedol")))
.endRest();
}
}.addRoutesToCamelContext(ctx);
{
InputStream stream = new URL("http://localhost:8080/issues/35/65").openStream();
assertEquals("Here's your issue 35:65", IOUtils.toString(stream));
}
}
}
The disableRoute() test fails since I cannot add another consumer to an existing endpoint.
So my question is - "is there a way to add a new URL mapping to a restful camel-jetty endpoint"? If you do it during first configuration it works fine, but when later you want to reconfigure one of the routes the error is:
org.apache.camel.FailedToStartRouteException: Failed to start route because of Multiple consumers for the same endpoint is not allowed: jetty:http://localhost:8080/issues/%7Bisin%7D/%7Bsedol%7D?httpMethodRestrict=GET
I want to run unit tests on a database other than the default one. Here is my application.conf:
application.secret="[cut]"
application.langs="en"
db.default.driver=com.mysql.jdbc.Driver
db.default.url="jdbc:mysql://localhost:3306/city_game?characterEncoding=UTF-8"
db.default.user=root
db.default.password=""
db.test.driver=com.mysql.jdbc.Driver
db.test.url="jdbc:mysql://localhost:3306/play_test?characterEncoding=UTF-8"
db.test.user=root
db.test.password=""
ebean.default="models.*"
ebean.test="models.*"
logger.root=ERROR
logger.play=INFO
logger.application=DEBUG
BaseModelTest.java:
package models;
import com.avaje.ebean.Ebean;
import com.avaje.ebean.EbeanServer;
import com.avaje.ebean.config.ServerConfig;
import com.avaje.ebeaninternal.server.ddl.DdlGenerator;
import com.avaje.ebean.config.dbplatform.MySqlPlatform;
import com.avaje.ebeaninternal.api.SpiEbeanServer;
import org.junit.AfterClass;
import org.junit.Before;
import org.junit.BeforeClass;
import play.test.FakeApplication;
import play.test.Helpers;
import java.io.IOException;
public class BaseModelTest
{
public static FakeApplication app;
public static DdlGenerator ddl;
#BeforeClass
public static void startApp() throws IOException
{
app = Helpers.fakeApplication();
Helpers.start(app);
String serverName = "test";
EbeanServer server = Ebean.getServer(serverName);
ServerConfig config = new ServerConfig();
ddl = new DdlGenerator();
ddl.setup((SpiEbeanServer) server, new MySqlPlatform(), config);
}
#AfterClass
public static void stopApp()
{
Helpers.stop(app);
}
#Before
public void dropCreateDb() throws IOException
{
// Drop
ddl.runScript(false, ddl.generateDropDdl());
// Create
ddl.runScript(false, ddl.generateCreateDdl());
}
}
However, I get results saved in the default database, and the test one has its tables created but empty. What I expect is to have the results written to the test db and default one untouched.
I somehow ended with different approach.
I still created separate real test database instance (because of stored procedures), but instead I used the Play1-like approach.
I have separates configuration sides beneath my main configuration (e.g. test configuration, prod specific stuff, stage specific stuff etc ..)
I load it via Global.scala as shown below (please note the exaple provided below works in in Play for java developers version as well)
object Global extends GlobalSettings {
override def onLoadConfig(config: Configuration, path: File, cl: ClassLoader, mode: Mode.Mode): Configuration = {
val modeFile: String = s"application.${mode.toString.toLowerCase}.conf"
Logger.error(s"Loading {${path.toURI}conf/application.conf}")
Logger.error(s"Appending mode specific configuration {${path.toURI}conf/$modeFile}")
val modeConfig = config ++ Configuration(ConfigFactory.load(modeFile))
super.onLoadConfig(modeConfig, path, cl, mode)
}
}
And the application.test.conf config file is as follows:
# test database
db.default.logStatements=false
db.default.jndiName=DefaultDS
db.default.url="jdbc:postgresql://127.0.0.1:5432/db-test"
db.default.user=user
db.default.password="password!##$"
db.default.driver=org.postgresql.Driver
This way I get following benefits:
I still write my tests the usual way
Play evolutions gets tested on CI / jenkins as well
I have to write my tests in the way I could safely retun them on the existing db instance w/ minimal assumptions about data and userbase. That way I'm 90% certain I will be able to run those against staging / prod environment with much less friction. (Controversial point)
I think you should separate your code
as these
#BeforeClass
public static void startApp() throws IOException {
app = Helpers.fakeApplication();
Helpers.start(app);
}
#Before
public void dropCreateDb() throws IOException {
String serverName = "test";
EbeanServer server = Ebean.getServer(serverName);
ServerConfig config = new ServerConfig();
DdlGenerator ddl = new DdlGenerator((SpiEbeanServer) server, new MySqlPlatform(), config);
// Drop
ddl.runScript(false, ddl.generateDropDdl());
// Create
ddl.runScript(false, ddl.generateCreateDdl());
}
I want to read from the database and write the records to a file using Camel. Below is my code:
import javax.sql.DataSource;
import org.apache.camel.CamelContext;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.impl.DefaultCamelContext;
import org.apache.camel.impl.SimpleRegistry;
import org.apache.commons.dbcp.BasicDataSource;
public class JDBCExampleSimpleRegistry {
public static void main(String[] args) throws Exception {
final String url = "jdbc:oracle:thin:#MYSERVER:1521:myDB";
DataSource dataSource = setupDataSource(url);
SimpleRegistry reg = new SimpleRegistry() ;
reg.put("myDataSource",dataSource);
CamelContext context = new DefaultCamelContext(reg);
context.addRoutes(new JDBCExampleSimpleRegistry().new MyRouteBuilder());
context.start();
Thread.sleep(5000);
context.stop();
}
class MyRouteBuilder extends RouteBuilder {
public void configure() {
String dst = "C:/Local Disk E/TestData/Destination/?fileName=output.txt";
from("direct:myTable")
.setBody(constant("select * from myTable"))
.to("jdbc:myDataSource")
.to("file://" + dst);
}
}
private static DataSource setupDataSource(String connectURI) {
BasicDataSource ds = new BasicDataSource();
ds.setDriverClassName("oracle.jdbc.driver.OracleDriver");
ds.setUsername("sa");
ds.setPassword("devon1");
ds.setUrl(connectURI);
return ds;
}
}
The above program works fine and CamelContext is elegantly shutdown. However, the target file is not created. What am I doing wrong?
I am a newbie to Apache Camel so appreciate any help. I am using JDK7 with Apache Camel 2.12.1 and is not using Spring.
You can take a look at the SQL example: http://camel.apache.org/sql-example.html and then to write to file, is just to send to a file instead of calling the bean as the sql example does.
If you want to use the JDBC component then it doesnt have built-in consumer, so you need to trigger the route using a timer or a quartz scheduler to run the route every X time.