Apache Camel 2.18.0 Consumer does not consume - java

We are using apache camel to 2.17.* version inorder to utilize the maxPollRecords parameter I am trying to upgrade to 2.18.0. Post upgrade to 2.18.0 The consumer doesn't seems to be recognized by the broker anymore. Following is the sample consumer I tried to create. I could produce the message from cli to the topic and if I create a consumer in the cli I could see the consumer created in cli consumes the message but not the consumer created through apache camel.
Also with the consumer group describe cli command I could see the consumer-id as blank if I run only apache camel consumer instance. While I was running with 2.17.5 the broker used to recognize and assign that to the partition. I can't find the example
please help.
package com.test;
import org.apache.camel.CamelContext;
import org.apache.camel.Exchange;
import org.apache.camel.Message;
import org.apache.camel.Processor;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.component.properties.PropertiesComponent;
import org.apache.camel.impl.DefaultCamelContext;
public class CamelConsumer {
public static void main(String argv[]){
CamelContext camelContext = new DefaultCamelContext();
// Add route to send messages to Kafka
try {
camelContext.addRoutes(new RouteBuilder() {
public void configure() {
PropertiesComponent pc = getContext().getComponent("properties", PropertiesComponent.class);
pc.setLocation("classpath:application.properties");
System.out.println("About to start route: Kafka Server -> Log ");
from("kafka:{{consumer.topic}}?brokers={{kafka.host}}:{{kafka.port}}"
+ "&maxPollRecords={{consumer.maxPollRecords}}" + "&consumersCount={{consumer.consumersCount}}"
+ "&groupId={{consumer.group}}").routeId("FromKafka")
.process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
Message message = exchange.getIn();
Object data = message.getBody();
System.out.println(data);
}
});
}
});
camelContext.start();
Thread.sleep(5 * 60 * 1000);
camelContext.stop();
} catch (Exception e) {
e.printStackTrace();
}
}
}
I donot get any exception either. I donot find any documentation related to this. Please help.
consumer.topic=test kafka.host=localhost kafka.port=9092
consumer.maxPollRecords=1 consumer.consumersCount=1
consumer.group=test

I could find out the working example code from 2.19* in the following repository.
https://github.com/Talend/apache-camel/branches (Forked branch)
https://github.com/apache/camel (actual camel branch)
Finally it worked with the version 2.21.5 and I had to bump up the apache kafka maven version to 1.0.0 from 0.9*

Related

How to investigate data on Apache Camel Route?

My project is working on getting data from one system to another. We are using Apache Camel Routes to send the data between JBoss EAP v7 servers. My question is, is there a way to investigate what the content of the packages are as they come across different routes?
We have tried upping the logging but our files/console just get flooded. We have also tried to use Hawtio on the server to see the messages coming across the routes but have had no success identifying where our message is getting "stuck".
Any help is appreciated!
You can use unit tests to test your routes locally and then either log contents of the exchange at specific points using adviceWith and weave methods.
With unit tests you can easily debug your routes in your favourite IDE even if you're running camel in something like Karaf or Red Hat fuse.
package com.example;
import org.apache.camel.Exchange;
import org.apache.camel.Processor;
import org.apache.camel.RoutesBuilder;
import org.apache.camel.builder.AdviceWithRouteBuilder;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.component.mock.MockEndpoint;
import org.apache.camel.model.dataformat.JsonLibrary;
import org.apache.camel.test.junit4.CamelTestSupport;
import org.junit.Test;
public class ExampleRouteTests extends CamelTestSupport {
#Test
public void exampleTest() throws Exception
{
ContractDetails testDetails = new ContractDetails(1512, 1215);
mockJDBCEndpoints();
context.getRouteDefinition("exampleRoute")
.adviceWith(context, new AdviceWithRouteBuilder(){
#Override
public void configure() throws Exception {
replaceFromWith("direct:start");
weaveByToUri("direct:getDetailsFromAPI")
.replace()
.to("log:testLogger?showAll=true")
.to("mock:api")
.setBody(constant(testDetails));
weaveByToUri("direct:saveToDatabase")
.replace()
.to("log:testLogger?showAll=true")
.to("mock:db");
}
});
MockEndpoint apiMockEndpoint = getMockEndpoint("mock:api");
apiMockEndpoint.expectedMessageCount(1);
MockEndpoint dbMockEndpoint = getMockEndpoint("mock:db");
dbMockEndpoint.expectedMessageCount(1);
context.start();
String body = "{\"name\":\"Bob\",\"age\":10}";
template.sendBody("direct:start", body);
apiMockEndpoint.assertIsSatisfied();
dbMockEndpoint.assertIsSatisfied();
}
#Override
protected RoutesBuilder createRouteBuilder() throws Exception {
return new RouteBuilder(){
#Override
public void configure() throws Exception {
from("amqp:queue:example")
.routeId("exampleRoute")
.unmarshal().json(JsonLibrary.Jackson,
Person.class)
.to("direct:getDetailsFromAPI")
.process(new SomeProcessor())
.to("direct:saveToDatabase");
from("direct:saveToDatabase")
.routeId("saveToDatabaseRoute")
.to("velocity:sql/insertQueryTemplate.vt")
.to("jdbc:exampleDatabase");
from("direct:getDetailsFromAPI")
.removeHeaders("*")
.toD("http4:someAPI?name=${body.getName()}")
.unmarshal().json(JsonLibrary.Jackson,
ContractDetails.class);
}
};
}
void mockJDBCEndpoints() throws Exception {
context.getRouteDefinition("saveToDatabaseRoute")
.adviceWith(context, new AdviceWithRouteBuilder(){
#Override
public void configure() throws Exception {
weaveByToUri("jdbc:*")
.replace()
.to("mock:db");
}
});
}
#Override
public boolean isUseAdviceWith() {
return true;
}
}
Now for troubleshooting problems that do not occur with unit tests you can configure generic or route specific exception handling with onException and use Dead letter channel to process and and store information about the failed exchange. Alternatively you can just use stream or file component to save information about the exception and failed exchange in to a separate file to avoid flooding logs.

zookeeper.server.ServerCnxn$EndOfStreamException: Unable to read additional data from client

I have installed IntelliJ Community 2018.1 on Windows 7 Professional. Running Java 1.8.0.172 jdk.
I am running Apache Storm related Simple Java code from Edureka Tutorials.
Very simple and basic. Just emit few integers from a spout and the bolt just doubles up each number and re-emits.
I am not running any local or remote zk/storm cluster. I am relying on the ZK/Storm instances that the code somehow generates itself. I do not have any Storm directory locally. All I have is IntelliJ and few dependency lines in pom.xml.
This is what I have in my Windows hosts file.
localhost sandbox.hortonworks.com sandbox-hdp.hortonworks.com sandbox-hdf.hortonworks.com
127.0.0.1 sandbox.hortonworks.com sandbox-hdp.hortonworks.com sandbox-hdf.hortonworks.com
When I run the Java program, I am consistently getting this error :-
org.apache.storm.shade.org.apache.zookeeper.server.ServerCnxn$EndOfStreamException: Unable to read additional data from client sessionid 0x164ebfb3e3e000f, likely client has closed socket
Here are the steps I did to write my code :-
(1) Create new Maven project in IntelliJ.
(2) Add dependencies section to POM.xml so that storm-core libraries are imported. All validates fine.
<dependencies>
<dependency>
<groupId>org.apache.storm</groupId>
<artifactId>storm-core</artifactId>
<version>1.1.1</version>
<scope>compile</scope>
</dependency>
</dependencies>
(4) Create a Java class for Spout in IntelliJ.
import org.apache.storm.spout.SpoutOutputCollector;
import org.apache.storm.task.TopologyContext;
import org.apache.storm.topology.OutputFieldsDeclarer;
import org.apache.storm.topology.base.BaseRichSpout;
import org.apache.storm.tuple.Fields;
import org.apache.storm.tuple.Values;
import java.util.Map;
public class IntegerSpout extends BaseRichSpout {
SpoutOutputCollector myspoutOutputCollector;
private Integer index = 2;
public void open(Map map, TopologyContext topologyContext, SpoutOutputCollector spoutOutputCollector) {
this.myspoutOutputCollector = spoutOutputCollector;
}
public void nextTuple() {
//Emit 100 numbers from the Spout.
if (index <100) {
System.out.println("Index is " + Integer.toString(index));
this.myspoutOutputCollector.emit(new Values(index));
index++;
}
}
public void declareOutputFields(OutputFieldsDeclarer outputFieldsDeclarer) {
outputFieldsDeclarer.declare(new Fields("field"));
}
}
(5) Feed these numbers from the Spout into a (Multiplier?) Bolt, that doubles each number from the spout and emits out further. Simple and Straightforward.
import org.apache.storm.topology.BasicOutputCollector;
import org.apache.storm.topology.OutputFieldsDeclarer;
import org.apache.storm.topology.base.BaseBasicBolt;
import org.apache.storm.tuple.Fields;
import org.apache.storm.tuple.Tuple;
import org.apache.storm.tuple.Values;
public class MultiplierBolt extends BaseBasicBolt {
public void execute(Tuple tuple, BasicOutputCollector basicOutputCollector) {
Integer number = tuple.getInteger(0);
number*= 2;
basicOutputCollector.emit(new Values(number));
}
public void declareOutputFields(OutputFieldsDeclarer outputFieldsDeclarer) {
outputFieldsDeclarer.declare(new Fields("field"));
}
}
(6) Now write a Main class with a main() that defines a topology and connects the spout to the bolt and then submits it for execution.
import org.apache.storm.Config;
import org.apache.storm.LocalCluster;
import org.apache.storm.topology.TopologyBuilder;
public class MainTopology {
public static void main(String[] args) {
TopologyBuilder builder = new TopologyBuilder();
builder.setSpout("IntegerSpout", new IntegerSpout());
builder.setBolt("MultiplierBolt", new MultiplierBolt()).shuffleGrouping("IntegerSpout");
Config config = new Config();
config.setDebug(true);
LocalCluster localCluster = new LocalCluster();
try {
localCluster.submitTopology("HelloTopology", config, builder.createTopology());
Thread.sleep(1000);
} catch (Exception e) {
System.out.println("Exception Raised");
e.printStackTrace();
} finally {
localCluster.shutdown();
};
}
}
That's it folks. Now just compile them and run. I get mostly exceptions and errors in my log.
[NIOServerCxn.Factory:0.0.0.0/0.0.0.0:2000] WARN o.a.s.s.o.a.z.s.NIOServerCnxn - caught end of stream exception org.apache.storm.shade.org.apache.zookeeper.server.ServerCnxn$EndOfStreamException: Unable to read additional data from client sessionid 0x164ebf470410009, likely client has closed socket
Any pointers of help will be appreciated.
TIA.
Amit

Camel SCR component tries to add component to context twice if route builder includes an errorHandler

I am using Camel in Karaf using SCR to process messages from ActiveMQ
Versions:
Camel: 2.16.0
Karaf: 4.0.7
ActiveMQ 5.14.1
When I deploy the following Camel route to Karaf all works fine:
package com.test;
import org.apache.camel.builder.RouteBuilder;
public class TestRoute extends RouteBuilder {
#Override
public void configure() throws Exception {
from("activemq:queue:TEST.IN")
.routeId("test-route")
.log("Message picked up from IN queue");
}
}
Here is my SCR Runner class:
package com.test;
import java.util.ArrayList;
import java.util.List;
import org.apache.activemq.ActiveMQConnectionFactory;
import org.apache.activemq.camel.component.ActiveMQComponent;
import org.apache.activemq.pool.PooledConnectionFactory;
import org.apache.camel.RoutesBuilder;
import org.apache.camel.component.jms.JmsConfiguration;
import org.apache.camel.scr.AbstractCamelRunner;
import org.apache.camel.spi.ComponentResolver;
import org.apache.felix.scr.annotations.Component;
import org.apache.felix.scr.annotations.Properties;
import org.apache.felix.scr.annotations.Property;
import org.apache.felix.scr.annotations.Reference;
import org.apache.felix.scr.annotations.ReferenceCardinality;
import org.apache.felix.scr.annotations.ReferencePolicy;
import org.apache.felix.scr.annotations.ReferencePolicyOption;
import org.apache.felix.scr.annotations.References;
import org.osgi.framework.BundleContext;
#Component(label = TestRunner.COMPONENT_LABEL, description = TestRunner.COMPONENT_DESCRIPTION, immediate = true, metatype = true)
#Properties({
#Property(name = "camelContextId", value = "test-context"),
#Property(name = "active", value = "true"),
})
#References({
#Reference(name = "camelComponent",referenceInterface = ComponentResolver.class,
cardinality = ReferenceCardinality.MANDATORY_MULTIPLE, policy = ReferencePolicy.DYNAMIC,
policyOption = ReferencePolicyOption.GREEDY, bind = "gotCamelComponent", unbind = "lostCamelComponent")
})
public class TestRunner extends AbstractCamelRunner {
public static final String COMPONENT_LABEL = "TestRunner";
public static final String COMPONENT_DESCRIPTION = "This is the description for the test runner";
#Override
protected List<RoutesBuilder> getRouteBuilders() {
List<RoutesBuilder> routesBuilders = new ArrayList<RoutesBuilder>();
routesBuilders.add(new TestRoute());
return routesBuilders;
}
#Override
protected void setupCamelContext(BundleContext bundleContext, String camelContextId)throws Exception{
super.setupCamelContext(bundleContext, camelContextId);
// Add Active MQ connection factory
ActiveMQConnectionFactory amqConnectionFactory = new ActiveMQConnectionFactory("tcp://c3m-activemq:61616");
amqConnectionFactory.setUserName("admin");
amqConnectionFactory.setPassword("admin");
// Create Pooled Connection Factory
PooledConnectionFactory amqPooledConnectionFactory = new PooledConnectionFactory(amqConnectionFactory);
amqPooledConnectionFactory.setMaxConnections(5);
amqPooledConnectionFactory.setMaximumActiveSessionPerConnection(5);
// Create JMS Configuration
JmsConfiguration consumerJmsConfig = new JmsConfiguration(amqPooledConnectionFactory);
consumerJmsConfig.setConcurrentConsumers(5);
// Create the ActiveMQ Component
ActiveMQComponent activemq = ActiveMQComponent.activeMQComponent();
activemq.setConfiguration(consumerJmsConfig);
// Add activeMQ component to the Camel Context
getContext().addComponent("activemq", activemq);
// Use MDC logging
getContext().setUseMDCLogging(true);
// Use breadcrumb logging
getContext().setUseBreadcrumb(true);
}
}
However, if I add an errorHandler to my routeBuilder then things fail.
Here's the same route with the errorHandler added:
public void configure() throws Exception {
errorHandler(deadLetterChannel("activemq:queue:TEST.DLQ").useOriginalMessage());
from("activemq:queue:TEST.IN")
.routeId("test-route")
.log("Message picked up from IN queue");
}
What happens:
- When installing the bundle on Karaf the following error is given:
2016-12-20 09:49:58,248 | ERROR | nsole user karaf | router | 124 - com.test.router - 1.1.0.SNAPSHOT | [com.test.TestRunner(7)] The activate method has thrown an exception
java.lang.IllegalArgumentException: Cannot add component as its already previously added: activemq
at org.apache.camel.impl.DefaultCamelContext.addComponent(DefaultCamelContext.java:369)
at com.test.TestRunner.setupCamelContext(TestRunner.java:75)[124:com.test.router:1.1.0.SNAPSHOT]
at org.apache.camel.scr.AbstractCamelRunner.prepare(AbstractCamelRunner.java:90)[72:org.apache.camel.camel-scr:2.16.0]
at org.apache.camel.scr.AbstractCamelRunner.activate(AbstractCamelRunner.java:79)[72:org.apache.camel.camel-scr:2.16.0]
...
And then the Camel route is NOT deployed in Karaf.
I'll proceed with some more troubleshooting, but perhaps someone understand more fully what's going wrong here
In your own TestRunner class then only add the component if its not already registered, you can use
if (context.hasComponent("activemq") != null) {
... add component
}
In the end I solved the problem with the following hack: If the component already exists, I first remove it and then add it back.
Here's the code:
// If activemq component already exists, remove it
// Note: This is a bit of a hack, but if we keep the one that is there
// Camel throws a security exception.
if (getContext().hasComponent("activemq") != null) {
getContext().removeComponent("activemq");
}
// Create the ActiveMQ Component
ActiveMQComponent activemq = ActiveMQComponent.activeMQComponent();
activemq.setConfiguration(consumerJmsConfig);
getContext().addComponent("activemq", activemq);
Not pretty, but if I don't remove it, and deploy the route, camel gives a security exception, almost as if the existing component "lost" the credentials of the broker.
Thanks for the help Claus!

Adding REST route to an existing Jetty endpoint in Camel at runtime

I have been inventing a way how to work around the problem of adding consumers to a jetty endpoint (it does not allow multiple consumers). The way we do it in our company is to build our own router and a broadcasting endpoint which consumes from jetty and routes requests to underlying "subscriptions". Only one of them will eventually process the request. It kind of works but it's not completely ok, since recently when updating to latest Camel we have found our custom built component to leak memory and in general I consider using built-in functionality over custom hacks.
I started investigating the Camel REST API and found it very nice and pretty much replacing our home-grown component apart from one thing - you cannot re-configure it at runtime - you have to stop the context basically for this to work. Below I include my unit test with a happy path and the path that fails. Frankly I think is a bug, but if there is a legitimate way to achieve what I want, I'd like to hear sound advice:
package com.anydoby.camel;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.fail;
import java.io.IOException;
import java.io.InputStream;
import java.net.URL;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.impl.DefaultCamelContext;
import org.apache.commons.io.IOUtils;
import org.junit.Before;
import org.junit.Test;
/**
* Test tries to add/remove routes at runtime.
*/
public class RoutesTest {
private DefaultCamelContext ctx;
#Before
public void pre() throws Exception {
ctx = new DefaultCamelContext();
new RouteBuilder(ctx) {
#Override
public void configure() throws Exception {
restConfiguration("jetty").host("localhost").port(8080);
rest("/")
.get("/issues/{isin}").route().id("issues")
.process(e -> e.getOut().setBody("Here's your issue " + e.getIn().getHeader("isin"))).endRest()
.get("/listings").route().id("listings").process(e -> e.getOut().setBody("some listings"));
}
}.addRoutesToCamelContext(ctx);
ctx.start();
}
#Test
public void test() throws IOException {
{
InputStream stream = new URL("http://localhost:8080/issues/35").openStream();
assertEquals("Here's your issue 35", IOUtils.toString(stream));
}
{
InputStream stream = new URL("http://localhost:8080/listings").openStream();
assertEquals("some listings", IOUtils.toString(stream));
}
}
#Test
public void disableRoute() throws Exception {
ctx.stopRoute("issues");
ctx.removeRoute("issues");
try (InputStream stream = new URL("http://localhost:8080/issues/35").openStream()) {
fail();
} catch (Exception e) {
}
new RouteBuilder(ctx) {
#Override
public void configure() throws Exception {
rest().get("/issues/{isin}/{sedol}").route().id("issues")
.process(e -> e.getOut()
.setBody("Here's your issue " + e.getIn().getHeader("isin") + ":" + e.getIn().getHeader("sedol")))
.endRest();
}
}.addRoutesToCamelContext(ctx);
{
InputStream stream = new URL("http://localhost:8080/issues/35/65").openStream();
assertEquals("Here's your issue 35:65", IOUtils.toString(stream));
}
}
}
The disableRoute() test fails since I cannot add another consumer to an existing endpoint.
So my question is - "is there a way to add a new URL mapping to a restful camel-jetty endpoint"? If you do it during first configuration it works fine, but when later you want to reconfigure one of the routes the error is:
org.apache.camel.FailedToStartRouteException: Failed to start route because of Multiple consumers for the same endpoint is not allowed: jetty:http://localhost:8080/issues/%7Bisin%7D/%7Bsedol%7D?httpMethodRestrict=GET

how to create datasource using camel?

I have just started learning Apache Camel. I understood the basics of Routes and Components. Now I want to give a try by connecting to Oracle database, reading records from one particular table and write those records to a file using File component. To read from database I assume I need to use JDBC component and give the dataSourceName.
However, I couldn't find any info on how to create a dataSource using camel. All info that I found related to this topic uses Spring DSL examples. I don't use Spring and I just need to test this using simple standalone Java application.
I am using JDK7u25 with Apache Camel 2.12.1.
Can someone please post a sample to read from the oracle table and write to a file?
[EDIT]
After checking several solutions on the web, I came to know about following two approaches:
Camel to run as standalone. Here is my code:
import javax.sql.DataSource;
import org.apache.camel.main.Main;
import org.apache.camel.builder.RouteBuilder;
import org.apache.commons.dbcp.BasicDataSource;
public class JDBCExample {
private Main main;
public static void main(String[] args) throws Exception {
JDBCExample example = new JDBCExample();
example.boot();
}
public void boot() throws Exception {
// create a Main instance
main = new Main();
// enable hangup support so you can press ctrl + c to terminate the JVM
main.enableHangupSupport();
String url = "jdbc:oracle:thin:#MYSERVER:1521:myDB";
DataSource dataSource = setupDataSource(url);
// bind dataSource into the registery
main.bind("myDataSource", dataSource);
// add routes
main.addRouteBuilder(new MyRouteBuilder());
// run until you terminate the JVM
System.out.println("Starting Camel. Use ctrl + c to terminate the JVM.\n");
main.run();
}
class MyRouteBuilder extends RouteBuilder {
public void configure() {
String dst = "C:/Local Disk E/TestData/Destination";
from("direct:myTable")
.setBody(constant("select * from myTable"))
.to("jdbc:myDataSource")
.to("file:" + dst);
}
}
private DataSource setupDataSource(String connectURI) {
BasicDataSource ds = new BasicDataSource();
ds.setDriverClassName("oracle.jdbc.driver.OracleDriver");
ds.setUsername("sa");
ds.setPassword("devon1");
ds.setUrl(connectURI);
return ds;
}
}
Using the approach mentioned by Claus lbsen. Here is the code again:
import javax.sql.DataSource;
import org.apache.camel.CamelContext;
import org.apache.camel.impl.DefaultCamelContext;
import org.apache.camel.impl.SimpleRegistry;
import org.apache.camel.main.Main;
import org.apache.camel.builder.RouteBuilder;
import org.apache.commons.dbcp.BasicDataSource;
public class JDBCExample {
private Main main;
public static void main(String[] args) throws Exception {
String url = "jdbc:oracle:thin:#MYSERVER:1521:myDB";
DataSource dataSource = setupDataSource(url);
SimpleRegistry reg = new SimpleRegistry() ;
reg.put("myDataSource",dataSource);
CamelContext context = new DefaultCamelContext(reg);
context.addRoutes(new JDBCExample().new MyRouteBuilder());
context.start();
Thread.sleep(5000);
context.stop();
}
class MyRouteBuilder extends RouteBuilder {
public void configure() {
String dst = "C:/Local Disk E/TestData/Destination";
from("direct:myTable")
.setBody(constant("select * from myTable"))
.to("jdbc:myDataSource")
.to("file:" + dst);
}
}
private static DataSource setupDataSource(String connectURI) {
BasicDataSource ds = new BasicDataSource();
ds.setDriverClassName("oracle.jdbc.driver.OracleDriver");
ds.setUsername("sa");
ds.setPassword("devon1");
ds.setUrl(connectURI);
return ds;
}
}
But in both the cases I am getting below exception:
Caused by: org.apache.camel.ResolveEndpointFailedException: Failed to resolve endpoint: jdbc://myDataSource due to: No component found with scheme: jdbc
at org.apache.camel.impl.DefaultCamelContext.getEndpoint(DefaultCamelContext.java:534)
at org.apache.camel.util.CamelContextHelper.getMandatoryEndpoint(CamelContextHelper.java:63)
at org.apache.camel.model.RouteDefinition.resolveEndpoint(RouteDefinition.java:192)
at org.apache.camel.impl.DefaultRouteContext.resolveEndpoint(DefaultRouteContext.java:106)
at org.apache.camel.impl.DefaultRouteContext.resolveEndpoint(DefaultRouteContext.java:112)
at org.apache.camel.model.SendDefinition.resolveEndpoint(SendDefinition.java:61)
at org.apache.camel.model.SendDefinition.createProcessor(SendDefinition.java:55)
at org.apache.camel.model.ProcessorDefinition.makeProcessor(ProcessorDefinition.java:500)
at org.apache.camel.model.ProcessorDefinition.addRoutes(ProcessorDefinition.java:213)
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:909)
... 12 more
[Thread-0] INFO org.apache.camel.main.MainSupport$HangupInterceptor - Received hang up - stopping the main instance.
There is the SQL example which shows how to setup a DataSource
http://camel.apache.org/sql-example.html
Yes that examples using Spring XML. But how you setup the DataSource can be done in Java code also. Then you need to register the DataSource in the Camel Registry.
For example you can use a JndiRegistry or the SimpleRegistry. The latter is easier.
Here is some pseudo code showing the principle, of creating a registry, add your beans into this registry, and then providing the registry to the constructor of DefaultCamelContext.
SimpleRegistry registry = new SimpleRegistry();
// code to create data source here
DateSource ds = ...
registry.put("myDataSource", ds);
CamelContext camel = new DefaultCamelContext(registry);
So silly me! I had not included camel-jdbc-2.12.1.jar in the CLASSPATH. Now above examples work.
Spring was mentioned there just because it is very useful paradigm of working with DB (mainly because of templates introduced by Spring Framework.) Of course you can hook up the standard JDBC connection and implement DAO by yourself - nothing wrong with that.

Categories

Resources