I want to be able to continuously poll database to select data from my table using Camel. I have configured Camel in my spring boot application. Here are the configurations that I am using
build.gradle:
implementation 'org.apache.camel:camel-jdbc-starter:2.24.0'
implementation 'org.apache.camel:camel-sql-starter:2.24.0'
RouteBuilder class:
#Component
public class CustomCamelConfig extends RouteBuilder {
Logger log = LoggerFactory.getLogger(getClass());
#Autowired
RouteDataMapper dataMapper;
#Override
public void configure() throws Exception {
from("timer://timer1?period=2s").log("Called every 2 seconds")
.setBody(constant("select * from tenders"))
.bean(dataMapper,"generateSalesData")
.noDelayer();
}
}
Bean:
#Component
public class RouteDataMapper {
Logger log = LoggerFactory.getLogger(getClass());
public void generateSalesData(String payload) {
log.info("RouteDataMapper - [generateSalesData]");
log.info("payload : {}", payload);
}
}
application.properties
spring.datasource.driver-class-name=oracle.jdbc.driver.OracleDriver
spring.datasource.url=jdbc:oracle:thin:#xxx:xxx/zzz
spring.datasource.username=zzz
spring.datasource.password=zzz
Issue I am facing is that, when I print the bean method parameter (generateSalesData(String payload)), I get the query string itself ("select * from tenders") and not the value from the table. The setBody() in configure method doesn't accept a sql:select .. statement, shows "The method setBody(Expression) in the type ProcessorDefinition is not applicable for the arguments (String)".
I am new to camel. Could anyone please let me know what is that I am missing to do.
The route you have, as written, is simply setting the body of the message to a string which happens to look like a SQL. Camel has no idea, since you haven't use the right component.
Instead of
.setBody(constant("select * from tenders"))
you need to tell Camel to use the sql component
.to("sql:select * from tenders")
The result that's passed on to RouteDataMapper will be a List<Map<String, Object>>, as described in the documentation. You'd need to adjust your method parameter accordingly.
Related
I have a Streaming Processor that processes messages from a Kafka InputTopic to an OutputTopic. Furthermore I have multiple tenants for whom this processing shall take place. Lets call them tenant A and tenant B, but there can be more than a dozen tenants that the application should process. The input and output topics follow the naming convention: A-input, B-input, ... and A-output, B-output...
The function definition is like:
#Configuration
public class StreamProcessorConfig {
#Bean
public Function<KStream<String, InputType>, KStream<String, OutputType>> myfunctiondefinition() {
return inputTypeStream -> inputTypeStream.map((String k, InputType v) -> {
return KeyValue.pair(k, OutputType.createFrom(v));
});
}
}
My application.yaml now configures the streaming application for tenant A:
tenant: A
spring.cloud.function.definition: myfunctiondefinition
spring.cloud.stream.kafka.streams.binder.functions.myfunctiondefinition:
applicationId: ${spring.application.name}-myfunctiondefinition
spring.cloud.stream.bindings.myfunctiondefinition-in-0:
destination: ${tenant}-input
spring.cloud.stream.bindings.myfunctiondefinition-out-0:
destination: ${tenant}-output
How can I modify the configuration to add an instance for tenant B? Of course I could duplicate myfunctiondefinition() as well as all configuration keys, but I'm looking for a way to dynamically add tenants fast and clean solely through configuration. Is this possible?
Note: Running another instance of the application for tenant B and further tenants is sadly not an option.
We found a solution to this problem by manually registering the function beans. Sadly this was not quite as easy as we thought it would be. FunctionDetectorCondition (https://github.com/spring-cloud/spring-cloud-stream-binder-kafka/blob/main/spring-cloud-stream-binder-kafka-streams/src/main/java/org/springframework/cloud/stream/binder/kafka/streams/function/FunctionDetectorCondition.java) requires an AnnotatedBeanDefinition that used as a template for the actual Stream Processing bean. This could be taken as a proposal to spring cloud streams for registering a function defintion template that can be used multiple times.
To reach this goal we initialise a factory bean instead of the stream processor function itself:
#Configuration
public class StreamProcessorConfig {
#Bean
public MyFunctionDefinitionFactory myFunctionDefinitionFactory() {
return new MyFunctionDefinitionFactory();
}
}
The factory creates the stream processor function:
public class MyFunctionDefinitionFactory {
public Function<KStream<String, InputType>,
KStream<String, OutputType>> myfunctiondefinition() {
return inputTypeStream -> inputTypeStream.map((String k, InputType v) -> {
return KeyValue.pair(k, OutputType.createFrom(v));
});
}
}
Now we need a Dummy Bean Interface that is Required for Spring Cloud Streams to apply its logic to create the stream processor:
// Behaves as dummy bean for spring cloud stream
// Has to be the same name as the original streaming function in the factory.
// In this case we named the method "myfunctiondefinition",
// so the dummy-bean has to get the name "Myfunctiondefinition".
public class Myfunctiondefinition implements Function<KStream<String, InputType>,
KStream<String, OutputType>> {
// !!! It could be that changes are needed if spring cloud streams changes the logic
// Method myfunctiondefinition() is needed, because spring cloud streams searches for
// a method with the same name as the class in
// FunctionDetectorCondition:pruneFunctionBeansForKafkaStreams
public Function<KStream<String, InputType>,
KStream<String, OutputType>> myfunctiondefinition() {
return null;
}
// Needed for the interface implementation. Spring cloud streams needs
// the class Function to identify a stream processor candidate.
#Override
public KStream<String, OutputType> apply(KStream<String, InputType> input) {
return null;
}
}
Now that we have all things in place we can register a bean per tenant. We do this within an ApplicationContextInitializer that creates a bean definition with a factory method and iterate over the functions that we will define in the configuration file application.yaml.
public class StreamProcessorInitializer
implements ApplicationContextInitializer<GenericWebApplicationContext> {
#Override
public void initialize(GenericWebApplicationContext context) {
String functionDefinitions = context.getEnvironment()
.getProperty("spring.cloud.function.definition");
String splitter = context.getEnvironment()
.getProperty("spring.cloud.function.definition.splitter");
String factoryName = CaseFormat.UPPER_CAMEL.
.to(CaseFormat.LOWER_CAMEL, MyFunctionDefinitionFactory.class.getSimpleName());
String factoryMethodName =
MyFunctionDefinitionFactory.class.getMethods()[0].getName();
AnnotatedGenericBeanDefinition def =
new AnnotatedGenericBeanDefinition(Myfunctiondefinition.class);
def.setFactoryBeanName(factoryName);
def.setFactoryMethodName(factoryMethodName);
Arrays.stream(functionDefinitions.split(splitter))
.forEach(function -> context.registerBeanDefinition(function, def));
}
}
Finally we can dynamically define functions within the application.yaml. This can be done by helm oder kustomize to configure the specific tenant environment:
#--------------------------------------------------------------------------------------------------------------------------------------
# streaming processor functions (going to be filled by helm)
#--------------------------------------------------------------------------------------------------------------------------------------
spring.cloud.function.definition: <name1>,<name2>,...
#--Note-- required as spring cloud streams has changed the splitter in the past
spring.cloud.function.definition.splitter: ;
# Properties per function (<name>)
spring.cloud.stream.kafka.streams.binder.functions.<name>.applicationId: ${tenant}-${spring.application.name}-<name>
# configuring dlq (if you have one)
spring.cloud.stream.kafka.streams.bindings.<name>-in-0.consumer.deserializationExceptionHandler: sendToDlq
spring.cloud.stream.kafka.streams.bindings.<name>-in-0.consumer.dlqName: ${tenant}-<name>-dlq
# configuring in- and output topics
spring.cloud.stream.bindings.<name>-in-0.destination: ${tenant}-<inputname>
spring.cloud.stream.bindings.<name>-out-0.destination: ${tenant}-<outputname>
I'm experiencing problem with feign client (spring-boot-starter 2.4.4, spring-cloud-starter-openfeign 3.0.2). When I'm trying to send an empty list inside #ModelAttribute annotated object feign client throws feign.codec.EncodeException with NullPointerException cause. Problem does not occur when list has at least one element.
Does anybody know how to properly override feign encoder to enable passing empty list without errors?
You will want to create a class that implements the feign Encoder (https://github.com/OpenFeign/feign/blob/master/core/src/main/java/feign/codec/Encoder.java)
e.g.
public class EnableEmptyListEncoder implements Encoder {
#Override
public void encode(Object object, Type bodyType, RequestTemplate template) {
// empty list encode logic here
}
}
How you point to that encoder depends on your setup:
-- via the application.yml:
feign:
client:
config:
feignName:
encoder: com.example.EnableEmptyListEncoder
-- via a buider:
Feign.builder()
.encoder( new EnableEmptyListEncoder() )
-- a bean in the config class
#Bean
public EnableEmptyListEncoder encoder() {
return new EnableEmptyListEncoder();
}
I am using pub sub integration with spring boot, for which my configuration class look like this:
#Configuration
public class PubSubConfiguration {
#Value("${spring.pubsub.topic.name}")
private String topicName;
#Bean
#ServiceActivator(inputChannel = "MyOutputChannel")
public PubSubMessageHandler messageSender(PubSubTemplate pubsubTemplate) {
return new PubSubMessageHandler(pubsubTemplate, topicName);
}
#MessagingGateway(defaultRequestChannel = "MyOutputChannel")
public interface PubsubOutboundGateway {
void sendToPubsub(String attribute);
}
}
So now, I was calling only sendToPubSub method which add payload into topic from my app, like this:
#Autowired
private PubSubConfiguration.PubsubOutboundGateway outboundGateway;
// used line in my code wherever is needed.
outboundGateway.sendToPubsub(jsonInString);
The above code is just meant for one topic which i loaded from application property file.
But now I wanted to make my topic name is dynamically added into messageSender, how to do that.
To override the default topic you can use the GcpPubSubHeaders.TOPIC header.
final Message<?> message = MessageBuilder
.withPayload(msg.getPayload())
.setHeader(GcpPubSubHeaders.TOPIC, "newTopic").build();
and modify your sendToPubsub(Message<byte[]> message) to use message as input.
Refer for more information
Consider creating a BeanFactory to generate a PubSubMessageHandler Bean given a topic name. PubSubMessageHandler also has a setTopic() method, which may be of use.
I currently have a REST route builder that looks as follows:
rest("/v1")
.post("/create")
.to("bean:myAssembler?method=assemble(${in.header.content})")
.to("bean:myService?method=create(?)");
The bean myAssembler takes raw JSON and transforms this into MyObject. This object is then returned and I want it forwarded onto myService as a parameter for its create method.
How can I do this using Camel?
Your beans will bind automatically to specific parameters like Exchange if you put it as a parameter to a method (see complete list Parameter binding).
One solution would be to define your route and beans like this:
restConfiguration()
.component("restlet")
.bindingMode(RestBindingMode.json)
.skipBindingOnErrorCode(false)
.port(port);
rest("/v1")
.post("/create")
.route()
.to("bean:myAssembler?method=assemble")
.to("bean:myService?method=create");
with beans like this
public class MyAssembler {
public void assemble(Exchange exchange) {
String content = exchange.getIn().getHeader("content", String.class);
// Create MyObject here.
MyObject object; // ...transformation here.
exchange.getOut().setBody(object);
}
}
and this
public class MyService {
public void create(MyObject body) {
// Do what ever you want with the content.
// Here it's just log.
LOG.info("MyObject is: " + body.toString());
}
}
The dependencies for shown configuration are
org.apache.camel/camel-core/2.15.3
org.apache.camel/camel-spring/2.15.3
org.apache.camel/camel-restlet/2.15.3
javax.servlet/javax.servlet-api/3.1.0
org.apache.camel/camel-jackson/2.15.3
org.apache.camel/camel-xmljson/2.15.3
xom/xom/1.2.5
Actually, if last bean returns MyObject, next bean can accept and bind MyObject as first arg. You don't need to put it into Exchange body or anything.
I found this approach from Camel's website which shows how to use #Produce annotation to create an pseudo-method call for sending message to JMS queue:
public interface MyListener {
String sayHello(String name);
}
public class MyBean {
#Produce(uri = "activemq:foo")
protected MyListener producer;
public void doSomething() {
// lets send a message
String response = producer.sayHello("James");
}
}
However, in my scenario, I need the ability to set different JMS queue for different environment. Therefore the JMS queue in:
#Produce(uri = "activemq:foo")
needs to come from a property file rather than hardcoded.
How can I achieve this? Is there any other ways I can use to achieve without using annotation?
Thank you very much.
Read the documentation about using property placeholders
http://camel.apache.org/using-propertyplaceholder.html
When you setup this, then you can use placeholders in the uri string you define with the annotation
#Produce(uri = "activemq:{{myQueue}}")
Use the ProducerTemplate described here:
http://camel.apache.org/producertemplate.html
#Bean
public class MyBean {
#Autowired
ProducerTemplate template
public void doSomething() {
// lets send a message
template.sendBody("your_mq_address", "James");
}
}
Remember to define the template in the camel context:
<camelContext xmlns="http://camel.apache.org/schema/spring" id="camelContext">
<contextScan/>
<template id="template"/>
</camelContext>