I have seen in many cases the .handle("someBean", "someMethod") EIP method, playing vast role in the integration flows. I can understand that it is just a Service Activator in the former XML Config, but I need some clarification on how to create this bean and what does the someMethod return. Also, in which cases do I have to use .handle(...)? Maybe a completed example using Java DSL should work for me.
As noticed correctly the .handle("someBean", "someMethod") is fully equal to the <int:service-activator ref="someBean" method="someMethod"/>: https://docs.spring.io/spring-integration/reference/html/messaging-endpoints-chapter.html#service-activator-namespace.
That means that you should have someBean definition for service invocation in the someMethod. For example you need to perform simple logic to convert payload of the incoming message to the upper case and return the result:
class MyService {
public String someMethod(String payload) {
return payload.toUpperCase();
}
}
The return of this method becomes as a payload of the outbound message to the next EIP endpoint in your IntegrationFlow definition.
Everything what you see in the Reference Manual is fully true for Java DSL. In particular all the rules for the <service-activator> or #ServiceActivator are applied for this .handle().
Related
My question is related to finding a best practice to include data persistence inside an integration flow while returning the Message object so that it can be further processed by the flow.
Let's consider the following flow:
#Bean
IntegrationFlow myFlow() {
return flowDefinition ->
flowDefinition
.filter(filterUnwantedMessages)
.transform(messageTransformer)
.wireTap(flow -> flow.trigger(messagePayloadPersister)) <--- here is the interesting part
.handle(terminalHandler);
}
The wide majority of cases, instead of the wireTap I have seen in some projects, a Transformer is used to persist data, which I do not particulary like, as
the name implies transformation of a message, and persistence is something else.
My wish is to find out alternatives to the wireTap, and a colleague of mine proposed using #ServiceActivator:
#Bean
IntegrationFlow myFlow() {
return flowDefinition ->
flowDefinition
.filter(filterUnwantedMessages)
.transform(messageTransformer)
.handle(messagePayloadPersister)
.handle(terminalHandler);
}
#Component
class MesssagePayloadPersister {
#ServiceActivator <--- interesting, but..
public Message handle(Message<?> msg) {
//persist the payload somewhere..
return message;
}
}
I like the flow, it looks clean now, but also I am not 100% happy with the solution, as I am mixing DSL with Spring.
Note: org.springframework.messaging.MessageHandler is not good because the handle method returns void so it is a terminal part to the flow. I need a method that returns Message object.
Is there any way to do this?
Need to understand what you are going to do with that persisted data in the future.
And what information from the message you are going to store (or the whole message at all).
See this parts of documentation - may be something will give you some ideas:
https://docs.spring.io/spring-integration/docs/5.3.2.RELEASE/reference/html/system-management.html#message-store
https://docs.spring.io/spring-integration/docs/5.3.2.RELEASE/reference/html/system-management.html#metadata-store
https://docs.spring.io/spring-integration/docs/5.3.2.RELEASE/reference/html/message-transformation.html#claim-check
https://docs.spring.io/spring-integration/docs/5.3.2.RELEASE/reference/html/core.html#persistent-queuechannel-configuration
https://docs.spring.io/spring-integration/docs/5.3.2.RELEASE/reference/html/jdbc.html#jdbc-outbound-channel-adapter
With the last one you may need to consider to use a publishSubscribeChannel() of the Java DSL to be able to store in the DB and have a second subscriber to continue the flow.
I want to be involved in a reactive programming world with Spring. As I realised, it gives me a choice between two different paradigms: the annotation-based (with well-known to us #Controller, #RequestMapping) and the reactive one (which is intended to resolve an "Annotation Hell").
My problem is a lack of understanding how a typical reactive controller will look like. There are three conceptual interfaces, which I can use in my controller class:
HandlerFunction<T> (1) - I define a method for each specific ServerRequest
which returns a concrete HandlerFunction<T> instance, then register these methods with a router. Right?
RouterFunction (2) and FilterFunction (3) - Is there a specific place where all RequestPredicates with corresponding HandlerFunctions should be placed? Or can I do it separately in each controller (as I used to do with the annotation approach)? If so, how then to notify a global handler (router, if any?) to apply this router part from this controller?
It's how I see a reactive controller "template" by now:
public class Controller {
// handlers
private HandlerFunction<ServerResponse> handleA() {
return request -> ok().body(fromObject("a"));
}
// router
public RouterFunction<?> getRouter() {
return route(GET("/a"), handleA()).and(
route(GET("/b"), handleB()));
}
// filter
public RouterFunction<?> getFilter() {
return route(GET("/c"), handleC()).filter((request, next) -> next.handle(request));
}
}
And, finally, how to say that it is a controller, without marking it with the annotation?
I've read the Spring reference and all posts related to this issue on the official blog. There is a plenty of samples, but all of them are pulled out of context (IMHO) and I can't assemble them into a full picture.
I would appreciate if you could provide a real-world example and good practices of how to organise interactions between these functions.
This is not a real world example, but so far Is how I view some kind of organization on this:
https://github.com/LearningByExample/reactive-ms-example
As far as I concerned:
RouterFunction is the closest analogue to #Controller (#RequestMapping precisely) in terms of new Spring approach:
Incoming requests are routed to handler functions with a
RouterFunction (i.e. Function>). A router function evaluates to a
handler function if it matches; otherwise it returns an empty result.
The RouterFunction has a similar purpose as a #RequestMapping
annotation. However, there is an important distinction: with the
annotation your route is limited to what can be expressed through the
annotation values, and the processing of those is not trivial to
override; with router functions the processing code is right in front
of you: you can override or replace it quite easily.
Then instead of Spring Boot SpringApplication.run in main method your run server manually by :
// route is your route function
HttpHandler httpHandler = RouterFunctions.toHttpHandler(route);
HttpServlet servlet = new ServletHttpHandlerAdapter(httpHandler);
Tomcat server = new Tomcat();
Context rootContext = server.addContext("",
System.getProperty("java.io.tmpdir"));
Tomcat.addServlet(rootContext, "servlet", servlet);
rootContext.addServletMapping("/", "servlet");
tomcatServer.start();
There are both reactive and non-reactive approach. It's illustrated on Spring github
I have it like this currently:
.route("headers.STATE", new Consumer<RouterSpec<ExpressionEvaluatingRouter>>() {
#Override
public void accept(RouterSpec<ExpressionEvaluatingRouter> spec) {
spec
.channelMapping(ProcStatus.NORMAL_OPERATION.toString(), "primaryChannel")
.channelMapping(ProcStatus.FAILED_OVER.toString(), "secondaryChannel")
.channelMapping(ProcStatus.UNKNOWN.toString(), "stateRetrievalChannel");
}
})
But it's not really a header value router per se right? I can't seem to set HeaderValueRouter as the routing spec and just give the name of the header on the first param.
Plus i couldn't find a default channel mapping on the spec. Thanks for the help!
To be honest the <header-value-router> does not make sense since introduction of SpEL router, where you can simply configure it like expression="headers.STATE", like in your config for Java DSL.
Everything else is the same for any kind of Router implementation.
See more in the reference manual.
And, yes, you can use HeaderValueRouter directly as well:
.route(new HeaderValueRouter("STATE"), new Consumer<RouterSpec<ExpressionEvaluatingRouter>>() {
#Override
public void accept(RouterSpec<ExpressionEvaluatingRouter> spec) {
spec
.channelMapping(ProcStatus.NORMAL_OPERATION.toString(), "primaryChannel")
.channelMapping(ProcStatus.FAILED_OVER.toString(), "secondaryChannel")
.channelMapping(ProcStatus.UNKNOWN.toString(), "stateRetrievalChannel");
}
})
But as you see the .channelMapping() remains the same.
As for "default channel mapping". I think you just mean default-output-channel, which we have in the XML configuration.
If you noticed no one component in the SI Java DSL has an output-channel option (the default-output-channel plays the same role). We just propagate the next .channel() definition in the IntegrationFlow to the current outputChannel-aware component. So, to map the default-output-channel for the .route() you should just go ahead in the method-chain with the IntegrationFlow definition. Like this:
.route()
.handle()
So, if routing condition doesn't meet any .channelMapping() and resolutionRequired == false, the message will be send to the next .handle() through the implicit DirectChannel between them.
I am implementing a message translator pattern with Apache Camel, to consume messages from a RESTful endpoint and send them onward to an AMQP endpoint.
The enclosing application is based on Spring Boot, and so I'm using Camel's "spring-boot" component to integrate the two frameworks. As suggested by the documentation in this spring-boot link, I'm implementing my Camel route inside of a #Configuration-annotated class which extends RouteBuilder:
#Component
public class MyRestToAmqpRouter extends RouteBuilder {
#Override
public void configure() throws Exception {
from("jetty:http://my-restful-url")
.process(exchange -> {
// convert the message body from JSON to XML, take some
// incoming header values and put them in the outgoing
// body, etc...
}).to("rabbitmq://my-rabbitmq-url");
}
}
My question involves how to go about unit-testing this translation, without needing an actual RESTful endpoint or configured RabbitMQ broker? I've read many online examples, as well as the Camel in Action book... and it seems like the typical approach for unit testing a Camel route is to cut-n-paste the route into your unit test, and replace one or more endpoint URL's with "mock:whatever".
I guess that sorta works... but it's awfully brittle, and your test suite won't recognize when someone later changes the real code without updating the unit test.
I've tried to adapt some Spring-based unit testing examples with mocks, like this:
#RunWith(CamelSpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = {Application.class})
public class MyRestToAmqpRouterTest extends AbstractJUnit4SpringContextTests {
#Produce(uri = "jetty:http://my-restful-url")
private ProducerTemplate fakeRest;
#EndpointInject(uri = "rabbitmq://my-rabbit-url")
private MockEndpoint fakeRabbit;
#Test
#DirtiesContext
public void testRouter() throws InterruptedException {
fakeRabbit.expectedMessageCount(1);
fakeRest.sendBodyAndHeader("", "header-1", "some value");
fakeRabbit.assertIsSatisfied();
}
}
My hope was that Camel would take those endpoint URLs from the unit test, register them as mocks... and then use the mocks rather than the real endpoint when the real code tries to use those URLs.
However, I'm not sure that this is possible. When I use the real URLs in the unit test I get IllegalArgumentException's, because you apparently can't inject a "real" endpoint URL into a MockEndpoint instance (only URLs prefixed with "mock:").
When I do use a "mock:..." endpoint URL in my unit test, then it's useless because there's nothing tying it to the real endpoint URL in the class under test. So that real endpoint URL is never overridden. When the real code is executed, it just uses the real endpoint as normal (and the goal is to be able to test without an external dependency on RabbitMQ).
Am I missing something on a really fundamental level here? It seems like there would be a way for unit tests to inject fake routes into a class like this, so that the code under test could switch from real endpoints to mock ones without even realizing it. Alternatively, I suppose that I could refactor my code so that the anonymous Processor were elevated to a standalone class... and then I could unit test its translation logic independently of the route. But that just seems like an incomplete test.
Some pointers what you may do.
You can read the Camel book again about testing, and pay attention to using advice with
http://camel.apache.org/advicewith.html.
And there is also mockEndpointsAndSkip
http://camel.apache.org/mock.html
And you can also use the stub component
http://camel.apache.org/stub
Or use property placeholders in your routes, and then configure the uris to be mock/stub etc for testing, and use the real ones for production
http://camel.apache.org/using-propertyplaceholder.html
Trying to figure out how to best unit test an http:outbound-gateway in a Spring Integration workflow.
Here's what our gateway looks like:
<int-http:outbound-gateway id="gateway"
request-channel="registrationQueue"
message-converters="jsonMessageConverter"
url-expression="#urlGenerator.resolve()"
http-method="POST"
expected-response-type="javax.ws.rs.core.Response"
reply-channel="nullChannel"
error-handler="httpResponseErrorHandler"/>
Specifically, we want to..
Assert serialization of the objects being sent; do the message-converters correctly process messages coming from the request-channel?
Verify response handling from the 3rd party service; what is the behavior given various responses (expected & unexpected) and errors (internal & external)?
We've got a number of unit tests that mock out the end points and assert the steps of our integration workflow behave as expected. Something like the following:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations = {"classpath:test-config.xml"})
public class FileRegistrationWorkflowTest {
...
#Autowired
private MessageChannel fileFoundChannel;
#Autowired
private QueueChannel testRegistrationQueue;
...
#Test
public void shouldQueueRegistrationForFileWithEntityId() {
// Given
mockFileLookupService(FILE_ID, FILENAME_WITH_ENTITY_ID);
// When
fileFoundChannel.send(MessageBuilder.withPayload(FILE_ID).build());
// Then
Message<?> message = testRegistrationQueue.receive();
assertThat(message, hasPayload(expected));
}
}
This method of testing works great for the steps along the workflow. Our trouble is testing the the end point gateways..
We can't mock the http:outbound-gateway, then we aren't testing it.
We don't want to deploy a real HTTP service to interact with, that's more an integration test.
The 3rd party service is only resolved by the url-expression, so there isn't a Spring bean to mock out.
Perhaps we can intercept the HTTP request Spring tries to send?
In the framework tests we use a DirectFieldAccessor to replace the endpoint's RestTemplate with a mock (actually a stub). However, this doesn't test the converters.
You can get even more sophisticated, where the real RestTemplate can be tested; just get a reference to it (with the SI TestUtils.getPropertyValue() or a DirectFieldAccessor) and configure it as discussed in the Spring Framework documentation.
You can get a reference to the handler with bean name endpointId.handler.