I'm trying to refactor some camel routes to use camel-rx instead of camel dsl and I've hit a point where I want to process the events inside an observable but then use the ReactiveCamel class to send the observable to an endpoint based on conditions. For example, it would be useful to map my observable to an object with target route information and then use ReactiveCamel to send to this target.
Is something like that possible or perhaps another way to implement this use case?
Related
I have a java interface that I have to implement that looks like that:
public Flow.Publisher<Packet> getLivePublisher();
This interface must return a Flow.Publisher that stays inactive until it is subscribed to, and the subscriber calls Subscription.next(n).
So far, my implementation looks like
return Source
.fromIterator(() -> new LivePacketIterator())
.async("live-dispatcher")
.runWith(JavaFlowSupport.Sink.asPublisher(AsPublisher.WITHOUT_FANOUT), actorSystem);
Unfortunately, this seems to immediately start getting elements from my LivePacketIterator, even when no subscribers ahve subscribed to the returned Flow.Publisher.
I understand that a Source is just a sort of a template for a Subscribable source of objects (my understanding is that it's like a Factory of Publishers), and that it only converts to a concrete active source once it's materialized. So if I understand correctly, I need to somehow materialize my Source to get a Flow.Publisher. But I want it to be materialized in a way that it only starts running when it is subscribed to.
I've also tried to use toMat()
return Source
.fromIterator(() -> new LivePacketIterator(maximumPacketSize))
.filter(OrderSnapshotPacket::isNotEmpty)
.async(dbDispatcher)
.toMat(JavaFlowSupport.Sink.asPublisher(AsPublisher.WITHOUT_FANOUT), Keep.right())
.???;
But I'm not sure what to do with the resulting RunnableGraph.
Am I understanding this correctly?
Is there a way to do what I'm trying to do?
Unfortunately, this seems to immediately start getting elements from my LivePacketIterator, even when no subscribers ahve subscribed to the returned Flow.Publisher.
What exactly do you observe to state this? I used a very similar snippet to yours:
Flow.Publisher<Integer> integerPublisher =
Source.from(List.of(1,2,3,4,5))
.wireTap(System.out::println)
.async()
.runWith(
JavaFlowSupport.Sink.asPublisher(AsPublisher.WITHOUT_FANOUT),
ActorSystem.create());
This will not start emitting items from the list until the publisher is subscribed to.
I understand that a Source is just a sort of a template for a Subscribable source of objects (my understanding is that it's like a Factory of Publishers), and that it only converts to a concrete active source once it's materialized
Kind of. All Flow.* interfaces are part of reactive streams specification for JVM. Akka Streams treats those interfaces as SPI and doesn't use them directly in its API. It introduces its own abstractions like Source, Flow and Sink. Akka Streams allows you to convert the processing stream expressed in its API to the lower level Flow.* just as you did in your snippet. This is useful if you say want to plugin Akka Streams processing pipeline to some other reactive streams implementation like say RxJava or Project Reactor. So Source is Akka Stream's abstraction that is somehow equivalent to Flow.Publisher, that is, it's a source of potentially infinite number of values. You need to connect Source to a Sink (potentially via a Flow) so that you get a RunnableGraph which you can run. This will set everything in motion and in most cases it will cause chain of subscriptions and elements will start flowing through the stream. But that is not the only option in case of JavaFlowSupport.Sink.asPublisher Sink, running the RunnableGraph will convert the whole Akka Stream to an instance of Flow.Publisher. The semantics here is that the subscription is deferred until something somewhere calls subscribe on that instance. Which is exactly what you're trying to achieve if I understand correctly.
How can I create producer in Spring Cloud Stream Functional Model?
The following version is deprecated now.
#Output(OUTPUT)
MessageChannel outbound();
I know that it is possible to achieve by java Supplier functional class, but it will send message every one second. I don't need it to send every second. I am going to replace REST API with with Kafka.
Are there any ways to do that?
Use the StreamBridge - see Sending data to an arbitrary output.
Here we autowire a StreamBridge bean which allows us to send data to an output binding effectively bridging non-stream application with spring-cloud-stream. Note that preceding example does not have any source functions defined (e.g., Supplier bean) leaving the framework with no trigger to create source bindings, which would be typical for cases where configuration contains function beans. So to trigger the creation of source binding we use spring.cloud.stream.source property where you can declare the name of your sources.
If you want to trigger a stream from an external Kafka topic, you can also bind a spring cloud steam processor’s input to that topic. The stream bridge provides a layer of abstraction that may be cleaner, I.e., your non-stream application does not use the Kafka API directly.
I have a io.grpc.Context that comes from a Publisher. Can I pass write that into the Reactor Context?
Mono.just(io.grpc.Context.current())
.contextWrite(grpcContext ->
reactor.util.context.Context.of(MY_KEY, grpcContext))
Is something like that possible?
The reason is because I have a OnNextOperator hook that I want to listen for this change in grpc.Context
Is something like that possible?
Not if you want to access the context downstream - context injection is part of the subscription signal, which flows upstream from bottom to top. This is in contrast to the other reactive signals (next, error) which flow downstream from top to bottom.
Your only sensible option if you want to read a value like this downstream is to include it as part of the element itself - usually by using zip(), zipWith() or similar.
I recently took over some Java code and there was a method that took in an object and based on some properties of that object, performed some processing on that object.
I was playing around with Apache Camel and was able to define a route that accomplished the same task. Where I am struggling is, how can I find the easiest way to pass an object to the route and execute the logic? What I have right now is a
producerTemplate.sendBody("direct:blah", myObject)
and the route itself defines a
from("direct:blah").process(...)
The above is working fine, albeit a little slower than before.
Is this the simplest way to replace the logic of a method? I was hoping to just be able to grab the route itself and pass an object to it for execution, but I don't see any ways to do this.
You don't necessarily need a from().process(). You can also inject an endpoint to your method. For example:
#Consume(uri = "direct:blah")
public void onFileSendToQueue(String body, #Header("CamelFileName") String name) {
LOG.info("Incoming file: {}", name);
producer.sendBody(body);
}
You can do the same for producers as well. See the Camel pojo messaging for more details.
http://camel.apache.org/pojo-messaging-example.html
I am trying to implement a Camel Component/Processor that takes one input and produces multiple output messages, similar to a Splitter. Like Splitter, the output should go to the next processor/endpoint in the route.
I have looked at Splitter & MulticastProcessor classes in the hope that I can reuse them or use similar logic. The idea, as I understood, is to create an new Exchange for each output and emit them. To do this, I need to provide the endpoint to which output is written to. This works, if I dynamically create the end point within the Processor class; my requirement is to send the output to the end point configured in the route. That is in the route below, mycomponent needs to write (multiple times) to file:output.
<route>
<from uri="file:input"/>
<to uri="mycomponent:OrderFlow?multi.output=true"/>
<to uri="file:output" />
</route>
In case of Splitter, it is instantiated by SplitDefinition class which has access to the output Processor/Endpoint.
a) From within a Processor is it possible to access the configured Output Processor/Endpoint?
b) If not, should I be writing a ProcessorDefinition class for my processor? Any pointers on this would help.
Two solutions suggested below by Petter are,
a) Inject a Producer template
b) Use Splitter component with a method call instead of writing a new component.
I assume you have read this page.
Yes, you can send multiple exchanges from a custom processor, but not really to the next processor in the flow. As in the link above, you can decouple the component implementation by injecting a producer template with a specific destination. You can cut your route into several parts using the direct or seda transport and make your component send the messages there. This way, you can reuse the code in several routes.
This is, as you point out, done in the splitter component (among others) in Camel core. Take a look at the multicastprocessor baseclass for example. However, there processors are aware of the following processors in the route, thanks to the route builder. You custom processor is not that lucky.
You can, non the less, extract that information from the CamelContext. Get hold of your route and there you can find the processors in the route. However, that seems like overcomplicating things.
UPDATE:
Instead of trying to alter the DSL, make use of the already existing DSL and components.
.split().method("mycomponent", "OrderFlow")
Instead of emitting new exchanges, your OrderFlow method just needs to create a List<..> with the resulting messages.