Start Spring Integration route on demand, not during context initialization - java

I have a Spring Integration route (made via DSL) that polls the files from a specific folder (as shown in Polling from file using Java DSL - compile error when adding Files.inboundAdapter) and sends to Rabbit.
When I configured the flow as explained in the link above, it starts on configuration stage already. I, however, would like to start it in runtime, later, since I need to connect to Rabbit first.
How can I configure IntegrationFlow to be started/stopped later on demand?

Add autoStartup(false).
e -> e.poller(Pollers.fixedDelay(5000))
.autoStartup(false)
then flow.start() when you are ready.

Related

Spring batch - Keep the server running

I made this spring batch project (csv to database, classic) - works fine: https://github.com/Tyvain/Spring-Batch-Generic-Bulk
On a separate project, I made an app with vaadin where I can upload a file works fine:
https://github.com/Tyvain/vaadin-simple-upload-file
-> We need to trigger the job when a file is uploaded.
So I regrouped the 2 projects into one app, with these steps:
disable batch: job.enabled: false (in application.yml)
add required vaadin librairies to the pom
add the view (MainView.java).
Sources of this modifications: https://github.com/Tyvain/Spring-Batch-Generic-Bulk/tree/include-vaadin-upload-page
At this point, I am still not sure how to launch the job, but I face another problem: when I launch the app, it stops. The server, that stay up in the vaadin app, does not stay up here.
Edit:
when I remove those annotation from my job configuration, the server stays up:
//#Configuration
//#EnableBatchProcessing
1/ Is it possible to keep my server running with spring-batch enable?
2/ is this a wrong practice ?
Solution working: https://github.com/Tyvain/ProcessUploadedFile-Vaadin_SpringBatch
When you run a Spring Batch job from within a web server, you need to setup a JobLauncher that runs your job asynchronously in the background. This is explained in details here: https://docs.spring.io/spring-batch/4.0.x/reference/html/job.html#runningJobsFromWebContainer
You would need to use an asynchronous TaskExecutor implementation (such as ThreadPoolTaskExecutor). See example here: https://docs.spring.io/spring-batch/4.0.x/reference/html/job.html#configuringJobLauncher
If you want to process uploaded file right after it has been uploaded, the spring-batch is not the right approach. I would recommend to process it in background thread after upload using #Async and CompletableFuture, see more info here: Spring #Async with CompletableFuture

Dynamic Spring Boot-Integration configuration

I would like to migrate a multi-threaded application in JSE to Spring Integration but I have to clarify some points before. First of all, the application will have the following Spring integration components:
JMS to Transformer to router to TCPOut
TcpIn (to router) to Transformer to JMS
In this context, I have to load all the TCP connections dynamically from a configuration file. I saw a couple of example of this here in StackOverflow (based in the FTP sample). These samples could be enough for the first part but I am looking for how to do that in Spring Boot and what is the best (and elegant) way to create this type of configuration.
Finally, I have to access to each different context (this is maybe the most important) from a type of Swing monitor to start/stop manually this TCP connections. Is this possible? What do you suggest me to do?
All my current components are java based configuration (not DSL).
See my answers to this question and its follow-up for examples of how to dynamically create application contexts using Java Configuration.
Also, take a look at the new feature in the Java DSL for dynamically registering/removing integration flows with the context. The 1.2 version of the DSL, containing this feature, will be released shortly.
You can stop/start endpoints using JMX or a control bus, or programmatically.

how dynamic create ftp adapter in spring integration?

Thanks for attention
i used spring integration in my project, i want to retrieve many input file from multiple ftp server with different address as bellow image:
how to create dynamically inbound-adapter in my project to polling and retrieve files from servers?
If you're allowed to use non-"General Availability" (GA) versions of 3rd party libraries (e.g. release candidates (RC) or milestones (M)) in your project, then you can utilize version 5.0.0.M2 of Spring Integration. It is the latest published version as of Mar 09 '17.
Starting from 5.0, Spring Integration includes Java DSL Runtime flow registration feature. It allows you to define integration flows (including inbound adapters) just like you do it in standard beans but it can be done at any runtime moment.
All you need to use it are these 3 steps:
Get IntegrationFlowContext bean from Spring context, e.g. by means of autowiring:
#Autowired
public MyClass(IntegrationFlowContext flowContext) {
this.flowContext = flowContext;
}
Build new flow with it, for example:
IntegrationFlowRegistration registration = flowContext
.registration(IntegrationFlows // this method accepts IntegrationFlow instance
.from(s -> s.ftp(ftpSessionFactory())
.localFilter(localFileFilter())
//other actions
.get()) // standard end of DSL flow building process
.autoStartup(true) // not required but can be useful
.register(); // this makes the flow exist in the context
When it's time to remove dynamically created flow, just refer to IntegrationFlowContext again with registration id you've got in the previous step:
// retrieve registration ID from the object created above
String dynamicFlowRegistrationId = registration.getId();
// the following will also gracefully stop all the processes within the flow
flowContext.remove(dynamicFlowRegistrationId);
There is also a DynamicTcpClient sample available on GitHub.
See the dynamic-ftp sample. While it only covers the outbound side, there are links in the README to discussions about what needs to be done on the inbound side (put each adapter in a child context that send messages to a channel in the main context).
Also see my answer to a similar question for multiple IMAP mail adapters using Java configuration and then a follow-up question.
You should be able to use the technique used there.

how to make persistent JMS messages with java spring boot application?

I am trying to make a queue with activemq and spring boot using this link and it looks fine. What I am unable to do is to make this queue persistent after application goes down. I think that SimpleJmsListenerContainerFactory should be durable to achieve that but when I set factory.setSubscriptionDurable(true) and factory.setClientId("someid") I am unable to receive messages any more. I would be greatfull for any suggestions.
I guess you are embedding the broker in your application. While this is ok for integration tests and proof of concepts, you should consider having a broker somewhere in your infrastructure and connect to it. If you choose that, refer to the ActiveMQ documentation and you should be fine.
If you insist on embedding it, you need to provide a brokerUrl that enables message persistence.
Having said that, it looks like you misunderstand durable subscriber and message persistence. The latter can be achieved by having a broker that actually stores the content of the queue somewhere so that if the broker is stopped and restarted, it can restore the content of its queue. The former is to be able to receive a message even if the listener is not active at a period of time.
you can enable persistence of messages using ActiveMQConnectionFactory.
as mentioned in the spring boot link you provided, this ActiveMQConnectionFactory gets created automatically by spring boot.so you can have this bean in your application configuration created manually and you can set various property as well.
ActiveMQConnectionFactory cf = new ActiveMQConnectionFactory("vm://localhost?broker.persistent=true");
Here is the link http://activemq.apache.org/how-do-i-embed-a-broker-inside-a-connection.html

CamelContext doesn't startup if one route is misconfigured

We use Java DSL to configure our routes. All configurations for routes are in a db table and can be configured via a GUI.
How is it possible to ensure that the camelContext starts up even if a route is misconfigured (e.g. .to(invalidurl or typo) in a route or simply a bug in a route)?
Is there a possibilty to validate the routes before starting or maybe better some parameters/options which can be set on the context itself?
You can configure the routes with .autoStartup(false), and then start the routes manually when CamelContext has been started up.
To validate its really depending on what kind of component it is. If its some database component you can write some code that does a SQL query to see if the is valid user login or something.
To validate that an endpoint uri is misconfigured, then that is harder as they have a ton of options. But this is getting improved from Camel 2.16 onwards where we have during build time some tooling that generates a json schema file with the options, then we could potentially leverage that during parsing the routes to check for invalid configuration before attempting to create the endpoints which could detect errors sooner, and even also with IDE plugins or other 3rd party tooling.
Can you just before adding every route to the context, add it to a separate "test" context individually, and see if it spins up or fails; then based on that add it to your real context?

Categories

Resources