I made this spring batch project (csv to database, classic) - works fine: https://github.com/Tyvain/Spring-Batch-Generic-Bulk
On a separate project, I made an app with vaadin where I can upload a file works fine:
https://github.com/Tyvain/vaadin-simple-upload-file
-> We need to trigger the job when a file is uploaded.
So I regrouped the 2 projects into one app, with these steps:
disable batch: job.enabled: false (in application.yml)
add required vaadin librairies to the pom
add the view (MainView.java).
Sources of this modifications: https://github.com/Tyvain/Spring-Batch-Generic-Bulk/tree/include-vaadin-upload-page
At this point, I am still not sure how to launch the job, but I face another problem: when I launch the app, it stops. The server, that stay up in the vaadin app, does not stay up here.
Edit:
when I remove those annotation from my job configuration, the server stays up:
//#Configuration
//#EnableBatchProcessing
1/ Is it possible to keep my server running with spring-batch enable?
2/ is this a wrong practice ?
Solution working: https://github.com/Tyvain/ProcessUploadedFile-Vaadin_SpringBatch
When you run a Spring Batch job from within a web server, you need to setup a JobLauncher that runs your job asynchronously in the background. This is explained in details here: https://docs.spring.io/spring-batch/4.0.x/reference/html/job.html#runningJobsFromWebContainer
You would need to use an asynchronous TaskExecutor implementation (such as ThreadPoolTaskExecutor). See example here: https://docs.spring.io/spring-batch/4.0.x/reference/html/job.html#configuringJobLauncher
If you want to process uploaded file right after it has been uploaded, the spring-batch is not the right approach. I would recommend to process it in background thread after upload using #Async and CompletableFuture, see more info here: Spring #Async with CompletableFuture
Related
I have a regular Java/Spring Batch job that runs every night to get data from one database and insert/update in my project's database. This is working fine in the current setup where it is deployed on Tomcat.
Now I need to separate it out and run it on an Azure WebJob. What will be a good approach?
Can I use Spring Boot for this purpose?
But I am not sure how it will work. I mean, I can create a JAR of my project (Job written using Spring Boot) and copy it on a Azure WebJob. Then have a batch file with "java -jar..." but:
wouldn't it be like running and deploying the Spring Boot App with it's inbuilt web-server that will continue to run once I run it?
secondly, the next time the batch file is executed by Azure WebJob as per the schedule I set it will try to run the Spring Boot App again and I will probably get bind exception since the port is already in use from the first run.
Would appreciate if somebody can help me in doing this.
Thank you.
wouldn't it be like running and deploying the Spring Boot App with it's inbuilt web-server that will continue to run once I run it?
A Spring Boot app can be a non web app, and a good example is a Spring Boot batch app without a dependency to a servlet container.
You can create a Spring Boot app that runs your Spring Batch job and then stops when the job is finished without the need to deploy it in a (embedded) Tomcat. You can find an example here: https://spring.io/guides/gs/batch-processing/
secondly, the next time the batch file is executed by Azure WebJob as per the schedule I set it will try to run the Spring Boot App again and I will probably get bind exception since the port is already in use from the first run.
Once you have a script to run your app with java -jar mybatchapp.jar, you can use Azure Scheduler to run your job when you want. Since your batch app does not contain/start an embedded servlet container, you won't get a port conflict.
I have a Spring Integration route (made via DSL) that polls the files from a specific folder (as shown in Polling from file using Java DSL - compile error when adding Files.inboundAdapter) and sends to Rabbit.
When I configured the flow as explained in the link above, it starts on configuration stage already. I, however, would like to start it in runtime, later, since I need to connect to Rabbit first.
How can I configure IntegrationFlow to be started/stopped later on demand?
Add autoStartup(false).
e -> e.poller(Pollers.fixedDelay(5000))
.autoStartup(false)
then flow.start() when you are ready.
I would like Flyway to run whenever I deploy a new war to my server.
Does flyway automatically get run when a server is deployed? Do I have to always automate a script which would then the flyway migration command? Or what is the best way to do this?
Server:
The server is a Java Tomcat Server running on Elastic Beanstalk (AWS) that is connected to a MySQL database.
Deployment Process
We run our sql migration scripts on the database manually. Then we upload a new war of the server to Elastic Beanstalk.
This can be useful:
Auto-migration on startup : https://flywaydb.org/documentation/api/
So for Java all it takes is to create scripts (eg. V1__initial_schema.sql, ...), put them under /src/main/resources/db/migration/
and then:
Flyway flyway = new Flyway();
flyway.setDataSource(...);
flyway.migrate();
As the comments said, there may be multiple ways to do this.
ServletContextListener
One common way is to use the hook defined by the Java Servlet spec for being notified when your web app is launching and shutting-down. That hook is the ServletContextListener interface. Add a class to your project implementing the two methods in this interface, one for launch and one for shutdown. In the launch method, run your Flyway code.
The word “context” is the technical term meaning your web app.
contextInitializedYour web app is launching. No incoming web request has yet been handled, and will not be handled until your implementation of this method completes. Run your Flyway migrations here.
contextDestroyedYour web app is shutting down. The last remaining web request has been serviced, and no more will be accepted.
Annotating this class with #WebListener is the easiest of multiple ways to get your Servlet container to register an instance.
Pretty easy.
Your ServletContextListener is guaranteed to be called and run to completion before the first execution of any Servlet (or Filter) in your web app. So this is the perfect place to do setup work that you want finished before your servlets go to work. Flyway seems like a natural fit to me.
Search Stack Overflow for “ServletContextListener” to learn more and see examples, such as my own Question & Answer.
Handling failure
Be aware that stopping a web app’s deployment when something goes wrong (when your ServletContextListener encounters an Exception) is not well-defined in the Servlet spec.
An example might be your Flyway migrations failing for some reason, such as not able to connect to database. At that point you might want to halt deployment of your web app.
See my own Question and Answer and the group of related questions I list in that answer. Tomcat 8.0.33 halts the deployment, and un-deploys the web app, but unfortunately does not report the offending Exception (or at least I could not find any such report in the logs nor in the IDE console while in development mode). The behavior of other Servlet containers may vary.
I have a Spring Boot application, which shows some data from MongoDB on an AngularJS page and allows the user to change it.
Now I need to create a mechanism, which allows me to
run a long (1-3 hours) Java method,
which produces some files and
observe its state via web (does it run, is it finished, did it crash?).
Can I implement this in scope of the Spring Boot application? If yes, what parts of Spring can I use to do that?
I would argue that it's not good idea to embed batch processing into your service exposing MongoDB data store.
I would create separate batch application. Spring Batch would be natural choice if you are using Spring stack. You would need to figure out how you want to host Spring Batch job and how you want to trigger and schedule it. Spring Batch needs SQL storage for it's metadata.
Status of the batch processing could be monitored by one other application with Spring Batch Admin module running on Servlet container. If you point this application to SQL storage of Spring Batch job application, you get monitoring of status via web UI out of the box.
Of course it could run each app with Spring Boot.
If you don't want to handle this operational complexity it brings to host there three applications, yous still can all three embed into one and it would work fine with Spring Boot. You could even execute jobs with parameters manually or restart them via Spring Batch Admin configured to have access to Spring Batch Job beans.
You could also explore using MongoDb as storage for Spring Batch metadata. E.g. this project may help. But such mechanism would need to be used by Spring Batch application and also by Spring Batch Admin module visualizing status of processing.
I'm trying to implement the failover strategy when executing jbpm6 processes. My setup is the following:
I'm using jbpm6.2.0-Final (latest stable release) with persistence enabled
I'm constructing an instance of org.kie.spring.factorybeans.RuntimeManagerFactoryBean with type SINGLETON to get KSession to start/abort processes and complete/abort work items
all beans are wired by Spring 3.2
DB2 is used a database engine
I use Tomcat 7.0.27
In the positive scenario everything is working as I expect. But I would like to know how to resume the process in the case of server crash. To reproduce it I started my process (described as BPMN2 file), got at some middle step and killed the Tomcat process. After that I see uncompleted process instance in the PROCESS_INSTANCE_INFO table and uncompleted work item in the WORK_ITEM_INFO table. Also there is a session in the SESSION_INFO table.
My question is: could you show me the example of code which would take that remaining process and resume it starting from the last node (if it is possible).
Update
I forgot to mention that i'm not using jbpm-console, but I'm embedding jbpm into my javaee application.
If you initialize your RuntimeManager on init of your application Server it should take care of reloading and resuming the processes.
You need not worry about reloading it again by yourself.