Multifunctional SpringBoot JAVA Application (REST/BATCH/LAMBDA) - java

I have a java spring boot application that runs a job to upload data to Database after polling a message from SQS and this application also contains a REST API over that same database.
Now I need to decouple the upload functionality and REST API.
Upload functionality would be done by an AWS Batch Job which would be triggered by a lambda.
Rest API would be simply as it was before.
Challenge is that I need to do all these operations within the same code repo. This is to avoid having 3 repositories one for REST API, another for the AWS Batch Job, and the last for AWS lambda handler.
Thus trying to find out solutions that spring boot can provide to run a same application in different modes. Please help.

I won't recommend using Spring Boot for lambda - technically you can, but it's waste of money. Spring Boot is overhead for java, it requires more memory, so it's more expensive.
You need to create a multi-module Maven application. The modules would be:
Existing Spring Boot app.
Batch job.
Common code, used by modules 1 & 2.
Simple new lambda.
... more modules if you need ...
But if you still sure that for some reason you want to wrap existing Spring Boot app into lambda, this library would help you:
https://github.com/awslabs/aws-serverless-java-container/wiki/Quick-start---Spring-Boot

Related

Create traceID in a non-Spring application

Small question on how to create traceID, but when the app is not Spring-based please.
My application is the first, the initiator of a HTTP call. Therefore, the app can be considered as client.
The destinations, the servers, are all Spring Boot Spring Cloud based web applications. I would like to emphasize, while the servers are Spring-based, me, the client, I am not a Spring Boot app.
While my app, being a non-Spring app, I do use the Spring Webflux WebClient in order to create the HTTP requests to those servers. To emphasize, it is not because I use the Spring Webflux WebClient, that makes the app a Spring app!
Since I am the first of the call chain, I would like to create some kind of traceID, so the subsequent services will carry the traceID I created.
I am puzzled as what should come inside this piece of code that I tried:
final var response = webClient.post().uri("http://some-third-party-api.com/someroute").header("X-B3-TraceId", "How to create a traceID?").body(BodyInserters.fromValue(payload)).retrieve().bodyToMono(String.class).block();
Therefore, I would like to ask, being the first, the HTTP call initiator, using a Spring Webflux WebClient, but in a non-Spring app, how to create those traceID so the subsequent services get the one that I created?
By default Sleuth is using OpenZipkin's tracing library called Brave. If your application is java-based, you can use Brave, if not, you can find official implementations for other platforms in the
OpenZipkin org or official and non-official ones in https://zipkin.io/pages/tracers_instrumentation.html

Spring Batch Processing using Spring Cloud data flow

I have spring batch app and I want an admin portal to manage failed jobs and see other job related activities. I saw there was some Spring batch Admin portal package in spring, but it has been deprecated in 2017 and I have to use Spring cloud data flow as mentioned here. I want to know for Spring cloud data flow, is this some dependency we need to add to project as an artifact or is this some separate standalone service that needs to be set up?
My batch has dozen of cron jobs, can I just give my jar to Cloud Data Flow and it will take care of rest or Do I need to configure each and every job there? Any sample for the same are appreciated, as I want to know how big will be an effort to set up all this.
On the side note : My app is a combination of some REST controllers and some batch jobs.So does it make sense to use cloud data flow? If not, then is there better console manager for batch jobs(like restart ,cancel jobs portal) etc.?
Spring Data Flow requires a server to be running, where you can deploy your jars (register tasks as wrapper over spring batch). This server will be responsible for orchestration and deploying to runtime. If you have massive work load you probably need to go with cluster and Kubernetes, which supports scheduling via cron, but if now you have a single server that handles all together and you don't have performance issues, you may simplify it by using Local mode. But with Local mode you have to manage scheduling anyway by yourself with Quartz for example.
https://dataflow.spring.io/docs/feature-guides/batch/scheduling/
So just having SCDF for monitoring may be complicated and probably requires re-think your application design. Also SCDF as I see is good when you have some dependencies between tasks.
Maybe it would be easier for you to write couple of REST endpoint to fetch failed jobs and re-run them - everything depends on what you need and how big your app is.
PS:
I'm currently also thinking between having just spring batch monolith or cloud ready tasks :)

Dependency injection/ORM in Vertx for Java

I am learning vertx framework with Java, and I was wondering if there is any "framework" such as Spring Core to perform dependency injection or a library ?
And also, I was looking for an ORM to interact with a Relational Database (eg. Hibernate, Spring Data in Spring.
Thank you for you recommendation !
You can use an integration between Spring and Vert.x in your project:
You can see examples here:
https://www.baeldung.com/spring-vertx
https://github.com/vert-x3/vertx-examples/tree/master/spring-examples
The general idea is to use Spring for configuring your application and use all its powerful annotations and dependency injection features and use Vert.x for for creating http server to handle your requests using Vert.x reactive model.
But if you find yourself writing all your code for handling requests inside an executeBlocking (for example, if you are using Spring Data and all your requests retrieve from DB) please don't do that. Instead try to find alternative asynchronous ways for doing things (for example, for DB you can use Vert.x async clients).

Spring Integration microservices visualization tool

I have an application based on microservices which communicate to each other via queues and topics. Each microservice is built using Spring Integration with XML configurations.
Is there a tool/framework that I could use to automatically generate diagram for the whole application, which would ideally show Spring Integration details for each microservice and as well the connection (via queues/topics) between the microservices?
Each application can be visualized using the spring integration runtime graph together with a viewer application.
See the file-split-ftp sample for an example. The viewer mentioned on the README page is in the the angular 1.x branch of the referenced spring-flo project.
Also see Tim Ysewyn's blog post here "VISUALIZING YOUR SPRING INTEGRATION COMPONENTS & FLOWS".
If you create your microservices using Spring Cloud Stream and deploy/orchestrate them using Spring Cloud Dataflow, you can visualize them at a higher level; examples are in the reference manual.

spring boot microservice framework how to call another microservice from one microservice

I am trying to build a new application with spring boot microservice framework. I have tried some demo. The existing demo is too simple, doesn't introduce how to call another service from one service. Should still going through http, or should going through RPC? If going RPC, which RPC framework support?
The way of integrating among services depends on numerous factors, like synchronicity/asynchronicity, load that will be generated, etc. The most popular (I guess) way of integration is REST-based one. Because you tagged your question with spring I would recommend using declarative REST client - Feign that is very well described here. You can use message brokers as well, which are also very well abstracted by Spring Cloud Stream - you can read more here. I think that more in depth discussion should be based on your needs.
If another micro-services are exposing the REST API , then you can simple use jersey client
or httpclient to call them.

Categories

Resources