We are currently planning to migrate our production application to Microservices. Our application is currently implemented using Scala and Akka, because it uses high concurrency to process items that are delivered via MQ, this is a Messaging application (delivers mails and sms). We have a few services exposed with REST, but the main processing is done via MQ and some other tasks by doing batch processing, the question here is if the MQ part can be migrated to a RESTful approach, and how to achieve that grade of high concurrency, if it cannot be created as REST resource can it be kept as a Microservice which will be continuously listening to a MQ (JMSListener), as it is now? Thanks.
Related
I have a Lagom service which writes events for persistent entities to cassandra.
I would like to process those events from another application (not lagom-based) by embedding a read-side processor which would connect to the same cassandra.
I did not find any documentation on how to embed lagom services or read-side processors into existing java/scala applications, is it possible?
It isn't recommended to have multiple services sharing the same database. This causes tight coupling between the services and can make it hard to upgrade in the future.
Instead, you can use the Lagom Message Broker Topic Producer API to publish events to a Kafka topic. Then, you can consume these from another application, either using Lagom with the Message Broker Consumer API, or using any other Kafka client.
My current understanding is that both of these projects are under Spring Cloud Dataflow, and serve as components of the pipeline. However, both can be made recurring (a stream is by definition recurring, where a task can run every certain time interval). In addition, both can be configured to communicate with the rest of the pipeline through the message broker. Currently there is this unanswered question, so I've yet to find a clear answer.
Please see my response as below:
My current understanding is that both of these projects are under Spring Cloud Dataflow, and serve as components of the pipeline.
Both Spring Cloud Stream and Spring Cloud Task are not under Spring Cloud Data Flow, instead, they can be used as standalone projects and Spring Cloud Data Flow just uses them.
Spring Cloud Stream lets you bind your event-driven long-running applications into a messaging middleware or a streaming platform. As a developer, you have to choose your binder (the binder implementations for RabbitMQ, Apache Kafka etc.,) to stream your events or data from/to the messaging middleware you bind to.
Spring Cloud Task doesn't bind your application into a messaging middleware. Instead, it provides abstractions and lifecycle management to run your ephemeral or finite duration applications (tasks). It also provides the foundation for developing Spring Batch applications.
However, both can be made recurring (a stream is by definition recurring, where a task can run every certain time interval)
A task application can be triggered/scheduled to make it a recurring one whereas the streaming application is a long-running, not a recurring one.
In addition, both can be configured to communicate with the rest of the pipeline through the message broker.
Though a task application can be configured to communicate to a messaging middleware, the concept of pipeline is different when it comes to stream vs task (batch). For the streaming applications, the pipeline refers to the communication via the messaging middleware while for the task applications, the concept of composed tasks lets you create a conditional workflow of multiple task applications. For more information on composed tasks, you can refer to the documentation.
I have a Spring boot 1.4.3 project. Recently I have come up with a requirement where I have to send logs from server to my web application and print the logs on the web page. I am aware of WebSockets but I was looking for better solutions and I came across, Reactive Programming and gRPC.
Spring is supporting Reactive Programming in Spring version 5, but I am quite confused between gRPC and Reactive Programming. gRPC features Bi-Directional streaming which is built on top of Netty and provides the same facility as pushing data from the server to clients like Reactive Programming. So which one should I use, If you can clear me on this confusion it would be really great.
Also, If I move to Spring Boot 2 which supports Spring Version 5, the project will be running on Netty. My confusion is, do I have to run my application on different containers like for normal REST endpoints on Jetty server and for Reactive API on netty server or Spring will handle this for me out of the box by handling reactive requests on netty and remaining general REST API on the jetty server, because as far as I know Netty is not a Servlet Container.
One key feature of reactive programming is back pressure, and back pressure that's implemented in a non-blocking manner. At the time of writing, gRPC doesn't support this.
There's also much more to reactive programming than the communication between a client and a server. To be truly reactive, you need to be reactive from end to end. This includes reactive access to your data store, etc. As far as I know, this isn't something that's tackled by gRPC.
You shouldn't try to mix the use of a traditional Servlet-based web framework (such as Spring MVC) with use of WebFlux (Spring's reactive web framework). You should either write a 100% reactive web application or a 100% Servlet-based web application.
I am building a spring cloud-based microservice ML pipeline.
I have a data ingestion service that (currently) takes in data from SQL, this data needs to be used by the prediction service.
The general consensus is that writes should use async message-based communication using kafka/rabbitmq.
What I am not sure about is how do I orchestrate these services?
Should I use an API gateway that invokes ingestion that starts the pipeline?
Typically you would build a service with rest endpoints (Spring Boot) to ingest the data. This service can then be deployed multiple times behind a api gateway (Zuul, Spring Cloud) that takes care about routing. This is the default spring cloud microservices setup. The ingest service can then convert the data and produce it to a RabbitMQ or Kafka. I recommend using Spring Cloud Stream for the interaction with the queue, it's abstraction on top of RabbitMQ and Kafka, which can be configured using starters/binders.
Spring Cloud Dataflow is a declarative approach for orchestration of your queues and also takes care of deployment on several cloud services/platforms. This can also be used but might add extra complexity to your use case.
I have two spring boot apps running in the same local network and they need to communicate with each other. An obvious answer is to leverage REST API and make http calls, but I would like to use Spring Integration project for this purpose.
That said, I have several architectural questions regarding that:
Should I setup a standalone messaging framework (e.g. Rabbit MQ) or embedded should also work (e.g. messaging will be embedded to one of the two apps).
If standalone, what messaging framework should I choose: ActiveMQ, RabbitMQ or something else?
Welcome to the Messaging Microservices world!
You go right way, but forget the embedded middleware if you are going to production. Especially when your application will be distributed geographically.
So, right you need some Message Broker and that should be definitely external one.
It's really your choice which one is better for your purpose. For example you can even take into account Apache Kafka or Redis.
If we talk here about Spring Integration it might be better for you to consider to use our new product - Spring Cloud Stream.
With that you just have your applications as Spring Boot Microservices which are able to connect to the external middleware transparently for the application. You just deal with message channels in the application!