Controller of microservices in Java on same system - java

How do I launch and control multiple microservices on the same system in Java? Is there an existing Java controller that can do this?
I have an application server which consists of multiple microservices that run on the same OS instance/system. Each microservice is spring boot based though there are a few exceptions. I'm looking for some already written controller which will start each service in order and restart if a service fails. I'm not looking for a container based approach but rather a controller which runs as a process on windows.
I don't want to create and maintain windows service entries for each service as that is error prone and tough to keep configured correctly. Getting the startup order right is also difficult.
I can write one myself but I'd rather not re-invent the wheel if I can find something that does what I need.

Related

SpringBatch with SpringBoot

Hi folks need some opinions here.
I already have a spring boot application holding all my rest APIs running on tomcat that ships in with spring-boot-starter-web.
I would like to set up jobs using spring batch that will be schedule via kubernetes. The idea is to share the same business logic instead of creating a standalone batch project which i need to maintain double business logic.
Question, scheduling via kubernetes meaning i will be firing java -jar someJar --spring.batch.jobNames=xxx in container, doing that it will also start up all my RestApis right? which in turn unnecessary and waste of resources. Anyway to mitigate this or my understanding is wrong?
The way I would implement this is by extracting the common business logic in a separate module, and make the batch app and the webapp depend on that module.

Microservices - Stubbing/Mocking

I am developing a product using microservices and am running into a bit of an issue. In order to do any work, I need to have all 9 services running on my local development environment. I am using Cloud Foundry to run the applications, but when running locally I am just running the Spring Boot Jars themselves. Is there anyway to setup a more lightweight environment so that I don't need everything running? Ideally, I would only like to have the service I am currently working on to have to be real.
I believe this is a matter of your testing strategy. If you have a lot of micro-services in your system, it is not wise to always perform end-to-end testing at development time -- it costs you productivity and the set up is usually complex (like what you observed).
You should really think about what is the thing you wanna test. Within one service, it is usually good to decouple core logic and the integration points with other services. Ideally, you should be able to write simple unit tests for your core logic. If you wanna test integration points with other services, use mock library (a quick google search shows this to be promising http://spring.io/blog/2007/01/15/unit-testing-with-stubs-and-mocks/)
If you don't have already, I would highly recommend to set up a separate staging area with all micro-services running. You should perform all your end-to-end testing there, before deploying to production.
This post from Martin Fowler has a more comprehensive take on micro-service testing stratey:
https://martinfowler.com/articles/microservice-testing
It boils down to a test technique that you use. Here my recent answer in another topic that you could find useful https://stackoverflow.com/a/44486519/2328781.
In general, I think that Wiremock is a good choice because of the following reasons:
It has out-of-the-box support by Spring Boot
It has out-of-the-box support by Spring Cloud Contract, which gives a possibility to use a very powerful technique called Consumer Driven Contracts.
It has a recording feature. Setup your Wiremock as a proxy and make requests through it. This will generate stubs for you automatically based on your requests and responses.
There are multiple tools out there that let you create mocked versions of your microservices.
When I encountered this exact problem myself I decided to create my own tool which is tailored for microservice testing. The goal is to never have to run all microservices at once, only the one that you are working on.
You can read more about the tool and how to use it to mock microservices here: https://mocki.io/mock-api-microservices. If you only want to run them locally, it is possible using the open source CLI tool
It can be solved if your microservices allow passing metadata along with requests.
Good microservice architecture should use central service discovery, also every service should be able to take metadata map along with request payload. Known fields of this map can be somehow interpreted and modified by the service then passed to next service.
Most popular usage of per-request metadata is request tracing (i.e. collecting tree of nodes used to process this request and timings for every node) but it also can be used to tell entire system which nodes to use
Thus plan is
register your local node in dev environment service discovery
send request to entry node of your system along with metadata telling everyone to use your local service instance instead of default one
metadata will propagate and your local node will be called by dev environment, then local node will pass processed results back to dev env
Alternatively:
use code generation for inter-service communication to reduce risk of failing because of mistakes in RPC code
resort to integration tests, mocking all client apis for microservice under development
fully automate deployment of your system to your local machine. You will possibly need to run nodes with reduced memory (which is generally OK as memory is commonly consumed only under load) or buy more RAM.
An approach would be to use / deploy an app which maps paths / urls to json response files. I personally haven't used it but I believe http://wiremock.org/ might help you
For java microservices, you should try Stybby4j. This will mock the json responses of other microservices using Stubby server. If you feel that mocking is not enough to map all the features of your microservices, you should setup a local docker environment to deploy the dependent microservices.

Couchbase Cluster and Bucket management

I am developing a server-side app using Java and couchbase. I am trying to understand the pros and cons of handling the cluster and bucket management from the java code over using the couchbase admin web console.
For instance, should I handle create/ remove buckets, indexing, and update buckets in my java code?
The reason I want to handle as many as couchbase administration functions is my app is expected to run on-prem not a cloud services. I want to avoid that our customers need to learn how to administrate couchbase.
The main reason to use the management APIs programmatically, rather than using the admin console, is exactly as you say: when you need to handle initializing and maintaining yourself, especially if the application needs to be deployed elsewhere. Generally speaking, you'll want to have some sort of database initializer or manager module in your code, which handles bootstrapping the correct buckets and indexes if they don't exist.
If all you need to do is handle preparing the DB environment one time for your application, you can also use the command line utilities that come with Couchbase, or send calls to the REST API. A small deployment script would probably be easier than writing code to do the same thing.

Programmatically controlling application servers

I'm creating an application that relies heavily on dynamic creation/management of various resources like jms queues, webservice endpoints, jdbc connections... I have a background in java EE and am currently working on a jboss 7 server however I'm finding it increasingly difficult to control these things programmatically. The hardest thing to control seem to be the webservices. I need to be able to generate WSDLs (and XSDs) on the fly, manage the endpoints, soap handlers etc and the system simply does not seem to be set up to do that.
Other application servers don't seem to really offer any groundbreaking solutions so I'm wondering whether perhaps java EE is not the best solution to this particular problem?
Is there an application server that allows you to do just that? Is there another technology that does? Should I just roll a custom solution that integrates all the separate modules (e.g. a jms server, a web server etc...)?
UPDATE
To clarify, most java EE stuff is accomplished through a mixture of annotations and XML configuration. This however assumes that you have a POJO and/or a jar/war/... per resource.
Suppose I have a #WebServiceProvider bean which can be reused for multiple input/output combinations (for example because it dynamically redirects the content). I need to be able to deploy a new "instance" of the provider on the fly. This means I do not want to duplicate the code and redeploy it, I just want to take that one existing bean on the classpath and deploy it multiple times with different configuration settings. This also means I need to manage the WSDL dynamically. The end result should be a webservice that works pretty much like a standard webservice on the application server with the necessary integrated security, soap handlers,...
I imagine that at some point in the application server code, there must be a class "WebserviceManager" which has a method like "createWebservice(...)" that is actually used by the deployment module whenever it discovers a webservice annotation. I want access to that method and similar methods for creating jdbc connections, jms queues,...
You can use OSGi for these kind of scenarios. It is perfect for hot deployment of varios modules.

Fast Multithreaded Online Processing Application Framework Suggestions

I am looking for a pattern and/or framework which can model the following problem in an easily configurable way.
Every say 3 minutes, I needs to have a set of jobs kick off in a web application context that will concurrently hit web services to obtain the latest version of data, and push it off to a database. The problem is the database will be being heavily used to read the data from to do tons of complex calculations on the data. We are currently using spring so I have been looking at Spring Batch to run this process does anyone have any suggestions/patterns/examples of using Spring or other technologies of a similar system?
We have used ServletContextlisteners to kick off TimerTasks in our web applications when we needed processes to run repeatedly. The ServletContextListener kicks off when the app server starts the application or when the application is restarted. Then the timer tasks act like a separate thread that repeats your code for the specified period of time.
ServletContextListener
http://www.javabeat.net/examples/2009/02/26/servletcontextlistener-example/
TimerTask
http://enos.itcollege.ee/~jpoial/docs/tutorial/essential/threads/timer.html
Is refactoring the job out of the web application and into a standalone app a possibility?
That way you could stick the batch job onto a separate batch server (so that the extra load of the batch job wouldn't impact your web application), which then calls the web services and updates the database. The job can then be kicked off using something like cron or Autosys.
We're using Spring-Batch for exactly this purpose.
The database design would also depend on what the batched data is used for. If it is for reporting purposes, I would recommend separating the operational database from the reporting database, using a database link to obtain the required data from the operational database into the reporting database and then running the complex queries on the reporting database. That way the load is shifted off the operational database.
I think it's worth also looking into frameworks like camel-integration. Also take a look at the so called Enterprise Integration Patterns. Check the catalog - it might provide you with some useful vocabulary to think about the scaling/scheduling problem at hand.
The framework itself integrates really well with Spring.

Categories

Resources