How to best share and store enum-like data between microservices? - java

Situation:
A microservices architecture with:
Microservices are written in Java with Spring Boot and Hibernate
An options-service which provides information about Options via a REST interface
Option data which could be modelled as follows e.g. via an enum (representative)
Has some attributes like enabled status associated with it. It is unlikely that more attributes belonging to Option will be added in future which have to be directly tied to it.
New Options and OptionTypes have to be seldom added. They will never be deleted (just disabled if need be).
An Option should have a unique identifier which can be referenced
There should be no UNKOWN option if possible
enum OptionType {
TYPE_A,
TYPE_B,
TYPE_C;
}
enum Option {
TYPE_A_X1(TYPE_A),
TYPE_A_X2(TYPE_A),
TYPE_B_Z1(TYPE_B, false),
TYPE_B_Z2(TYPE_B);
TYPE_C_U1(TYPE_C);
TYPE_C_U2(TYPE_C);
TYPE_C_U3(TYPE_C);
private final OptionType type;
private final boolean enabled;
Option(OptionType type){
this.type = type;
this.enabled = true;
}
Option(OptionType type, boolean enabled){
this.type = type;
this.enabled = enabled;
}
}
Other microservices (currently 3) need to be able to access Option data. They need to know which Options exist and somehow reference an Option e.g. via its name or identifier
One of those services (example-service) needs to provide the Option data type as filter settings in its own REST interface to the outside world. The filter object in JSON would look something like this:
{
"typeA": "TYPE_A_X1",
"typeB": "TYPE_B_Z2",
"typeC": [ "TYPE_C_U1", "TYPE_C_U2"]
// more filter settings
}
Different approaches of storing and sharing this Option data between microservices as I see it:
options-service stores Option data in its own database. If I read my data from database into my Hibernate entities Option is only a String everywhere from there on.
Pro:
Easy to rename, add and remove Options
Con:
No type safety when working with Option in code e.g. when deserialising a response containing Option
example-service cannot easily offer Option data in its OpenAPI doc (just Strings)
Microservices need to query and cache Option data from options-service
Option data only lives in source code in an enum as e.g. modelled above and is shared between different services via a lib.
Pro:
Type safety everywhere where Options are needed. Really useful when deserializing reponses containing Options data but also for generating OpenAPI doc
Microservices can still reference an Option in its database via its name since it is unique
Con:
Editing the name of an Option is difficult
Removing an Option not possible
If a new Option/OptionType is added the order in which the services relying on that lib update their lib version matters. Since we cannot deserialize responses into an UNKNOWN Option type.
There is also the possibility of a mixed database and enum solution which comes with the big drawback that one has to maintain both sources of truth.
What is the best way to store and share the Option data between microservices? What is best practice?

While enums are very helpful in most programming languages, they can become a nightmare in inter-service communication.
If you use an enum in your microservice endpoints, it becomes part of the API specification. Changing them on one side will result in problems of serialization/deserialization on the other side.
So you should only use enums in your API if
you are definitely sure they will never change (e.g. weekdays) or
you are the owner of the code of all participating services and are able to orchestrate a big bang redeployment.
In these cases, you may create a shared lib between your services.
If this does not apply, make your life easier and consider most enum-like data (e.g. currency codes, country codes) as simple string values that are subject to change. If you want to have a central management for these values, why don't create a microservice for it that acts as a master? That's what microservice architectures are made for. Offer some endpoints that other services can query from time to time. You don't even need a persistence layer for that.

Another option : Use the first option, but with a specialised class.
This allow for type safety,
but does not require your enums to be known at compile time.
Still, if you need to check the validity of your Option instance, you need a service that has cached the list of allowed Options.
Example code:
import com.fasterxml.jackson.annotation.JsonValue;
public class Option {
#JsonValue
private String value;
Option(String value) {
this.value = value;
}
}

First Option :
Define or Create new Module / MicroServices / Project that only Contains your CommonMessage , String Message, Enum, Exception or everything that Common and reusable for your multiple microservices that using spring-boot.
But those Enum is not parameterizing so you can not Create Update Or Delete.
And you create your Option ENUM class for multiple purpose and you can inject those class from POM.xml as a dependency.
Second Option :
Store it in Database as an Entity Object , so you can manage those data.
if you want to use those data on other microservices, dont forget to use this artifact as dependency in POM.xml so other microservices can call your service or repository.

Related

MVC with JavaFX without client-server-architecture

I've got a pretty general question.
For my work I have to implement a demonstrator application using MongoDB, Java and JavaFX (and yWorks). So since I - unfortunately - have to work alone on this and there is not a huge amount of know-how in our company, all I did was studying and learning these technologies for myself.
And since we don't want to have a server application there is only MongoDB as a service and the client working on the data. (This is okay, since its only a demonstrator). But im kind of confused.
Implementing pojo classes to store and load from the database and implementing gui model classes with the exact same properties but using "SimpleStringProperty" by JavaFX led to the - in my opinion - weird fact, that i have two semantically identic model classes that I have to implement some kind of Observer/Observable-Pattern to propagate changes from the ViewModels to the POJOS and vice versa.
There must be a more elegant way to do this, right?
Any ideas on this?
Some visualisation:
Analyse
Just the analyse, skip to the proposition if you dont want details
The question to duplicate or not class model definition is an architectural choice, but theses 2 solutions are possible and acceptable:
Use your model bean in controller and Screen
Duplicate all your classes with helpers method from/to for the conversions.
Since the duplication is possible but trivial, I will only describe how to use directly models, there are still another 2 solutions:
1.1 Just manually bind your attributes (the simpler but not the more elegant)
Create Observable SimpleStringProperty ... on the fly when binding read-only, or use Helpers to add listener on screen observable to call regular setter when a value is modifiable
1.2. Make your data framework use getters/setters and not fields: in most case you can configure it (that's the case for hibernate) So if mongodb can serialize/deserialize objects using method and not field, It will be possible
I suppose here that you use standard javaFX bean with 3 accessor on each attributes: get,set,property.
Proposition
Sorry for this big intro, here is a proposed solution using Jackson.
First It is aparently possible to use mongodb with Jackson:Efficient POJO mapping to/from Java Mongo DBObject using Jackson
Then here is an example code of javaFX Bean with jackson:
#JsonAutoDetect(fieldVisibility = Visibility.NONE, getterVisibility = Visibility.PUBLIC_ONLY, setterVisibility = Visibility.PUBLIC_ONLY)
public class MyBean {
private final StringProperty label = new SimpleStringProperty();
public final StringProperty labelProperty() {
return this.label;
}
public final String getLabel() {
return this.labelProperty().get();
}
public final void setLabel(final String label) {
this.labelProperty().set(label);
}
}
You are using morphia, you have to check if this is possible.
Side note:
I'm asking a similar question about non duplication of object on the link below (I am in a 3 tier not 2 tier, but still duplication problem), the usual solution for instance is still the duplication:
https://softwareengineering.stackexchange.com/questions/367768/how-to-avoid-dto-on-a-client-server-application?noredirect=1#comment804724_367768

Spring/Hibernate Same Entity different database structure

I have an app used in 3 different countries. It is almost all the same, with a few country dependent exceptions, for example an order would have salesTax in USA, but not in the UK. Each country has its own database, completely separate to each other, with slight differences in the table structure for these country specific properties. The table names are always identical, as are the columns that all countries share; it is only the extra columns that make them different.
Currently as a result I actually have 3 separate applications, with about 80% identical code. Every time I develop something, I need to copy it into each individual application which is a bit laborious.
My plan is to try and have a global application, and then extend this with the 3 much smaller country specific ones.
For my services and daos this can be handled by my interfaces being global, and any implementations that vary between the apps being local. The local apps will pull in the global as a dependency.
However, my problem is with the business objects. With the order as an example again, many of my services and daos use the Order object, and they would be global since the code is identical across countries. So I need to have some sort of GLOBAL Order object. But I also want to make sure that depending on the country, the extra properties are available and persisted if I go and save them again.
As an example, I have a service that just checks the tracking of an order and marks it as Delivered. This service is global, so would have access to the GLOBAL Order object. it would call orderDao.getOrder(id) thanks to the orderDao interface being global. The orderDaoImpl would be local, so the getOrder(id) method would actually return a localized Order instance, complete with extra fields. When this is passed up through the layers it reaches the tracking service again, which is expecting a GLOBAL Order, meaning the localized fields are not available (this is OK because this trackingService doesn't care about them, if it did, the implementation would be localized).
When I update the status of this GLOBAL Order I need to make sure that when it gets saved the localized properties are not lost.
So my big question is how can I make this work?
Option 1: Is there any way for me to define some "selective transiency"? So I have a single Java Object with all possible properties. When it is used in each application, if any of those properties don't exist in the database table just ignore it and carry on.
OR
Option 2: Is there a way for me to use some abstract/interfaced class that could be used at both global and local level, with local apps automatically casting the object as the implemented subclass?
It is worth pointing out that the GLOBAL app as such would never actually be run by itself, it would only ever be used within the local apps. But obviously cant have references to the local classes.
Cheers for any assistance.
Might not be the most elegant solution, but you could make a global order class that isn't mapped as an #Entity and include it in your regional orders with #Embedded
class GlobalOrder {
#Column(name = "total")
private Integer orderTotal;
...
}
#Entity
#Table("emeaorder")
class EMEAOrder {
#Embedded
private GlobalOrder globalOrder;
#Column(name = "tax")
private Integer euSalesTax;
...
}
OK. I think I have come up with the least elegant, but workable solution to this.
The problem is allowing my global apps to know about the local Business Objects such as Orders, but not having them mapped to columns that may not exist in each country.
So, I have included them in my Global project, and included every possible attribute, and getters/setters.
I then ALSO include them in my local projects with only the fields needed in that particular country.
Since the global apps are never actually used by themselves, whenever I build the library using maven, I exclude any of the business objects that are in my local projects. So when the local then includes it as a dependency there are no conflicts, and only the local version of these objects are included, meaning Hibernate doesn't complain about missing columns.
Not clean, but it works!

Generating code for converting between classes

In one of the project I'm working on, we have different systems.
Since those system should evolve independently we have a number of CommunicationLib to handle communication between those Systems.
CommunicationLib objects are not used inside any System, but only in communication between systems.
Since many functionality require data retrieval, I am often forced to create "local" system object that are equal to CommLib objects. I use Converter Utility class to convert from such objects to CommLib objects.
The code might look like this:
public static CommLibObjX objXToCommLib(objX p) {
CommLibObjX b = new CommLibObjX();
b.setAddressName(p.getAddressName());
b.setCityId(p.getCityId());
b.setCountryId(p.getCountryId());
b.setFieldx(p.getFieldx());
b.setFieldy(p.getFieldy());
[...]
return b;
}
Is there a way to generate such code automatically? Using Eclipse or other tools? Some field might have a different name, but I would like to generate a Converter method draft and edit it manually.
try Apache commons-beanutils
BeanUtils.copyProperties(p, b);
It copies property values from the origin bean to the destination bean for all cases where the property names are the same
If you feel the need to have source code automatically generated, you are probably doing something wrong. I think you need to reexamine the design of the communication between your two "systems". How do these "systems" communicate?
If they are on different computers or in different processes, design a wire protocol for them to use, rather than serializing objects.
If they are classes used together, design better entity classes, which are suitable for them both.

Rails application, but all data layer is using a json/xml based web service

I have a web service layer that is written in Java/Jersey, and it serves JSON.
For the front-end of the application, I want to use Rails.
How should I go about building my models?
Should I do something like this?
response = api_client.get_user(123)
User user = User.new(response)
What is the best approach to mapping the JSON to the Ruby object?
What options do I have? Since this is a critical part, I want to know my options, because performance is a factor. This, along with mapping JSON to a Ruby object and going from Ruby object => JSON, is a common occurance in the application.
Would I still be able to make use of validations? Or wouldn't it make sense since I would have validation duplicated on the front-end and the service layer?
Models in Rails do not have to do database operation, they are just normal classes. Normally they are imbued with ActiveRecord magic when you subclass them from ActiveRecord::Base.
You can use a gem such as Virtus that will give you models with attributes. And for validations you can go with Vanguard. If you want something close to ActiveRecord but without the database and are running Rails 3+ you can also include ActiveModel into your model to get attributes and validations as well as have them working in forms. See Yehuda Katz's post for details on that.
In your case it will depend on the data you will consume. If all the datasources have the same basic format for example you could create your own base class to keep all the logic that you want to share across the individual classes (inheritance).
If you have a few different types of data coming in you could create modules to encapsulate behavior for the different types and include the models you need in the appropriate classes (composition).
Generally though you probably want to end up with one class per resource in the remote API that maps 1-to-1 with whatever domain logic you have. You can do this in many different ways, but following the method naming used by ActiveRecord might be a good idea, both since you learn ActiveRecord while building your class structure and it will help other Rails developers later if your API looks and works like ActiveRecords.
Think about it in terms of what you want to be able to do to an object (this is where TDD comes in). You want to be able to fetch a collection Model.all, a specific element Model.find(identifier), push a changed element to the remote service updated_model.save and so on.
What the actual logic on the inside of these methods will have to be will depend on the remote service. But you will probably want each model class to hold a url to it's resource endpoint and you will defiantly want to keep the logic in your models. So instead of:
response = api_client.get_user(123)
User user = User.new(response)
you will do
class User
...
def find id
#api_client.get_user(id)
end
...
end
User.find(123)
or more probably
class ApiClient
...
protected
def self.uri resource_uri
#uri = resource_uri
end
def get id
# basically whatever code you envisioned for api_client.get_user
end
...
end
class User < ApiClient
uri 'http://path.to.remote/resource.json'
...
def find id
get(id)
end
...
end
User.find(123)
Basic principles: Collect all the shared logic in a class (ApiClient). Subclass that on a per resource basis (User). Keep all the logic in your models, no other part of your system should have to know if it's a DB backed app or if you are using an external REST API. Best of all is if you can keep the integration logic completely in the base class. That way you have only one place to update if the external datasource changes.
As for going the other way, Rails have several good methods to convert objects to JSON. From the to_json method to using a gem such as RABL to have actual views for your JSON objects.
You can get validations by using part of the ActiveRecord modules. As of Rails 4 this is a module called ActiveModel, but you can do it in Rails 3 and there are several tutorials for it online, not least of all a RailsCast.
Performance will not be a problem except what you can incur when calling a remote service, if the network is slow you will be to. Some of that could probably be helped with caching (see another answer by me for details) but that is also dependent on the data you are using.
Hope that put you on the right track. And if you want a more thorough grounding in how to design these kind of structures you should pick up a book on the subject, for example Practical Object-Oriented Design in Ruby: An Agile Primer by Sandi Metz.

Dynamically create table and Java classes at runtime

I have a requirement in my application. My tables won't be defined beforehand.
For example, if a user creates a form by name Student, and adds its attributes like name, roll no, subject, class etc, then on runtime, there should be a table created by name student with columns name, roll no, subject, class etc. And also its related class and its Hibernate mapping file.
Is there any way of doing so?
Thanks in advance,
Rima Desai
Hibernate supports dynamic models, that is, entities that are defined at run-time, but you have to write out a mapping file. You should note a couple things about dynamic models:
You may be restricted in how you define these at run-time (viz. you will have to use the Session directly instead of using a helper method from HibernateTemplate or something like that).
Dynamic models are supported using Maps as the container for the fields of an entity, so you will lose typing and a POJO-style API at run-time (without doing something beyond the baked-in dynamic model support).
All of that said, you didn't mention whether it was a requirement that the dynamically defined tables be persistent across application sessions. That could complicate things, potentially.
It's possible, but it's not clear why would you want to do something like that, so it's hard to suggest any specific solution.
But generally, yes, you can generate database tables, hibernate classes and mappings dynamically based on some input. The easiest approach is to use some template engine. I've used Velocity in the past and it was very good for this task, but there are others too if you want to try them.
EDIT:
Following OP clarification the better approach is to use XML to store user defined data.
The above solution is good but it requires recompiling the application whether forms are changed. If you don't want to stop and recompile after each user edit, XML is much better answer.
To give you some head start:
#Entity
public class UserDefinedFormData {
#Id
private long id;
#ManyToOne
private FormMetadata formMetadata;
#Lob
private String xmlUserData;
}
Given a definition of the form it would trivial to save and load data saved as XML.
Add a comment if you would like some more clarifications.
last week I was looking for same solution and then I got idea from com.sun.tools.javac.Main.compile class, you create the Entity class using java IO and compile using java tools, for this you need tools.jar to locate on CLASS_PATH, now I am looking for run time hibernate mapping without restart.
some one was saying in the post regarding to this type of requirement that "but it's not clear why would you want to do something like that" answer is this requirement is for CMS(Content Management System). and I am doing the same. code is as below.
public static void createClass()
{
String methodName=“execute”;
String parameterName=“strParam”;
try{
//Creates DynamicTestClass.java file
FileWriter fileWriter=new FileWriter(fileName,false);
fileWriter.write(“public class “+ className +” {\n”);
fileWriter.write(“public String “+methodName +“(String “+parameterName+“) {\n”);
fileWriter.write(“System.out.println(\” Testing\”);\n”);
fileWriter.write(“return “+parameterName +“+ \” is dumb\”;\n }\n}”);
fileWriter.flush();
fileWriter.close();
String[] source = { new String(fileName) };
com.sun.tools.javac.Main.compile(source);
}

Categories

Resources