Keeping models between spring boot and angular application in sync. Alternatives? - java

In client-server applications with spring-boot and angular. Most resources I find explain how to expose REST endpoint from spring boot and consume it from angular with an http client.
Most of the time, communicating in JSON is preconized, maintaining DTOs (DataTransfertObject) in both angular and spring boot side.
I wonder if people with fullstack experience knows some alternative avoiding maintaining DTOs in both front and backend, maybe sharing models between the two ends of the application?

Swagger would be a good tool to use here.
You can take a code-first approach which will generate a swagger spec from your Java controllers & TOs or a spec-first approach which will generate your Java controllers & TOs from a swagger spec.
Either way, you can then use the swagger spec to generate a set of TypeScript interfaces for the client side.

As Pace said Swagger would be a great feature that you can use. In this case plus having great documentation for the API endpoints you can sync object models between frontend and backend. You have to just use the .json or .yaml file of Swagger to generate the services and object models on the frontend side by using ng-swagger-gen.
Then put the command for generating services and object model in package.json when you wanna build or run your application for instance:
...
"scripts": {
...
"start": "ng-swagger-gen && ng serve",
"build": "ng-swagger-gen && ng build -prod"
...
},
...
So after running one of these commands you will have updated object models and if an object property name or type changed, add/remove object property you will get the error and you have to fix it first then move forward.
Note: Just keep in mind the services and object models will be generated based on the Swagger file so it always should be updated.
PS: I was working on a project that we even every code on the backend side was generated based on the Swagger file ;) so they just change the Swagger file and that's it.

This is a difficult topic, since we are dealing with two different technology stacks. The only way I see is to generate those objects from a common data model.

Sounds good, doesn't work. It is not just about maintaining dto's.
Lets say api changes String to List. It is not enough to update your typescript dto from string to string[]. There is logic behind string manipulation that now needs to handle list of strings. Personally i do not find that troublesome to maintain dto's on both sides. Small tradeoff for flexibility and cleaner code (you will have different methods in different dto's)

Related

How to handle DTOs definition between different microservices using Java + Springboot + GraphQL

I know this question has already been asked, but I have a different scenario in which I have the following structure:
As you can see the front-end (UI) will always call a GraphQL API Gateway microservice that will redirect the request to different microservices depending on the functionality or data required. I simplified the graphic by drawing 3 microservices, but we have more than 30.
When I started working in this project, something that I noticed is that the GraphQL API Gateway microservice is repeating many DTOs that are already defined on the other microservices. As a consequence, this GraphQL API Gateway microservice is full of Java classes that serve as DTOs (with same fields, datatypes and no modifications), so basically, what developers have been doing was to copy and paste a DTO from user microservice and paste them in the GraphQL API Gateway microservice. The same pattern was followed for all the 30 microservices DTOs. Additionally, DTOs across microservices are being copy-pasted based on the needs. You may say that there should not be an issue, but whenever a microservice requires a DTO structure change, we need to change the DTO in all the other microservices.
Some approaches that me and my team evaluated were to:
Either create a common library exclusively for DTOs that could be reused by many projects. The only concern is that this will add some "dependency" and break the "microservices" idea. Additionally, I think this will not fully solve the issue we have since we would need to keep this dependency up to date in each microservice.
Use some sort of utility such JSON schema that will generate required DTOs based on needs.
This is something that I have been discussing with the team, most of the team would like to go for the JSON Schema approach. However, I would like to know if there could be something more efficient from your point of view, I took a look to some approaches like exposing domain objects (https://www.baeldung.com/java-microservices-share-dto), but I do not think this apply for the particular case I am talking in this post.
So, any suggestion or feedback will be highly appreciated. Thanks in advance.

REST API - Frontend testing

Have frontend WebApplication developed in backbone which hits backend REST API in order to eg. download data from webservice to user interface table.
In Intellij have set maven project with two modules - one for functional selenium (webdriver/java) tests and second for rest.
What I am planning to do is to create under rest module some class which could call relevant rest API json method, put somewhere what was returned and under selenium module assert that with what ui table displays. This is kind of integration test.
But ... this is theory , in real life have doubts if it could work like I've decribed and what should I use in order to download data from REST - I've been thinking about RestAssured or about SoapUI ...but maybe you could advise something what should be used (and how) ?
Best way to deal with these kind of problem is :
1) Generate Java classes for your request and response json objects using below link
2) populate the request and call jersey api to populate response object.
3) once you get response object, create one more response oject by calling UI
4) Compare these two objects and assert
You may use Jasmine or Dredd for that.
Jasmine is a BDD oriented framework for javascript testing. It's very usefull for testing your javascript components, that includes calling your API throught your web framework.
Dredd offers more than that and a different approach, but it also may be used for that.
You can also use simple Java + Junit + Gson unit testing (even use some BDD framework like concordion) and you could also get the work done. Even using some RAML based tool.
Although they're not the same they could offer what you need. Other non framework based alternatives are fiddler, soapui.

RESTful API with Spring MVC and GWT and overlay types

I have developed a simple Spring MVC RESTful API and now I moved to the stage to create a simple GWT project to perform some requests to this api and obviously I choose that the communication will be done by exchanging JSON messages.
When receiving a response I will have to unmarshall it to a POJO.
I am aware that the general approach is to create the so called 'overlay types' but that looks to me as a mere duplicate of the java classes I wrote in api.
So the question is:
why shouldn't I simply create a common api that simply contains the common classes to perform this marshalling/unmarshalling?
I can clearly see that the main benefit is that if any change is needed you won't have to change also the overlay types.
Assuming that you can define interfaces for your pojo, you can share those Interfaces in client and server side (common package)
In server side you have to code your implementations which are used for the RESTful api.
In client side, the implementation of those interfaces can be done automatically with generators. For this you can use gwtquery databinding or gwt autobeans.
To request your RESTful api, you can use either gwtquery ajax or gwt requestbuilder
Each option has its advantages, normally I use gwtquery because its simplicity and because its databinding approach is more lightweight, otherwise, with autobeans you can create your POJOS using autobeans factories in both client and server sides. If you already have developed your backend this is not a goal for you though.
The REST response can be consumed by any client and not specifically one client. If I understand your question correctly, you want to build the logic of marshalling and unmarshalling inside your REST API. Ideally it violates Single Responsibility Principal. You might need to change the mapping logic if the service changes so you are touching two different aspects of an API where as only one component requires change.
Also, the REST API should ideally be designed to be client agnostic. It is your specific requirement to translate them to POJO but another client might want to consume it as simple plain JSON. If you provide an overlay type, your code will be quite loosely coupled.
If your server side class (Player for example) can be serialized/desirialized without any problems, then you can send it to client side without any overlay type / conversion (serialization to JSON on server -> transport -> desirialization from JSON on client). On client side you can use RestyGWT for example to archieve automatic desirialization process. Overlay types and conversion process are necessary only in the case when Player instance cannot be serialized (for example it is backed by Hibernate).

Java server side form validation using DTO and hash map

I am developing an app using MVC pattern.
Controllers: servlets
Model: I am following DAO/DTO pattern for accessing database
View: simple JSP EL and JSTL
For accessing database I am using DAO pattern. I want to put validation method and a HashMap for error messages inside the DTO classes for validating FORM data, something similar to Putting validation method and hashmap into DTO.
My question is - this a right approach? If not what is an ideal way for doing this?
As a summary: I want to know real world solutions for server side form validation when we are using DAO/DTO pattern. Please help me.
I believe you need to treat separately the architecture you're implementing and the frameworks you're using to implement the architecture.
Java has a rich set of tools for working on the three standard tiers of your application and choices depend on some factors like expected load and server resources, if you have a two or three users application then it is just a matter of taste.
In terms of DAO/DTO then you have also some options, for example you can build your Data access layer with hibernate and then for your service layer API use DTO's. In this situation you probably want to use a tool for mapping between your domain model and your DTO's (for example jDTO Binder).
Another common approach is to use Spring JDBC Template, there you can go a little bit more crazy and use the same Domain objects as part of the Service layer API.
Finally, the truth is, you can do this by the book or you can do it completely different choice is based on your scenario, taste and experience.

JAX-RS : Model and Best practices

I have a JAX-RS service (I use Jersey), and now I have to do the client. I was wondering how you guys use to deal with the model objects.
Do you put you model classes in a different jar in order to share it between the client and the server? Do you always use DTO or do you sometimes (always?) returns jpa entities.
The service I have to use (I haven't created it but I can modify it), often returns entities so I was wondering if it wasn't a bit weird if I externalize those classes.
What do you think? What are you use to do?
It depends on the complexity of the project and on the purpose you use JAX-RS in it:
for very simple projects I wouldn't create one more DTO layer whatsoever
for a project like yours that seems to use JAX-RS just to move data from a java client to a java server I wouldn't create one more layer either. That's because you are in charge at both ends (client and server) and because you reuse the same objects in both places (putting them in a separate jar and maven module is a good idea)
for projects that use JAX-RS to expose an API to external clients it's a good idea to separate the model from the API with DTOs so they are allowed to evolve independently. For example you don't always want to expose all the fields via an API or to break your clients when changing something in the model.
LATER EDIT
for a project that transfers only a subset of their model data fields to the client a DTO layer is useful for efficiency reasons

Categories

Resources