Conditionally rendering an element as a link to implement HAL in Java - java

I'm trying to implement the HAL standard for JSON in a JAX-RS service. My project consists of Users containing many Projects containing many Nodes containing a variety of data and pointers to other Nodes.
So when an endpoint is hit, I'd like to embed objects one level deep, and link after that:
/user has user data and "_embedded" projects, but those projects only
contain "_links" to nodes (and self)
/project/1234 has "_embedded" nodes, but those nodes "_links" to further data.
And so on.
The Jackson JSONFilters look close, but I'm not quite grasping it. Especially relevant is that sometimes a property will be mapped in a collection of "_embedded" and sometimes in "_links" using different techniques.
Any one ever try something like this?
There is HalBuilder, but it seems like it requires hand serialization, which I'd like to avoid. Then again, Jackson seems to be almost as much code as hand serializing.

You'll need to implement your custom Jackson Serializer ( http://jackson.codehaus.org/1.7.9/javadoc/org/codehaus/jackson/map/JsonSerializer.html )
Take a look at Spring Hateoas project. They have implemented a jackson extension HalJacksonModule ( https://github.com/SpringSource/spring-hateoas/commit/61e73107c1213556c025dc8f68a8784daf089796
) to enable HAL serialization to Jackson. I think you can use it or adapt it to your needs.
Additionally, the project "Spring Data Rest" (http://www.springsource.org/spring-data/rest) provides a way to export your JPA model to REST (with hateoas) using Spring Hateoas. You may look at the code to get inspiration or simply use the framework in your code. (Remember to register the HalJacksonModule into the ObjectMapper).

I have found that the RestExpress library is pretty fantastic and it includes support for HAL. The author did all of the work of build the serialization mechanisms and links building based on one simple configuration.
https://github.com/RestExpress/HyperExpress
https://github.com/RestExpress/HyperExpress/tree/master/hal

Related

Is it possible to use hibernate entity classes as POJOs for GSON?

I'm working on a project where we have a spring boot application that uses spring data and hibernate. Now I want to use the GSON library to work with JSON files. I've read a tutorial where it becomes clear that it is possible to create POJO classes and convert JSON files into objects from those classes. The same thing happens with hibernate.
Now my question: is it possible to design the POJO (or entity) classes in a way that they work both for hibernate and GSON. Can problems arise, if it is possible and I do it it that way?
Thanks in advance!
Edit: Here is the tutorial where I read about POJOs for GSON: tutorials point - GSON
It is possible, but it is not a good design. If you use it for example serialize rest interface data, then it will hard couple your rest endpoint with database or it can lead for security issues (serialize sensitive data) etc. So it can lead hard coupling, and will be harder to decouple it latter. It is always good to create separate model to db and other interfaces. You can use mapping libraries (for example mapstuct) to mapping between the models easily.
You actually should not design Entity or POJO class based on which library we use ( GSON or Jackson API).
You can refer here for a clear explanation.
https://www.geeksforgeeks.org/convert-java-object-to-json-string-using-jackson-api/

Keeping models between spring boot and angular application in sync. Alternatives?

In client-server applications with spring-boot and angular. Most resources I find explain how to expose REST endpoint from spring boot and consume it from angular with an http client.
Most of the time, communicating in JSON is preconized, maintaining DTOs (DataTransfertObject) in both angular and spring boot side.
I wonder if people with fullstack experience knows some alternative avoiding maintaining DTOs in both front and backend, maybe sharing models between the two ends of the application?
Swagger would be a good tool to use here.
You can take a code-first approach which will generate a swagger spec from your Java controllers & TOs or a spec-first approach which will generate your Java controllers & TOs from a swagger spec.
Either way, you can then use the swagger spec to generate a set of TypeScript interfaces for the client side.
As Pace said Swagger would be a great feature that you can use. In this case plus having great documentation for the API endpoints you can sync object models between frontend and backend. You have to just use the .json or .yaml file of Swagger to generate the services and object models on the frontend side by using ng-swagger-gen.
Then put the command for generating services and object model in package.json when you wanna build or run your application for instance:
...
"scripts": {
...
"start": "ng-swagger-gen && ng serve",
"build": "ng-swagger-gen && ng build -prod"
...
},
...
So after running one of these commands you will have updated object models and if an object property name or type changed, add/remove object property you will get the error and you have to fix it first then move forward.
Note: Just keep in mind the services and object models will be generated based on the Swagger file so it always should be updated.
PS: I was working on a project that we even every code on the backend side was generated based on the Swagger file ;) so they just change the Swagger file and that's it.
This is a difficult topic, since we are dealing with two different technology stacks. The only way I see is to generate those objects from a common data model.
Sounds good, doesn't work. It is not just about maintaining dto's.
Lets say api changes String to List. It is not enough to update your typescript dto from string to string[]. There is logic behind string manipulation that now needs to handle list of strings. Personally i do not find that troublesome to maintain dto's on both sides. Small tradeoff for flexibility and cleaner code (you will have different methods in different dto's)

Configurable Object to Object map Java/Spring

I haven't been able to find any good examples, or a direction to go for this. But essentially I want to be able to create a configurable object to object mapping interface. I don't want to hardcode the fields that should be mapped to one another, but rather give users an interface to be able to say fieldA from objectA maps to fieldB from objectB.
These configuration settings can be persisted in SQL, or an XML file, doesn't really matter to me. This is a Spring Boot application.
I was using Orika for the mapping currently, but I don't know how to make it configurable. Am I on the right track, or does it not have that capability? Would a CustomMapper be what I need to do? Looking for some good examples, or the right direction.
You can give Dozer a try. It provides full automation of mapping process and allows handling of complicated mapping cases. Usually, all this flexibility comes at a price of reduced performance, but is maybe good enough in your case.
Mappings are usually set using XML files. In later versions, Dozer also supports mappings via API and via Annotations. Check their website for more info. Dozer also has a Spring framework integration.

Hibernate objects and GWT-RPC

I want to transfer hibernate objects with GWT-RPC to the frontend. Of course i can not transfer the annotated class because the annotations can not be compiled to javascript. So i did the hibernate mapping purely in the ".hbm.xml". This worked fine for very simple objects. But as soon as i add more complex things like a oneToMany relationship realized with e.g. a set, the compiler complains about some serialization issues with the set (But the objects in the set are serializable as well).
I guess it does't work because hibernate creates some kind of special set that can not be interpreted by GWT?
Is there any way to get around this or do i need another approach to get my objects to the frontend?
Edit: It seems that my approach is not possible with RPC because hibernate changes the objects. (see answer from thanos). There is a newer approach from google to transfer objects to the the frontend: The request factory. It looks really good and i will try this now.
Edit2: Request factory works perfectly and is much more convenient than RPC!
This is a quote from GWT documentation. It says that hibernate changes the object from the original form in order to make it persistent.
What this means for GWT RPC is that by the time the object is ready to be transferred over the wire, it actually isn't the same object that the compiler thought was going to be transferred, so when trying to deserialize, the GWT RPC mechanism no longer knows what the type is and refuses to deserialize it.
Unfortunately the only way to implement the solution is by making DTOs and their appropriate converters.
Using Gilead is a cleaner approach (no need for all this DTO code), but DTOs are more ligtweight and thus produce less traffic through the wire.
Anyhow there is also Dozer, that will generate the DTOs for you so there will not be much need for yo to actually write the code.
Either way as mchq08 said the link he provided will solve many of questions.
I would also make another suggestion! Separate the projects. Create a new one as a model for your application and include the jar into the GWT. In this way your GWT project will be almost in its' entirety the GUI and the jar library can be re-used for other projects too.
When I created my RPC to Hibernate I used this example as a framework. I would recommend downloading their source code and reading the section called "Integration Strategies" since I felt the "Basic" section did not justify DTO. One thing this tutorial did not go over as well is the receiving and sending part from the web page(which converts to JS) so thats why I am recommending you downloading their source code and looking at how they send/receive each the DTOs.
Post the stack trace and some code that you believe will be useful to solving this error.
Google's GWT & Hibernate
Reading this (and the source code) can take some time but really helps understands their logic.
I used the next approatch: for each hibernate entity class I had client replica without any hibernate stuff. Also I had mechanism for copy data between client <-> server clases.
This was working, but I belive current GWT version should work with hibernate-annotated classes..
On a client project, I use Moo (which I wrote) to translate Hibernate-enhanced domain objects into DTOs relatively painlessly.

Which web service stack allows binding wsdl first web service to existing classes in Java?

Greetings,
I have a complicated scenario to handle. I have a wsdl file which uses a particular XML schema.
The XML schema is actually a handcrafted implementation of a specification. There is also a Java based implementation of the same specification. So the XSD used in WSDL and Java classes at hand are quite similar, but not exactly same.
Almost all web service stacks allow creating classes from WSDL or creating WSDL from Java class annotations.
What I want to do, is to use the WSDL and bind XSD used in the wsdl to existing java classes.
Should/can I do this by manually replacing generated Java classes with existing ones? Is it a matter of changing type names in config files and moving binding annotations to existing classes?
If you know any best practices, or java web service stacks that support this kind if flexibility in a practical way, your response would be much appreciated.
Best Regards
Seref
I suggest Spring's Web Services module, which has no code generation involved, but provides a clean separation of concerns. Different concerns are broken out nicely by allowing you to provide your WSDL and existing schema(s) on one side (contract first), your existing Java-based domain model on the other, and a way to plugin in your OXM (Object-XML Mapping) technology of choice.
Since you have hand-crafted WSDL/schema and hand-crafted Java classes, the real work will be in configuring your OXM. I prefer JiBX as it keeps the concerns separated (no XML annotation garbage mixed into your domain) with JAXB as a backup if the learning curve looks too steep. Spring Web Services supports several other OXM frameworks, and you can even use several different ones at once.
As far as best-practices, I consider hand-crafted code a best practice, though I may be in the minority. If you generate classes from XML you end up with classes that are simple data containers with no behavior (assuming you want to regenerate them whenever your WSDL/XSD changes). This is bad if you favor the object-oriented paradigm because you end up having to place your "business logic" in utilities/helpers/services etc. instead of in the domain objects where it really belongs. This is one reason I favor JiBX. I can make very nice OO objects with behavior, a nice clean schema that doesn't necessarily match the objects, and can manage changes to either side with a mapping file similar to how hibernate does it for ORM (Object-Relational Mapping). You can do the same with JAXB, but that requires embedding XML structure into your object model, and binds a single XML representation to it (whereas with JiBX you can have many).
MOXY (I'm the tech lead) was designed for instances where you have an existing XML Schema and an exsting object model. It accomplishes this through XPath based mapping and can ever handle cases where the models are not that similar:
parse google geocode with xstream
MOXy also has an external binding file:
http://wiki.eclipse.org/EclipseLink/Examples/MOXy/EclipseLink-OXM.XML
MOXy is a JAXB implementation with extensions (A couple of which are mentioned above). IF you go ahead with Spring, MOXy is configured as a JAXB implementation, and you need to add a jaxb.properties file in with your model classes with the following entry:
javax.xml.bind.context.factory=org.eclipse.persistence.jaxb.JAXBContextFactory

Categories

Resources