Versioning a Java based REST web service? - java

There seems to be an on-going philosophical debate about exactly how to version a REST web service. For me though, the number one issue is practical question regarding how easy/hard is it to implement and maintain in a Java servlet based backend. My company is building a new REST web service and while we are not currently concerned with versioning it at this time, I do not want to make an architectural decision that would paint us into a corner.
I guess the primary decision we have to make right now is should we put the version identifier in our URI or in the media type (or both). In case its relevant, we will mint only a few new media types. Also the application has 50+ resource URIs.
What are the pros and cons of each approach relative to implementing them in our Java servlet?
My initial thoughts:
1) I like the idea of versioning the media types (e.g. "application/vnd.mystuff+xml;version=1.0"), because it feels the same as versioning the XML schema for a SOAP/RPC web service. Also, it seems like conneg was designed just for this sort of thing. However, It seems like implementing this would be difficult if not impossible because our application is created with domain driven design principals and we can't just create new versions Java classes and have them sit along side the old ones. So we'd have to implement the new Java classes and then hand code JSON/XML serialization/deserialization to support all the old versions of media types. I guess this is no different than versioning XML Schema for SOAP...
2) On the other hand, if we were to version the URI's (e.g. "http://ourapp.com/v1/mystuff"), then our customers could deploy two entire versions of the system and stand them up as two separate servlets with different context mappings. However, we'd have modify the JPA (ORM) mappings of the v1 Java classes to use the v2 RDBMS schema. This may not always be simple, but I can easily understand how it could be done. This approach bugs me however because "application/vnd.mystuff+xml" will return two different XML models corresponding to two different XML schemas. But maybe it doesn't matter as the client knows what it asked for because it used the v1 or v2 URI...
Or is there an easier to implement, alternate approach we are not considering? Or is it ok to punt on this entire issue and worry about it later?

If the restful services are used by mobile applications then it would a bit difficult to bake in the processing of both responses. In my opinion keeping the version in the URL is the better way as this would be easier to decommission the old endpoints when convenient.

Related

Sharing POJOs between Java backend and an Android application

I am developing an Android application with my friend. I am currently responsible for the backend while she is working on the Android part. The backend is developed in Java using Lambda functions running in AWS Amazon Cloud‎. The frontend and the backend are totally decoupled (Lambda functions are exposed via REST APIs) except for the POJOs used on both sides. POJOs are serialized by the application into JSON when calling an API and deserialized again into POJOs (very same ones) by the backend when handling API requests.
We want to keep POJOs on both sides exactly the same for obvious reasons but we are wondering what the proper way to do it is. We see the following two options:
1) Simply copy code on both sides. This has the disadvantage of changing common code independently which, sooner or later, will lead to a misallignment.
2) Move POJOs out to a separate library and include it as a dependency on both sides. This seems like a more proper way to solve this issue but how do we ensure that both me and my friend know that a POJO has been changed? Let's say I remove one field from a POJO and create a new version of the shared library. I push changes to our repository and then... tell my friend that I made some changes so she should pull them, build the new version and include it in her project?
Is there a different (better) way to address this issue? Currently the backend is built with Maven but I can switch to Gradle if this would help automate things and make our code consistent (Android Studio forces Gradle builds).
I found similar questions of other people but they were either a bit different or remained unanswered:
Sharing POJOs between Android project and java backend project
Sharing one java library between Android and Java backend (gradle)
Sharing code between Java backend and Android app
There are certainly lots of other ways of doing this and better or not; I will leave that to you to consider.
But before going to sharing the POJOs, I ask you to take a step backwards and take a look at your architecture. You have essentially got:
a Java Backend with REST APIs, supporting JSON payload
an Android Application, capable of making REST calls and deserialising the JSON payloads.
If you note, above, the tech stack does not involve POJO on any level.
You see what I mean? POJO is an implementation detail for you and it is not wise to share it among your components.
How about looking into the future where you add more components to your architecture, say:
iOS application
Kotlin support for Android application
Will your inclination to share POJO code still be intact? Perhaps not.
From what I see, you should design and develop for a REST backend and a REST capable client. Thats all. That should be the bottomline.
So with that, coming back to your requirements of sharing the updates between the backend and the client, you can share the JSON schema between the two, instead of sharing the POJOs. And thereafter, employ an automated system (say, a simple script) to generate POJOs in the backend and the client.
This approach can have certain benefits. For instance:
You will be able to share updates now and in the future, as per your requirements.
This makes your modularity (or decoupling) better too because the backend and the client is not bound by the requirements to use POJOs. For instance, you can use Data class if you decide to use Kotlin in your client.
You can use versioned schema for future, for the times where the client cannot keep up with the backend, or the backend needs to update independently.
and more
Adding to the answer above, I would take advantage of the fact that both languages use Java compilers and apis. Whether the front end uses Java or Kotlin, you can call any of these api libraries directly from your code.
One api in particular, Json-B, provides methods for transforming your Java (or Kotlin) objects into Json for transport, then transforming the Json response back into Java/ Kotlin on the other end.
One caveat: I recently heard that at least parts of the javax.* package were scheduled for deprecation. They should work on Java 14 or lower, but if you are planning on updating in the future, this is something that you will want to consider.
For Java versions 9 or newer, you should also read this first. It will save you some time.
EDIT: Json-B is, in fact, disabled by default in newer Java versions (the package is included but 'hidden'), but the last article linked in the paragraph above talks about acceptable workarounds. IMO it is still the preferred option for working with Json in Java.

Good Design for calling BRMS/Drools Logic over REST

I am working on an application which is built in .NET and Java. The Java component contains the complete Rule base using Redhat BRMS suite. The .NET client (UI and desktop based applications) will be consulting the Java Rule engine and sending/receiving the JSON data. The decision which has been taken is to expose the Rules engine (Redhat BRMS 6.0.0 using Drools) as a REST based API. I have come-up with the following design approaches:
Write a REST controller in Spring framework and service classes for calling BRMS.
Write a simple REST controller using JBOSS proprietary RESTSY or JAXRS.
Write a Camel adapter and wrap the REST calls behind the Camel and let the Camel talk to Drools.
Wrap REST behind SOAP based webservices.
I want to ask which one would be the better approach for designing such as System.
Any other thoughts are welcome.
As might be obvious from https://github.com/gratiartis/sctrcd-payment-validation-web and https://github.com/gratiartis/qzr my general preference is for exposing my Drools business rules using REST APIs in a Spring application.
The only alternative I consider in the above list is 4, where the API is exposed through a SOAP web service. Albeit definitely not wrapping a JSON REST service! A well-designed Spring application can expose functionality through both REST and SOAP APIs with very little effort.
I have usually exposed via SOAP when working with .NET clients. Firstly, the .NET tooling has excellent support for generating proxies based on WSDL that you have defined. Secondly, the WSDL forms a well-defined contract which both you and the client developers must obey. Having a strict contract can be very useful in preventing arguments. Although if your interface is simple, it may not be so much of a benefit.
The other key reason is that the WSDL does not change unless you change it deliberately. A REST JSON API may seem quick to develop, thanks to Jackson generating everything for you. However, it can expose your internal object model (and dependencies!), meaning that unless you are careful, what seems like a trivial change to an internal model can make private data visible and can break clients.
All that said, if you can keep the API reasonably simple and have a good relationship with the .NET devs (perhaps you're one of them), then going with the Spring REST API would be my recommendation. Feel free to steal code from the github repos if it can help you get started!
btw - If you were to consider Camel, it's worth noting that there is a Drools-Camel component which does quite a bit of the work for you.
In my view,
I would go with the option 1. This is the simplest and easiest approach.
Option 2 may be second choice.
Option 3 - Looks like if there are some routing rules you could choose. Again its could make it complex.
And definitely not option 4 to make it complicated with SOAP.

How to consume ad hoc web services (non-SOAP, schemaless XML)?

I need to write integrations to multiple external web services. Some of them are SOAP (have WSDL), some of them pretty much ad hoc - HTTP(s), authentication either by basic auth or parameters in URL (!), natural-language like XML which does not really map nicely to domain classes..
For now, I've done the spike integrations using Spring Web 3.0 RestTemplate and binding using JAXB2 (Jaxb2Marshaller). Some kind of binding is needed because domain classes need to be cleaner than the XML.
It works, but it kind of feels bad. Obviously this partially just because how the services are built. And one minor issue I have is naming of RestTemplate as services have nothing to do with REST. This I can live with. JAXB2 feels a bit heavy though.
So, I'm looking for some other alternatives. Ideas? I'd like to have a simple solution (so RestTemplate is fine), not too enterprisey..
While some of your services may be schemaless XML, they will still probably have a well-documented API. One of the techniques that the Spring folks seem to be pushing, at least from the web-service server side, is to use XPath/XQuery for retrieving only the information you really need from a request. I know that this may only end up being part of your solution, but I'm not sure that this is a situation where one particular binding framework is going to meet all your needs.
If I understand correctly you have 1 application that has to make calls to various external (web) services by use of different technologies. The first thing that comes to mind is to have some intermediate level. While this could be something as elaborate as en ESB-solution, my guess is that is not what you're looking for.
You could for example achieve this intermediate level by having a class hierarchy with at its top an interface 'Consumer'. Method to be implemented: doConsume() and so on.
If you look into it you'll probably have the opportunity to make use of several design patterns like Strategy or Template. Remember to be pro-active and try to ask a few times 'What if ..' (As in: what if they need me to consume yet another service? etc.)
If JAXB feels too heavy there are other API's to be found:
Axis
JAX-WS
CXF
other
It'll depend on the situation which one would be better. If you run into troubles with any of them I'm sure you'll be able to find help here on SO (and from people who have more hands-on experience with them than me ;-)

High level Java security framework

What security framework do you use in your Java projects?
I used Spring Security and Apache Shiro and they both look immature.
Spring Security flaws:
no native support for permissions;
no ability to use explicitly in Java code (sometimes it's necessary);
too much focused on classic (non AJAX) web applications.
Apache Shiro flaws:
bugs in final release (like the problem with Spring integration);
no support for OpenID and some other widely used technologies;
performance issues reported.
There is also lack of documentation for both of them.
Maybe most of the real projects develop their own security frameworks?
As for Apache Shiro:
I'm not sure why you've listed the things you did:
Every project in the world has release bugs, without question. The big key here however is that Shiro's team is responsive and fixes them ASAP. This is not something to evaluate a framework on, otherwise you'd eliminate every framework, including any you write yourself.
OpenID support will be released shortly in Shiro 1.2 - maybe a month out?.
What performance issues? No one has ever reported performance issues to the dev list, especially since the caching support in Shiro is broad and first-class. Without clarifications or references, this comes across as FUD.
Documentation now is really good actually - some of the best in Open Source that I've seen lately (it was re-worked 2 weeks ago). Do you have specific examples of where it falls short for you?
I'd love to help, but your concerns are generalizations that aren't supported by references or concrete examples. Maybe you could represent specific things that your project needs that you've fail to accomplish thus far?
Apache Shiro continues to be the most flexible and easiest to understand security framework for Java and JVM languages there is - I doubt you'll find better.
But, above all, and I mean this with all sincerity, please don't write your own security framework unless you plan on putting a ridiculous amount of time into it. Nearly every company I've ever seen that tries to do this themselves fails miserably. It is really hard to get 'right' (and secure). Trust me - after writing one for 8 years, that's one thing I'm absolutely sure of :)
Anyway, feel free to join the Shiro user list and you're sure to find that the community is happy and willing to work through whatever issues you may have. You'll find that we take care of the people that ask questions and do our best to help out.
HTH!
My current projects use SpringSecurity and involve doing all three things you claim to be flaws in SpringSecurity:
The projects implement fine-grained access rules that go beyond simple ROLEs, and variously involve state of domain objects, extra request parameters, and so on. These are implemented using custom "access policy objects" that get called within my MVC controllers. However, access check failures are handed back to SpringSecurity by throwing the relevant exception. (These could have been implemented as standard SpringSecurity method-level interceptors, but the checks typically involve examining domain objects.)
The projects support both web and AJAX access, and deal with access failures differently for the two cases. This is done by writing some custom Authentication entrypoint components for SpringSecurity that choose between different authentication behaviors depending on the request URL, etc.
In other words, it can be done ...
Having said that, I agree with you on a couple of points:
It is not easy to wire this up kind of thing. I kept on running into roadblocks when using the <http> element and its associated configurer. Like ... you want it to use a different version of component X. But to do that you have to replace Y, Z, P and Q as well.
The documentation is really sparse, and not helpful if you are trying to do something out of the ordinary.
Andrey, I think this answer comes too late to be helpful to you; it is intended for those who land on this thread later and I hope it helps.
My company recently released as open source, OACC, an advanced Java Application Security Framework. OACC is designed for systems that require up to object-level security granularity.
OACC provides a high performance API that provides permission based authorization services. In a nutshell, OACC allows your application to enforce security by answering the question: Is entity ‘A’ allowed to perform action ‘p’ on entity ‘B’?
One of the key abstractions in OACC is a resource. A resource serves as the placeholder in OACC for the object in the application domain that needs to be secured. Both the actors (e.g. users, processes) and the objects being secured (e.g. documents, servers) are represented as resources in OACC. The application domain objects that are actors, or are secured, simply store the resource id to the associated resource.
The resource abstraction allows OACC, unlike other major security frameworks, to provide a rich API that manages permissions between resources. OACC persists security relationships in RDBMS tables (DB2, Oracle, MS-SQLServer and PostgreSQL are currently supported).
For more information please check out the project website: http://oaccframework.org
We use a layered security in one of our projects. The layers are the following:
HTTPS as protocol (Apache-AJCConnectors-TomcatServlets)
Only binary objects transferred between client and servlet
Individual elements in the passed objects (either way) are encrypted
Encryption key is dynamic, set up during initial handshaking, valid for 1 session
Conceptually, the security consists of the encryption key, encryption algorithm and the data on which it is applied. We make sure that more than 1 of the 3 is never passed simultaneously during a communication. Hope that helps. Regards, - M.S.

Selecting a good Framework for web-development

Based the accepted answer to this question I've setup a NetBeans/tomcat environment.
In testing this setup I'm trying to create a Java Web/Web application, but is stumped by the a choice of frameworks for this test-app.
The choices are:
Spring Web MVC 2.5
JavaServer Faces
Struts 1.3.8
Hibernate 3.2.5
In my reading-up (googling & SO) and fairly quickly got lost in the woods, so I am considering just picking one and if it doesn't pan out, to later switch/migrate to a different one. Would such an approach be feasible?
Background on the project
(Must be Java-based due to legacy code)
It uses a self-signed applet to do client-side rendering & interaction;
Servlets retrieve data-sets requested from the client;
Database may be on some remote server, so I intend to use JDBC for accessing it;
The legacy system was CORBA (ACE/TAO) based with lots of C++ modules that need to be translated to Java, and the existing Java-modules (fortunately few) that make CORBA-calls need to be changed to use the newly translated Java-modules.
If you can come up with better approach to handle this project, please tell me.
(This project has all the hallmarks of what I like: it is interesting, challenging, and I learn something new)
Well first of all it can't hurt to take a close look at the whole Spring Framework in general. The documentation is quite good starting at the very basic module working it's way up to the web MVC layer (where you can decide if you want to use it, e.g. Struts Integration is possible, too - but I found Struts always to be a hassle anyway). Hibernate is the probably most popular Object Relation Mapper framework. It is used to store, query and retrieve you Domain Model Objects (everything that you want to store in the database) but doesn't have anything to do with the web layer.
I personally don't like JSF (another specification monster that takes way more time to get into it than it needs to). If you favour a widget based approach (putting you page together with componentes instead of outputting plain old HTML) you might want to have a look at Google Web Toolkit.
Another Spring sollution is GRails. It is really fun to use and even if you have to learn another (scripting) language (called Groovy) you can still use all your Java legacy classe in the Framework because Groovy classes are compatible with Java classes (and vice versa).
And btw. I thought that CORBA is a technology / protocol / standard that especially allows you to access methods and objects independently of the language. Wikipedia:
The Common Object Request Broker
Architecture (CORBA) is a standard
defined by the Object Management Group
(OMG) that enables software components
written in multiple computer languages
and running on multiple computers to
work together, i.e. it supports
multiple platforms.
So why do you have to translate the C++ modules to talk to Java?
First of all, cross Hibernate off your list - while you'd be advised to use it if you've got an ORM requirement it's not related to the web-tier.
Then I think you've two choices:
Spring MVC and JSF
Struts
Heading down either route is going to commit you to that/those API(s) and a switch at a later date is never going to be painless.
My advice would be:
use Spring MVC - you'll likely be using Spring anyway and so it's a natural choice.
ignore JSF, write the HTML yourself, using JSTL to render beans.
use JQuery/JavaScript to enrich the user experience.
use Hibernate for object persistence.
I think it is a good idea to just pick up the minimum, and add as needed. Chances are that you gain simplicity that way.
An idea could be to start with Spring as your "grand scheme of things", and integration technology. Then add complements as needed:
persistence : Hibernate
javaScript : pick a js library that goes well with the Spring MVC module you're using

Categories

Resources