REST web services vs JSON services - java

REST webservices imply the server-side is stateless (for the most part) and urls correspond to resources and HTTP GET/POST/DELETE/PUT methods correspond to operations on those resources.
I am planning to work on a JSON services layer that is dependent on server-side state, having urls that correspond to resources and having urls like /add /update /delete corresponding to different operations and all of those urls using HTTP POST.
What is the right terminology for this? Should this be called JSON
web services? Is there any specific term for this?
It looks like Jersey+JAX-RS would be very useful for this purpose. Is it ok?
Would it make sense to use Struts for this kind of application?

JSON is a file format which your rest service may (or may not) use to send its answers. If you use this format to communicate with your webservice, you cann say that your service is a JSON web service.
Indeed, Jersey is a good JAX-RS implementation. However it is not the only one.
If you need a webapp to interact with the webservice you can use Struts. However, you can use any other tool (included some more recent technology, struts is really old) for the front-end of your system.

Well, it's not REST. However trying to defeat that tide of general misunderstanding just isn't going to happen.
What you are describing is actually what most of the world views as REST and therefor should be mostly OK with just calling it as such.
In the presence of a true Restifarian, you will get some push-back because they will be correct - however, there isn't really much need to add a tremendous amount of confusion to your daily life as the distinctions (and the MASSIVE benefits they afford...) are largely lost on anyone you work with.
(1) is accurate enough, or simply rest. (2) is redundant but it is a good framework for the technology. (3) I would say no, but noticing when this question was asked, I'm sure you already decided.

Related

WSDLS generated in the background by servlet containers?

Recently I read the article http://www.ibm.com/developerworks/library/ws-noide1/ and this prompted me to review my knowledge of just how the SOA had evolved in the last decade or so. The review was a nice refresher however I soon discovered some gaps in my knowledge.
In particular I want to know, and could not find a definitive answer for wheather WSDL files are necessary regardless of which protocol, paradigm, or API is used in providing the web service. Is the creation and propagation of these files done by the servlet containers such as tomcat , jetty etc in the background?
Essentially HTTP protocol itself for example does not require a WSDL which leads me to believe that WSDL is a specification closely coupled to SOAP and that perhaps EJB, Spring, etc do not use require it.
I know similar questions to this have been asked such as JSON, REST, SOAP, WSDL, and SOA: How do they all link together
but I haven't been able to find a definitive answer to this specific question.
If you take a closer look to the Java specification you'll have a clue of what is clearly the WSDL.
When you build a Web Service in Java you have several way to do it.
SOAP, spec Jax-WS : this standard is kind of strict. It order to communicate with it you have to respect a contract. This contract, named the WSDL, is an XML that define how to reach the WS, which parameters are needed and what are their types. This file is provided by the service and most of modern IDE would generate it but you have to provide it to your client so it can call the WS respecting the contract.
REST, spec Jax-RS : this standard is far less strict as you have no contract. This provide an URL over a specific HTTP method (GET, POST, PUT, DELETE). To make a call to this kind of WS, just call it and you'll see what happen.
Queue JMS : this is a kind of different as the two others but seems important to me as it provide a way to create messaging reliable, decoupled et asynchronous. It is based on a connection factory to deals with the communication.
These standard are implemented in most of the main technologies today. Java EE with its EJB has implementation for the three of them, as do Spring.
SOA provides many ways to communicate now, depending on what are your needs.
I hope I helped, don't hesitate to ask if needed.
EDIT:
To explain use case I'll try to set up an example... It's a kind of hard exercise and not perfect but I hope it will help you.
Consider, you work for a house seller. You have three different call to WS: 1. you confirm a sale on your website, 2. you search in your catalog, 3 you inform your boss via a small message on the intranet.
I precise that using three different type of WS is not compulsory.
This action is really important for your workflow. Data that are
sent must arrived. You must be sure to respect what is expected on
the WS. Client side and server side must be perfectly matching.
You'll use SOAP because there is a specific contract between these
two sides.
For this you don't need a specific and rigid contract. Searching is just easy and does not need a structure with defined arguments. Just get data and print it on the screen. Here REST is maybe more suitable because it is easier to set up and if modification are needed, there is no contract to modify on client side.
For messaging, you want to send the message and that's all. JMS are queues waiting for "message". These messages are requests that will be consume asynchronously. The message will be stored waiting for a consumer to take them in the queue order (FIFO).
The generation of the WSDL is your task. It will generate an xml file based on your Java code of the WS. Notice that the contrary is possible too, if you have a WSDL you can generate the Java from it(see this. Most of the time you have an url corresponding to your WSDL file so it can be accessed from your client.
You can generate WSDL from the IDE. But I'm not sure using Maven is the right way. The WSDL is your contract, it may be the one you based your WS on. The generation with IDE is just a way to make your life easier but at the end the WSDL may not change a lot. If it does then maybe SOAP is not what you need. REST may be more "agile".
Look at these links for manually generation with IDE (IntelliJ, Eclipse) or with external tool WSGEN.

Good Design for calling BRMS/Drools Logic over REST

I am working on an application which is built in .NET and Java. The Java component contains the complete Rule base using Redhat BRMS suite. The .NET client (UI and desktop based applications) will be consulting the Java Rule engine and sending/receiving the JSON data. The decision which has been taken is to expose the Rules engine (Redhat BRMS 6.0.0 using Drools) as a REST based API. I have come-up with the following design approaches:
Write a REST controller in Spring framework and service classes for calling BRMS.
Write a simple REST controller using JBOSS proprietary RESTSY or JAXRS.
Write a Camel adapter and wrap the REST calls behind the Camel and let the Camel talk to Drools.
Wrap REST behind SOAP based webservices.
I want to ask which one would be the better approach for designing such as System.
Any other thoughts are welcome.
As might be obvious from https://github.com/gratiartis/sctrcd-payment-validation-web and https://github.com/gratiartis/qzr my general preference is for exposing my Drools business rules using REST APIs in a Spring application.
The only alternative I consider in the above list is 4, where the API is exposed through a SOAP web service. Albeit definitely not wrapping a JSON REST service! A well-designed Spring application can expose functionality through both REST and SOAP APIs with very little effort.
I have usually exposed via SOAP when working with .NET clients. Firstly, the .NET tooling has excellent support for generating proxies based on WSDL that you have defined. Secondly, the WSDL forms a well-defined contract which both you and the client developers must obey. Having a strict contract can be very useful in preventing arguments. Although if your interface is simple, it may not be so much of a benefit.
The other key reason is that the WSDL does not change unless you change it deliberately. A REST JSON API may seem quick to develop, thanks to Jackson generating everything for you. However, it can expose your internal object model (and dependencies!), meaning that unless you are careful, what seems like a trivial change to an internal model can make private data visible and can break clients.
All that said, if you can keep the API reasonably simple and have a good relationship with the .NET devs (perhaps you're one of them), then going with the Spring REST API would be my recommendation. Feel free to steal code from the github repos if it can help you get started!
btw - If you were to consider Camel, it's worth noting that there is a Drools-Camel component which does quite a bit of the work for you.
In my view,
I would go with the option 1. This is the simplest and easiest approach.
Option 2 may be second choice.
Option 3 - Looks like if there are some routing rules you could choose. Again its could make it complex.
And definitely not option 4 to make it complicated with SOAP.

How to consume ad hoc web services (non-SOAP, schemaless XML)?

I need to write integrations to multiple external web services. Some of them are SOAP (have WSDL), some of them pretty much ad hoc - HTTP(s), authentication either by basic auth or parameters in URL (!), natural-language like XML which does not really map nicely to domain classes..
For now, I've done the spike integrations using Spring Web 3.0 RestTemplate and binding using JAXB2 (Jaxb2Marshaller). Some kind of binding is needed because domain classes need to be cleaner than the XML.
It works, but it kind of feels bad. Obviously this partially just because how the services are built. And one minor issue I have is naming of RestTemplate as services have nothing to do with REST. This I can live with. JAXB2 feels a bit heavy though.
So, I'm looking for some other alternatives. Ideas? I'd like to have a simple solution (so RestTemplate is fine), not too enterprisey..
While some of your services may be schemaless XML, they will still probably have a well-documented API. One of the techniques that the Spring folks seem to be pushing, at least from the web-service server side, is to use XPath/XQuery for retrieving only the information you really need from a request. I know that this may only end up being part of your solution, but I'm not sure that this is a situation where one particular binding framework is going to meet all your needs.
If I understand correctly you have 1 application that has to make calls to various external (web) services by use of different technologies. The first thing that comes to mind is to have some intermediate level. While this could be something as elaborate as en ESB-solution, my guess is that is not what you're looking for.
You could for example achieve this intermediate level by having a class hierarchy with at its top an interface 'Consumer'. Method to be implemented: doConsume() and so on.
If you look into it you'll probably have the opportunity to make use of several design patterns like Strategy or Template. Remember to be pro-active and try to ask a few times 'What if ..' (As in: what if they need me to consume yet another service? etc.)
If JAXB feels too heavy there are other API's to be found:
Axis
JAX-WS
CXF
other
It'll depend on the situation which one would be better. If you run into troubles with any of them I'm sure you'll be able to find help here on SO (and from people who have more hands-on experience with them than me ;-)

Opinions/headers for splitting an application into front-end/back-end (not user/admin, but ui/logic)

I am building a new web application and playing around with the architecture and would like some opinions about splitting UI and business logic and running them on separate servers.
This means that if someone requests a page, the front end will itself request the data from a back-end server and then not actually perform any calculations/logic but just use the data to populate a template and then respond with that.
Back-End: Java + JAX-WS
Front-End: Kohana 3.1 (PHP)
Data Interchange Format: JSON
Advantages:
clear separation of logic and UI
ability to choose language/framework best suited for either end
possibility to add logic/UI servers depending on which one is the bottleneck in case of performance issues
possibility to make the API publicly available without any extra work (pseudo-internal requests will go to the same API as requests from third-part applications)
ability to change (if need be) the framework/language of either side without having to edit the other
ability to specify different server hardware according to the needs of the logic/UI application
better security (if API private) (??)
Disadvantages:
latency (??)
more servers
So what do you think? Is this a good idea? I haven't been able to find much information so far but my guess is that many big sites do it this way, right? How will performance be affected (I am thinking of running it on EC2)? What are further advantages/disadvantages? Any thoughts on the languages/frameworks choices?
A similar architectural pattern is often employed, though generally the UI part is often moved to the client. So you have a backend that responds with JSON, a quick http server with full-blown caching (and that can use html5 app caching as well) and a rich javascript client which requests the JSON from the backend and builds the UI.
More on this pattern: http://www.metaskills.net/2008/05/24/the-ajax-head-design-pattern/
The main negative of the approach is that is generally more work in the beginning - if you don't need an external API then using a simpler architecture will be easier to program.
You also might want to employ the idea of keeping your servers stateless and let the client side handle any state.
This simplifies the whole load-balancing and fail-over stuff and makes you think about a more resource-oriented architecture.
And if you are set on JSON already, you might want to explore the idea of NOT mapping POJOS to your data and use a document store like MongoDB or CouchDB to access JSON data directly.

Expose webservice directly to webclients or keep a thin server-side script layer in between?

I'm developing a REST webservice (Java, Jersey). The people I'm doing this for want to directly access the webservice via Javascript. Some instinct tells me this is not a good idea, but I cannot really explain that instinct. My natural approach would have been to have the webservice do the real logic and database access, but also have some (relatively thin) server-side script layer (e.g. in PHP). Clients would talk to the PHP layer which in turn would talk to the webservice. (The webservice would be pretty local to the apache/PHP server and implicitly trust calls from the script layer. The script layer would take care of session management.)
(Btw, I am not talking about just hiding the webservice behind an Apache which simply redirects calls.)
But as I find myself at a lack of words/arguments to explain my instinct, I wonder whether my instinct is right - note that while I have been developing all kinds of software in all kinds of languages and frameworks for like 17 years, this is the first time I develop a webservice.
So my question is basically: what are your opinions? Are there any standard setups? Is my instinct totally wrong? Or partially? ;P
Many thanks,
Max
PS: I might add a few bits of information about the planned usage of the whole application:
will be accessed by different kinds of users, partly general public, partly privileged
thus, all major OS/browser combinations can be expected as clients
however, writing the client is not my responsibility
will potentially have very high load/traffic
logic of webservice will later be massively expanded for another product which is basically a superset of the functionality of the current project
there is a significant likelihood that at some point an API should be exposed which can be used by 3rd party developers - obviously, with some restrictions
at some point, the public view of the product should become accessible via smartphones, too (in other words, maybe a customized version of the site to adapt to the smaller display and different input methods)
I don't think that accessing a REST webservice directly via e.g. JavaScript is
generally a bad idea, because that what the REST architecture is designed
for. For your usecase you might have some implications to consider:
Your webservice will have to take care of user management. Since the REST architecture does not support a server side session state you will have to do authentication and authorization on every request. Users will have to maintain their state on the client side.
Your webservice implementation will have to take care of issues like caching and load balancing and all the other things you might have assigned to e.g. the PHP "proxy" script
For your requirements:
all major OS/browser combinations can
be expected as clients
Since you webservice will only deliver data (e.g. JSON or XML) this should not be a problem. The JavaScript part just has to take care to issue the correct requests.
will potentially have very high
load/traffic
If you strictly follow the REST architecture you can make use of http caches. But keep in mind that the stateless nature will always cause more traffic.
logic of webservice will later be
massively expanded for another product
which is basically a superset of the
functionality of the current project
The good thing about open webservices is that you can loosely couple them together.
there is a significant likelihood that
at some point an API should be exposed
which can be used by 3rd party
developers - obviously, with some
restrictions
Again, with RESTful webservice you already have an API exposed for developers. It is on your clients to decide if this is a good or a bad thing.
at some point, the public view of the
product should become accessible via
smartphones
Another pro for making your REST webservice publicly accessible. Most smartphone APIs support HTTP requests, so you will just have to develop the GUI for the specific smarphone platform that makes direct calls to the webservice.
Firstly I am just extending on what Daff replied above. I am extending Daff's answer from the point of my learning or designing and implementing RESTful WebServices and please note that I am still learning.
When I started learning RESTful WS with Java, Jersey (0.3 IIRC), I had similar questions and the primary cause for that is "Total" mis-conception about RESTful Architecture. The most "Grave" mistake I performed was using JAXB for XML and Jackson for JSON (de)serialization directly from/to the persistence beans. This totally violates the REST principal and hence creating some vital issues in creating a high performance, highly available, scalable web service.
My mistake was, thinking in terms of API a.k.a Service, when we think RESTful WS we should forget "API" and think Resources. We should take great care in interlinking resources. My understanding of this only came after reading this, I suggest it to anyone wanting to create their own web service. My conclusion is what is Resource is to RESTful WS/Architecture what API to a native interface or SOAP Web Service. So I would suggest design your resources with care and understand that there is no limit in how resources your WebService may have.
So here comes how I concluded in implementing systems exposing an "API" through RESTful WS. I create an API which deals communicating with business entities, for example, PersistentBook, which contains either Id of PersistentAuthor or the object itself. All business logic considering persistent entities lie in the API implementation layer.
The web service layer uses the API layer to perform its operations on resources. Web service layer uses persistent entities to generate representations of beans and vice versa, the key feature here would be PersistentBook's representation would have a URI to the PersistentAuthor. If I want to use automated (de)serialization I create another domain layer, e.g. Book, Author etc.
Now as Daff mentioned caching would be inevitable, my checkpoints for them are -
Support for 'Cache-Control', 'Last-Modified', 'ETag' response headers and 'If-Modified-Since', 'If-Match-None' request headers are key. Note from my more recent learnings - use 'Vary' header in case of varying representations (content negotiation) based on 'Accept' header.
Using a server side caching such as Squid, Varnish in case clients do not use caching. One thing I learnt having all the right header support counts for nothing if clients do support them and in fact increases the cost in terms of computation and badnwidth ;)
Use of Content-Encoding.

Categories

Resources