Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
My team has been tasked with converting our application's existing SOAP API to REST. My preference is to re-write the API from scratch (reusing only the actual business logic that each operation performs). Others in my team want to just have REST as a wrapper over the existing SOAP. Meaning we would expose REST services but when a request comes in our application would internally call the existing SOAP operations.
Could you please offer suggestions on which of these is the best approach? It looks like my way is cleaner and lighter and faster but their way allows some code re-use.
It depends what is your priority and weather you are going to receive too many requests for changes in behavior of API.
Ample time and more changes expected:
If you have got time, of course writing from scratch is recommended as
it would mean cleaner, lighter and faster. This will also make
shipping new features easy.
Less time and less changes expected. API too big to do regression testing:
BUT if you have time constraints, I would suggest go with REST over
SOAP api. Anyways you are going to expose only REST api to client, so
you can do internal refactoring and phasing out of SOAP as and when
time permits you. Changing whole code means regression testing of
entire module
Could you please offer suggestions on which of these is the best
approach? It looks like my way is cleaner and lighter and faster but
their way allows some code re-use.
I wrote a framework that does the SOAP -> REST conversion. It was used internally in one of the companies I used to work for. The framework was capable of doing this with a mapping file in less than 10 minutes, but we did not use it for all services. Here's why...
Not all services (WSDL based) are designed with REST in mind. Some of them are just remote methods being invoked on a service and nothing more.
The service may not have resources that can be mapped.
Even if there are resources they may not map correctly to REST (GET / POST etc) and some of the calls are not easily translatable.
A mapping framework has an overhead of it's own. The framework's SLA was quite low (single digit millis), but even a small overhead may not be suitable for critical services. The time it takes to profile and get this overhead down should not be underestimated.
In summary, the approach works for some services but it takes some engineering effort to get there. It makes sense to do this if you have say 500+ services that need to be converted quickly in a short span of time, temporarily.
The fact that you would have to convert your REST calls to SOAP calls in order to reuse your current code definitely suggests a rewrite to me!
To make testing, debugging, and profiling easier, you should have a pure POJO based API that is called by your REST API.
What back end server are you using? As several of the more popular Java web servers have tooling that makes creating REST APIs really easy.
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
When learning/working with a new API, does anyone have any tips for effectively learning it?
I currently make a bunch of get requests to understand what I can and cannot retrieve based on the API's responses. From that point I try to map out what is within the API and see what I can build out.
If you guys could share anything what you do that would be great.
The first thing i do is to read API documentation and search for examples in it. As you get used to read this kind of docs you'll find easier to find exactly what parts of the functionality you need to learn first.
I also use search engines to look for more working examples, and after that I work on creating a minimal use case of the API (for example write a file with commons-io api of apache). For this is a good idea to create a project with multiple JUnit tests with minimal use cases of an API (in the example of commons-io create a file, delete a file, move a file, copy a file, ...).
I must say this is not a science and each API is a new world and may require a slightly differnt approach (As with rest apis you'll need to use some tool like curl or postman to understand how to communicate with them, others will have pre-requirements like have a working installation of a system, and so on).
As everything in coding you'll need to do it by yourself and struggle to solve issues you'll find by yourself (what can take several hours of your free time).
There is no "magic" behind learning something, and coding is in some way like playing a musical instrument, it requires practice.
I dont know whether you are a beginner or you have developed already but will start from scratch..!!
Apis are the code which will allow you to play with the content having certain formats...!!
There are apis based on what operations you want to do are.
Get=> In order to fetch something.
Post=> In order to save something.
Put => to update something.
Delete => to delete something.
People also use patch similar to update...!!
You can play around all these by constructing objects and databases...!!
You will require rest services spring restful web service is the ultimate good options..!!
Diving deeper you need to be careful assigning the names you give I mean the meaningful names as you dont know if tomorrow you become famous and need to make your apis sharable ;)
Now some common concerns are like
Meaningful Name.
Versioning is required like what the old apis are working and now what data your apis give.
Can implement swagger its a tool which will allow you to describe the apis like you can write what this api does what type of data it brings etc etc..!!
Apis are more or less called an end points means you have that link as a connection between front end and backend So need to keep it secure..!! By authentication.
Above four points are considered to be good practises for writing apis ;)
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I have one question about how to construct app with vaadin and rest to gain good performace and scalability. I want to make rest api as a business layer and vaadin as ui for web. Ive made this decision cause my app will be also available as standalone version and on mobile os (ios and android). For me this idea is good if we are speaking about scalability but what about performance. Lets take about 5 thousand concurrent users for example.
I want to know general data about performance and will it be good for a lot of concurrent users. Only vaadin is hard to gain good performance with a lot of concurrent users (cause anyway almost all the code is running on the server). And if we also add rest api for any vaadin operation inside im scared that i will gain fatal combo.
Of course vaadin and rest api are located on the same server.
What do you think about it? Thanks a lot for answers.
I don't think it is a good idea, as Vaadin is a server side framework, so every action you do goes to the server, and from there on it has to make another rest service call to another server from where you have to get data and render it. I would rater suggest you to try some client side frameworks like ExtJS, Jquery, Angular JS, GWT etc... You render your UI on the client side using these frameworks and any action which requires data ops like fetching data or perform transactions you can make a rest service call to the server side. Using this approach you can avoid another redirection.
Now a days you the concept of RWD is getting popular, I think this would be a great choice for your use case. Build once deploy on any device :)
BTW, this is purely my opinion.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
My application needs to work as middleware(MW) where it has got orders(in form of XML) from various customers which contains the --------------Priority 1
supplier id where custmers can send the XML to one of these components
1)JMS queue
2)File system
3)HTTP
4)Web service request(rest/soap)
This MW will first validate the incoming request and send the acknowledgement to customers who requested order
over their preferred channels. Channel and customer end point info exists in the incoming XML.
Once it gets the order, it needs to send order requests to different suppliers over their preffered channels in the form of xml.
I have the supplier and their preferred channel info in my db.
So its a Enterprise Integration usecase.
I was planning to do it using core java technologies. here is the approach i was planning.
Will have four listener/entry endpoint for each type of incoming request (JMS queue, File system, HTTP, Web service request(rest/soap)).
These listeners will put the will put the xml string on jms queue. This it will work as receptionist and make the process asynchronous.
Now i will have jms consumer which will listen on queue.(Consumer can be on same system or different as producer depending on load on
producer machine). This consumer will parse the xml string to java objects. Perform the validation. Send the acknowledgement to
customers(acknowledgement needs to be send based on customer preference. I will be using acknowledgement processor factory which will send
the acknowledgement based on preference). Once validation is done, convert this pojo to another pojo format so xstream/jaxb further
marshal it to xml format and send to suppliers on their preferred channel(supplier preference is stored in db) like thru soap,jms,file request etc.
Some i came across this CAMEL link http://java.dzone.com/articles/open-source-integration-apache and looks like its provides the perfect solution
and found this is the Enterprise Integration usecase.
Experts please advise is Camel the right solution for this. Or some other Enterprise integration fraework like Spring integration, ESB
will be more benefecial in this case.If somebody can point me to the resource where ESB solves this kind of usecase. It would be really helpful.
I could not explore all solution as because of time constraint so looking for expert suggestion so that can concentrate on one.
Something like Camel is completely appropriate for this task.
Things like Camel provide toolsets and components that make stitching together workflows like you describer easier, with the caveat that you must learn the overall tool (i.e. Camel, in this case), first.
For a skilled, experience developer, and simple use cases, you can see where they might take the approach that you're taking. Provisioning the workflow with tools at hand, including, perhaps, custom code, rather than taking the time to learn a new tool.
Recall while tools can be a great benefit (features, testing, quality, documentation), they also bring a burden (support, resources, complexity). A key aspect of bringing tool sets in to your environment is that while you may not have written the code, you are still ultimately responsible for it's behavior in your environment.
So, all that said, you need to ascertain whether the time investment of incorporating a tool like Camel is worth the benefit to your current project. Odds are that if you intend to continue and do more integrations in the future, investing in such a tool would be a good idea, as the tool will make those integration easier.
But be conscious that something like Camel, which is quite flexible, also brings along with it inherent complexity. But for simple stuff like what you're talking about, I think it's a solid fit.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
We are developing an application integrator which sends requests to various webservices simultaneously, aggregate data return by each webservice and format it to display on UI. Each webservice may have proprietary xml format. Also we don't wont to compromise user experience.
We identified ESB(Servicemix/Mule) and Async Http Client for this requirement.
Can anyone suggest which would be better option? Async Http Client seems good fit as it is lightweight over servicemix.
Thanks,
Amit Patel
you can also just use Apache Camel for this...
It supports a wide range of components and messaging patterns, is lightweight and has a flexible deployment model (standalone, spring, maven, webapp, OSGi, etc).
You have answered your own. Yes ESB is good option. You can use Mule.
Second option is Asynchronous messaging but it will be complicated, because You have to orchestrate the services properly.
eh, we did a productivity test on coding pure java vs. ESB (mule and spring integration). We had 3 developers do the same thing in all 3 versions (mule, SI, and just pure java with no ESB). They finsihed 6 times faster when not using an ESB and we gave alot of things in the problem that would leverage the ESB, but in the end it did not help....all the xml coding and confusion of the api usages led to really unproductive development teams. Not only that, it is hard to find ESB developers on the market as well.
NOTE: We even took an advanced spring integration guy that had been doing it and he also was faster in pure java to complete the code. He loved spring integration and after taking my test, he changed his mind.
ie. be warned of the huge productivity loss that using the wrong framework can cause. 6 times is a huge penalty. I mean 1 month vs. 6 months is a big difference.
A 6 times productivity loss is worth taking one week out and doing your own developer productivity test. Some argued with me that they didn't know the framework yet which is why we got an advanced spring integration guy to take the test.
Also, make sure your test takes at least an hour or so....just develop some fake but realistic requirements from the application that you are going to write anyways so you make progress on your application while running the study. I would be interested to see more results posted as well.
As per your requirement, I would suggest you to go with WSO2 ESB. Its and 100% free and open source ESB (Apache License 2) and unlike other ESBs, with wso2 you don't have the commercial and community versions. So, what you download for free from WSO2, comprises of all the features that are available in 'commercial versions' of other ESB vendors. Also, WSO2 offers not just an ESB but a complete SOA Platform for your SOA needs.
For the requirement you mentioned above, there is a simple sample that you can try out.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
Given a series of URLS from a stream where millions could be bit.ly, google or tinyurl shortened links, what is the most scalable way to resolve those to get their final url?
A Multi-threaded crawler doing HEAD requests on each short link while caching ones you've already resolved? Are there services that already provide this?
Also factor in not getting blocked from the url shortening service.
Assume the scale is 20 million shortened urls per day.
Google provides an API. So does bit.ly (and bit.ly asks to be notified of heavy use, and specify what they mean by light usage). I am not aware of an appropriate API for tinyurl (for decoding), but there may be one.
Then you have to fetch on the order of 230 URLs per second to keep up with your desired rates. I would measure typical latencies for each service and create one master actor and as many worker actors as you needed so the actors could block on lookup. (I'd use Akka for this, not default Scala actors, and make sure each worker actor gets its own thread!)
You also should cache the answers locally; it's much faster to look up a known answer than it is to ask these services for one. (The master actor should take care of that.)
After that, if you still can't keep up because of, for example, throttling by the sites, you had better either talk to the sites or you'll have to do things that are rather questionable (rent a bunch of inexpensive servers at different sites and farm out the requests to them).
Using HEAD method is an interesting idea by I am afraid it can fail because I am not sure the services you mentioned support HEAD at all. If for example the service is implemented as a java servlet it can implement doGet() only. In this case doHead() is unsupported.
I'd suggest you to try to use GET but do not read the whole response. Read HTTP status line only.
As far as you have very serious requirements for performance you cannot these requests synchronously, i.e. you cannot use HttpUrlConnection. You should use NIO package directly. In this case you will be able to send requests to all millions of destinations using only one thread and get responses very quickly.