We have a webservice which saves data and presents the same on the User interface for viewing the transactions. Now, my requirement is to validate all the input parameters in the web service request to make sure that vulnerable content is not shown on UI. I am looking for solutions to validate input params in the web service request, before it is saved to database.
Some of the solutions that I have are below:
Use Java Filter along with any parser API - Dom or SAX etc and validate all the input parameters. But, this approach might create lot of burden on the server.
Dom and SAX parser
Before saving the data into our database, validate each parameter from java object and if any of them fails, fail the transaction. This looks fine, but kind of maintenance overhead as and when we add a new service.
Are there any API or jars which can be integrated with axis2 or java which takes care of validating the request params rather than doing it manually?
Please suggest what is the best way.
Thanks,
Harika
As you mentioned approach 2 is the ideal one and you can use Apache Commons Lang library's StringEscapeUtils which has methods escapeHtml, escapeJavascript and escapeXml which can eliminate Front end code before saving it into the database.
This will prevent XSS but can not guarantee SQL Injection prevention.
Related
I've got the need to be able to write a standalone application/function to be able to sign and verify SOAP based XML messages.
The catch is that this all needs to be done outside of any WebServices framework. I need to be able to pass my function my SOAP request as a string (XML format), and have my function produce an XML string response. The signature must sign several header elements (including custom elements) as well as the soap body.
Similarly, I need to be able to validate the signed response by passing it a string of XML and have it return a boolean.
Most of the documentation I've seen does one of two things.
Code everything by hand, manually altering XML elements (adding ids, namespaces, etc)
Rely on WS frameworks (AXF, JAX-WS, etc) to do the signing
I think that using a WS framework is significant overkill for my simple needs, but at the same time, I don't want to have to manually alter all the elements by hand.
Is there a fair compromise somewhere? Are there Spring libs that can help me find a middle ground?
I have created a web-service which involves accepting certain parameters in URL, validation the and returning XML response to client using JAXB xml library. Recently I came across JAX-WS framework which uses SOAP over HTTP to transfer request/response to/from client. Am I losing something by using simple servlet + connection pooling and serving XML response over HTTP ? Client will be hitting the URL created via some systems which are not known yet.
Is there any advantage to use REST or SOAP protocol, when simple HTTP + XML or JSON can solve the problem.
Sounds like you do not have any constraints over what you serve which leaves you with loads of options. There are quite a few angles to approach this question from, I just list some of the most obvious practical considerations below and leave out any architectural discussion.
SOAP is generally (but not always) used for enterprise integration - but it has a lot of shortcomings - not least in its verbosity. My advice would be not to use SOAP unless you really have no choice in the matter.
XML or JSON ? would your clients prefer to consume JSON or XML? If you are just publishing the service and not consuming it then I would go with JSON since it is becoming a very popular message format now for web services.
If you are the only one consuming the response then you need to think about what your client technology/framework would likely prefer to parse and how big the message payload is likely to be. If it is rich (loads of nested objects and attributes) then an XML schema might suit, but if the messages are very large then consider that the JSON footprint is likely to be smaller than the XML one.
You are not missing anything in your approach. Go for the simplest option - which depends on what framework/libraries you are using and the skill-set of your developer(s). If a servlet appears to be the most straightforward to you, go with that. If your data is already in XML then that seems like the way to go, but if not, I would consider publishing JSON format first. If you're not sure how to do that, first have a look at Jackson.
Update:
If you are still not sure which way to go then just serve JSON.
Also, the format of the messages you consume/publish should not dictate how you implement your application design - I mean, the question to use a servlet or not does not really factor into what message format to use, unless you intend to use a framework which ties you to a particular approach eg. Play Framework will very easily allow you to serve JSON from a Controller (not Servlet) so if you were using this framework - for example - you would not want to use a servlet and you would use JSON since that is the easiest way to go because of the out-of-the box support the framework already provides.
I have a Java client that allows indexing documents on a local ElasticSearch server.
I now want to build a simple Web UI that allows users to query the ES index by typing in some text in a form.
My problem is that, before calling ES APIs to issue the query, I want to preprocess the user input by calling some Java code.
What is the easiest and "cleanest" way to achieve this?
Should I create my own APIs so that the UI can access my Java code?
Should I build the UI with JSP so that I can directly call my Java
code?
Can I somehow make ElasticSearch execute my Java code before
the query is executed? (Perhaps by creating my own ElasticSearch plugin?)
In the end, I opted for the simple solution of using Json-based RESTful APIs. Time proved this to be quite flexible and effective for my case, so I thought I should share it:
My Java code exposes its ability to query an ElasticSearch index, by running an HTTP server and responding to client requests with Json-formatted ES results. I created the HTTP server with a few lines of code, by using sun.net.HttpServer. There are more serious/complex HTTP servers out there (such as Tomcat), but this was very quick to adopt and required zero configuration headaches.
My Web UI makes HTTP GET requests to the Java server, receives Json-formatted data and consumes it happily. My UI is implemented in PHP, but any web language does the job, as long as you can issue HTTP requests.
This solution works really well in my case, because it allows to have no dependencies on ES plugins. I can do any sort of pre-processing before calling ES, and even post-process ES output before sending the results back to the UI.
Depending on the type of pre-processing, you can create an Elasticsearch plugin as custom analyser or custom filter: you essentially extend the appropriate Lucene class(es) and wrap everything into an Elasticsearch plugin. Once the plugin is loaded, you can configure the custom analyser and apply it to the related fields. There are a lot of analysers and filters already available in Elasticsearch, so you might want to have a look at those before writing your own.
Elasticsearch plugins: https://www.elastic.co/guide/en/elasticsearch/reference/1.6/modules-plugins.html (a list of known plugins at the end)
Defining custom analysers: https://www.elastic.co/guide/en/elasticsearch/guide/current/custom-analyzers.html
Question is,
What is the accepted 'elegant' solution for parsing URL strings to select a response function in Jetty?
I've provided some background, but feel free to skip my waffle!
The situation is that I've written a bunch of client/server code in Java, using a socket connection to communicate serialized Java objects. Obviously, this depended on both the client and server being in Java, and while that was fine to start with I wish to make the system more universal and resilient to firewalls.
Therefore, I've decided to create a REST API to communicate the data over HTTP.
I DON'T want to produce any HTML - all responses will be JSON-encoded. Therefore, I'd rather not create individual files for each response - instead, I'd like to redirect each HTTP request, based on its type and URL, to a Java method which will produce a JSON response.
Unfortunately my only experience of programming servers is using Python/Django, and I'd rather not change language at this point.
So my question is very simple - how do I set up an embedded Jetty server to parse URLs? I've tried reading the docs, but had trouble making sense of this. At the moment, it seems there are two possibilities.
One is the target string in the handle method of the AbstractHandler javadoc, which I believe contains the entire URL string? So I could manually parse this in a Java method, to which all web requests would be directed.
The other is to use a chain of ContextServletHandler? Not precisely sure how that would be done, or whether this is the intended use of that function.*
I work on an application that uses Spring MVC and Hibernate. I am implementing some RESTful web services and am curious how to easily filter collections server side.
As an example, I want to be able to filter a collection of employee entities. I have researched several options, such as RQL, the way Google handles custom searches, Ebay's answer, and even Yahoo's YQL. They all seem to be good answers to the filtering question, but I can not seem to find any libraries that will allow me to easily implement this concept.
I did find here, that:
Apache CXF introduced FIQL support with its JAX-RS implementation since 2.3.0 release
but we are already using Spring MVC.
I'm surprised there is no library for taking the bold query string below, for example, and translating that into SQL or something that Hibernate can use to filter.
/employees?lastname=john OR jon&hiredate lt 20010201
It is entirely possible that I am thinking of this incorrectly, but I wanted to tap into the community's experience and knowledge. What am I missing?
I know this is old, but for completeness, there are at least two libraries that handle parsing RQL:
https://github.com/jazdw/rql-parser
https://github.com/jirutka/rsql-parser (not quite RQL by default, but configurable)
I'm using jazdw/rql-parser myself and started working on an SQL mapper but as Oleksi mentioned there is a lot of custom code required for validating, field mapping, etc. so I don't know how generic I can make it yet.
A library that directly converts a GET like that into SQL could be very insecure. You need to have an intermediate layer to do some validation to make sure that the user isn't messing with the URL to execute a SQL injection.
As far as I know, the best you can do is use your JAX-RS implementation to cleanly read in those query parameters, validate them, and use something like a prepared SQL statement to securely execute them.