Cross Domain Error with self built API - java

The server sending JSON to the API is a Tomcat server in the Gradle packages (it is built in Java).
I am having trouble's making an API call with Angular. I know my API is working because I can view it on "Postman."
var app = angular.module("todo", []);
app.controller("AppCtrl", function($http){
$http.get("192.168.5.100:8080/aggregators/datafile")
.success(function(data){
console.log(data)
})
})
When I run it I get the following error:
XMLHttpRequest cannot load %3192.168.5.100:8080/aggregators/datafile. Cross origin requests are only supported for HTTP.

The problem you're running into is that you can't make cross origin requests from the browser without CORS or using JSONP.
Postman operates outside of the context of the browser (as if you had issued a cURL request, if you're familiar with cURL).
This is for security reasons.
So, how do you implement JSONP? It really depends on the server, but in general, your resource would look for a GET request that had a pre-determined querystring parameter (normally callback for simplicity):
http://192.168.5.100:8080/aggregators/datafile?callback=mycallback
How do you make a JSONP call?
The server wraps the JSON in that callback, causing it to look something like the following:
mycallback({json:object});
This Stack Overflow answer goes into more detail.
The callback is the function the browser should hit when the request is executed, and that's what allows for cross-domain requests.
Now, on to CORS.
CORS is a system for allowing the browser to communicate with the server to determine whether or not it should accept a cross domain request. It's a bit complicated, but in general it involves settings up certain Headers on your API Server; and then executing an Ajax request in a particular fashion (for JQuery, use the withCredentials property for $.ajax). The server checks where the request is from, and if it's a valid source, it let's the browser know and the browser allows the request (I'm being simplistic).
MDN has a thorough explanation of CORS that is worth reading.

Related

Exposing a web site through web services

I know what I am asking is somehow weird. There is a web application (which we don't have access to its source code), and we want to expose a few of its features as web services.
I was thinking to use something like Selenium WebDriver, so I simulate web clicks on the application according to the web service request.
I want to know whether this is a better solution or pattern to do this.
I shall mention that the application is written using Java, Spring MVC (it is not SPA) and Spring Security. And there is a CAS server providing SSO.
There are multiple ways to implement it. In my opinion Selenium/PhantomJS is not the best option as if the web is properly designed, you can interact with it only using the provided HTML or even some API rather than needing all the CSS, and execute the javascript async requests. As your page is not SPA it's quite likely that an "API" already exists in form of GET/POST requests and you might be lucky enough that there's no CSRF protection.
First of all, you need to solve the authentication against the CAS. There are multiple types of authentication in oAuth, but you should get an API token that enables you access to the application. This token should be added in form of HTTP Header or Cookie in every single request. Ideally this token shouldn't expire, otherwise you'll need to implement a re-authentication logic in your app.
Once the authentication part is resolved, you'll need quite a lot of patience, open the target website with the web inspector of your preferred web browser and go to the Network panel and execute the actions that you want to run programmatically. There you'll find your request with all the headers and content and the response.
That's what you need to code. There are plenty of libraries to achieve that in Java. You can have a look at Jsop if you need to parse HTML, but to run plain GET/POST requests, go for RestTemplate (in Spring) or JAX-RS/Jersey 2 Client.
You might consider implementing a cache layer to increase performance if the result of the query is maintained over the time, or you can assume that in, let's say 5 minutes, the response will be the same to the same query.
You can create your app in your favourite language/framework. I'd recommend to start with SpringBoot + MVC + DevTools. That'd contain all you need + Jsoup if you need to parse some HTML. Later on you can add the cache provider if needed.
We do something similar to access web banking on behalf of a user, scrape his account data and obtain a credit score. In most cases, we have managed to reverse-engineer mobile apps and sniff traffic to use undocumented APIs. In others, we have to fall back to web scraping.
You can have two other types of applications to scrape:
Data is essentially the same for any user, like product listings in Amazon
Data is specific to each user, like in a banking app.
In the firs case, you could have your scraper running and populating a local database and use your local data to provide the web service. In the later case, you cannot do that and you need to scrape the site on user's request.
I understand from your explanation that you are in this later case.
When web scraping you can find really difficult web apps:
Some may require you to send data from previous requests to the next
Others render most data on the client with JavaScript
If any of these two is your case, Selenium will make your implementation easier though not performant.
Implementing the first without selenium will require you to do lots of trial an error to get the thing working because you will be simulating the requests and you will need to know what data is expected from the client. Whereas if you use selenium you will be executing the same interactions that you do with the browser and hence sending the expected data.
Implementing the second case requires your scraper to support JavaScript. AFAIK best support is provided by selenium. HtmlUnit claims to provide fair support, and I think JSoup provides no support to JavaScript.
Finally, if your solution takes too much time you can mitigate the problem providing your web service with a notification mechanism, similar to Webhooks or Resthooks:
A client of your web service would make a request for data providing a URI they would like to get notified when the results are ready.
Your service would respond immediatly with an id of the request and start scraping the necessary info in the background.
If you use skinny payload model, when the scraping is done, you store the response in your data store with an id identifying the original request. This response will be exposed as a resource.
You would execute an HTTPPOST on the URI provided by the client. In the body of the request you would add the URI of the response resource.
The client can now GET the response resource and because the request and response have the same id, the client can correlate both.
Selenium isn't a best way to consume webservices. Selenium is preferably an automation tool largely used for testing the applications.
Assuming the services are already developed, the first thing we need to do is authenticate user request.
This can be done by adding a HttpHeader with key as "Authorization" and value as "Basic "+ Base64Encode(username+":"+password)
If the user is valid (Users login credentials match with credentials in server) then generate a unique token, store the token in server by mapping with the user Id and
set the same token in the response header or create a cookie containing token.
By doing this we can avoid validating credentials for the following requests form the same user by just looking for the token in the response header or cookie.
If the services are designed to chcek login every time the "Authorization" header needs to be set in request every time when the request is made.
I think it is a lot of overhead using a webdriver but it depends on what you really want to achieve. With the info you provided I would rather go with a restTemplate implementation sending the appropriate http messages to the existing webapp, wrap it with a nice #service layer and build your web service (rest or soap) on top of it.
The authentication is a matter of configuration, you can pack this in a microservice with #EnableOAuth2Sso and your restTemplate bean, thanks to spring boot, will handle the underlining auth part for you.
May be overkill..... But RPA? http://windowsitpro.com/scripting/review-automation-anywhere-enterprise

IntelliJ Static Web Project to Tomcat or Angular CORs

I have a static web angular project in IntelliJ IDEA. The static page gets deployed to http://localhost:63342/Calculator/app/index.html. I have run into a problem where I try to post some data to a server to get a response back but when I try to post I get this error:
XMLHttpRequest cannot load <url>. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost:63342' is therefore not allowed access. The response had HTTP status code 401.
Here is my post angular code:
WebIdServer.prototype.getId = function(id) {
var _this = this;
var request = {
method: 'POST',
url: 'https://<url>,
headers: {
'Authorization':'Bearer QWE234J234JNSDFMNNKWENSN2M3',
'Content-Type':'application/json',
},
data: {
id:id
}
};
_this.$log.debug(request);
return _this.$http(request)
.success(function(data, status, headers, config){
_this.$log.debug("Successfull request.");
/*called for result & error because 200 status*/
_this.uid = data.id;
_this.$log.debug(_this.uid);
})
.error(function(data, status, headers, config){
_this.$log.debug("Something went wrong with the request.");
_this.$log.debug(data);
/*handle non 200 statuses*/
});
};
I know for a fact that post works because I tried it on a local url of my application that I had running on a different port.
So my question is, since I can't post from localhost I was wondering if maybe deploying this to a tomcat server would fix things. If so, how do you deploy this static web project to a tomcat server? If that's not necessary, then how do I get around this problem I'm having?
There's a few things regarding CORS. It's a web browser telling you you cannot make a particular call. This is only a front end problem, a script running on a server can call any api regardless of the location. Three different options:
without config; same hosts
Without any configuration on your server, your front end's AJAX requests need to match both the domain and the port of the service you're calling. In your case, your angular app at http://localhost:63342 should be calling a service also hosted on http://localhost:63342 and then you're sweet. No CORS issues.
with server side config; different hosts
If the API is hosted elsewhere, you'll have to configure the API host. Most servers will let you configure access controls, to allow a particular domain to bypass the CORS block. If you have access to the server you're trying to call, see if you can set it up. The enable CORS website has examples for most servers. Usually this is pretty simple.
Create a proxy
This is your Tomcat idea. CORS is only a problem if your front end calls another service. A server script can call anything it likes. So, you could use Tomcat (or Apache, or NGINX, or NodeJS...) to host a script that'll pass on the request. Essentially, all it needs to do is add Access-Control-Allow-Origin: * to the response of the API.
I have never used Tomcat myself, but here's a blog post that might have some useful info on how to do that. Combine it with the info on enable CORS and you should be able to route anything to anywhere.
This process is common. Just look at the popularity of a node package like CORS anywhere, which is what your tomcat does.
As a disclaimer, how good of an idea this is depends on how you can pass along the credentials and tokens. You don't really want to create a service that'll just blindly call someone else's API with your credentials.

How to get cross-domain JSON from jQuery?

I am trying to get JSON (getJSON()) from server that doesn't have support for jsonp implemented. Namely, when adding callback=? to the URL, the server does return the data, but it returns pure JSON without padding.
I understand this is something that must be corrected server-side - there is no way to resolve it in jQuery. Is this correct?
If CORS support is not supported by server as well jsonp, you might try proxy approach in such cases. One example http://www.corsproxy.com/, there should be other proxy alternatives too.
What does it do?
CORS Proxy allows javascript code on your site to access resources on other domains that would normally be blocked due to the same-origin policy.
How does it work?
CORS Proxy takes advantage of Cross-Origin Resource Sharing, which is a feature that was added along with HTML 5. Servers can specify that they want browsers to allow other websites to request resources they host. CORS Proxy is simply an HTTP Proxy that adds a header to responses saying "anyone can request this".

JQuery post to a Tomcat Servlet

I am trying to make a JQuery $.post to a Java Servlet. I integrated the Tomcat server into
Apache and if the Tomcat server is on the same machine as the Apache the $.post succeded.
(The Java Servlet receives it).
If the Tomcat servlet is on a remote machine and if I make $.post(http://ip:8080/App/MyServlet,...) the servlet doesn't receive anything.
If I make a JQuery $.post on my machine I have like this $.post(Myservlet,.....).
If I try like this : $.post(http://localhost:8080/App/MyServlet,...) it doesn't work.
How should I make a JQuery $.post to a remote uri?
How should the remote uri for a Tomcat Servlet look like?
Thanks,
Jquery runs in the browser (client-side), which means it's subject to the browser's same-origin policy, which is a good thing.
This means ajax requests that are GET or POST can only be made to the domain of the page making the ajax request.
There are 2 ways to bypass the policy. The first is to have the remote server vouch for the request, the second is to sneak around the browser's same-origin policy.
So if you have control over the remote server, or if the admin who does takes requests to open the server/domain to foriegn ajax requests, then the server just needs to send the following header:
Access-Control-Allow-Origin: your-local-domain.org
The browser gets back the response header, sees that the requesting page is in the above list, and allows the response through.
If you have no control over the remote server, here are the sneakier ways to get around same-origin policy:
Make an ajax request to a local url with the parameters, and have it pass it along to the servlet, and the have that proxy script return whatever the servlet responds with.
JSONP (which I'm still fuzzy on, honestly, but jquery's ajax documentation goes into it)
Script injection, where you leverage the fact that the script element's src is not limited by the same-origin policy.
Of the 3, I think the first is the safest, least hackish, and most honest (so to speak), but JSONP has become the simple and easy way to pull of a cross-domain request in jquery.

jsonp referer url determine

I have a jquery plugin and I'm using jsonp for crossdomain call to a jsp file.
I want to strict the jsp return values only to specific websites in our database.
To achieve this I need to somehow get the ip or url of the website the jsonp call triggered and not the client/user ip. I've tried the referer value in the http header but this will not work with IE and I guess this is not the best solution either.
How can I securely now who is calling my jsp file with my plugin, from his website?
Thanks in advance.
The simplest answer would be to issue each website a unique key or other identifier that they include in their request. You parse this identifier and flex your response appropriately.
However with a request originating from the client browser, you would have to be careful and would have to evaluate what you mean by how "securely" you need the request to be handled. (since the untrusted client would be making the request it would be a simple task to harvest and reuse such an identifier)...
Referrer (if present) could be used as a double check, but as you pointed out, this is unreliable and coming from an untrusted client computer, this portion of the request could be faked as well.
If we could assume some server side processing by the website owners, you could have them implement a proxy for the jsonp call (which would ensure such a token would never fall into the hands of the browser)... but we'd have to know if such a safeguard would really be worth it or not :)

Categories

Resources