I keep on facing this question from my manager how SSO will work if client disable cookies but I don't have any answer. We are currently using JOSSO for single sign on. Do we have any open source framework which support single sign on without using cooking mechanism.
In the absence of cookies, you're going to have to embed some parameter in each url request. e.g. after logging in you assign some arbitrary id to a user and embed that in every link such as http://mydomain.com/main?sessionid=123422234235235. It could get pretty messy since every link would have to be fixed up before it went out the door which slows down your content. It also has security, logging and session history implications which are not such a huge deal when the state is in a cookie.
It may be simpler to do a simple cookie test on logged in users and send them off to an error page if they do not have cookies enabled.
The CAS project passes a "ticket" from the sign on server to the consuming application as a url query parameter, the consuming app then makes a back channel request back to the sign on server to validate the ticket's authenticity. This negates the need for cookies and therefore works across domains however it is a bit "chatty"
Another arguably more robust solution is to use a product based on SAML which is an industry standard for cross domain single sign on. There are a couple of open source products out there which use SAML and CAS itself has a SAML extension however they are typically quite complex to setup. Cloudseal is also based on SAML and is much simpler to use. The Cloudseal platform itself is delivered as a managed service but all the client libraries are open source
Of course with all these solutions you are simply passing a security context from one server to another, the consuming application will no doubt create it's own local session so you would then need to use URL rewriting instead of cookies
Disclaimer: I work for Cloudseal :)
Related
I know what I am asking is somehow weird. There is a web application (which we don't have access to its source code), and we want to expose a few of its features as web services.
I was thinking to use something like Selenium WebDriver, so I simulate web clicks on the application according to the web service request.
I want to know whether this is a better solution or pattern to do this.
I shall mention that the application is written using Java, Spring MVC (it is not SPA) and Spring Security. And there is a CAS server providing SSO.
There are multiple ways to implement it. In my opinion Selenium/PhantomJS is not the best option as if the web is properly designed, you can interact with it only using the provided HTML or even some API rather than needing all the CSS, and execute the javascript async requests. As your page is not SPA it's quite likely that an "API" already exists in form of GET/POST requests and you might be lucky enough that there's no CSRF protection.
First of all, you need to solve the authentication against the CAS. There are multiple types of authentication in oAuth, but you should get an API token that enables you access to the application. This token should be added in form of HTTP Header or Cookie in every single request. Ideally this token shouldn't expire, otherwise you'll need to implement a re-authentication logic in your app.
Once the authentication part is resolved, you'll need quite a lot of patience, open the target website with the web inspector of your preferred web browser and go to the Network panel and execute the actions that you want to run programmatically. There you'll find your request with all the headers and content and the response.
That's what you need to code. There are plenty of libraries to achieve that in Java. You can have a look at Jsop if you need to parse HTML, but to run plain GET/POST requests, go for RestTemplate (in Spring) or JAX-RS/Jersey 2 Client.
You might consider implementing a cache layer to increase performance if the result of the query is maintained over the time, or you can assume that in, let's say 5 minutes, the response will be the same to the same query.
You can create your app in your favourite language/framework. I'd recommend to start with SpringBoot + MVC + DevTools. That'd contain all you need + Jsoup if you need to parse some HTML. Later on you can add the cache provider if needed.
We do something similar to access web banking on behalf of a user, scrape his account data and obtain a credit score. In most cases, we have managed to reverse-engineer mobile apps and sniff traffic to use undocumented APIs. In others, we have to fall back to web scraping.
You can have two other types of applications to scrape:
Data is essentially the same for any user, like product listings in Amazon
Data is specific to each user, like in a banking app.
In the firs case, you could have your scraper running and populating a local database and use your local data to provide the web service. In the later case, you cannot do that and you need to scrape the site on user's request.
I understand from your explanation that you are in this later case.
When web scraping you can find really difficult web apps:
Some may require you to send data from previous requests to the next
Others render most data on the client with JavaScript
If any of these two is your case, Selenium will make your implementation easier though not performant.
Implementing the first without selenium will require you to do lots of trial an error to get the thing working because you will be simulating the requests and you will need to know what data is expected from the client. Whereas if you use selenium you will be executing the same interactions that you do with the browser and hence sending the expected data.
Implementing the second case requires your scraper to support JavaScript. AFAIK best support is provided by selenium. HtmlUnit claims to provide fair support, and I think JSoup provides no support to JavaScript.
Finally, if your solution takes too much time you can mitigate the problem providing your web service with a notification mechanism, similar to Webhooks or Resthooks:
A client of your web service would make a request for data providing a URI they would like to get notified when the results are ready.
Your service would respond immediatly with an id of the request and start scraping the necessary info in the background.
If you use skinny payload model, when the scraping is done, you store the response in your data store with an id identifying the original request. This response will be exposed as a resource.
You would execute an HTTPPOST on the URI provided by the client. In the body of the request you would add the URI of the response resource.
The client can now GET the response resource and because the request and response have the same id, the client can correlate both.
Selenium isn't a best way to consume webservices. Selenium is preferably an automation tool largely used for testing the applications.
Assuming the services are already developed, the first thing we need to do is authenticate user request.
This can be done by adding a HttpHeader with key as "Authorization" and value as "Basic "+ Base64Encode(username+":"+password)
If the user is valid (Users login credentials match with credentials in server) then generate a unique token, store the token in server by mapping with the user Id and
set the same token in the response header or create a cookie containing token.
By doing this we can avoid validating credentials for the following requests form the same user by just looking for the token in the response header or cookie.
If the services are designed to chcek login every time the "Authorization" header needs to be set in request every time when the request is made.
I think it is a lot of overhead using a webdriver but it depends on what you really want to achieve. With the info you provided I would rather go with a restTemplate implementation sending the appropriate http messages to the existing webapp, wrap it with a nice #service layer and build your web service (rest or soap) on top of it.
The authentication is a matter of configuration, you can pack this in a microservice with #EnableOAuth2Sso and your restTemplate bean, thanks to spring boot, will handle the underlining auth part for you.
May be overkill..... But RPA? http://windowsitpro.com/scripting/review-automation-anywhere-enterprise
I have two Java wepapps potentially on different domains/servers using Spring Security for authentication. The first is handling authentication locally storing users in the application database. For the second, I would like to authenticate users using the same users accounts than the first webapp with single sign on (if a user is authenticated in the first webapp, it shouldn't have to enter his info again in the second).
I identified three potential ways to do this but it doesn't seem very straightforward:
Shared cookies: Using a shared session cookie and the same database for the two applications. It seem relatively easy to do but the two webapps need to be on the same domain which isn't necessarily the case for my applications.
Directory service: Using a central directory service (LDAP) which would be used by the two webapps to handle authentication. It seem pretty heavy to implement and the users can't be stored in the first webapp database anymore. The existing users accounts would need to be migrated into the LDAP and it would not be possible to create new users using the first webapp.
OAuth: It seem to be be possible to make the first webapp handle external authentications requests by providing an OAuth api (like Google sign on kind of service). That would allow the second webapp to use this api to authenticate the users, but I'm not sure that the signin process would be totally transparent to handle single sign on. It doesn't seem very easy to implement either, as it would necessitate the development of a complete OAuth api in the first webapp.
I also looked at this service https://auth0.com that seem to provide an authentication api that can be interfaced with an external database, but I'm not sure that it can be interfaced with Spring Security and it also mandate the use of an online solution which isn't ideal. I'm not sure that it would handle single sign on either, only shared accounts.
Is there any other way to handle this use case that would be more straightforward?
CAS is a good candidate indeed as a SSO system for your need and it has several CAS clients for Spring Security. You can try for free a CAS server v4.0 at CAS in the cloud: http://www.casinthecloud.com...
As you mentioned, a shared cookie won't work across domains.
LDAP would give you shared credentials (single name/pw works for both systems), but not single sign on, and you notice you'll have provisioning issues.
Not knowing anything about Spring Security, odds are high you won't find a painless solution to this. Integrating SSO is fraught with workflow issues (user provisioning, password recovery, user profile maintenance, etc.)
We had a classic DB managed authentication scheme. Later, when we added LDAP support, we added the capability for "auto-provisioning". This basically consisted of having the application pull down the relevant demographics from the LDAP store during login, and simply updating fields each time user logged in. If the user didn't exist, we'd create one on the fly.
This worked well, because the rest of the application had no awareness of LDAP. It simply worked with the user profile we managed already and if it needed something from the DB, the data was there.
Later, when we integrated SSO, we just leveraged the existing LDAP logic to pull from the SSO server and do the same thing.
This workflow helped a lot with provisioning and management. We could maintained the authoritative source (LDAP, SSO), and the app just kept up. What it hindered was local editing of the user profile, so we simply disabled that. Let them view the profile, but they could go to the other systems portal for management. Inelegant, but it's a rare use case anyway, so we just muddled through it. We eventually worked out two way pushing and replication, etc. but it's a real pain if you don't need it.
You can look here if you want an overview of how to do cross domain SSO: Cross Domain Login - How to login a user automatically when transferred from one domain to another
For our SSO, we use SAML v2 Web Profile, but we ended up writing our most of our own code to pull it off.
But, bottom line, no matter what the web sites say, integrating this is non-trivial. The edge cases and workflow/help desk issues that surround it are legion. And it can be a bear to debug.
Let's say I've created a mobile application named 'Foo'(iOS). This app talks to a Java-running backend at 'java.com' and works perfectly. Now, I'm trying to create the website 'Foo.com' to let users enjoy the 'same' service on a browser/computer. So far, I've found that almost all calls needed to the API from the website can be done in JavaScript directly to the backend at 'java.com', including a login-function.
On the backend, I've implemented the standard 'doPost'-method to handle the login, and I create a Cookie to attach to the request.
The problem, I think, is that the users get the JavaScript from 'Foo.com', and the JavaScript tries to log in by using an AJAX-call to 'java.com', thus the cookie will be 'stamped' by www.java.com', not by 'www.foo.com', and the user will never receive the cookie. (At least, I don't receive a cookie now)
I've been trying to find a way to accept cookies from 'api.com' into the application, but it doesn't look good. Honestly, I'm not even sure this is the actual problem causing me to not receive a cookie, but I've read several places that cross-domain-cookies aren't allowed. So I ask the general question, how should I proceed?
I've been toying with the idea to add a .php-page to the server-side of the website 'foo.com', and from there handle the requests from client to API, hopefully causing the cookies to be 'stamped' as 'foo.com' instead of 'java.com'. (In that case, I'd also wonder if the .php can forward the information in the cookie or something similar).
But I really want to avoid as much traffic on the webhost as possible. An all-script-website would be optimal, but I don't really see how cookies can work with that.
Is there anything else I can do to handle this? If I simply want a persistent login-function from a client of 'foo.com' handled at 'java.com', are there any options, with or without the use of cookies?
I'm building an app to let users export data from a university system. Currently, they can log in and see the data in HTML, but I would like to let people download it as CSV.
I have an app where users supply their username and password. I would like to log in to the university system and HTML scrape the resulting page. How can I do this?
I'm building a GWT app. I could either do this in Java-transliterated-JS on the client, or Java on the server.
Update: Selenium might be nice, but it looks like overkill.
You're going to have to do this from the server unless the domains are the same. You'd need to determine what the POST transaction used by the other server for the login step looks like - parameter names etc. Then you'd perform that operation and do whatever you want with what comes back. If you need to see multiple pages, you need to maintain the appropriate session cookie too so that the server knows you're still logged in on the subsequent HTTP requests.
If you have to hit another site to validate the credentials, then I'm not so sure that people should feel comfortable providing those credentials to you. That is, if you don't have rights to check the credentials directly, why are you trustworthy to receive them? I know sometimes people need to integrate with a system they don't own, so this is just a question.
First, this has to be done server-side because of the limitations on client scripting due to the same origin policy.
The typical way of handling the "screen scraping" you mention is to treat the web page as if it was an XML service. First, examine the source code of the page, then using an internet/HTTP stack, craft a POST to the correct URL and read the response using a standard XML library. It will take some ingenuity to come up with a good way to dig into the XML to find the piece you need that will be as insulated as possible from changes to the page. Keep in mind that your system can break any time that the owners of the site change their page.
Sometimes, you can't just send the POST but have to request the blank page initially in order to get hidden form values that need to be returned in the POST. You'll have to experiment to find out what it requires.
Additionally, you probably have to handle cookies as well, since they usually are an integral part of the web site's authentication and session management (though you might get lucky that the session doesn't matter between the initial POST and the first response).
Last, you may be unlucky enough that the site uses javascript to do part of the authentication work, which may require additional digging to understand how the credentials are posted to the site.
There are other potential barriers such as the site checking to see that the referrer is their own site, possible use of SSL (HTTPS) and so on.
I'm pretty sure that the protection against cross-site scripting in web browsers will mean that you can't log in to the university's app using javascript running in the web browser. So the part of your program that fetches data from the university will need to run on your server. Once you have the data, you can process it either on your server or in javascript in the browser, but I think it would be easier to do it on the server.
See http://en.wikipedia.org/wiki/Same_origin_policy
I'm not too sure about GWT, but in general, you would take the form data submitted by the user, check it against a database of username and hashed passwords. If the database checks out, set a session cookie that says the user is logged in.
In your pages, check if the session cookie say the user is logged in. If not, redirect to login page, otherwise allow them to view the pagfe.
This question is more towards Design and Architecture and I want to know SO Readers think on my scenario.
I have a requirement where in my Application should provide other application interface when the user logs in to my application.
For example, lets say my application is www.gmail.com and other application is www.stackoverflow.com so what am trying to accomplish is that when the user log's in gmail account he should see his home page of stackoverflow and a particular questions.
From technology point of view, we have to use Java and so am not sure of what design and architecture consideration would go in to implement the requirement.
One Approach, am thinking on is that when the user logs in to gmail than I will populate the request object with all the login credential parameters for stackoverflow website and also question_id which would be passed in as parameter and then on Stackoverflow side, I would parse the request object and authenticate the user credentials and depending upon request parameter, I would render the question_id which I received from request.
I want to know what would be best approach and issues encountered in designing such an system.
Edit
After seeing all the answer, I would like to add little update to my question. What I am looking for is to get the feel of issues and challenges what I would have to face while trying to accomplish my task, also I am using Java and am not sure how can I accomplish my goal using Java as we do not have something like OLE which we have in Microsoft Technology stack to achieve the task.
Hope I am making some sense here.
I can think of three ways you could solve this.
Implement single sing-on. You log-in to all enterprise applications, and once logged all of them use the same authentication credentials (I think this is the best option. you don't need a full-fledge SSO, at least for these two application you could use the same credential validation mechanism)
You could also do what your are proposing creating the authentication credential for the user (i.e a cookie) and then do a redirect. Keep in mind that both application will need to be in the same sub-domain in order to work.
As mentioned before, you could also expose through your application the data/services you want to consume from the other application.
In my company we have what we call "Graphical Services", which are managed by a central server which also do credential validation, if the credentials are right it display a user interface for the user (generally in a Pop-up or an iframe).
Hope it helps.
You can't definitely do that at client side or java script as it will lead to cross site scripting issues. Or you can use iframes (which isdeprecated).
The other way of doing it would be to have your own interface/UI for the application and use only the service layer from your back end (java/j2ee in your case) which you may end up duplicating all the front end again (on the positive side, you will get your own branding of the site).
Regarding credentialing all most all the sites now used "OAuth" or similar and it should not be that difficult for authorizing
If both applications are web-based in-house applications, you could write a master login component, independent of either application, that will perform the user authentication, load any useful data it can at login time, and send the user's browser to the correct URL, making sure to pass any relevant information to the target app (as part of the forwarding request or behind the scenes in some distributed shared memory). Just a thought.