How to maintain sessions in java URLConnection? - java

I am trying to login to a website and get page source of a page site after I login to the web site with java URLConnection. The problem I am facing is I can't maintain session so server gives me this warning and doesn't let me to get connected:
This system requires the use of HTTP cookies to verify authorization information.
Our system has detected that your browser has disabled HTTP cookies, or does not support them.
Please refer to the Help page in your browser for more information on how to correctly configure your browser for use with this system.
At first I am trying to send empty cookie to let server to understand I am handling sessions but it doesn't give me session id either.
This is my source code:
try {
// Construct data
String data = URLEncoder.encode("usr", "UTF-8") + "=" + URLEncoder.encode("usr", "UTF-8");
data += "&" + URLEncoder.encode("password", "UTF-8") + "=" + URLEncoder.encode("pass", "UTF-8");
// Send data
URL url = new URL("https://loginsite.com");
URLConnection conn = url.openConnection();
conn.setDoOutput(true);
conn.setRequestProperty("Cookie", "SESSID=");
OutputStreamWriter wr = new OutputStreamWriter(conn.getOutputStream());
wr.write(data);
wr.flush();
// Get the response
BufferedReader rd = new BufferedReader(new InputStreamReader(conn.getInputStream()));
String line;
while ((line = rd.readLine()) != null) {
System.out.println(line);
}
wr.close();
rd.close();
String headerName=null;
for (int i=1; (headerName = conn.getHeaderFieldKey(i))!=null; i++) {
if (headerName.equals("Set-Cookie")) {
String cookie = conn.getHeaderField(i);
System.out.println(cookie.split(";", 2)[0]);
}
}
} catch (Exception e) {
}

You should use an HTTP library which handles session management and other details of the HTTP protocol for you, e.g. supports Cookies and things like Keep-Alive, Proxies etc. out of the box. Try Apache HttpComponents

Related

Attempt to get OAuth access token from Neteller produces error: "Server returned HTTP response code: 401 for URL"

I want to set a successful request to Neteller, I am trying to get an access token using the code from the Neteller documentation. However, it consistently fails with with the following exception:
java.io.IOException: Server returned HTTP response code: 401 for URL: https://test.api.neteller.com/v1/oauth2/token?grant_type=client_credentials
Here's the code (again, from the Neteller documentation):
String testUrl = " https://test.api.neteller.com";
String secureUrl = "https://api.neteller.com";
String url = testUrl;
if("live".equals(configBean.get("environment"))){
url = secureUrl;
}
url += "/v1/oauth2/token?grant_type=client_credentials";
String xml = "grant_type=client_credentials?grant_type=client_credentials";
xml = "";
String test = Base64.encodeBytes((accountID + ":" + secureID).getBytes());
try {
URL urls = new URL ("https://test.api.neteller.com/v1/oauth2/token?grant_type=client_credentials");
HttpURLConnection connection = (HttpURLConnection) urls.openConnection();
connection.setRequestMethod("POST");
connection.setDoOutput(true);
connection.setRequestProperty ("Authorization", "Bearer " + test);
connection.setRequestProperty ("Content-Type", "application/json");
connection.setRequestProperty ("Cache-Control", "no-cache");
DataOutputStream wr = new DataOutputStream(connection.getOutputStream());
wr.flush();
wr.close();
BufferedReader in = new BufferedReader(new InputStreamReader(connection.getInputStream()));
String inputLine;
StringBuffer response = new StringBuffer();
while ((inputLine = in.readLine()) != null) {
response.append(inputLine);
}
in.close();
String accessToken = "";
} catch(Exception e) {
e.printStackTrace();
}
Why is my implementation failing here?
There is nothing wrong with your code. The problem is that you are trying use a regular member account for the API integration, where you need to be using a merchant account for that. Below are the steps you will need to complete in order to get it to work:
You need to get a test merchant account (http://www.neteller.com/business/contact-sales/). Registering on www.neteller.com creates a regular member account, which cannot receive payments via the API.
Once you have a test merchant account, you will need to white-list the IP address from which you will be making requests to the API. (pg. 31 of the manual).
Then, you will need to add an application to it (pg. 32 of the manual).
Once you have added the application, use the "client ID" and "client secret" in the Authorization header - just like you do now, base64 encoded values, separated with colon (:).

POST Requests from a Servlet

I am writing a servlet using eclipse that receives POST request from a client that should do some splitting on the received text, access google geolocation api to get some data and display to the user.
On a localhost, this works perfectly fine. On an actual server (tried with Openshift and CloudBees), this doesn't work. I can see the splitting reply but not the reply from google geolocation service. There is always an error logged into the console from google service. However, the same code works perfectly fine on localhost.
After I receive the POST request in the doPost method of the servlet, I am doing the following to access the Google GeoLocation service:
//Attempting to send data to Google Geolocation Service
URL url;
HttpURLConnection connection = null;
try {
//Create connection
url = new URL("https://www.googleapis.com/geolocation/v1/geolocate?key=MyAPI");
connection = (HttpURLConnection)url.openConnection();
connection.setRequestMethod("POST");
connection.setRequestProperty("Content-Type", "application/json");
connection.setUseCaches (false);
connection.setDoInput(true);
connection.setDoOutput(true);
//Send request with data (output variable has the JSON data)
DataOutputStream wr = new DataOutputStream (
connection.getOutputStream ());
wr.writeBytes (output);
wr.flush ();
wr.close ();
//Get Response
InputStream is = connection.getInputStream();
BufferedReader rd = new BufferedReader(new InputStreamReader(is));
String line;
StringBuffer response2 = new StringBuffer();
while((line = rd.readLine()) != null) {
response2.append(line);
response2.append('\r');
}
rd.close();
//Write to Screen using out=response.getWriter();
out.println("Access Point's Location = " + response2.toString());
} catch (Exception e) {
e.printStackTrace();
} finally {
if(connection != null) {
connection.disconnect();
}
Could you tell me why this is happening and how can I make this work? Should I resort to something like AJAX or is there someother work around? I am relatively new to coding and hence, trying to refrain from learning AJAX at this stage. Please let me if there's any other way of getting this to work
Your localhost has your localhost IP as a sending IP. Openshift et al has the Openshift et al IP as a sending IP. So the Google API says "I have only seen that localhost IP twice before, that's fine!", whereas it says "I have seen this Openshift IP millions of times before! NO REPLY FOR YOU!"

Java method to log into ASP.NET Web form

I'm working on a java program that will need to log into a ASP.NET web form, then once authenticated, download a file. Normal HTTP GET/POST is not a problem, but it appears that ASP is not giving me a SESSION ID when I connect from java, but it is from the browser.
When I look at the header information in Firefox, I see the cookies being set from the initial login, but then the page is immediately redirected over to a new URL. I'm not sure if it matters, but the page it redirects to after login contains iframes. I've tried loading both the main page and the iframe src inside, but neither give me the cookie in the header.
//Pull up the login page, extract out the hidden input variables __VIEWSTATE, __EVENTVALIDATION
URL url = new URL(loginPage);
HttpURLConnection conn = null;
conn = (HttpURLConnection) url.openConnection();
//This reads the page line-by-line and extracts out all the values from hidden input fields
Map<String,String> formFields = getViewstate(conn);
//Now re-open the URL to actually submit the POST data
conn = (HttpURLConnection) url.openConnection();
conn.setRequestMethod("POST");
conn.setDoOutput(true);
conn.setDoInput(true);
DataOutputStream out = new DataOutputStream(conn.getOutputStream());
String postValues = URLEncoder.encode("txtUsername", "UTF-8") + "=" + URLEncoder.encode(uid, "UTF-8");
postValues += "&" + URLEncoder.encode("txtPassword", "UTF-8") + "=" + URLEncoder.encode(pwd, "UTF-8");
postValues += "&" + URLEncoder.encode("__EVENTTARGET", "UTF-8") + "=" + URLEncoder.encode("", "UTF-8");
postValues += "&" + URLEncoder.encode("__VIEWSTATE", "UTF-8") + "=" + URLEncoder.encode(formFields.get("viewstate"), "UTF-8");
postValues += "&" + URLEncoder.encode("__EVENTVALIDATION", "UTF-8") + "=" + URLEncoder.encode(formFields.get("eventvalidation"), "UTF-8");
out.writeBytes(postValues);
out.flush();
out.close();
//At this point looking at Firefox sniffer data, it should be sending back the cookie
//However there is no Set-Cookie in the header fields
for (int i = 1; (key = conn.getHeaderFieldKey(i)) != null; i++) {
// get ASP.NET_SessionId from cookie
if (key.equalsIgnoreCase("set-cookie")) {
sessionId = conn.getHeaderField(key);
sessionId = sessionId.substring(0, sessionId.indexOf(";"));
}
}
BufferedReader rd = new BufferedReader(new InputStreamReader(conn.getInputStream()));
while ((line = rd.readLine()) != null) {
//The page it prints out is the page it was redirected to when logged in through the browser
System.out.println(line);
}
rd.close();
//At this point, it was a successful login, but I never got the cookie so I'm stuck
HttpClient, which I believe HtmlUnit is based on, has the lower level functionality I think you're looking for. Handles cookies well, though if you need more, then Kurt is right in that you should look for something with more functionality. If you actually need to get full browser functionality, you could try something like Selenium/Webdriver that actually automates a browser under programmatic control.
It looks like the site you are trying to access relies on Cookies which are not supported by HttpURLConnection. A way around this issue is to use a library like HtmlUnit which simulates a browser (supports cookies, javascript, etc..).

Connect to web that requires user/password

I'm a bit new to Java and more to connections stuff with it. I'm trying to create a program to connect to a website ("www.buybackprofesional.com") where I would like to download pictures and get some text from cars (after the login you have to enter a plate number to access a car's file).
This is what I have right now, but it always says that the session has expired, I need a way to login using the username and password of the mainpage, am I right? can someone give me some advice? Thanks
Note: I want to do it in Java, maybe I was not clear in the question.
//URL web = new URL("http://www.buybackprofesional.com/DetallePeri.asp?mat=9073FCV&fec=27/07/2010&tipo=C&modelo=4582&Foto=0");
URL web = new URL("http://www.buybackprofesional.com/");
HttpURLConnection con = (HttpURLConnection) web.openConnection();
con.setRequestMethod("GET");
con.setRequestProperty("User-Agent", "Mozilla/4.0 (compatible; JVM)");
con.setRequestProperty("Pragma", "no-cache");
con.connect();
BufferedReader reader = new BufferedReader(new InputStreamReader(con.getInputStream()));
String line = null;
while ((line = reader.readLine()) != null) {
System.out.println(line);
}
A colleage helped me with this so I'll post the code that works:
public static URLConnection login(String _url, String _username, String _password) throws IOException, MalformedURLException {
String data = URLEncoder.encode("Usuario", "UTF-8") + "=" + URLEncoder.encode(_username, "UTF-8");
data += "&" + URLEncoder.encode("Contrase", "UTF-8") + "=" + URLEncoder.encode(_password, "UTF-8");
// Send data
URL url = new URL(_url);
URLConnection conn = url.openConnection();
conn.setDoOutput(true);
OutputStreamWriter wr = new OutputStreamWriter(conn.getOutputStream());
wr.write(data);
wr.flush();
wr.close();
return conn;
}
This will submit the form info on the page I need and after that, using cookies I can stay connected!
To connect to a website using java consider using httpunit or httpcore (offered by apache). They handle sessions much better then you (or I) could do on your own.
Edit: Fixed the location of the link. Thanks for the correction!

Java connecting to Http which method to use?

I have been looking around at different ways to connect to URLs and there seem to be a few.
My requirements are to do POST and GET queries on a URL and retrieve the result.
I have seen
URL class
DefaultHttpClient class
HttpClient - apache commons
which method is best?
My rule of thumb and recommendation: Don't introduce dependencies and 3rd party libraries if it's fairly easy to get away without.
In this case I would say, if you need efficiency such as multiple requests per established connection session handling or cookie support etc, go for HTTPClient.
If you only need to perform an HTTP get, this will suffice:
Getting Text from a URL
try {
// Create a URL for the desired page
URL url = new URL("http://hostname:80/index.html");
// Read all the text returned by the server
BufferedReader in = new BufferedReader(new InputStreamReader(url.openStream()));
String str;
while ((str = in.readLine()) != null) {
// str is one line of text; readLine() strips the newline character(s)
}
in.close();
} catch (MalformedURLException e) {
} catch (IOException e) {
}
Sending a POST Request Using a URL
try {
// Construct data
String data = URLEncoder.encode("key1", "UTF-8") + "=" + URLEncoder.encode("value1", "UTF-8");
data += "&" + URLEncoder.encode("key2", "UTF-8") + "=" + URLEncoder.encode("value2", "UTF-8");
// Send data
URL url = new URL("http://hostname:80/cgi");
URLConnection conn = url.openConnection();
conn.setDoOutput(true);
OutputStreamWriter wr = new OutputStreamWriter(conn.getOutputStream());
wr.write(data);
wr.flush();
// Get the response
BufferedReader rd = new BufferedReader(new InputStreamReader(conn.getInputStream()));
String line;
while ((line = rd.readLine()) != null) {
// Process line...
}
wr.close();
rd.close();
} catch (Exception e) {
}
Both methods work great. (I've even done manual gets/posts with cookies.)
HTTPClient is the way to go if your needs go past trivial URL connection (e.g. proxy authentication such as NTLM). There are at least a comparison here between standard HTTP client functionality between libraries provided by the JRE, Apache HTTP Client and others.
If you are using JDK versions earlier to (including 1.4) and have a fairly large data in your post requests, like large file uploads, the default HTTPURLConnection that comes with the JRE is bound to go Out of memory at some point since it buffers the entire data before posting. Additionally it does not support some advanced HTTP headers like chunked encoding, etc.
So I'd recommend it only if your request are trivial and you are not posting large data as aioobe did.

Categories

Resources