Use Server cache for 15 minutes - java

I want to use server cache for 15 minutess so what i have to use in setRequestProperty() ?
Please Help me..
Here is my code which i used..
private HttpURLConnection httpCon = null;
httpCon = (HttpURLConnection) httpUrl.openConnection();
httpCon.setRequestMethod("GET");
httpCon.setRequestProperty("Connection", "Keep-Alive");
httpCon.setRequestProperty("Pragma","public");
httpCon.setRequestProperty("Cache-Control","maxage=900");
httpCon.setUseCaches(true);

You are telling the server you are willing for it to cache responses, but there's no guarantee that the server will do that or is enabled to do that (unless you control the server also and implement that).
You can also try setting up an intermediate HTTP cache the client and server, such as a proxy cache such as Varnish, Pound, or Squid.
Lastly, you can do browser caching on your own, which is supported the the Android java.net package but doesn't have a default implementation. To do this:
-Check out HttpURLConnection which details (in the "Response Caching" section) that you must implement ResponseCache and call setDefault.
-Also check out ResponseCache Example which has examples of this, and something quirky to watch out for at the end (which may or may not still be true).
Good luck!

Instead of using the HttpConnection, use DefaultHttpClient and CachingHttpClient (part of Apache Http Client, bundled by default with Android).
Have a look at http://hc.apache.org/httpcomponents-client-ga/tutorial/html/caching.html to get more details on how to use caching.

Related

HTTPS Post request (not) using certificate/hash print?

I admit there is a possibility that I am not well informed about the subject, but I've done a LOADS of reading and I still can't get answer to my question.
From what I have learnt, to make communication secure with HTTPS I need to be using some sort of public key (reminds me of pgp-encryption).
My goal is to make a secured POST request from my java application (which I, in the moment it starts working, will rewrite to Android app, if it matters) to a php application accessible via https address.
Naturally I did some Google research on the topic and I got a lot of results how to make ssl connection. Non of those results used any sort of certificate/hash prints. They just use HttpsURLConnection instead of HttpURLConnection, everything else is almost identical.
Right now, almost copy paste of something I found here is this:
String httpsURL = "https://xx.yyyy.zzz/requestHandler.php?getParam1=value1&getParam2=value2";
String query = "email=" + URLEncoder.encode("abc#xyz.com", "UTF-8");
query+="&";
query+="password="+URLEncoder.encode("tramtarie","UTF-8");
URL myurl = new URL(httpsURL);
HttpsURLConnection con = (HttpsURLConnection) myurl.openConnection();
con.setRequestMethod("POST");
con.setRequestProperty("Content-length",String.valueOf(query.length()));
con.setRequestProperty("Content-Type","application/x-www-form-urlencoded");
con.setRequestProperty("User-Agent","Mozilla/4.0 (compatible; MSIE 5.0;Windows98;DigExt)");
con.setDoOutput(true);
con.setDoInput(true);
DataOutputStream output = new DataOutputStream(con.getOutputStream());
output.writeBytes(query);
output.close();
DataInputStream input = new DataInputStream(con.getInputStream());
for(
int c = input.read();
c!=-1;c=input.read())
System.out.print((char)c);
input.close();
System.out.println("Resp Code:"+con.getResponseCode());
System.out.println("Resp Message:"+con.getResponseMessage());
Which sadly does not work and ends up with this exception:
Exception in thread "main" javax.net.ssl.SSLHandshakeException: java.security.cert.CertificateException: No subject alternative DNS name matching app.elessy.cz found
This probably means that it checks the certificate and finds out that the certificate I am using does not match domain name for which is registered (it is webhosting certificate, registered for webhosting domain, not the domain I own, the only reason I am using https is to secure data for internal purposes, I do not want this site to be visited by users from outside, so this certificate should be ok).
There are two things that I just don't get about the code and everything.
No code I have been able to find use MD5/SHA-1 (supposedly the public keys for message encryption?) prints or
certificate, they just somehow automatically connect to https
website and should work. Doesn't work for me though.
Do I really need those md5/sha-1 prints that are provided to me? Or at least, what in the given context do those prints mean?
Edit:
Following the given answer and duplicate mark, I managed to get it working - in the meaning that I can communicate with application behind https.
But I didnt have to use any sort of md5/sha1 print. How do I know now that it is safe? Does this protocol on his own? Like that communication is secured either way, when I use built-in java classes to connect to app behind https?
I probably do not seek for precise technical explanation, but more for an assurance that yes - the communication is safe even though I do not use (knowingly) certificate/servers public key to encrypt my messages. That it does the ssl connection for me.

Infinite redirect loop in HTTP request

I'm getting a too many redirects redirect error from URLConnection when trying to fetch www.palringo.com
URL url = new URL("http://www.palringo.com/");
HttpURLConnection.setFollowRedirects(true);
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
System.out.println("Response code = " + connection.getResponseCode());
outputs the dreaded:
Exception in thread "main" java.net.ProtocolException: Server redirected too many times (20)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(Unknown Source)
According to wget there is just one redirect, from www.palringo.com to www.palringo.com/en/gb/
Any ideas why my request using URLConnection for /en/gb results in another 302 response for the same resource ?
The problem is exemplified by:
URL url = new URL("http://www.palringo.com/en/gb/");
HttpURLConnection.setFollowRedirects(false);
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
// Just for testing, use Chrome header, to eliminate "anti-crawler" response!
connection.setRequestProperty("User-Agent", "Mozilla/5.0 (X11; Linux i686) AppleWebKit/534.30 (KHTML, like Gecko) Ubuntu/11.04 Chromium/12.0.742.112 Chrome/12.0.742.112 Safari/534.30");
System.out.println("Response code = " + connection.getResponseCode());
This outputs:
Response code = 302
Redirected to /en/gb/
hence an infinite redirect loop.
Interestingly although browsers and wget handle it, curl does not:
joel#bohr:/tmp$ curl http://www.palringo.com/en/gb/
curl: (7) couldn't connect to host
A request for /en/gb/ is redirected to /en/gb/ precisely once.
The problem is that your HttpURLConnection (or whatever code you use -- sorry, I'm NOT familiar with Java) does not use cookies.
Disable cookies in browser and observe exactly the same behaviour -- infinite redirect.
The reason: Server checks if cookie is set. If not set -- it sets it and redirects. Because cookies are not supported/disabled, script on server side redirects over and over again.
Solution: Enable/add cookie support to your code and try again.
I think that redirect is defined with pattern like /* -> /en/gb
So, when you arrive to /en/gb the redirect rule works again.
Check your redirect rules. Where are they defined? In apache web server or in other place? Check all. Verify that this is (or is not) a case and fix the rules accordingly.
The problem is on the server side. It might be a broken Apache httpd rewrite rule that is sending redirects that loop back to the same place. It might be something else. Whatever it is, you are unlikely to be able to fix it on the client side.
I'm basically running a crawler and just noticed this issue.
Ah.
It is possible that it is an anti-crawler defence measure. "Hmmm ... looks like one of those pesky crawlers who ignore my robots.txt file, waste all of my bandwidth and steal my precious content. Lets cause him some pain with a redirect loop!!".
Check that your crawler is obeying the "robots.txt" protocol. Check the ToS for the site you are crawling to see if what you are doing is allowed.
You could be right, but if so how come wget and browsers handle this with just the one redirect?
Maybe because the server is looking at the request headers, or at your pattern of requests.
The Terms of Service (that I see) say this:
"You agree to not use the Service to: ... xiii - Run any automated systems, processes, scripts or bots for any purpose without the express written permission of Palringo."
Arguably, crawling their site is in violation of that.
You will also get this error if you're trying to connect to a service that requires authentication and you provide wrong username and password.

Alternative to java.net.URL for custom timeout setting

Need timeout setting for remote data request made using java.net.URL class. After some googling found out that there are two system properties which can be used to set timeout for URL class as follows.
sun.net.client.defaultConnectTimeout
sun.net.client.defaultReadTimeout
I don't have control over all the systems and don't want everybody to keep setting the system properties. Is there any other alternative for making remote request which will allow me to set timeouts.
Without any library, If available in java itself is preferable.
If you're opening a URLConnection from URL you can set the timeouts this way:
URL url = new URL(urlPath);
URLConnection con = url.openConnection();
con.setConnectTimeout(connectTimeout);
con.setReadTimeout(readTimeout);
InputStream in = con.getInputStream();
How are you using the URL or what are you passing it to?
A common replacement is the Apache Commons HttpClient, it gives much more control over the entire process of fetching HTTP URLs.

Trying to GET a Google Spreadsheet in Java is returning HTTP error 405

Been working on this all day and have gotten no where with it.
My Java code looks like this:
final URL url = new URL(String.format("https://spreadsheets.google.com/feeds/download/spreadsheets/Export?key=%s&exportFormat=tsv&gid=0", spreadsheetId));
HttpURLConnection conn = (HttpURLConnection) url.openConnection();
conn.setRequestProperty("Authorization", "GoogleLogin auth=" + wiseAuth.getAuthToken());
conn.setRequestProperty("GData-Version", "3.0");
conn.setRequestMethod("GET");
conn.setDoOutput(true); // trouble here, see below
conn.setInstanceFollowRedirects(true);
conn.connect();
I always get a FileNotFound error when attempting to do conn.getInputStream(). I narrowed it down to being that the response code is 405 Method Not Allowed. The exception is returning me my URL and I can access the page just fine in my browser.
It was then that I discovered that setDoOutput(true) executes a POST internally. But if I remove that line, conn.getInputStream() is null, and conn.getOutputStream() appears to return nothing--though maybe I am setting it up wrong?
I don't recommend you to do it like this, even if you get it working now you cannot ensure you will get it working in the future if Google started changing it.
Instead, consider using Google Spreadsheet API. The provided Java examples are pretty straightforward and you should able to accomplish what you want.
I would recommend using a web debugger like Fiddler to see what exactly your application is sending in the GET request and compare it to your browser. You might be missing an important header or something, and Fiddler makes it really easy to slowly strip down your browser's request to the essential elements (just drag a request to clone it, then take out headers).

RequestDispatcher for remote server?

I am trying to create a HttpServlet that forwards all incoming requests as is, to another serlvet running on a different domain.
How can this be accomplished? The RequestDispatcher's forward() only operates on the same server.
Edit: I can't introduce any dependencies.
You can't when it doesn't run in the same ServletContext or same/clustered webserver wherein the webapps are configured to share the ServletContext (in case of Tomcat, check crossContext option).
You have to send a redirect by HttpServletResponse.sendRedirect(). If your actual concern is reusing the query parameters on the new URL, just resend them along.
response.sendRedirect(newURL + "?" + request.getQueryString());
Or when it's a POST, send a HTTP 307 redirect, the client will reapply the same POST query parameters on the new URL.
response.setStatus(HttpServletResponse.SC_TEMPORARY_REDIRECT);
response.setHeader("Location", newURL);
Update as per the comments, that's apparently not an option as well since you want to hide the URL. In that case, you have to let the servlet play for proxy. You can do this with a HTTP client, e.g. the Java SE provided java.net.URLConnection (mini tutorial here) or the more convenienced Apache Commons HttpClient.
If it's GET, just do:
InputStream input = new URL(newURL + "?" + request.getQueryString()).openStream();
OutputStream output = response.getOutputStream();
// Copy.
Or if it's POST:
URLConnection connection = new URL(newURL).openConnection();
connection.setDoOutput(true);
// Set and/or copy request headers here based on current request?
InputStream input1 = request.getInputStream();
OutputStream output1 = connection.getOutputStream();
// Copy.
InputStream input2 = connection.getInputStream();
OutputStream output2 = response.getOutputStream();
// Copy.
Note that you possibly need to capture/replace/update the relative links in the HTML response, if any. Jsoup may be extremely helpful in this.
As others have pointed out, what you want is a proxy. Your options:
Find an open-source Java library that does this. There are a few out there, but I haven't used any of them, so I can't recommend any.
Write it yourself. Shouldn't be too hard, just remember to deal with stuff like passing along all headers and response codes.
Use the proxy module in Apache 2.2. This is the one I'd pick, because I already know that it works reliably.
Jetty has a sample ProxyServlet implementation that uses URL.openConnection() under the hood. Feel free to use as-is or to use as inspiration for your own implementation. ;-)
Or you can use Apache HttpClient, see the tutorial.

Categories

Resources