Getting 301 status but data is there [duplicate] - java

I'm trying to get HTML by URL in Java. But 301 Moved Permanently is all that I've got. Another URLs work. What's wrong? This is my code:
hh= new URL("http://hh.ru");
in = new BufferedReader(
new InputStreamReader(hh.openStream()));
while ((inputLine = in.readLine()) != null) {
sb.append(inputLine).append("\n");
str=sb.toString();//returns 301
}

You're facing a redirect to other URL. It's quite normal and web site may have many reasons to redirect you. Just follow the redirect based on "Location" HTTP header like that:
URL hh= new URL("http://hh.ru");
URLConnection connection = hh.openConnection();
String redirect = connection.getHeaderField("Location");
if (redirect != null){
connection = new URL(redirect).openConnection();
}
BufferedReader in = new BufferedReader(new InputStreamReader(connection.getInputStream()));
String inputLine;
System.out.println();
while ((inputLine = in.readLine()) != null) {
System.out.println(inputLine);
}
Your browser is following redirects automaticaly, but using URLConnection you should do it by your own. If it bothers you take a look at other Java HTTP client implementations, like Apache HTTP Client. Most of them are able to follow redirect automatically.

found this answer useful and improved a little due to the possibility of multiple redirections (e.g. 307 then 301).
URLConnection urlConnection = url.openConnection();
String redirect = urlConnection.getHeaderField("Location");
for (int i = 0; i < MAX_REDIRECTS ; i++) {
if (redirect != null) {
urlConnection = new URL(redirect).openConnection();
redirect = urlConnection.getHeaderField("Location");
} else {
break;
}
}

There's nothing wrong with your code. The message means that hh.ru is permanently moved to another domain.

I tested your code and it is ok, but when I use "hh.ru", the same problem as yours, and when I use lynx(command line browser) to connect to "hh.ru", it will show me that it is redirecting to another url and then show me that it is moved permanently and after that this alert:
"Alert!: This client does not contain support for HTTPS URLs"

Check if the URL provided is HTTP or HTTPS, consider adding protocol is you are using only domain name like http(s)://domainname.com/resource-name
Read: https://en.wikipedia.org/wiki/HTTP_301

I resolved mine when I put the specific file running on the server.
Instead of http://hh.ru,
I used http://hh.ru/index.php.
It worked for me

Related

Check if my server is up via java

I am starting a tomcat server in my local for a web application and it takes around 20 minutes to be up and running. I want to check if the web app is up and running and taking any requests via java. Any help?
My server is say at localhost:8001/myapp
Thanks in advance.
You can check it through many ways. Like... Set a servlet as start up on-load and inside it keep some loggers which files log messages along with exact time.
You can add something like localhost:8001/myapp/status to the app that would return information about current status. Then you can just sent http request from java and check the response
public String execute(String uri) throws Exception {
URL url = new URL(uri);
URLConnection connection = url.openConnection();
connection.setReadTimeout(1000);
BufferedReader in = new BufferedReader(
new InputStreamReader(
connection.getInputStream()));
String inputLine;
StringBuffer outputLine = new StringBuffer();
while ((inputLine = in.readLine()) != null)
outputLine.append(inputLine);
in.close();
return outputLine.toString();
}
I guess I will call this method after a certain time period to see if I'm getting a timeout exception of the raw html.

Accessing a URL using a loop

I have created a web application, and hosted in on the server. Now I want to create a java program which will access (or "hit") the URL of my application in a loop, so that I can check how much load can my web application can handle. Also, the program should be able to tell me when the URL was successfully accessed, and when it wasn't.
I tried executing it without using a loop:
try {
URL url = new URL("https://example.com/");
BufferedReader in = new BufferedReader(
new InputStreamReader(url.openStream()));
String inputLine;
while ((inputLine = in.readLine()) != null) {
System.out.println(inputLine);
}
in.close();
} catch (Exception e) {
System.out.println("e: " + e.toString());
}
But, I got this error:
e: javax.net.ssl.SSLHandshakeException: java.security.cert.CertificateException: No subject alternative DNS name matching example.com found.
Use,
javax.net.ssl.HttpsURLConnection
to connect to https://
something like (note, handle resource closing, exceptions etc left on you)
final URL url = new URL("https://example.com");
final HttpsURLConnection con = (HttpsURLConnection)url.openConnection();
final BufferedReader br = new BufferedReader(new InputStreamReader(con.getInputStream()));
String input;
while ((input = br.readLine()) != null){
System.out.println(input);
}
However there are lots of tool available to load test it
you are just getting a stream from url, but obviously this url is using HTTPS, so you need a "Public Key" imported to your client application. otherwise the client and server won't be able to communicate with each other.

HTTPURLConnection.getContent java.io.FileNotFoundException

I use a HttpURLConnection to connect to a website and receive an ResponseCode=404 (HTTP_NOT_FOUND). However I have no problem opening the website in my browser (IE).
Why the difference, and what can I do about it?
This is my Program:
String responseMsg = "";
String cgsUrl = "http://localhost:9081/ntes/";
URL url = new URL(cgsUrl);
System.out.println("ouuuuuuu-->"+url.getContent());
InputStream in = url.openConnection().getInputStream();
StringBuffer respDataBuf = new StringBuffer();
respDataBuf.setLength(0);
int b = -1;
while((b = in.read()) != -1) {
respDataBuf.append((char)b);
}
responseMsg = respDataBuf.toString();
If this is a 404 error, this is certainly a particular server configuration.
Maybe your user-agent is banned, or you're not carrying special headers and so on. I recommend you copying the headers from your browser (all of them) and use them to make the request in your Java program.
Then you throw them away one by one to find the one which is mandatory

URLConnection Error - java.io.IOException: Server returned HTTP response code: 400 for URL

I am trying to connect to a URL from a desktop app, and I get the error indicated in the Title of my question, but when I tried to connect to the same URL from servlet, all works fine. When I load the URL from browser, all works fine. I am using the same code in the servlet. The code was in a library, when it didn't work, I pulled the code out to a class in the current project, yet it didn't work.
The URL https://graph.facebook.com/me.
The Code fragment.
public static String post(String urlSpec, String data) throws Exception {
URL url = new URL(urlSpec);
URLConnection connection = url.openConnection();
connection.setDoOutput(true);
OutputStreamWriter writer = new OutputStreamWriter(connection.getOutputStream());
writer.write(data);
writer.flush();
BufferedReader reader = new BufferedReader(new InputStreamReader(connection.getInputStream()));
String line = "";
StringBuilder builder = new StringBuilder();
while((line = reader.readLine()) != null) {
builder.append(line);
}
return builder.toString();
}
I'm a little bit confused here, is there something that is present is a servlet that is not a normal desktop app?
Thanks.
FULL STACK TRACE
Feb 8, 2011 9:54:14 AM com.trinisoftinc.jiraffe.objects.FacebookAlbum create
SEVERE: null
java.io.IOException: Server returned HTTP response code: 400 for URL: https://graph.facebook.com/me
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1313)
at sun.net.www.protocol.https.HttpsURLConnectionImpl.getInputStream(HttpsURLConnectionImpl.java:234)
at com.jiraffe.helpers.Util.post(Util.java:49)
at com.trinisoftinc.jiraffe.objects.FacebookAlbum.create(FacebookAlbum.java:211)
at com.trinisoftinc.jiraffe.objects.FacebookAlbum.main(FacebookAlbum.java:261)
EDIT: You need to find the exact error message that facebook is sending in the response
You can modify your code to get the message from the error stream like so:
HttpURLConnection httpConn = (HttpURLConnection)connection;
InputStream is;
if (httpConn.getResponseCode() >= 400) {
is = httpConn.getErrorStream();
} else {
is = httpConn.getInputStream();
}
Take a look at how you are passing the user context
Here's some information that could help you out:
Look at the error message behind the 400 response code:
"Facebook Platform" "invalid_request" "An active access token must be used to query information about the current user*
You'll find the solution here
HTTP/1.1 400 Bad Request
...
WWW-Authenticate: OAuth "Facebook Platform" "invalid_request" "An active access token must be used to query information about the current user."
...
I finally found the problem. Of course it's my code. One part of the code I didn't post is the value of data. data must contain only name and description but I am passing more than name and description.

Reading https web page data behind proxy java

I want to read a secure webpage data say https://www.paypal.com, i am behind proxy. I tried with
System.setProperty("java.net.useSystemProxies","true");
System.setProperty("htttps.proxyHost","myproxyhost");
System.setProperty("https.proxyPort","443");
URL u = new URL("https://www.paypal.com");
URLConnection uc = u.openConnection();
uc.setDoOutput(true);
StringBuffer sbuf=new StringBuffer();
BufferedReader in = new BufferedReader(
new InputStreamReader(uc.getInputStream()));
String res = in.readLine();
System.out.println(" Response from paypal "+res);
while ((res = in.readLine()) != null){
sbuf.append(res).append(",");
}
in.close();
System.out.println(" Total Data received "+sbuf);
i am getting UnknownHostException all the time, I am successfully fetching data with http websites. Am i missing something?
Thanks,
Rohit
You've got 3 T's in your proxyHost settings, i.e. you are using htttps rather than https.

Categories

Resources