I'm getting this error : java.io.IOException: Server returned HTTP response code: 502 for URL:
and it ends my program if the website has a bad gate, and its inconsistent on working or not. Is there any way I can force it to keep on retrying the website until it gets a response?
this is currently my code if it matters:
URL webpage = null;
URLConnection conn = null;
try{
webpage = new URL(websiteurl);
conn=webpage.openConnection();
InputStreamReader reader = new InputStreamReader(conn.getInputStream(), "UTF8");
BufferedReader buffer = new BufferedReader(reader);
String line = "";
while(true){
line = buffer.readLine();
if(line!=null){
System.out.println(line);
}
else
break;
}
}
catch(Exception e){
e.printStackTrace();
}
nevermind solved it by recalling my method in the catch and adding in a pause between each call : this is what it is now
URL webpage = null;
URLConnection conn = null;
try{
webpage = new URL(website);
conn=webpage.openConnection();
InputStreamReader reader = new InputStreamReader(conn.getInputStream(), "UTF8");
BufferedReader buffer = new BufferedReader(reader);
String line = "";
while(true){
line = buffer.readLine();
if(line!=null){
System.out.println(line);
}
else
break;
}
}
catch(Exception e){
e.printStackTrace();
try
{
Thread.sleep(5000);
}
catch(InterruptedException ex)
{
Thread.currentThread().interrupt();
}
findCreationDate(name);
}
solved it by recalling my method in the catch and adding in a pause between each call : this is what it is now
URL webpage = null;
URLConnection conn = null;
try{
webpage = new URL(website);
conn=webpage.openConnection();
InputStreamReader reader = new InputStreamReader(conn.getInputStream(), "UTF8");
BufferedReader buffer = new BufferedReader(reader);
String line = "";
while(true){
line = buffer.readLine();
if(line!=null){
System.out.println(line);
}
else
break;
}
}
catch(Exception e){
e.printStackTrace();
try
{
Thread.sleep(5000);
}
catch(InterruptedException ex)
{
Thread.currentThread().interrupt();
}
findCreationDate(name);
}
Related
At the moment i'm trying to save a response to the internal storage in the phone. Everything works fine up until i try and retrieve the data again. When i log out the retrieved data it only logs out one small section of the response and the rest isn't there. Ive tried deleting the file and calling it again just incase it was using an old one.
Saving Code
try {
String response = apiResponse.getRawResponse();
Log.e("Response", response);
FileOutputStream userInfo = openFileOutput("personal_profile", MODE_PRIVATE);
userInfo.write(response.getBytes());
userInfo.close();
} catch (Exception e) {
e.printStackTrace();
Retrieving Code
String response = "";
try {
FileInputStream fis = getActivity().openFileInput("personal_profile");
DataInputStream isr = new DataInputStream(fis);
BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(isr));
StringBuilder sb = new StringBuilder();
String line;
while ((line = bufferedReader.readLine()) != null) {
sb.append(line);
}
line = response;
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
Log.e("Saved File", response);
Any kind of suggestions would be great!
REASON
The problem was that the line variable is assigned again in every iteration
Try this:
String response = "";
try {
FileInputStream fis = getActivity().openFileInput("personal_profile");
DataInputStream isr = new DataInputStream(fis);
BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(isr));
StringBuilder sb = new StringBuilder();
String line;
while ((line = bufferedReader.readLine()) != null) {
sb.append(line);
}
line = response;
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
CHANGE LAST LINE
Log.e("Saved File", sb.toString());
Have you got this in your AndroidManifest.xml file?
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
Also, this link has everything you need to know about reading and writing files:
http://www.anddev.org/working_with_files-t115.html
Code::
String response = "";
try {
FileInputStream fis = getActivity().openFileInput("personal_profile");
DataInputStream isr = new DataInputStream(fis);
BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(isr));
StringBuilder sb = new StringBuilder();
String line;
while ((line = bufferedReader.readLine()) != null) {
sb.append(line);
}
line = response;
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
Log.e("Saved File", sb.toString());
I've created a small scraping class and the method below reads in the text from a page.
However, I've found that the method fails to close the connection properly. This results in a ton of open connections which cause my hosting company to then suspend my account. Is the below correct?
private String getPageText(String urlString) {
String pageText = "";
BufferedReader reader = null;
try {
URL url = new URL(urlString);
reader = new BufferedReader(new InputStreamReader(url.openStream()));
StringBuilder builder = new StringBuilder();
int read;
char[] chars = new char[1024];
while ((read = reader.read(chars)) != -1)
builder.append(chars, 0, read);
pageText = builder.toString();
} catch (MalformedURLException e) {
Log.e(CLASS_NAME, "getPageText.MalformedUrlException", e);
} catch (IOException e) {
Log.e(CLASS_NAME, "getPageText.IOException", e);
} finally {
if (reader != null)
try {
reader.close();
} catch (IOException e) {
Log.e(CLASS_NAME, "getPageText.IOException", e);
}
}
return pageText;
}
Your code is fine in the success case but will potentially leak connections in the failure cases (when the http server returns a 4xx or 5xx status code). In these cases HttpURLConnection provides the response body via .getErrorStream() rather than .getInputStream() and you should make sure to drain and close that stream as well.
URLConnection conn = null;
BufferedReader reader = null;
try {
conn = url.openConnection();
reader = new BufferedReader(new InputStreamReader(conn.getInputStream()));
// ...
} finally {
if(reader != null) {
// ...
}
if(conn instanceof HttpURLConnection) {
InputStream err = ((HttpURLConnection)conn).getErrorStream();
if(err != null) {
byte[] buf = new byte[2048];
while(err.read(buf) >= 0) {}
err.close();
}
}
}
There probably needs to be another layer of try/catch inside that finally but you get the idea. You should not explicitly .disconnect() the connection unless you're sure there won't be any more requests for urls on that host in the near future - disconnect() will prevent subsequent requests from being pipelined over the existing connection, which for https in particular will slow things down considerably.
You are just closing the stream and not the connection, use the following structure:
URL u = new URL(url);
HttpURLConnection conn = (HttpURLConnection)
u.openConnection();
conn.connect();
reader = new BufferedReader(new InputStreamReader(conn.getInputStream()));
and then:
} finally {
if (reader != null)
try {
reader.close();
} catch (IOException e) {
Log.e(CLASS_NAME, "getPageText.IOException", e);
}
}
try {
if (conn != null) {
conn.disconnect();
}
} catch (Exception ex) {}
}
I have looked around on how to do this and I keep finding different solutions, none of which has worked fine for me and I don't understand why. Does FileReader only work for local files? I tried a combination of scripts found on the site and it still doesn't quite work, it just throws an exception and leaves me with ERROR for the variable content. Here's the code I've been using unsuccessfully:
public String downloadfile(String link){
String content = "";
try {
URL url = new URL(link);
URLConnection conexion = url.openConnection();
conexion.connect();
InputStream is = url.openStream();
BufferedReader br = new BufferedReader( new InputStreamReader(is));
StringBuilder sb = new StringBuilder();
String line;
while ((line = br.readLine()) != null) {
sb.append(line);
}
content = sb.toString();
br.close();
is.close();
} catch (Exception e) {
content = "ERROR";
Log.e("ERROR DOWNLOADING",
"File not Found" + e.getMessage());
}
return content;
}
Use this as a downloader(provide a path to save your file(along with the extension) and the exact link of the text file)
public static void downloader(String fileName, String url) throws IOException {
File file = new File(fileName);
url = url.replace(" ", "%20");
URL website = new URL(url);
if (file.exists()) {
file.delete();
}
if (!file.exists()) {
ReadableByteChannel rbc = Channels.newChannel(website.openStream());
FileOutputStream fos = new FileOutputStream(fileName);
fos.getChannel().transferFrom(rbc, 0, Long.MAX_VALUE);
fos.close();
}
}
Then call this function to read the text file
public static String[] read(String fileName) {
String result[] = null;
Vector v = new Vector(10, 2);
BufferedReader br = null;
try {
br = new BufferedReader(new FileReader(fileName));
String tmp = "";
while ((tmp = br.readLine()) != null) {
v.add(tmp);
}
Iterator i = v.iterator();
result = new String[v.toArray().length];
int count = 0;
while (i.hasNext()) {
result[count++] = i.next().toString();
}
} catch (IOException ioe) {
ioe.printStackTrace();
}
return (result);
}
And then finally the main method
public static void main(){
downloader("D:\\file.txt","http://www.abcd.com/textFile.txt");
String data[]=read("D:\\file.txt");
}
try this:
try {
// Create a URL for the desired page
URL url = new URL("mysite.com/thefile.txt");
// Read all the text returned by the server
BufferedReader in = new BufferedReader(new InputStreamReader(url.openStream()));
String str;
StringBuilder sb = new StringBuilder();
while ((str = in.readLine()) != null) {
// str is one line of text; readLine() strips the newline character(s)
sb.append(str );
}
in.close();
String serverTextAsString = sb.toString();
} catch (MalformedURLException e) {
} catch (IOException e) {
}
I have 2 programs: a client and a server.
The server creates a ServerSocket and the client connects using:
address = InetAddress.getByName(host);
conn = new Socket(address, port);
this works, but here is the problem: void mousePressed() { gets called once the mouse is clicked, executing this: (client side)
void mousePressed() {
try {
BufferedOutputStream os = new BufferedOutputStream(conn.getOutputStream());
OutputStreamWriter osw = new OutputStreamWriter(os, "US-ASCII");
osw.write("123");
osw.flush();
} catch (Exception e) {
e.printStackTrace();
}
}
the server should receive the input using:
BufferedReader reader = new BufferedReader(
new InputStreamReader(new BufferedInputStream(conn.getInputStream())));
StringBuilder result = new StringBuilder();
for (String line = null; (line = reader.readLine()) != null;) {
result.append(line);
}
reader.close();
println(result.toString());
The server only receives the input after the socket has been closed with: conn.close(); on the client side or quitting the client. As i want to be able to click the mouse multiple times, i can't close the socket.
What can i do to send input without closing the socket?
Edit: connection code:
Server:
// init
ServerSocket socket1;
int main_port = 5204;
// in main
try {
socket1 = new ServerSocket(main_port);
Socket conn = socket1.accept();
} catch (Exception e) {
e.printStackTrace();
}
Client:
// init
String host = "localhost";
int port = 5204;
Socket conn;
InetAddress address;
// in main
try {
address = InetAddress.getByName(host);
conn = new Socket(address, port);
} catch (Exception e) {
e.printStackTrace();
}
My solution (based on other answers and comments):
1) Changing osw.write("123"); to osw.write("123\n"); in the client.
2) Replacing
BufferedReader reader = new BufferedReader(new InputStreamReader(new
BufferedInputStream(thread_cnn.getInputStream())));
StringBuilder result = new StringBuilder();
for (String line = null; (line = reader.readLine()) != null;) {
result.append(line);
}
println(result);
reader.close();
with
BufferedReader reader = new BufferedReader(new InputStreamReader(new BufferedInputStream(conn.getInputStream())));
String result = reader.readLine().toString();
println(result);
reader = null;
result = null;
on the server.
You are writing an incomplete line, and trying to read complete lines. Terminate the text you send with a line break so it can be read when it arrives.
Also, do not catch and ignore exceptions. If something goes wrong you will want to know about it.
try {
BufferedOutputStream os = new BufferedOutputStream(conn.getOutputStream());
OutputStreamWriter osw = new OutputStreamWriter(os, "US-ASCII");
osw.write("123\n");
osw.flush();
} catch (Exception e) {
e.printStackTrace();
}
I can pull the user's statuses with no problem with cURL, but when I connect with Java, the xml comes out truncated and my parser wants to cry. I'm testing with small users so it's not choke data or anything.
public void getRuserHx(){
System.out.println("Getting user status history...");
String https_url = "https://twitter.com/statuses/user_timeline/" + idS.rootUser + ".xml?count=100&page=[1-32]";
URL url;
try {
url = new URL(https_url);
HttpsURLConnection con = (HttpsURLConnection)url.openConnection();
con.setRequestMethod("GET");
con.setReadTimeout(15*1000);
//dump all the content into an xml file
print_content(con);
}
catch (MalformedURLException e) {
e.printStackTrace();
}
catch (IOException e) {
e.printStackTrace();
}
System.out.println("Finished downloading user status history.");
}
private void print_content(HttpsURLConnection con){
if(con!=null){
try {
BufferedReader br = new BufferedReader(new InputStreamReader(con.getInputStream()));
File userHx = new File("/" + idS.rootUser + "Hx.xml");
PrintWriter out = new PrintWriter(idS.hoopoeData + userHx);
String input;
while ((input = br.readLine()) != null){
out.println(input);
}
br.close();
}
catch (IOException e) {
e.printStackTrace();
}
}
This request doesn't need auth. Sorry about my ugly code. My professor says input doesn't matter so my I/O is a trainwreck.
You have to flush the output stream when you write the content out. Did you flush or close the output stream?