public static boolean sendRequest(String request) {
InputStream inputStream = null;
try {
URL url = new URL(request);
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
connection.setReadTimeout(TIMEOUT);
connection.setConnectTimeout(TIMEOUT);
connection.setRequestMethod("POST");
connection.connect();
inputStream = connection.getInputStream();
while (inputStream.read() != -1);
return true;
} catch (IOException error) {
return false;
} finally {
try {
if (inputStream != null) {
inputStream.close();
}
} catch (IOException secondError) {
Log.w(RequestManager.class.getSimpleName(), secondError);
}
}
}
how do i read data from inputreader.read()? i want to read the data that is sent back from a server
int iReadSize;
byte[] buffer = new byte[4096];
try (InputStream inputStream = connection.getInputStream();) {
while ((iReadSize = inputStream.read(buffer)) != -1) {
System.out.println(new String(buffer, 0, iReadSize));
}
} catch (IOException error) {
}
This might be useful.
import java.net.*;
import java.io.*;
public class URLConnectionReader {
public static void main(String[] args) throws Exception {
URL oracle = new URL("http://www.oracle.com/");
URLConnection yc = oracle.openConnection();
BufferedReader in = new BufferedReader(new InputStreamReader(
yc.getInputStream()));
String inputLine;
while ((inputLine = in.readLine()) != null)
System.out.println(inputLine);
in.close();
}
}
For reference you can lookup from link https://docs.oracle.com/javase/tutorial/networking/urls/readingWriting.html.
Related
I've created a small scraping class and the method below reads in the text from a page.
However, I've found that the method fails to close the connection properly. This results in a ton of open connections which cause my hosting company to then suspend my account. Is the below correct?
private String getPageText(String urlString) {
String pageText = "";
BufferedReader reader = null;
try {
URL url = new URL(urlString);
reader = new BufferedReader(new InputStreamReader(url.openStream()));
StringBuilder builder = new StringBuilder();
int read;
char[] chars = new char[1024];
while ((read = reader.read(chars)) != -1)
builder.append(chars, 0, read);
pageText = builder.toString();
} catch (MalformedURLException e) {
Log.e(CLASS_NAME, "getPageText.MalformedUrlException", e);
} catch (IOException e) {
Log.e(CLASS_NAME, "getPageText.IOException", e);
} finally {
if (reader != null)
try {
reader.close();
} catch (IOException e) {
Log.e(CLASS_NAME, "getPageText.IOException", e);
}
}
return pageText;
}
Your code is fine in the success case but will potentially leak connections in the failure cases (when the http server returns a 4xx or 5xx status code). In these cases HttpURLConnection provides the response body via .getErrorStream() rather than .getInputStream() and you should make sure to drain and close that stream as well.
URLConnection conn = null;
BufferedReader reader = null;
try {
conn = url.openConnection();
reader = new BufferedReader(new InputStreamReader(conn.getInputStream()));
// ...
} finally {
if(reader != null) {
// ...
}
if(conn instanceof HttpURLConnection) {
InputStream err = ((HttpURLConnection)conn).getErrorStream();
if(err != null) {
byte[] buf = new byte[2048];
while(err.read(buf) >= 0) {}
err.close();
}
}
}
There probably needs to be another layer of try/catch inside that finally but you get the idea. You should not explicitly .disconnect() the connection unless you're sure there won't be any more requests for urls on that host in the near future - disconnect() will prevent subsequent requests from being pipelined over the existing connection, which for https in particular will slow things down considerably.
You are just closing the stream and not the connection, use the following structure:
URL u = new URL(url);
HttpURLConnection conn = (HttpURLConnection)
u.openConnection();
conn.connect();
reader = new BufferedReader(new InputStreamReader(conn.getInputStream()));
and then:
} finally {
if (reader != null)
try {
reader.close();
} catch (IOException e) {
Log.e(CLASS_NAME, "getPageText.IOException", e);
}
}
try {
if (conn != null) {
conn.disconnect();
}
} catch (Exception ex) {}
}
I have looked around on how to do this and I keep finding different solutions, none of which has worked fine for me and I don't understand why. Does FileReader only work for local files? I tried a combination of scripts found on the site and it still doesn't quite work, it just throws an exception and leaves me with ERROR for the variable content. Here's the code I've been using unsuccessfully:
public String downloadfile(String link){
String content = "";
try {
URL url = new URL(link);
URLConnection conexion = url.openConnection();
conexion.connect();
InputStream is = url.openStream();
BufferedReader br = new BufferedReader( new InputStreamReader(is));
StringBuilder sb = new StringBuilder();
String line;
while ((line = br.readLine()) != null) {
sb.append(line);
}
content = sb.toString();
br.close();
is.close();
} catch (Exception e) {
content = "ERROR";
Log.e("ERROR DOWNLOADING",
"File not Found" + e.getMessage());
}
return content;
}
Use this as a downloader(provide a path to save your file(along with the extension) and the exact link of the text file)
public static void downloader(String fileName, String url) throws IOException {
File file = new File(fileName);
url = url.replace(" ", "%20");
URL website = new URL(url);
if (file.exists()) {
file.delete();
}
if (!file.exists()) {
ReadableByteChannel rbc = Channels.newChannel(website.openStream());
FileOutputStream fos = new FileOutputStream(fileName);
fos.getChannel().transferFrom(rbc, 0, Long.MAX_VALUE);
fos.close();
}
}
Then call this function to read the text file
public static String[] read(String fileName) {
String result[] = null;
Vector v = new Vector(10, 2);
BufferedReader br = null;
try {
br = new BufferedReader(new FileReader(fileName));
String tmp = "";
while ((tmp = br.readLine()) != null) {
v.add(tmp);
}
Iterator i = v.iterator();
result = new String[v.toArray().length];
int count = 0;
while (i.hasNext()) {
result[count++] = i.next().toString();
}
} catch (IOException ioe) {
ioe.printStackTrace();
}
return (result);
}
And then finally the main method
public static void main(){
downloader("D:\\file.txt","http://www.abcd.com/textFile.txt");
String data[]=read("D:\\file.txt");
}
try this:
try {
// Create a URL for the desired page
URL url = new URL("mysite.com/thefile.txt");
// Read all the text returned by the server
BufferedReader in = new BufferedReader(new InputStreamReader(url.openStream()));
String str;
StringBuilder sb = new StringBuilder();
while ((str = in.readLine()) != null) {
// str is one line of text; readLine() strips the newline character(s)
sb.append(str );
}
in.close();
String serverTextAsString = sb.toString();
} catch (MalformedURLException e) {
} catch (IOException e) {
}
I'm experiencing some issues with this code depending on the browser I use, there are URL's displayed correctly in IE but being displayed as plain text in Firefox (for instance www.microsoft.es looks good on IE but not on Firefox).
Don't know what I'm doing wrong here, I think that there's a problem with the headers that I'm using but I'm not sure...
This is the code:
import java.io.*;
import java.net.*;
import java.util.concurrent.*;
public class Server {
public void startServer() {
final ExecutorService clientProcessingPool = Executors.newFixedThreadPool(10);
Runnable serverTask = new Runnable() {
#Override
public void run() {
try {
#SuppressWarnings("resource")
ServerSocket serverSocket = new ServerSocket(8080);
while (true) {
Socket clientSocket = serverSocket.accept();
clientProcessingPool.submit(new ClientTask(clientSocket));
}
} catch (IOException e) {
e.printStackTrace();
}
}
};
Thread serverThread = new Thread(serverTask);
serverThread.start();
}
private class ClientTask implements Runnable {
private Socket clientSocket;
private ClientTask(Socket clientSocket) {
this.clientSocket = clientSocket;
}
#Override
public void run() {
try {
BufferedReader in = new BufferedReader(new InputStreamReader(clientSocket.getInputStream()));
BufferedOutputStream out = new BufferedOutputStream(clientSocket.getOutputStream());
String url = null;
int i=0;
String [] headers = new String [100];
String buffer;
while ((buffer = in.readLine()) != null) {
headers[i]=buffer;
i++;
if(buffer.contains("GET"))
{
String[] splitText = buffer.split(" ");
url = splitText[1];
}
if(buffer.contains("POST"))
{
String[] splitText = buffer.split(" ");
url = splitText[1];
}
if(buffer.contains("CONNECT"))
{
String[] splitText = buffer.split(" ");
url = "https://"+splitText[1];
}
if (buffer.isEmpty()) break;
}
URL u = new URL(url);
URLConnection connection = u.openConnection();
for (int x=1;x<i-1;x++){
if (!headers[x].contains("Accept-Encoding:")){
connection.setRequestProperty(headers[x].substring(0, headers[x].indexOf(":")).toString() , headers[x].replace(headers[x].substring(0, headers[x].indexOf(":") +2), "").toString());
}
}
boolean redirect = false;
int status = ((HttpURLConnection) connection).getResponseCode();
if (status != HttpURLConnection.HTTP_OK) {
if (status == HttpURLConnection.HTTP_MOVED_TEMP || status == HttpURLConnection.HTTP_MOVED_PERM || status == HttpURLConnection.HTTP_SEE_OTHER)
redirect = true;
}
if (redirect) {
String Url = connection.getHeaderField("Location");
URL urlloc = new URL(Url);
connection = urlloc.openConnection();
for (int x=1;x<i-1;x++){
if (!headers[x].contains("Accept-Encoding:")){
connection.setRequestProperty(headers[x].substring(0, headers[x].indexOf(":")).toString() , headers[x].replace(headers[x].substring(0, headers[x].indexOf(":") +2), "").toString());
}
}
}
byte[] chunk = new byte[1024];
int bytesRead;
InputStream stream;
stream = connection.getInputStream();
while ((bytesRead = stream.read(chunk)) > 0) {
out.write(chunk, 0, bytesRead);
out.flush();
}
out.close();
in.close();
clientSocket.close();
} catch (Exception e) {
e.printStackTrace();
}
}
}
}
Any help would be appreciated.
Thanks.
I'm not sure what the problem is, but you should use a tool like Wireshark to examine the actual network traffic between the browser and your proxy, and compare it to the network traffic between the browser and the site when you connect to the site directly.
I have this method that downloads .csv files from yahoo finance and saves them locally. It is accessed during a loop so it is downloading many files from a list. However sometimes a symbol is entered incorrectly, no longer exists, or the connection times out. How can I amend this method so that connection time outs are retried and incorrect symbols (meaning the url does not work) are just skipped over without ending the program?
public static void get_file(String symbol){
OutputStream outStream = null;
URLConnection uCon = null;
InputStream is = null;
String finance_url = "http://ichart.finance.yahoo.com/table.csv?s="+symbol;
String destination = "C:/"+symbol+"_table.csv";
try {
URL Url;
byte[] buf;
int ByteRead,ByteWritten=0;
Url= new URL(finance_url);
outStream = new BufferedOutputStream(new FileOutputStream(destination));
uCon = Url.openConnection();
is = uCon.getInputStream();
buf = new byte[size];
while ((ByteRead = is.read(buf)) != -1) {
outStream.write(buf, 0, ByteRead);
ByteWritten += ByteRead;
}
}catch (Exception e) {
System.out.println("Error while downloading "+symbol);
e.printStackTrace();
}finally {
try {
is.close();
outStream.close();
}catch (IOException e) {
e.printStackTrace();
}
}
}
Why not call the method again when an exception is thrown. You can narrow down the exception type to indicate when a retry should be initiated.
public static void get_file(String symbol){
OutputStream outStream = null;
URLConnection uCon = null;
InputStream is = null;
String finance_url = "http://ichart.finance.yahoo.com/table.csv?s="+symbol;
String destination = "C:/"+symbol+"_table.csv";
try {
URL Url;
byte[] buf;
int ByteRead,ByteWritten=0;
Url= new URL(finance_url);
outStream = new BufferedOutputStream(new FileOutputStream(destination));
uCon = Url.openConnection();
is = uCon.getInputStream();
buf = new byte[size];
while ((ByteRead = is.read(buf)) != -1) {
outStream.write(buf, 0, ByteRead);
ByteWritten += ByteRead;
}
}catch (Exception e) {
getFile(symbol);
}finally {
try {
is.close();
outStream.close();
}catch (IOException e) {
e.printStackTrace();
}
}
}
Hey I have the following code:
import java.net.*;
import java.io.*;
class OpenStreamTest {
public static void main(String args[]) {
try {
URL yahoo = new URL("http://www.yahoo.com/");
DataInputStream dis;
String inputLine;
dis = new DataInputStream(yahoo.openStream());
while ((inputLine = dis.readLine()) != null) {
System.out.println(inputLine);
}
dis.close();
} catch (MalformedURLException me) {
System.out.println("MalformedURLException: " + me);
} catch (IOException ioe) {
System.out.println("IOException: " + ioe);
}
}
}
How can i save the source code i get from this to a XML file? Please help
Create a Connection:
DefaultHttpClient httpclient = new DefaultHttpClient();
HttpGet httppost = new HttpGet("http://www.google.com");
HttpResponse response = httpclient.execute(httppost);
HttpEntity ht = response.getEntity();
BufferedHttpEntity buf = new BufferedHttpEntity(ht);
InputStream is = buf.getContent();
Put inputstream in a buffer and read it:
BufferedReader r = new BufferedReader(new InputStreamReader(is2));
total = new StringBuilder();
String line;
while ((line = r.readLine()) != null) {
total.append(line);
}
Then put it in the file:
File file = new File("/sdcard", "report.xml");
if(!file.exists()){
file.createNewFile();
}
StringBuilder temp = null;
while ((inputLine = dis.readLine()) != null) {
temp.append(inputLine);
}
FileWriter fw = new FileWriter(file);
fw.write(temp.toString());
fw.flush();
Hope this helpes
Here is an example, where "iso" is you InputSrteam
try {
final File file = new File("/sdcard/filename.xml");
final OutputStream output = new FileOutputStream(file);
try {
try {
final byte[] buffer = new byte[1024];
int read;
while ((read = iso.read(buffer)) != -1)
output.write(buffer, 0, read);
output.flush();
}
finally {
output.close();
}
} catch (Exception e) {
e.printStackTrace();
}
}
catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} finally {
try {
iso.close();
System.out.println("saved");
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}