I can pull the user's statuses with no problem with cURL, but when I connect with Java, the xml comes out truncated and my parser wants to cry. I'm testing with small users so it's not choke data or anything.
public void getRuserHx(){
System.out.println("Getting user status history...");
String https_url = "https://twitter.com/statuses/user_timeline/" + idS.rootUser + ".xml?count=100&page=[1-32]";
URL url;
try {
url = new URL(https_url);
HttpsURLConnection con = (HttpsURLConnection)url.openConnection();
con.setRequestMethod("GET");
con.setReadTimeout(15*1000);
//dump all the content into an xml file
print_content(con);
}
catch (MalformedURLException e) {
e.printStackTrace();
}
catch (IOException e) {
e.printStackTrace();
}
System.out.println("Finished downloading user status history.");
}
private void print_content(HttpsURLConnection con){
if(con!=null){
try {
BufferedReader br = new BufferedReader(new InputStreamReader(con.getInputStream()));
File userHx = new File("/" + idS.rootUser + "Hx.xml");
PrintWriter out = new PrintWriter(idS.hoopoeData + userHx);
String input;
while ((input = br.readLine()) != null){
out.println(input);
}
br.close();
}
catch (IOException e) {
e.printStackTrace();
}
}
This request doesn't need auth. Sorry about my ugly code. My professor says input doesn't matter so my I/O is a trainwreck.
You have to flush the output stream when you write the content out. Did you flush or close the output stream?
Related
I have a Java application that is runs constantly. This application makes HTTP requests to a cloud server. The problem is that at each request the memory consumption increases until it reaches the point that the machine complete freezes. I isolated parts of the code and I'm sure the problem is with this code block making this http requests. Analyzing the JVM numbers, via prometheus / Grafana, I see that the use of non-heap memory (codecache and metaspace) are constantly increasing, as shown here
In the image above, whenever there is a drop in the line, it is when 98% of memory consumption reached, and Monit kills the app.
The method that is causing this memory consumption, is below (it is executed approx. 300 times until it exhausts a little more than 1.5GB of available memory in the initialization).
public AbstractRestReponse send(RestRequest request){
BufferedReader in = null;
OutputStream fout = null;
URLConnection conn = null;
InputStreamReader inputStreamReader = null;
String result = "";
try {
MultipartEntityBuilder mb = MultipartEntityBuilder.create();// org.apache.http.entity.mime
for (String key : request.getParams().keySet()) {
String value = (String) request.getParams().get(key);
// System.out.println(key + " = " + value);
mb.addTextBody(key, value);
}
if (request.getFile() != null) {
mb.addBinaryBody("file", request.getFile());
}
org.apache.http.HttpEntity e = mb.build();
conn = new URL(request.getUrl()).openConnection();
conn.setDoOutput(true);
conn.addRequestProperty(e.getContentType().getName(), e.getContentType().getValue());// header "Content-Type"...
conn.addRequestProperty("Content-Length", String.valueOf(e.getContentLength()));
fout = conn.getOutputStream();
e.writeTo(fout);// write multi part data...
inputStreamReader = new InputStreamReader(conn.getInputStream());
in = new BufferedReader(inputStreamReader);
String line;
while ((line = in.readLine()) != null) {
result += line;
}
String text = result.toString();
return objectMapper.readValue(text, FacialApiResult.class);
}catch (Exception e) {
e.printStackTrace();
return null;
}finally {
try {
inputStreamReader.close();
} catch (IOException e) {
e.printStackTrace();
}
try {
conn.getInputStream().close();
} catch (IOException e) {
e.printStackTrace();
}
try {
fout.close();
} catch (IOException e) {
e.printStackTrace();
}
try {
in.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
((HttpURLConnection)conn).disconnect() comes to mind. Also String concatenation is time and memory exhaustive. And there was a minor bug in dropping newlines.
NullPointerExceptions may arise in the finally block when an open was not reached due to an exception. But you should have checked that.
public AbstractRestReponse send(RestRequest request) {
URLConnection conn = null;
try {
MultipartEntityBuilder mb = MultipartEntityBuilder.create();// org.apache.http.entity.mime
for (String key : request.getParams().keySet()) {
String value = (String) request.getParams().get(key);
mb.addTextBody(key, value);
}
if (request.getFile() != null) {
mb.addBinaryBody("file", request.getFile());
}
org.apache.http.HttpEntity e = mb.build();
conn = new URL(request.getUrl()).openConnection();
conn.setDoOutput(true);
conn.addRequestProperty(e.getContentType().getName(), e.getContentType().getValue());// header "Content-Type"...
conn.addRequestProperty("Content-Length", String.valueOf(e.getContentLength()));
try (fout = conn.getOutputStream()){
e.writeTo(fout);// write multi part data...
}
StringBuilder resullt = new StringBuilder(2048);
try (BufferedReader in = new InputStreamReader(conn.getInputStream(),
StandardCharsets.UTF_8)) { // Charset
String line;
while ((line = in.readLine()) != null) {
result.append(line).append('\n'); // Newline
}
}
String text = result.toString();
return objectMapper.readValue(text, FacialApiResult.class);
} catch (Exception e) {
e.printStackTrace();
return null;
} finally {
if (conn != null) {
try {
if (conn instanceof HttpURLConnection) {
((HttpURLConnection) conn).disconnect();
}
} catch (IOException e) {
e.printStackTrace(); //Better logger
}
}
}
return null;
}
I explicitly defined the charset (UTF-8 might be wrong) - momentarily it is the server's default.
Used a StringBuilder, and added the missing newline, which might have lead to wrong parsing.
Try-with-resources for auto-closing, and a bit earlier. Hopefully this does not break anything.
Disconnecting the connection when it is an HttpURLConnection. Mind the instanceof which might play a role in unit tests mocking.
You seems to have handled all possible closing part in the finally block. Anyway it's better to use try-with resources to safely close all Closeable objects, if your application is running on Java 7+. That may isolate the problem further if it doesn't fix.
I'm trying to make a basic client <-> server connection in Java. When trying to write to the server, the client sends the details correctly, and the server stalls on reading it until the client output stream is closed. Though, once the output stream is closed it apparently closes the socket, and due to that the server can't reply to the client. Here's the main snippet of code that handles this interaction.
Client:
private void sendCmd(String cmd) {
String infoToSend = cmd;
try {
socket = new Socket(hostname, port);
System.out.println("Trying to send: " + com.sun.org.apache.xml.internal.security.utils.Base64.encode(infoToSend.getBytes()));
out = new DataOutputStream(socket.getOutputStream());
out.writeBytes(com.sun.org.apache.xml.internal.security.utils.Base64.encode(infoToSend.getBytes()));
out.flush();
System.out.println("Socket is flushed");
System.out.println("Waiting for Data");
InputStream is = socket.getInputStream();
System.out.println("Trying to get data");
BufferedReader input = new BufferedReader(
new InputStreamReader(is)
);
String line;
while((line = input.readLine()) != null) {
System.out.println(line);
}
socket.close();
} catch (IOException e) { e.printStackTrace(); }
}
Server:
public void run() {
System.out.println("Got Connection");
try {
BufferedReader in = new BufferedReader(new InputStreamReader(socket.getInputStream()));
out = new DataOutputStream(socket.getOutputStream());
String response;
System.out.println("Response:");
String decode = "";
while ((response = in.readLine()) != null) {
try {
decode = new String(Base64.decode(response));
} catch (Base64DecodingException e) {
e.printStackTrace();
}
}
System.out.println("Decoded: " + decode);
out.writeBytes("We got your message!");
out.flush();
out.close();
} catch (IOException e) { System.out.println("Fail"); e.printStackTrace(); }
Would anyone be able to guide me on how to fix this error. Sorry if it's super easy and I'm just unable to see it.
Sending
socket.shutdownOutput();
solved the issue.
I am trying to send data to server when user mobile internet become active, in my application whenever internet connection is active my broadcast receiver call a service method.
Below is the method. I trying for both post and get method but it does not request to the site. I am not able to print "i am in service5"(Below i have printed).It does not update my database.
public class LocalService extends Service
{
....
....
public void sendfeedback(){
System.out.println("i am in service4");
String filename =MainScreenActivity.UserName;
HttpURLConnection urlConnection = null;
final String target_uri =
"http://readoline.com/feedback.php";
try {
BufferedReader mReader = new BufferedReader(new InputStreamReader(getApplicationContext().openFileInput(filename)));
String line;
StringBuffer buffer = new StringBuffer();
while ((line = mReader.readLine()) != null) {
buffer.append(line + "\n");
}
System.out.println(buffer.toString());
try {
System.out.println("i am in service41"+buffer.toString());
Uri buildUri = Uri.parse("http://readoline.com/feedback.php" + "?").buildUpon()
.appendQueryParameter("user_id", filename)
.appendQueryParameter("feedback", buffer.toString())
.build();
URL url = new URL(buildUri.toString());
urlConnection = (HttpURLConnection) url.openConnection();
urlConnection.setRequestMethod("GET");
urlConnection.connect();
System.out.println("i am in service44");
/* PrintWriter out=new PrintWriter(urlConnection.getOutputStream());
out.write("user_id="+filename);
out.write("&");
out.write("feedback="+buffer.toString());
*/
System.out.println("i am in service5");
// Log.e(LOG_TAG,dataToSend.toString());
//out.close();
// Read the input stream into a String
}catch (IOException e){
System.out.println("i am in service6");
e.printStackTrace();
}finally {
if (urlConnection!=null){
urlConnection.disconnect();
}
}
}catch(Exception e){
}
}
}
I've created a small scraping class and the method below reads in the text from a page.
However, I've found that the method fails to close the connection properly. This results in a ton of open connections which cause my hosting company to then suspend my account. Is the below correct?
private String getPageText(String urlString) {
String pageText = "";
BufferedReader reader = null;
try {
URL url = new URL(urlString);
reader = new BufferedReader(new InputStreamReader(url.openStream()));
StringBuilder builder = new StringBuilder();
int read;
char[] chars = new char[1024];
while ((read = reader.read(chars)) != -1)
builder.append(chars, 0, read);
pageText = builder.toString();
} catch (MalformedURLException e) {
Log.e(CLASS_NAME, "getPageText.MalformedUrlException", e);
} catch (IOException e) {
Log.e(CLASS_NAME, "getPageText.IOException", e);
} finally {
if (reader != null)
try {
reader.close();
} catch (IOException e) {
Log.e(CLASS_NAME, "getPageText.IOException", e);
}
}
return pageText;
}
Your code is fine in the success case but will potentially leak connections in the failure cases (when the http server returns a 4xx or 5xx status code). In these cases HttpURLConnection provides the response body via .getErrorStream() rather than .getInputStream() and you should make sure to drain and close that stream as well.
URLConnection conn = null;
BufferedReader reader = null;
try {
conn = url.openConnection();
reader = new BufferedReader(new InputStreamReader(conn.getInputStream()));
// ...
} finally {
if(reader != null) {
// ...
}
if(conn instanceof HttpURLConnection) {
InputStream err = ((HttpURLConnection)conn).getErrorStream();
if(err != null) {
byte[] buf = new byte[2048];
while(err.read(buf) >= 0) {}
err.close();
}
}
}
There probably needs to be another layer of try/catch inside that finally but you get the idea. You should not explicitly .disconnect() the connection unless you're sure there won't be any more requests for urls on that host in the near future - disconnect() will prevent subsequent requests from being pipelined over the existing connection, which for https in particular will slow things down considerably.
You are just closing the stream and not the connection, use the following structure:
URL u = new URL(url);
HttpURLConnection conn = (HttpURLConnection)
u.openConnection();
conn.connect();
reader = new BufferedReader(new InputStreamReader(conn.getInputStream()));
and then:
} finally {
if (reader != null)
try {
reader.close();
} catch (IOException e) {
Log.e(CLASS_NAME, "getPageText.IOException", e);
}
}
try {
if (conn != null) {
conn.disconnect();
}
} catch (Exception ex) {}
}
I am trying to read the first line of a URL.
Then i want to use that as a string later in the code.
Anyone can help me?
I already tried it with
public static String main(String[] args) {
try {
URL url = new URL("myurlhere");
// read text returned by server
BufferedReader in = new BufferedReader(new InputStreamReader(url.openStream()));
String line;
while ((line = in.readLine()) != null) {
return line;
}
in.close();
}
catch (MalformedURLException e) {
System.out.println("Malformed URL: " + e.getMessage());
}
catch (IOException e) {
System.out.println("I/O Error: " + e.getMessage());
}
return null;
}
I just can't get a string out of it.
You can consider using jsoup for your purpose:
try {
Document doc = Jsoup.connect("http://popofibo.com/pop/swaying-views-of-our-past/").get();
Elements paragraphs = doc.select("p");
for(Element p : paragraphs) {
System.out.println(p.text());
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Output:
It is indeed difficult to argue over the mainstream ideas of evolution of human civilizations...
If you want to read from a file on the internet using a URL you should use URLConnection
here is a simple example:
String string = "";
try {
URLConnection connection = new URL(
"http://myurl.org/mypath/myfile")
.openConnection();
Scanner scanner = new Scanner(connection.getInputStream());
while (scanner.hasNext()) {
string += scanner.next() + " ";
}
scanner.close();
} catch (IOException e) {
e.printStackTrace();
}
// Do something with the string.