I added Findbugs plugin to my project and I suddenly started getting the following bug: Dereference of the result of readLine() without nullcheck
I have the following code which reads the http request line by line:
InputStream input = clientSocket.getInputStream();
String line;
while (!(line = in.readLine()).equals("")) {
...
}
I tried rewriting this into some other for with nullcheck:
String line = "";
while (line != null) {
line = in.readLine();
if (line.equals("")) return;
}
But this gets stuck forever (so it is not rewritten correctly). I am sorry for such a basic question but I can't seem to get it right...
Another thing that is marked as bug is Found reliance on default encoding in ..InputStream...
How can I specify encoding in InputStreamReader?
The fixed loop looks like so:
InputStream input = clientSocket.getInputStream();
String line;
while (null != (line = in.readLine())) {
if("".equals(line)) break;
...
}
Why? First of all, of the remote side (the client) closes the connection, readLine() will return null. That what the outer check guards against.
readLine() won't return at all if the client just stops sending data. So as long as the client keeps the connection open, your "fixed" loop hangs.
When comparing string literals, I always put them first:
"".equals(line))
never fails, even when line is null. It's also often more readable since you often want to know what you're comparing against; the variable which you want to check is less "informative".
Apparently readLine can return null, so you have to check it after the line = in.readLine();
Your updated code could still throw a NullPointerException, if readLine returned null.
I doubt that your change will work, since the check is being made on the previous value of line, thus, if your previous line was valid (but you where reading the last line) any subsequent calls can potentially yield a NullPointerException.
The go around this, usually the following pattern is applied:
InputStream input = clientSocket.getInputStream();
String line = "";
while ((line = in.readLine()) != null) {
...
}
Related
I am trying to parse HTML from a website to get very specific data. The following method reads the source and outputs it as a string to be processed by other methods.
StringBuilder source = new StringBuilder();
URL url = new URL(urlIn);
URLConnection spoof;
spoof = url.openConnection();
spoof.setRequestProperty( "User-Agent", "Mozilla/4.0 (compatible; MSIE 5.5; Windows NT 5.0; H010818)" );
BufferedReader in = new BufferedReader(new InputStreamReader(spoof.getInputStream()));
String strLine = "";
while ((strLine = in.readLine()) != null){
source.append(strLine);
}
return source.toString();
The problem that I'm having is that since I call this method multiple times with a different urlIn argument each time, sometimes the method gets stuck at the readLine command. I read that this is because readLine looks for a line break and if the BufferedReader object does not contain one for whatever reason, it will be stuck indefinitely.
Is there a way to check whether my BufferedReader object contains a line break before I run the readLine command. I tried using an if (in.toString().contains("\n")) but that always returns false. Alternatively, could I add a "\n" at the end of my Buffered Reader "in" object every time just so that the while loop would break and not hang up indefinitely?
Any help would be appreciated.
Okay, this here should be what you are looking for.
fis = new FileInputStream("C:/sample.txt");
reader = new BufferedReader(new InputStreamReader(fis));
System.out.println("Reading File line by line using BufferedReader");
String line = reader.readLine();
while(line != null){
System.out.println(line);
line = reader.readLine();
}
Read more: http://javarevisited.blogspot.com/2012/07/read-file-line-by-line-java-example-scanner.html#ixzz3g4RHvy6V
Edit, in your case, since it seems like you are doing webapp testing, I do believe WebDriverWait may work for your needs.
This is not true. BufferedReader.readLine() will not block if the underlying stream has reached the end of input. It will return null. See http://docs.oracle.com/javase/7/docs/api/java/io/BufferedReader.html#readLine().
If your method is getting stuck there is another explanation.
Carefully check all of your exception handling and stream closing logic.
I have the following code for compressing and decompressing string.
public static byte[] compress(String str)
{
try
{
ByteArrayOutputStream obj = new ByteArrayOutputStream();
GZIPOutputStream gzip = new GZIPOutputStream(obj);
gzip.write(str.getBytes("UTF-8"));
gzip.close();
return obj.toByteArray();
}
catch (IOException e)
{
e.printStackTrace();
}
return null;
}
public static String decompress(byte[] bytes)
{
try
{
GZIPInputStream gis = new GZIPInputStream(new ByteArrayInputStream(bytes));
BufferedReader bf = new BufferedReader(new InputStreamReader(gis, "UTF-8"));
StringBuilder outStr = new StringBuilder();
String line;
while ((line = bf.readLine()) != null)
{
outStr.append(line);
}
return outStr.toString();
}
catch (IOException e)
{
return e.getMessage();
}
}
I compress into byte array on windows, and then send the byte array through socket to the linux and uncompress it there. However upon uncompression it seem that all my newline characters are gone.
So I thought that the problem was linux to windows relationship. However I have tried writing a simple program on windows that uses it, and found that the newlines are still gone.
Can anyone shed any light as to what causes it? I can't figure out any explanation.
I think the problem is here:
while ((line = bf.readLine()) != null)
{
outStr.append(line);
}
The readLine see's the newline char but doesn't include it in the returned value for line
The problem is worse than you think, perhaps.
readLine() gets all the characters up to, but not including, a newline (or some variety of returns and linefeed characters) OR the end of file. So you don't know if the last line you get had a newline on the end or not.
This might not matter, and if so, you can just add this following the other append:
outStr.append('\n');
Some files might end up with an extra line ending at the end of file.
If it does matter, you will need to use read() and then output all the characters you receive. In that case, you might end up with the infamous "What's at the end of the line?" problem you allude to between Windows, Linux and the MacOS and the way they use different combinations of return and new-line characters to end lines.
It is not GZIP that is "eating" newlines.
It is this code:
while ((line = bf.readLine()) != null)
{
outStr.append(line);
}
The readLine() method reads a line (up to and including a line termination sequence) and then returns it without a newline. You then append it to outStr ... without replacing the line termination that was stripped.
But even if you replaced the line termination, you can't guarantee to preserve the actual line termination sequence that was used ... if you do it that way.
I recommend that you replace the readLine() calls with read() calls; i.e. read and then buffer the data one character at a time. It solves two problems at once. It may even be faster, because you are avoiding the unnecessary overhead of assembling line Strings.
I have the following code. What I would like to do is read each line from the BufferedReader directly into a StringBuffer to reduce memory overhead. Once it gets to the end of the data stream I would like it to exit the while loop.
StringBuffer line = new StringBuffer();
URL url = new URL("a url");
BufferedReader reader = new BufferedReader(new InputStreamReader(url.openStream()));
int count = 0;
while(line.append(reader.readLine()) != null){
System.out.println(line.toString());
line.delete(0,line.length());
}
It reads the stream fine but when I get to the end of the stream it returns null and keeps printing null without exiting the loop. Any
This while(line.append(reader.readLine()) != null) is basically the same as saying while(line.append(reader.readLine()).toString() != null) which is never likely to happen.
The other problem you might have, is null is actually being translated to a literal String of "null". That's why it's printing "null", the value isn't actually null - confused yet...
Instead, try something like...
String text = null;
while((text = reader.readLine()) != null){
line.append(text)
System.out.println(line.toString());
line.delete(0,line.length());
}
Updated
While I'm here, I might suggest that you are actually not saving your self anything.
readLine will create String object, which you're putting into a StringBuffer. You're not actually saving any memory, but rather complicating the process.
If you're really worried about creating lots of String objects in memory, then use BufferedReader#read(char[]) instead. Append the resulting character array to the StringBuffer.
Also, unless you need synchronized access to the StringBuffer, use StringBuilder instead, it's faster.
This works perfectly. You just have to catch the NUllPointerException
while(line.append(reader.readLine().toString()) != null){
You could try the same with this for-loop:
for (String line; (line = reader.readLine()) != null;) {
System.out.println(line); // Or whatever
}
I have following piece of code :
fis = new FileInputStream(new File(st[0]));
br = new BufferedReader(new InputStreamReader(fis));
while(fis.available()!=-1)
{
System.out.println(br.readLine());
System.out.println(fis.available());
}
The first println statement prints whole of my file but alongside second println statement always shows 0. why when there is actual content to read, is it showing 0 ?
and what should i put as end condition over here.
You want to stop when readLine() returns null, something like this:
String sCurrentLine;
br = new BufferedReader(new FileReader("C:\\testing.txt"));
while ((sCurrentLine = br.readLine()) != null) {
System.out.println(sCurrentLine);
}
The first println statement prints whole of my file but alongside second println statement always shows 0.
You're checking available() twice. After you've read some data, it's no longer available to read, so the available() value printed is different to the one used for the loop condition above.
Secondly, you're reading from the BufferedReader, which does its own buffering of the data from the input stream. That means it's wrong to then sneak around the reader's back to call the available method of the underlying input stream!
Try this:
for (;;) {
String line = br.readLine();
if (line == null) break;
System.out.println(line);
}
availabe() is returning the amount of bytes that can be read for that InputStream when it is not blocking. your readLine() is blocking that InputStream.
I am trying to read text from a web document using a BufferedReader over an InputStreamReader on an URL (to the file on some Apache server).
String result = "";
URL url = new URL("http://someserver.domain/somefile");
BufferedReader in = null;
in = new BufferedReader(new InputStreamReader(url.openStream(), "iso-8859-1"));
result += in.readLine();
Now this works just fine. But Obviously I'd like the reader not to just read one line, but as many as there are in the file.
Looking at the BufferedReader API the following code should do just that:
while (in.ready()) {
result += in.readLine();
}
I.e. read all lines while there are more lines, stop when no more lines are there. This code does not work however - the reader just never reports ready() = true!
I can even print the ready() value right before reading a line (which reads the correct string from the file) but the reader will report 'false'.
Am I doing something wrong? Why does the BufferedReader return 'false' on ready when there is actually stuff to read?
ready() != has more
ready() does not indicate that there is more data to be read. It only shows if a read will could block the thread. It is likely that it will return false before you read all data.
To find out if there is no more data check if readLine() returns null.
String line = in.readLine();
while(line != null){
...
line = in.readLine();
}
Another way you can do this that bypasses the in.ready() is something like:
while ((nextLine = in.readLine()) != null) {
result += nextLine;
}
You will just continue reading until you are done. This way you do not need to worry about the problem with in.ready().
I think the standard way to write this is to just attempt to read the line and verify that it returned sometime. Something like this:
while ((String nextLine = in.readLine()) != null) {
//System.out.println(nextLine);
result += nextLine;
}
So you just continue to go until you get null returned from the stream. See here for extra information:
http://download.oracle.com/javase/1.5.0/docs/api/java/io/BufferedReader.html#readLine()
The BufferedReader.ready() method is behaving as specified:
The Reader.ready() javadoc says the following:
[Returns] true if the next read() is guaranteed not to block for input, false otherwise. Note that returning false does not guarantee that the next read will block.
Then the BufferedReader.ready() javadoc says the following:
Tells whether this stream is ready to be read. A buffered character stream is ready if the buffer is not empty, or if the underlying character stream is ready.
If you put these two together, it is clear that BufferedReader.ready() can return false in situations where are characters available. In short, you shouldn't rely on ready() to test for logical end-of-file or end-of-stream.
This is what we have been using consistently for years - not sure if it is the "standard" method. I'd like to hear comments about the pros and cons of using URL.openURLStream() directly, and if that is causing the OP's problems. This code works for both HTTP and HTTPS connections.
URL getURL = new URL (servletURL.toString() + identifier+"?"+key+"="+value);
URLConnection uConn = getURL.openConnection();
BufferedReader br = new BufferedReader (new
InputStreamReader (uConn.getInputStream()));
for (String s = br.readLine() ; s != null ; s = br.readLine()) {
System.out.println ("[ServletOut] " + s);
// do stuff with s
}
br.close();
Basically the BufferedReader.ready() method can be used for checking whether the underlying stream is ready for providing data to the method caller.... else we can wait the thread for some time till it becomes ready.
But the real problem is that after we completely read the data stream, it will throw false..
so we didn't know whether the stream is fully read OR underlying stream is busy....
If you want to use in.ready(), the following worked for me well:
for (int i = 0; i < 10; i++) {
System.out.println("is InputStreamReader ready: " + in.ready());
if (!in.ready()) {
Thread.sleep(1000);
} else {
break;
}
}