Filereader null declarations and appending best practice - java

I want to optimise my file reader function but am not sure if it is best practice to declare the nulls outside of the try loop. Also, is looping and appending chars to a Stringbuffer considered bad practice? I would like to use the exception handling here, but maybe it is better to use another structure? any advice most welcome, thanks.
public String readFile(){
File f = null;
FileReader fr = null;
StringBuffer content = null;
try{
f = new File("c:/test.txt");
fr = new FileReader(f);
int c;
while((c = fr.read()) != -1){
if(content == null){
content = new StringBuffer();
}
content.append((char)c);
}
fr.close();
}
catch (Exception e) {
throw new RuntimeException("An error occured reading your file");
}
return content.toString();
}
}

Advice:
Indent your code properly. The stuff in your question looks like a dog's breakfast.
You don't need to initialize f inside the try / catch block. The constructor can't throw an Exception the way you are using it.
In fact, you don't need to declare it at all. Just inline the new File(...).
In fact, you don't even need to do that. Use the FileReader(String) constructor.
There's no point initializing the StringBuffer inside the loop. The potential performance benefit is small and only applies in the edge-case where the file is empty or doesn't exist. In all other cases, this is an anti-optimization.
Don't catch Exception. Catch the exceptions that you expect to be thrown and allow all other exceptions to propagate. The unexpected exceptions are going to be due to bugs in your program, and need to be handled differently from others.
When you catch an exception, don't throw away the evidence. For an unexpected exception, either print / log the exception, its message and its stacktrace, or pass it as the 'cause' of the exception that you throw.
The FileReader should be closed in a finally clause. In your version of the code, the FileReader won't be closed if there is an exception after the object has been created and before the close() call. That will result in a leaked file descriptor and could cause problems later in your application.
Better yet, use the new Java 7 "try with resource" syntax which takes care of closing the "resource" automatically (see below).
You are reading from the file one character at a time. This is very inefficient. You need to either wrap the Reader in a BufferedReader, or read a large number of characters at a time using (for example) read(char[], int, int)
Use StringBuilder rather than StringBuffer ... unless you need a thread-safe string assembler.
Wrapping exceptions in RuntimeException is bad practice. It makes it difficult for the caller to handle specific exceptions ... if it needs to ... and even makes printing of a decent diagnostic more difficult. (And that assumes that you didn't throw away the original exception like your code does.)
Note: if you follow the advice of point 8 and not 9, you will find that the initialization of fr to null is necessary if you open the file in the try block.
Here's how I'd write this:
public String readFile() throws IOException {
// Using the Java 7 "try with resource syntax".
try (FileReader fr = new FileReader("c:/test.txt")) {
BufferedReader br = new BufferedReader(fr);
StringBuilder content = new StringBuilder();
int c;
while ((c = br.read()) != -1) {
content.append((char)c);
}
return content.toString();
}
}
A further optimization would be to use File.length() to find out what the file size (in bytes) is and use that as the initial size of the StringBuilder. However, if the files are typically small this is likely to make the application slower.

public String readFile() {
File f = new File("/Users/Guest/Documents/workspace/Project/src/test.txt");
FileReader fr = null;
BufferedReader br = null;
StringBuilder content = new StringBuilder();;
try {
fr = new FileReader(f);
br = new BufferedReader(fr);
//int c;
//while ((c = fr.read()) != -1) {
//content.append((char) c);
//}
String line = null;
while((line = br.readLine()) != null) {
content.append(line);
}
fr.close();
br.close();
} catch (Exception e) {
// do something
}
return content.toString();
}
Use buffered reader and youll get 70%+ improvement, use string builder instead of string buffer unless you need syncronization.
ran it on a 10MB file 50 times and averaged
no need to put anything that does not need exception handling inside try
no need for that if clause because it will be true only once and so you're wasting time - checking it for every character
there is no runtime exceptions to throw.
results:
fastest combination to slowest:
string builder and buffered reader line by line: 211 ms
string buffer and buffered reader line by line: 213 ms
string builder and buffered reader char by char: 348 ms
string buffer and buffered reader char by char: 372 ms
string builder and file reader char by char: 878
string buffer and file reader char by char: 935 ms
string: extremely slow
so use string builder + buffered reader and make it read line by line for best results.

Related

How to use a regular expression to parse a text file and write the result on another file in Java

I used a regular expression to parse a text file to use the resulted group one and two as follows:
write group two in another file
make its name to be group one
Unfortunately, No data is written on the file!
I did not figure out where is the problem, here is my code:
package javaapplication5;
import java.io.*;
import java.util.regex.*;
public class JavaApplication5 {
public static void main(String[] args) {
// TODO code application logic here
try {
FileInputStream fstream = new FileInputStream("C:/Users/Welcome/Desktop/End-End-Delay.txt");
DataInputStream in = new DataInputStream(fstream);
BufferedReader br = new BufferedReader(new InputStreamReader(in));
File newFile1= new File("C:/Users/Welcome/Desktop/AUV1.txt");
FileOutputStream fos1= new FileOutputStream(newFile1);
BufferedWriter bw1= new BufferedWriter(new OutputStreamWriter(fos1));
String strLine;
while ((strLine = br.readLine()) != null) {
Pattern p = Pattern.compile("sender\\sid:\\s(\\d+).*?End-End\\sDelay:(\\d+(?:\\.\\d+)?)");
Matcher m = p.matcher(strLine);
while (m.find()) {
String b = m.group(1);
String c = m.group(2);
int i = Integer.valueOf(b);
if(i==0){
System.out.println(b);
bw1.write(c);
bw1.newLine();
}
System.out.println(b);
// System.out.println(c);
}
}
}
catch (Exception e) {
System.err.println("Error: " + e.getMessage());
}
}
}
Can anyone here help me to solve this problem and Identify it?
You are using BufferedWriter, and never flush (flushing writer pushes the contents on disk) your writer or even close it at the end of your program.
Due to which, before your content gets written in actual file on disk from BufferedWriter, the program exits and the contents get lost.
To avoid this, either you can call flush just after writing contents in bw1,
bw1.write(c);
bw1.newLine();
bw1.flush();
OR
Before your program ends, you should call,
bw1.close(); // this ensures all content in buffered writer gets push to disk before jvm exists
Calling flush every time you write the data is not really recommended, as it defeats the purpose of buffered writing.
So best is to close the buffered writer object. You can do it in two ways,
Try-with-resources
Manually close the buffered writer object in the end, likely in the finally block so as to ensure it gets called.
Besides all this, you need to ensure that your regex matches and your condition,
if(i==0){
gets executed else code that is writing data in file won't get executed and of course in that case no write will happen in file.
Also, it is strongly recommended to close any of the resources you open like file resources, database (Connection, Statements, ResultSets) resources etc.
Hope that helps.

Releasing resources acquired by BufferedReader, InputStreamReader and InputStream

I've got the following piece of code:
public ArrayList<Crime> loadCrimes() throws IOException, JSONException {
ArrayList<Crime> crimes = new ArrayList<Crime>();
BufferedReader reader = null;
try {
// Open and read the file into a StringBuilder
InputStream in = mContext.openFileInput(mFilename);
//what if an exception gets thrown in the line below?
reader = new BufferedReader(new InputStreamReader(in));
StringBuilder jsonString = new StringBuilder();
String line = null;
while ((line = reader.readLine()) != null) {
// Line breaks are omitted and irrelevant
jsonString.append(line);
}
// Parse the JSON using JSONTokener
JSONArray array = (JSONArray) new JSONTokener(jsonString.toString()).nextValue();
// Build the array of crimes from JSONObjects
for (int i = 0; i < array.length(); i++) {
crimes.add(new Crime(array.getJSONObject(i)));
}
} catch (FileNotFoundException e) {
// Ignore this one; it happens when starting fresh
} finally {
if (reader != null)
reader.close();
}
return crimes;
}
First, I wondered why we call .close() just on the BufferedReader object and not on the InputStream and InputStreamReader objects. I checked the official Oracle documentation and skimmed through a couple of questions in stackoverflow and according to what I've read BufferedReader.close() takes care of releasing the resources acquired by InputStreamReader and InputStream, so I don't have to call .close() on them.
Is this correct?
Secondly, I thought what would happen if an exception got thrown after creating the InputStream object and before creating the BufferedReader object. That is, either the InputStreamReader(InputStream in) constructor or BufferedReader(Reader in) constructor throws an exception. In that case we have acquired a file resource with InputStream but the BufferedReader object is still null, so in the finally block, the .close() method will not be invoked and the InputStream will not release the resources it has acquired.
Then I read the Oracle documentation, and neither of those two constructors throw an exception according to it. So it's not possible for an exception to occur between those two lines, right?
My final question is "Does this piece of code make sure it releases all resources it acquires?"
First, I wondered why we call .close() just on the BufferedReader object and not on the InputStream and InputStreamReader objects. I checked the official Oracle documentation and skimmed through a couple of questions in stackoverflow and according to what I've read BufferedReader.close() takes care of releasing the resources acquired by InputStreamReader and InputStream, so I don't have to call .close() on them.
Is this correct?
Yes, it is.
My final question is "Does this piece of code make sure it releases all resources it acquires?"
Most probably, yes. Even if some odd exceptions occur when creating the InputStreamReader or the BufferedReader, after exiting the block where they're defined, before GC collects them, they will be automatically closed as well. Of course, it's usually not a good idea to count on the GC to cleanup for you so if you want to be 100% sure, you might want to do it yourself in the finally block.
Another easier way is to use a FileReader which will simplify the code by only using two objects:
BufferedReader reader = new BufferedReader(new FileReader(fileName));
(I am assuming you want to read from a file).

closing an buffer reader is compulsory

I am trying an example from
http://www.roseindia.net/java/beginners/java-read-file-line-by-line.shtml
in the example the BufferReader is not closed is that necessary to close the BufferReaderor not? Please explain.
FileInputStream fstream = new FileInputStream("textfile.txt");
BufferedReader br = new BufferedReader(new InputStreamReader(fstream));
String strLine;
//Read File Line By Line
while ((strLine = br.readLine()) != null) {
// Print the content on the console
System.out.println (strLine);
}
//Close the input stream
in.close();
Always close streams. It's a good habit which helps you to avoid some odd behaviour. Calling close() method also calls flush() so you don't have do this manually.
The best place where to close streams is probably in a finally block. If you have it like in your example and an exception occurs before the in.close() line, the stream won't be closed.
And if you have chained streams, you can only close the last one and all before it are closed too. This means br.close() in your example - not in.close();
Example
try {
// do something with streams
} catch (IOException e) {
// process exception - log, wrap into your runtime, whatever you want to...
} finally {
try {
stream.close();
} catch (IOException e) {
// error - log it at least
}
}
Alternatively you can use closeQuietly(java.io.InputStream) in Apache Commons library.
From the perspective of resource leak prevention, it is not strictly necessary to close a wrapper stream if you've also closed the stream that it wraps. However, closing the wrapped stream may result in stuff getting lost (specifically in the output case), so it is better to close (just) the wrapper, and rely on documented behavior that the closing the wrapper closes the wrapped stream too. (That is certainly true for the standard I/O wrapper classes!)
Like Peter Lawrey, I question the wisdom of relying on "Rose India" examples. For instance, this one has two more obvious mistakes in it that no half-decent Java programmer should make:
The stream is not closed in a finally block. If any exception is thrown between opening and closing, the in.close() statement won't be executed, and the application will leak an open file descriptor. Do that too often and your application will start throwing unexpected IOExceptions.
The DataInputStream in the chain serves no useful purpose. Instead, they should use fstream as the parameter for the InputStreamReader. Or better still, use FileReader.
Finally, here is a corrected version of the example:
BufferedReader br = new BufferedReader(new FileReader ("textfile.txt"));
try {
String line;
while ((line = br.readLine()) != null) {
// Print the content on the console
System.out.println(line);
}
} finally {
// Close the reader stack.
br.close();
}
or using Java 7's "try with resource":
try (BufferedReader br = new BufferedReader(new FileReader ("textfile.txt"))) {
String line;
while ((line = br.readLine()) != null) {
// Print the content on the console
System.out.println(line);
}
}
Since the underlying stream is closed, it is not absolutely necessary to close BufferedReader, even though it is a good practice to close ALL Closeables in reverse order (relative to the order they were opened in.)

Read multiple lines from InputStreamReader (JAVA)

I have an InputStreamReader object. I want to read multiple lines into a buffer/array using one function call (without crating a mass of string objects). Is there a simple way to do so?
First of all mind that InputStreamReader is not so efficient, you should wrap it around a BufferedReader object for maximum performance.
Taken into account this you can do something like this:
public String readLines(InputStreamReader in)
{
BufferedReader br = new BufferedReader(in);
// you should estimate buffer size
StringBuffer sb = new StringBuffer(5000);
try
{
int linesPerRead = 100;
for (int i = 0; i < linesPerRead; ++i)
{
sb.append(br.readLine());
// placing newlines back because readLine() removes them
sb.append('\n');
}
}
catch (Exception e)
{
e.printStackTrace();
}
return sb.toString();
}
Mind that readLine() returns null is EOF is reached, so you should check and take care of it.
If you have some delimiter for multiple lines you can read that many characters using read method with length and offset. Otherwise using a StringBuilder for appending each line read by BufferedReader should work well for you without eating up too much temp memory

Most concise way to read the contents of a file/input stream in Java?

What ist most concise way to read the contents of a file or input stream in Java? Do I always have to create a buffer, read (at most) line by line and so on or is there a more concise way? I wish I could do just
String content = new File("test.txt").readFully();
Use the Apache Commons IOUtils package. In particular the IOUtils class provides a set of methods to read from streams, readers etc. and handle all the exceptions etc.
e.g.
InputStream is = ...
String contents = IOUtils.toString(is);
// or
List lines = IOUtils.readLines(is)
I think using a Scanner is quite OK with regards to conciseness of Java on-board tools:
Scanner s = new Scanner(new File("file"));
StringBuilder builder = new StringBuilder();
while(s.hasNextLine()) builder.append(s.nextLine());
Also, it's quite flexible, too (e.g. regular expressions support, number parsing).
Helper functions. I basically use a few of them, depending on the situation
cat method that pipes an InputStream to an OutputStream
method that calls cat to a ByteArrayOutputStream and extracts the byte array, enabling quick read of an entire file to a byte array
Implementation of Iterator<String> that is constructed using a Reader; it wraps it in a BufferedReader and readLine's on next()
...
Either roll your own or use something out of commons-io or your preferred utility library.
To give an example of such an helper function:
String[] lines = NioUtils.readInFile(componentxml);
The key is to try to close the BufferedReader even if an IOException is thrown.
/**
* Read lines in a file. <br />
* File must exist
* #param f file to be read
* #return array of lines, empty if file empty
* #throws IOException if prb during access or closing of the file
*/
public static String[] readInFile(final File f) throws IOException
{
final ArrayList lines = new ArrayList();
IOException anioe = null;
BufferedReader br = null;
try
{
br = new BufferedReader(new FileReader(f));
String line;
line = br.readLine();
while(line != null)
{
lines.add(line);
line = br.readLine();
}
br.close();
br = null;
}
catch (final IOException e)
{
anioe = e;
}
finally
{
if(br != null)
{
try {
br.close();
} catch (final IOException e) {
anioe = e;
}
}
if(anioe != null)
{
throw anioe;
}
}
final String[] myStrings = new String[lines.size()];
//myStrings = lines.toArray(myStrings);
System.arraycopy(lines.toArray(), 0, myStrings, 0, lines.size());
return myStrings;
}
(if you just want a String, change the function to append each lines to a StringBuffer (or StringBuilder in java5 or 6)
String content = (new RandomAccessFile(new File("test.txt"))).readUTF();
Unfortunately Java is very picky about the source file being valid UTF8 though, or you will get an EOFException or UTFDataFormatException.
You have to create your own function, I suppose. The problem is that Java's read routines (those I know, at least) usually take a buffer argument with a given length.
A solution I saw is to get the size of the file, create a buffer of this size and read the file at once. Hoping the file isn't a gigabyte log or XML file...
The usual way is to have a fixed size buffer or to use readLine and concatenate the results in a StringBuffer/StringBuilder.
I don't think reading using BufferedReader is a good idea because BufferedReader will return just the content of line without the delimeter. When the line contains nothing but newline character, BR will return a null although it still doesn't reach the end of the stream.
String org.apache.commons.io.FileUtils.readFileToString(File file)
Pick one from here.
How do I create a Java string from the contents of a file?
The favorite was:
private static String readFile(String path) throws IOException {
FileInputStream stream = new FileInputStream(new File(path));
try {
FileChannel fc = stream.getChannel();
MappedByteBuffer bb = fc.map(FileChannel.MapMode.READ_ONLY, 0, fc.size());
/* Instead of using default, pass in a decoder. */
return CharSet.defaultCharset().decode(bb).toString();
}
finally {
stream.close();
}
}
Posted by erickson
Or the Java 8 way:
try {
String str = new String(Files.readAllBytes(Paths.get("myfile.txt")));
...
} catch (IOException ex) {
Logger.getLogger(getClass().getName()).log(Level.SEVERE, null, ex);
}
One may pass an appropriate Charset to the String constructor.

Categories

Resources