Efficient Way to Read a File Java [closed] - java

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
What is the most efficient way to read input from a file ?
I have a very very large file which contains a list of words separated by a newline
e.g
computer
science
is
fun
really
I was thinking about using a BufferedReader object however I was confused by this line in the documentation.
"In general, each read request made of a Reader causes a corresponding read request to be made of the underlying character or byte stream. It is therefore advisable to wrap a BufferedReader around any Reader whose read() operations may be costly, such as FileReaders and InputStreamReaders. For example,
BufferedReader in = new BufferedReader(new FileReader("foo.in"));
will buffer the input from the specified file. Without buffering, each invocation of read() or readLine() could cause bytes to be read from the file, converted into characters, and then returned, which can be very inefficient. " <
Can some please explain this to me?
On second read I starting to believe the BufferedReader is my best bet. Is there a better way?

This post may help.
BufferedReader is a good choice, letting you turn a BufferedReader into a java.util.Stream in Java 8.
Parsing a large CSV file for instance with java.util.stream package:
InputStream is = new FileInputStream(new File("persons.csv"));
BufferedReader br = new BufferedReader(new InputStreamReader(is));
List<Person> persons = br.lines()
.substream(1)
.map(mapToPerson)
.filter(person -> person.getAge() > 17)
.limit(50)
.collect(toList());
Unlike Collections which are in-memory data structures which hold elements within it , Streams allow parallel processing and behave like fixed data structures which computes the elements on-demand basis.
Moreover, Streams also support Pipelining and Internal Iterations.

Related

How to write to a dummy DataOutputStream without using files? [closed]

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 7 months ago.
Improve this question
I'm using a specific library (unfortunately can't be avoided) that writes some information from a class to a file using a utility function that receives a DataOutputStream as the input.
I would like to get the resulting file's content as a String without actually creating a file and writing into it as the writing can be pretty "taxing" (1000+ lines).
Is this possible to do by using a dummy DataOutputStream or some other method and without resorting to creating a temporary file and reading the result from there?
P.S: the final method that actually writes to the DataOutputStream changes from time to time so I would prefer not actually copy-paste it and redo it every time.
As java.io.DataOutputStream wraps around just any other java.io.OutputStream (you have to specify an instance in the constructor) I would recommend that you use a java.io.ByteArrayOutputStream to collect the data in-memory and get the String of that later with the .toString() method.
Example:
ByteArrayOutputStream inMemoryOutput = new ByteArrayOutputStream();
DataOutputStream dataOutputStream = new DataOutputStream(inMemoryOutput);
// use dataOutputStream here as intended
// and then get the String data
System.out.println(inMemoryOutput.toString());
If the encoding of the collected bytes does not match the system default encoding, you might have to specify a Charset as parameter of the toString.

What would be a good program that includes writing data to a text or csv file and/or reading from a text or csv file? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I am still a beginner in java we are task to create a program that includes writing data to a text or csv file and/or reading from a text or csv file.A program that can help or any modern problem. any simple idea?
If you want to perform this task as fast as possible, I suggest you looking at this library that handles CSV file I/O operations:
Apache Commons CSV
Other resources at this link: https://github.com/akullpp/awesome-java#CSV
For a homemade solution try to implement by yourself the logic using Java Core with BufferedReader to read a file line by line, and FileWriter for write strings to append in the file.
BufferedReader example:
BufferedReader csvReader = new BufferedReader(new FileReader(pathInputFile));
while ((row = csvReader.readLine()) != null) {
// do for each row something
// if you use a csv get every values using split method:
// String[] data = row.split(",");
}
csvReader.close();
FileWriter example:
FileWriter fileWriter = new FileWriter(pathOutputFile);
// use csvWriter.append(data) for write strings
// If you want to use a csv structure:
//csvWriter.append("Value1");
//csvWriter.append(",");
//csvWriter.append("Value2");
//if a csv line is finished go down with \n token
//csvWriter.append("\n");
//when finish the write operation
csvWriter.flush();
csvWriter.close();

BufferedReader.readLine() vs FileReader.read(charArray) performance [duplicate]

This question already has answers here:
Why does the buffered writer does not write immediately when the write() method is called
(1 answer)
What is buffer? What are buffered reads and writes?
(1 answer)
Closed 4 years ago.
BufferedReader.readLine() method vs FileReader.read(charArray) , if we pass a big size for charArray, performance of FileReader improves exponentially and looks like we can achieve what BufferedReader does, so why do we have BufferedReader and not just use FileReader with a big char array ?
How come BufferedReader is more efficient than FileReader when its just a decorator around FileReader (or any other Reader implementation) and depends upon FileReader(Reader instance) to read data from disk File ?
Does BufferedReader reduces number of I/O journeys to read data from disk as compared to FileReader ?
Because it is buffered. That cuts down the number of system calls by a factor of thousands.

New line in different languages [closed]

Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers.
This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers.
Closed 8 years ago.
Improve this question
I've encountered a problem while working on my testing system. I'm creating a file in Java, writing to it in Java, but reading from it in Pascal compilators.
So, this may be not unclear, but when I do something like this in Java (Eclipse)
File file = new File("D:/i.txt");
BufferedReader bf = new BufferedReader(new FileReader(file));
PrintWriter pw = new PrintWriter(new FileWriter(file));
pw.print("hey\n");
pw.print("you");
bf.close();
pw.close();
It gives me a file that looks like
hey
you
And when I run this code on Pascal language
begin
assign(input,'D:/i.txt'); reset(input);
while not eoln(input) do write(1);
end.
Which means: Write "1" until you find a new line separator.
It won't stop to write ones.
But this is okay. Here is another strange thing: Pascal must have a line break or a line separator, or new line indicator and I found this to be Char number 10 on ASCII table (LF, new line).
So, I decided to do other way.
File file = new File("D:/i.txt");
BufferedReader bf = new BufferedReader(new FileReader(file));
PrintWriter pw = new PrintWriter(new FileWriter(file));
pw.print("hey"+(char)10);
pw.print("you");
bf.close();
pw.close();
This one would give me the same output file as the first bit of code ( at last apparently ).
But all my Pascal compilators still keep complaining and writing hundreds of ones.
How can I solve the problem with new lines?
Thank you.
I think the infinite loop (writing hundreds of ones) is because you never read anything from the input so it is never at the end-of-line. Try putting a read(input,ch); in the loop.

Reading serialized file - java [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
So I am having problems reading from a serialized file.
More specifically, I have serialized an object to a file written in a hexadecimal format. The problem occurs when I want to read one line at a time from this file. For example, the file can look like this:
aced 0005 7372 0005 5465 7374 41f2 13c1
215c 9734 6b02 0000 7870
However, the code underneath reads the whole file (instead of just the first line). Also, it automatically converts the hexadecimal data into something more readable: ¬ísrTestAòÁ
....
try (BufferedReader file = new BufferedReader(new FileReader(fileName))) {
read(file);
} catch ...
....
public static void read(BufferedReader in) throws IOException{
String line = in.readLine();
System.out.println(line); // PROBLEM: This prints every line
}
}
This code works perfectly fine if I have a normal text file with some random words, it only prints the first line. My guess is the problems lies in the serialization format. I read somewhere (probably the API) that the file is supposed to be in binary (even though my file is in hexadecimal??).
What should I do to be able to read one line at a time from this file?
EDIT: I have gotten quite a few of answers, which I am thankful for. I never wanted to deserialize the object - only be able to read every hexadecimal line (one at a time) so I could analyze the serialized object. I am sorry if the question was unclear.
Now I have realized that the file is actually not written in hexadecimal but in binary. Further, it is not even devided into lines. The problem I am facing now is to read every byte and convert it into hexadecimal. Basically, I want the data to look like the hexadecimal data above.
UPDATE:
immibis comments helped me solve this.
"Use FileInputStream (or a BufferedInputStream wrapping one) and call read() repeatedly - each call returns one byte (from 0 to 255) or -1 if there are no more bytes in the file. This is the simplest, but not the most efficient, way (reading an array is usually faster)"
The file does not contain hexadecimal text and is not separated into lines.
Whatever program you are using to edit the file is "helpfully" converting it into hexadecimal for you, since it would be gibberish if displayed directly.
If you are writing the file using ObjectOutputStream and FileOutputStream, then you need to read it using ObjectInputStream and FileInputStream.
Your question doesn't make any sense. Serialized data is binary. It doesn't contain lines. You can't read lines from it. You should either read bytes, with an InputStream, or objects, with an ObjectInputStream.

Categories

Resources