Read A Large File Using Java NIO [closed] - java

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I need to read the content of a large file. For that I Googled on it and found so many methods and resources. But I'm still confused which is the method to read the large files (Factors need to be consider in my case are memory allocation, performance, large file )
Using FileChannel
using Files.readAllLines
using BufferedReader
Can anyone guide on this?

Your best option is to read the file lazily. Fetch each line one at a time and process.
Example:-
Stream<String> lines = Files.lines(Paths.get("C:/files", "yourfile.txt"));
Then process the lines afterwords.
From the official documentation:-
public static Stream<String> lines(Path path, Charset cs) throws IOException
Read all lines from a file as a Stream. Unlike readAllLines, this
method does not read all lines into a List, but instead populates
lazily as the stream is consumed.

Related

Bulk Processing with Java Stream [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
The community reviewed whether to reopen this question 1 year ago and left it closed:
Original close reason(s) were not resolved
Improve this question
I have a definition of a Stream such as: "They are wrappers around a data source, allowing us to operate with that data source and making bulk processing convenient and fast. "
Can someone give an example and just a basic explanation of how it works such that Stream makes "bulk processing convenient and fast"?
Thank you!
Files.newBufferedReader("/tmp/foo").lines().map(...)...collect(...);
// or
BufferedReader reader = Files.newBufferedReader("/tmp/foo");
Stream<String> stream = reader.lines();
Collection<String> result = stream.map(...)...collect(...);
Is a convenient way to process a text file using a Stream.
But the work of making it fast/efficient is being done by the BufferedReader, not the Stream.

Convert image to binary to apply Image Steganography [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I was trying to convert a ".jpg" image to binary and then change its binary value to hide some data. But couldn't find anything. Any ideas anyone?
If I understand the question correctly, you want to get the single bytes of the jpg-file, which can be read with a DataInputStream:
File imageFile;
DataInputStream dis = new DataInputStream(new FileInputStream(imageFile));
int input = dis.read();
dis.close();
input then holds the first byte of the file, if you invoke read again (before dis.close()), you can read the subsequent bytes. Next, you would have to manipulate them and finally, you can write them to this or another file with a DataOutputStream that works just like the corresponding input stream. Just do NOT forget to close the streams after you are done reading or writing, so that system resources are freed and the files are closed. Otherwise the written data could be lost.

What is the relation between InputStream, BuffreredInputStream, InputStreamReader and BufferedReader? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I always get confused when to process my input data how, which process. Different times i find different solutions. I am also not clear about their Hierarchy.
InputStream is parent class of all input streams and readers. Classes that have Stream keyword will work with bytes whereas classes which have Reader keyword will work with characters.
Buffer is wrapper around these streams to decrease the system calls and increase performance and speed of reading. Non buffered streams return single byte each time whereas Bufferd stream will not return until the buffer gets full. For example if you take BufferedReader you can read a whole line using readLine() but in non buffered stream you must read single character using read() method.

Java - What is the fastest way for reading a file [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I want to read a file as a string. Every line of this file contains information that I use to construct new objects.
I wonder what is the fastest way of doing that. Obviously, there are two options. The first is to read the whole file and create the objects by reading the string. The second is to read the data line by line and create a new object after every readLine(). Is there a real performance deference or I can go either way?
If there is an object every couple of lines and it won't affect anything else after or before it I would go for the line by line approach. However, if there is parsing involved with nested objects or a more complicated file structure read it all first

How to log JSON deserialization in Jackson [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 months ago.
Improve this question
I have a piece of code that is reading JSON data from an input stream and converting it to POJOs (using Jackson). Sometimes, the data will fail to deserialize and it's difficult to troubleshoot. What would be a good mechanism to see the line-by-line input stream in log4j? Are there other tools/techniques that can help the troubleshooting?
I assume that you are already logging the exceptions and stacktraces that you are getting, and that they are not providing enough context to track down the problems.
Jackson doesn't have any internal logging that you could "tap into".
So to log the JSON input line by line, you will need to write (or find) an InputStream or Reader that wraps your source InputStream or Reader and logs each line that is read. Here are a couple of examples that you may be able to use:
https://stackoverflow.com/a/34955334/139985
Logging everything that goes through an InputStream

Categories

Resources