Java: Outputting text file to Console - java

I'm attempting to output a text file to the console with Java. I was wondering what is the most efficient way of doing so?
I've researched several methods however, it's difficult to discern which is the least performance impacted solution.
Outputting a text file to the console would involve reading in each line in the file, then writing it to the console.
Is it better to use:
Buffered Reader with a FileReader, reading in lines and doing a bunch of system.out.println calls?
BufferedReader in = new BufferedReader(new FileReader("C:\\logs\\"));
while (in.readLine() != null) {
System.out.println(blah blah blah);
}
in.close();
Scanner reading each line in the file and doing system.print calls?
while (scanner.hasNextLine()) {
System.out.println(blah blah blah);
}
Thanks.

If all you want to do is print the contents of a file (and don't want to print the next int/double/etc.) to the console then a BufferedReader is fine.
Your code as it is won't produce the result you're after, though. Try this instead:
BufferedReader in = new BufferedReader(new FileReader("C:\\logs\\log001.txt"));
String line = in.readLine();
while(line != null)
{
System.out.println(line);
line = in.readLine();
}
in.close();
I wouldn't get too hung up about it, though because it's more likely that the main bottleneck will be the ability of your console to print the information that Java is sending it.

If you're not interested in the character based data the text file is containing, just stream it "raw" as bytes.
InputStream input = new BufferedInputStream(new FileInputStream("C:/logs.txt"));
byte[] buffer = new byte[8192];
try {
for (int length = 0; (length = input.read(buffer)) != -1;) {
System.out.write(buffer, 0, length);
}
} finally {
input.close();
}
This saves the cost of unnecessarily massaging between bytes and characters and also scanning and splitting on newlines and appending them once again.
As to the performance, you may find this article interesting. According the article, a FileChannel with a 256K byte array which is read through a wrapped ByteBuffer and written directly from the byte array is the fastest way.
FileInputStream input = new FileInputStream("C:/logs.txt");
FileChannel channel = input.getChannel();
byte[] buffer = new byte[256 * 1024];
ByteBuffer byteBuffer = ByteBuffer.wrap(buffer);
try {
for (int length = 0; (length = channel.read(byteBuffer)) != -1;) {
System.out.write(buffer, 0, length);
byteBuffer.clear();
}
} finally {
input.close();
}

If it's a relatively small file, a one-line Java 7+ way to do this is:
System.out.println(new String(Files.readAllBytes(Paths.get("logs.txt"))));
See https://docs.oracle.com/javase/7/docs/api/java/nio/file/package-summary.html for more details.
Cheers!

If all you want is most efficiently dump the file contents to the console with no processing in-between, converting the data into characters and finding line breaks is unnecessary overhead. Instead, you can just read blocks of bytes from the file and write then straight out to System.out:
package toconsole;
import java.io.BufferedInputStream;
import java.io.FileInputStream;
public class Main {
public static void main(String[] args) {
BufferedInputStream bis = null;
byte[] buffer = new byte[8192];
int bytesRead = 0;
try {
bis = new BufferedInputStream(new FileInputStream(args[0]));
while ((bytesRead = bis.read(buffer)) != -1) {
System.out.write(buffer, /* start */ 0, /* length */ bytesRead);
}
} catch (Exception e) {
e.printStackTrace();
} finally {
try { bis.close(); } catch (Exception e) { /* meh */ }
}
}
}
In case you haven't come across this kind of idiom before, the statement in the while condition both assigns the result of bis.read to bytesRead and then compares it to -1. So we keep reading bytes into the buffer until we are told that we're at the end of the file. And we use bytesRead in System.out.write to make sure we write only the bytes we've just read, as we can't assume all files are a multiple of 8 kB long!

FileInputStream input = new FileInputStream("D:\\Java\\output.txt");
FileChannel channel = input.getChannel();
byte[] buffer = new byte[256 * 1024];
ByteBuffer byteBuffer = ByteBuffer.wrap(buffer);
try {
for (int length = 0; (length = channel.read(byteBuffer)) != -1;) {
System.out.write(buffer, 0, length);
byteBuffer.clear();
}
} finally {
input.close();
}
Path temp = Files.move
(Paths.get("D:\\\\Java\\\\output.txt"),
Paths.get("E:\\find\\output.txt"));
if(temp != null)
{
System.out.println("File renamed and moved successfully");
}
else
{
System.out.println("Failed to move the file");
}
}

For Java 11 you could use more convenient approach:
Files.copy(Path.of("file.txt"), System.out);
Or for more faster output:
var out = new BufferedOutputStream(System.out);
Files.copy(Path.of("file.txt"), out);
out.flush();

Related

Is it possible to read images without ImageIO?

I am trying to read an image and deliver it through a Java socket. But there are some bits that does not fit. When viewing in a diff tool I realized that all numbers bigger than 127 were truncated.
So I wanted to just convert it to a char[] array and return it instead. Now I'm getting a complette different image, perhaps due to char's size.
try (PrintWriter out = new PrintWriter(this.socket.getOutputStream(), true);
BufferedInputStream in = new BufferedInputStream(new FileInputStream(filename), BUFSIZ)) {
byte[] buffer = new byte[BUFSIZ];
while (in.read(buffer) != -1) {
response.append(new String(buffer));
out.print(response.toString());
response.setLength(0);
}
} catch (IOException e) {
System.err.println(e.getMessage());
}
This is my reading and delivering code.
I've read many times to use ImageIO but I want to do it without, since I don't know whether it's an image or not. (And what about other file types like executables?)
So, is there any way to convert it to something like an unsigned byte that'll be delivered correctly on the client? Do I have to use something different than read() to achieve that?
Writers are for character data. Use the OutputStream. And you're making the usual mistake of assuming that read() filled the buffer.
The following loop will copy anything correctly. Memorize it.
int count;
byte[] buffer = new byte[8192];
while ((count = in.read(buffer)) > 0)
{
out.write(buffer, 0, count);
}
Repeat after me: a char is not a byte and it's not a code point.
Repeat after me: a Writer is not an OutputStream.
try (OutputStream out = this.socket.getOutputStream();
BufferedInputStream in = new BufferedInputStream(new FileInputStream(filename), BUFSIZ)) {
byte[] buffer = new byte[BUFSIZ];
int len;
while ((len = in.read(buffer))) != -1) {
out.write(buffer, 0, len);
}
} catch (IOException e) {
System.err.println(e.getMessage());
}
(this is from memory, check the args for write()).

corrupted file text while reading

I have the following code:
BlobDomain blobDomain = null;
OutputStream out = null;
try {
blobDomain = new BlobDomain();
out = blobDomain.getBinaryOutputStream();
byte[] buffer = new byte[8192];
int bytesRead = 0;
while ((bytesRead = in.read(buffer, 0, 8192)) != -1) {
out.write(buffer, 0, bytesRead);
String line = (new String(buffer));
fullText += line;
}
} catch (Exception e) {
//do nothing
}finally{
if (out != null)
try {
out.close();
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
when i print the fullText what i see for larger files is that end part of the text is added again to the fullText. So full text has some lines repeated in the end. any suggestions on what is wrong here?
The reason that you are getting this is that you are writing the entire buffer every time to your String. Thus, when you reach the end of the file you may not have read exactly the amount of bytes that your buffer is sized at. The old data is still in the buffer and will also be written to your String.
One option to solve this may be to write your data to a String first and then to write your String to the output stream. This should also be faster than adding to a String after each read.
Save inputStream to String:
java.util.Scanner s = new java.util.Scanner(in).useDelimiter("\\A");
fullText = s.hasNext() ? s.next() : "";
Write String to output stream:
out.write(fullText.getBytes());
If you want to keep you code as-is then perform a substring on the buffer and retrieve only the amount of bytes read. For example:
String line = (new String(buffer.substring(0,bytesRead));

Reading binary stream until "\r\n" is encountered

I'm working on a Java application which will stream video from an IP Camera. The video streams from the IP Camera in MJPEG format. The protocol is the following...
--ipcamera (\r\n)
Content-Type: image/jpeg (\r\n)
Content-Length: {length of frame} (\r\n)
(\r\n)
{frame}
(\r\n)
--ipcamera (\r\n)
etc.
I've tried using classes such as BufferedReader and Scanner to read until the "\r\n", however those are meant for text and not binary data, so it becomes corrupt. Is there any way to read the binary stream until it encounters a "\r\n"? Here is my current (broken) code.
EDIT: I've gotten it to work. I updated the code below. However, it's really slow in doing so. I'm not sure if it has anything to do with the ArrayList or not, but it could be the culprit. Any pointers to speed up the code? It's currently taking 500ms to 900ms for a single frame.
public void run() {
long startTime = System.currentTimeMillis();
try {
URLConnection urlConn = url.openConnection();
urlConn.setReadTimeout(15000);
urlConn.connect();
urlStream = urlConn.getInputStream();
DataInputStream dis = new DataInputStream(urlStream);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ArrayList<Byte> bytes = new ArrayList<Byte>();
byte cur;
int curi;
byte[] curBytes;
int length = 0;
while ((curi = dis.read()) != -1) {
cur = (byte) curi;
bytes.add(cur);
curBytes = getPrimativeArray(bytes);
String curBytesString = new String(curBytes, "UTF-8");
if (curBytesString.equals("--ipcamera\r\n")) {
bytes.clear();
continue;
} else if (curBytesString.equals("Content-Type: image/jpeg\r\n")) {
bytes.clear();
continue;
} else if (curBytesString.matches("^Content-Length: ([0-9]+)\r\n$")) {
length = Integer.parseInt(curBytesString.replace("Content-Length: ", "").trim());
bytes.clear();
continue;
} else if (curBytesString.equals("\r\n")) {
if (length == 0) {
continue;
}
byte[] frame = new byte[length];
dis.readFully(frame, 0, length);
writeFrame(frame);
bytes.clear();
break;
}
}
} catch (Exception e) {
e.printStackTrace();
}
long curTime = System.currentTimeMillis() - startTime;
System.out.println(curTime);
}
private byte[] getPrimativeArray(ArrayList<Byte> array) {
byte[] bytes = new byte[array.size()];
for (int i = 0; i < array.size(); i++) {
bytes[i] = array.get(i).byteValue();
}
return bytes;
}
private void writeFrame(byte[] bytes) throws IOException {
File file = new File("C:\\test.jpg");
FileOutputStream fos = new FileOutputStream(file);
fos.write(bytes);
fos.close();
System.out.println("done");
}
Currently you do not cope with the case when data is read in the frame part.
A rough assumption is:
Current version:
else if (line.equals("") && length != 0)
Probably more correct version:
else if (!line.equals("") && length != 0)
You cannot use BufferedReader to read binary, it will corrupt it. I you want to keep things simple, use DataInputStream.readLine(). Though not ideal, it may be the simplest in your case.
Other than using some bad practices and assuming that your URLConnection correctly delivers the data, the example you posted seems to work if you reset the length to zero after reading the frame data.
} else if (line.equals("") && length != 0) {
char[] buf = new char[length];
reader.read(buf, 0, length);
baos.write(new String(buf).getBytes());
//break;
length = 0; // <-- reset length
}
Please note this way all the frame data are written in the same ByteArrayOutputStream consecutively. If you don't want that, you should create a new ByteArrayOutputStream for every new frame you encounter.
You can't use a BufferedReader for part of the transmission and then some other stream for the rest of it. The BufferedReader will fill its buffer and steal some of the data you want to read with the other stream. Use DataInputStream.readLine(), noting that it's deprecated, or else roll your own line-reading code, using the input stream provided by the URLConnection.
Surely you don't have to? URLConnection reads the headers for you. If you want the content-length, use the API to get it. The stuff you get to read starts at the body of the transmission.

Regarding reading a file and optimizing the performance

I was doing some research on IO and I read the following article which talks about buffering techniques. To minimize disk accesses and work by the underlying operating system, buffering techniques use a temporary buffer that reads data in a chunk-wise manner, instead of reading data directly from the disk with every read operation.
Examples were given without and with buffering.
without buffering:
try
{
File f = new File("Test.txt");
FileInputStream fis = new FileInputStream(f);
int b; int ctr = 0;
while((b = fis.read()) != -1)
{
if((char)b== '\t')
{
ctr++;
}
}
fs.close();
// not the ideal way
} catch(Exception e)
{}
With buffering:
try
{
File f = new File("Test.txt");
FileInputStream fis = new FileInputStream(f);
BufferedInputStream bs = new BufferedInputStream(fis);
int b;
int ctr = 0;
while((b =bs.read()) != -1)
{
if((char)b== '\t')
{
ctr++;
}
}
fs.close(); // not the ideal way
}
catch(Exception e){}
The conclusion was:
Test.txt was a 3.5MB file
Scenario 1 executed between 5200 to 5950 milliseconds for 10 test runs
Scenario 2 executed between 40 to 62 milliseconds for 10 test runs.
Is there any other way to do this in Java that is better? Or any other method / technique to give better performance?Please advise..!
Is there any other way to do this in Java that is better? Or any other method / technique to give better performance?
In terms of IO performance, that probably is going to be the best without a lot of other code. You are going to be IO bound most likely anyway.
while((b =bs.read()) != -1)
This is very inefficient to read byte-by-byte. If you are reading a text file then you should be using a BufferedReader instead. This converts a byte array into String.
BufferedReader reader = new BufferedReader(new InputStreamReader(fis));
...
while ((String line = reader.readLine()) != null) {
...
}
Also, with any IO, you should always do it in a try/finally block to make sure you close it:
FileInputStream fis = new FileInputStream(f);
BufferedReader reader;
try {
reader = new BufferedReader(new InputStreamReader(fis));
// once we wrap the fis in a reader, we just close the reader
} finally {
if (reader != null) {
reader.close();
}
if (fis != null) {
fis.close();
}
}
the problem with your code is that you're reading file by bytes (one byte per request). Read it into array chunk by chunk - and performance will be equal to one with Buffer.
you may want to try out NIO and memory-mapped files as well, see http://www.linuxtopia.org/online_books/programming_books/thinking_in_java/TIJ314_029.htm
You can read blocks of data at a time which can still be faster than using a buffered input.
FileInputStream fis = new FileInputStream(new File("Test.txt"));
int len, ctr = 0;
byte[] bytes = new byte[8192];
while ((len = fis.read(bytes)) > 0)
for (int i = 0; i < len; i++)
if (bytes[len] == '\t')
ctr++;
fis.close();
You can also try memory mapping.
FileChannel fc = new FileInputStream(new File("Test.txt")).getChannel();
ByteBuffer bb = fc.map(FileChannel.MapMode.READ_ONLY, 0, fc.size());
int ctr = 0;
for (int i = 0; i < bb.limit(); i++)
if (bb.get(i) == '\t')
ctr++;
fc.close();
I would expect both of these options being about twice as fast.

InputStream read doesn't read the data

I'm having an issue reading from a java input stream. I have a buffer of size 1024, and an input stream of size 29k-31k. I read the inputStream in a loop, but I only get 29 bytes for the first read, 39 for the second read, and nothing after that. The same behavior repeats for different InputStreams. (I'm writing the data to an output stream but I don't see how this can affect the first read)
int bytesRead = 0;
byte[] byteBuf = new byte[1024];
OutputStream fileStream = FileUtil.openFileForWrite(saveTo);
bytesRead = reader.read(byteBuf);
while(bytesRead!=-1){
fileStream.write(byteBuf, 0, bytesRead);
bytesRead = reader.read(byteBuf);
}
What am I missing?
Any help is appreciated :)
Where are you getting the input stream from? How do you know that it's 29K-31K?
Your code looks reasonable to me, although I generally structure the loop slightly different to avoid the duplication of the read call.
Have you tried using readline() instead of read()?
Path file = ...;
InputStream in = null;
try {
in = file.newInputStream();
BufferedReader reader = new BufferedReader(new InputStreamReader(in));
String line = null;
while ((line = reader.readLine()) != null) {
System.out.println(line);
}
} catch (IOException x) {
System.err.println(x);
} finally {
if (in != null) in.close();
}

Categories

Resources