output streams buffered output streams appending - java

I know how to create / write/ close buffered output stream. How do I reopen the stream and append data to end of the file.

You can't reopen a stream... but you can create a new stream which will append:
FileOutputStream output = new FileOutputStream(file, true);
See the constructor list for FileOutputStream for the various options.
(Of course, you can then wrap that FileOutputStream with a BufferedOutputStream.)

Related

ImageIO.read closes inputs stream

I write images and other data to binary file. When I read image via ImageIO.read(InputStream) from that file, it reads image, it is ok, but method closes given input stream and I cant proceed to read other data.
Why so it is made?
Then how read image without closing stream?
EDIT: It is simple code that writes image and string after into file:
File f = new File("test.bin");
if(f.exists())
f.delete();
f.createNewFile();
DataOutputStream os = new DataOutputStream(new FileOutputStream(f));
BufferedImage img = ImageIO.read(new File("test.jpg"));
ImageIO.write(img, "jpg", os);
os.writeUTF("test string after image");
os.close();
And code that reads all:
DataInputStream is = new DataInputStream(new FileInputStream(f));
BufferedImage img = ImageIO.read(is);
String s = is.readUTF(); // on this line EOFException occurs
System.out.println(s);
NetBeans output:
Exception in thread "main" java.io.EOFException
at java.io.DataInputStream.readUnsignedShort(DataInputStream.java:340)
at java.io.DataInputStream.readUTF(DataInputStream.java:589)
at java.io.DataInputStream.readUTF(DataInputStream.java:564)
at mediamanager.Main.test(Main.java:105)
at mediamanager.Main.main(Main.java:44)
May be I'm doing something wrong?
Quote from the documentation of ImageIO.read(InputStream)
This method does not close the provided InputStream after the read operation has completed; it is the responsibility of the caller to close the stream, if desired.
Emphasis not mine.
The problem is elsewhere. Probably in your code.
I can see two possible causes of such behaviour:
Image reader use buffer to read data from the stream to improve performance. So it reads more data from the stream.
Also image reader could try to read EXIF for already parsed image. Such information usually appended at the end of file to avoid full file rewriting when you are just adding a couple of piece of information about the image.
Try ImageIO.setUseCash(false) it could help.

How to write new line in Java FileOutputStream

I want to write a new line using a FileOutputStream; I have tried the following approaches, but none of them are working:
encfileout.write('\n');
encfileout.write("\n".getbytes());
encfileout.write(System.getProperty("line.separator").getBytes());
This should work. Probably you forgot to call encfileout.flush().
However this is not the preferred way to write texts. You should wrap your output stream with PrintWriter and enjoy its println() methods:
PrintWriter writer = new PrintWriter(new OutputStreamWriter(encfileout, charset));
Alternatively you can use FileWriter instead of FileOutputStream from the beginning:
FileWriter fw = new FileWriter("myfile");
PrintWriter writer = new PrintWriter(fw);
Now just call
writer.println();
And do not forget to call flush() and close() when you finish your job.
It could be a viewer problem... Try opening the file in EditPlus or Notepad++. Windows Notepad may not recognize the line feed of another operating system. In which program are you viewing the file now?
String lineSeparator = System.getProperty("line.separator");
<br>
fos.write(lineSeparator.getBytes());
To add a line break use
fileOutputStream.write(10);
here decimal value 10 represents newline in ASCII

Write with ObjectOutputStream into multiple ZipEntrys in a single ZipOutputStream

I want to create a zip archive in Java where each contained file is produced by serializing some objects. I have a problem with correctly closing the streams.
The code looks like this:
try (OutputStream os = new FileOutputStream(file);
ZipOutputStream zos = new ZipOutputStream(os);) {
ZipEntry ze;
ObjectOutputStream oos;
ze = new ZipEntry("file1");
zos.putNextEntry(ze); // start first file in zip archive
oos = new ObjectOutputStream(zos);
oos.writeObject(obj1a);
oos.writeObject(obj1b);
// I want to close oos here without closing zos
zos.closeEntry(); // end first file in zip archive
ze = new ZipEntry("file2");
zos.putNextEntry(ze); // start second file in zip archive
oos = new ObjectOutputStream(zos);
oos.writeObject(obj2a);
oos.writeObject(obj2b);
// And here again
zos.closeEntry(); // end second file in zip archive
}
I know of course that I should close each stream after finishing using it, so I should close the ObjectOutputStreams in the indicated positions. However, closing the ObjectOutputStreams would also close the ZipOutputStream that I still need.
I do not want to omit the call to ObjectOutputStream.close() because I do not want to rely on the fact that it currently does not more than flush() and reset().
I also cannot use a single ObjectOutputStream instance because then I miss the stream header that is written by the constructor (each single file in the zip archive would not be a full object serialization file, and I could not de-serialize them independently).
The same problem occurs when reading the file again.
The only way I see would be to wrap the ZipOutputStream in some kind of "CloseProtectionOutputStream" that would forward all methods except close() before giving it to the ObjectOutputStream. However, this seems rather hacky and I wonder if I missed a nicer solution in the API.
If your OutputStream wrapper throws an exception when closed more than once, it is not a hack. You can create a wrapper for each zip entry.
From an architectural point of view, I think the ObjectOutputStream author should have provided an option to disable close() cascading. You are just workarounding his lacking API.
In this case, and for all the reasons you mentioned, I would simply not pipe my ObjectOutputStream to the ZipOutputStream. Instead, serialize to a byte[] and then write that straight into the ZipOutputStream. This way, you are free to close the ObjectOutputStream and each byte[] you produce will have the proper header from the serializer. One down side is you wind up with a byte[] in memory that you didn't have before but if you get rid of it right away, assuming we're not talking about millions of objects, the garbage collector shouldn't have a hard time cleaning up.
Just my two cents...
It at least sounds less hacky than a stream subclass that changes the close() behavior.
If you're intending to throw the ObjectOutputStream away anyway, then it should be sufficient to call flush() rather than close(), but as you say in the question the safest approach is probably to use a wrapper around the underlying ZipOutputStream that blocks the close() call. Apache commons-io has CloseShieldOutputStream for this purpose.

Optimized solution for reading from a file, manipulating its data and writing to another file in JAVA

I have a file "File 1" containing approximately 0.1 million lines.
I have to read this file line by line and add some text to each line and write it to another file "File 2".
I don't want to perform write operation on "File 2" for each line as its expensive.
Which is the better option to save each modified line of "File 1" in memory?
ArrayList or StringBuilder or anything else?
And at the end I will write the complete data in memory to "File 2".
Using an appropriately sized buffer might not be as expensive as you might think. Keeping all of the content in memory when your problem looks to be more of a stream processing one sounds like a bad idea.
Reading File 1 from a BufferedReader, changing something and passing the result to a BufferedWriter might be your best bet.
Just use BufferedOutputStream to reduce number of physical writes:
BufferedReader reader = new BufferedReader(
new InputStreamReader (new FileInputStream ("input.txt")));
Writer writer =
new OutputStreamWriter(
new BufferedOutputStream (new FileOutputStream ("output.txt")));
String line;
while ((line = reader.readLine ()) != null)
{
writer.write (line + " foo \n");
}
reader.close ();
writer.close ();
Well i dont think reading it line by line will huge performance impact , if you are using java 5 and above you can make use of java.nio to make use of native OS processes
Check this link for more info
http://docs.oracle.com/javase/tutorial/essential/io/file.html

Different ways of creating files

There are methods for creating files in java.io.File (like createNewFile() or mkdir()). Are there other ways of creating files in Java SE using "standard" API?
When you create a FileOuputStream, the file is created, if it does not exist, although this is not guaranteed:
A file output stream is an output stream for writing data to a File or to a FileDescriptor. Whether or not a file is available or may be created depends upon the underlying platform.
FileOutputStream can be used to create a file as shown below
FileOutputStream fos = new FileOutputStream("myfile");
FileOutputStream fos = new FileOutputStream(new File("myfile"));
You can use PrintWriter in conjunction with FileWriter such as PrintWriter write = new PrintWriter(new FileWriter("FileName", false)); writing to blank file or BufferedWriter works with FileWriter as well, such as BufferedWriter writer = new BufferedWriter(new FileWriter("FileName"));

Categories

Resources