I may be overthinking this, but I just wrote the code:
try (InputStream in = ModelCodeGenerator.class.getClassLoader().getResourceAsStream("/model.java.txt"))
{
modelTemplate = new SimpleTemplate(CharStreams.toString(new InputStreamReader(in, "ascii")));
}
Which means the InputStreamReader is never closed (but in this case we know its close method just closes the underlying InputStream.)
One could write it as:
try (InputStreamReader reader = new InputStreamReader(...))
But this seems worse. If InputStreamReader throws for some reason, the InputStream won't ever be closed, right? This is a common problem in C++ with constructors that call other constructors. Exceptions can cause memory/resource leaks.
Is there a best practice here?
Which means the InputStreamReader is never closed
Eh? In your code it is... And it will certainly handle the .close() of your resource stream as well. See below for more details...
As #SotiriosDelimanolis mentions however you can declare more than one resource in the "resource block" of a try-with-resources statement.
You have another problem here: .getResourceAsStream() can return null; you may therefore have an NPE.
I'd do this if I were you:
final URL url = ModelCodeGenerator.class.getClassLoader()
.getResource("/model.java.txt");
if (url == null)
throw new IOException("resource not found");
try (
final InputStream in = url.openStream();
final Reader reader = new InputStreamReader(in, someCharsetOrDecoder);
) {
// manipulate resources
}
There is a very important point to consider however...
Closeable does extend AutoCloseable, yes; in fact it only differs, "signature wise", by the exception thrown (IOException vs Exception). But there is a fundamental difference in behavior.
From the javadoc of AutoCloseable's .close() (emphasis mine):
Note that unlike the close method of Closeable, this close method is not required to be idempotent. In other words, calling this close method more than once may have some visible side effect, unlike Closeable.close which is required to have no effect if called more than once. However, implementers of this interface are strongly encouraged to make their close methods idempotent.
And indeed, the javadoc of Closeable is clear about this:
Closes this stream and releases any system resources associated with it. If the stream is already closed then invoking this method has no effect.
You have two very important points:
by contract, a Closeable also takes care of all resources associated with it; so, if you close a BufferedReader which wraps a Reader which wraps an InputStream, all three are closed;
should you call .close() more than once, there is no further side effect.
This also means, of course, that you can choose the paranoid option and keep a reference to all Closeable resources and close them all; beware however if you have AutoCloseable resources into the mix which are not Closeable!
But this seems worse. If InputStreamReader throws for some reason, the
InputStream won't ever be closed, right?
That's right (although unlikely, the InputStreamReader constructor doesn't really do much).
The try-with-resources lets you declare as many resources as you'd like. Declare one for the wrapped resource, and another for the InputStreamReader.
try (InputStream in = ModelCodeGenerator.class
.getClassLoader()
.getResourceAsStream("/model.java.txt");
InputStreamReader reader = new InputStreamReader(in)) {...}
Note that getResourceAsStream can potentially return null, which would cause the InputStreamReader constructor to throw a NullPointerException. If you want to deal with that differently, adapt how you retrieve the resource that's meant to be wrapped.
The tutorial linked above presents this example
try (
java.util.zip.ZipFile zf =
new java.util.zip.ZipFile(zipFileName);
java.io.BufferedWriter writer =
java.nio.file.Files.newBufferedWriter(outputFilePath, charset)
) {
with the explanation
In this example, the try-with-resources statement contains two
declarations that are separated by a semicolon: ZipFile and
BufferedWriter. When the block of code that directly follows it
terminates, either normally or because of an exception, the close
methods of the BufferedWriter and ZipFile objects are automatically
called in this order. Note that the close methods of resources are
called in the opposite order of their creation.
Related
I saw this example on the web and in the Book "Effective Java" (by Joshua Bloch).
try(BufferedWriter writer = new BufferedWriter(new FileWriter(fileName))){
writer.write(str); // do something with the file we've opened
}
catch(IOException e){
// handle the exception
}
There is no problem with this example, BufferedWriter which will automatically get closed and that, in turn, will close the FileWriter; however, in other case, if we declare 2 nested resources this way:
try (AutoClosable res = new Impl2(new Impl1())) {... }
I guess it might happen that new Impl1() performs fine, but new Impl2() crashes, and in this case Java would have no reference to the Impl1, in order to close it.
Shouldn't it be a better practice to always declare multiple resources independently (even if not required in this case), like this?
try(FileWriter fw = new FileWriter(fileName);
BufferedWriter writer = new BufferedWriter(fw)){ ... }
After some searching I was able to find this article: https://dzone.com/articles/carefully-specify-multiple-resources-in-single-try
By definition JLS 14.20.3 a ResourceList consists of Resources separated by ;. Based on that we can conclude that a nested initialization like AutoClosable res = new Impl2(new Impl1()) is a single resource. Because of that rules defined for try-with-resources with multiple resources won't apply here, important ones being:
Resources are initialized in left-to-right order. If a resource fails to initialize (that is, its initializer expression throws an exception), then all resources initialized so far by the try-with-resources statement are closed. If all resources initialize successfully, the try block executes as normal and then all non-null resources of the try-with-resources statement are closed.
Resources are closed in the reverse order from that in which they were initialized. A resource is closed only if it initialized to a non-null value. An exception from the closing of one resource does not prevent the closing of other resources. Such an exception is suppressed if an exception was thrown previously by an initializer, the try block, or the closing of a resource.
What is more, Implt1#close() won't be called unless it is explicitly called inside of Impl2#close()
In short, it is better to declare multiple resources in separate statements separated with ;, like so:
try(Impl1 impl1 = new Impl1();
Impl2 impl2 = new Impl2(impl1))
If the exception happens in the try part, there is no resource leakage (assuming Impl1 is written well). Notice that a raised exception in Impl1() will not reach Impl2 as the constructor argument is evaluated before calling it.
try (AutoClosable res = new Impl2(new Impl1())) {
So it is fine to nest such wrapping constructors; better style if the code does not become to long.
One remark: FileWriter and FileReader are old utility classes using the platform encoding, which will differ per application installation.
Path path = Paths.get(fileName);
try (BufferedWriter writer =
Files.newBufferedWriter(path, StandardCharsets.UTF8)) {
First of all, for the sake of a little sanity check, I would say, that I could not find the example, you provided, in Joshua Bloch's Effective Java (3rd Edition, 2018). If you are reading the previous version, it should be better to get the latest one. If it is my bad, please refer particular page number to me.
Now with respect to the question itself.
Let's begin with JSL §14.20.3, which says, that Resources are separated by ;. This means, that disregarding of how long our decoration chain of the object creations will be (e.g. new Obj1(new Obj2(...new ObjK(..))); it will be treated as a one single resource, as it is a one definition/statement.
As we now know what does a Single Resource constitute, let me shed some light from my observation.
Reasons, why it is better to define resources separately
JSL §14.20.3 also states, that:
If a resource fails to initialize (that is, its initializer expression throws an exception), then all resources initialized so far by the try-with-resources statement are closed. If all resources initialize successfully, the try block executes as normal and then all non-null resources of the try-with-resources statement are closed.
Q: What does that mean for us?
A: If that single resource is a chain of objects passed into wrapping constructors, it does not mean, that throwing an exception from the initialization phase will close those embedded resources, rather the .close() method will be invoked on the parent (wrapper, enclosing) objects, and therefore, you might easily end up with a resource leak.
On the other hand, you can be resting assured that all resources will be closed, if you have defined them separately.
JSL §14.20.3 also states, that:
An exception from the closing of one resource does not prevent the closing of other resources. Such an exception is suppressed if an exception was thrown previously by an initializer, the try block, or the closing of a resource.
Q: What does that mean for us?
A: If you have declared your resources separately, then it does not matter which throws exception and which does not. They all will be closed successfully.
Unfortunately the book you mentioned (at least the 3rd version it) does not cover the question you brought up here; however, I did some more research and found that Core Java (10th Edition), by Cay S. Horstmann, confirms the point I referred to above, in its §7.2.5:
When the block exits normally, or when there was an exception, the in.close()
method is called, exactly as if you had used a finally block.
and
No matter how the block exits, both in and out are closed.
in and out are Autoclosable objects in the examples given in the book.
Q: What does this mean?
A: It means, that exceptions thrown from any phase of one of the resources, does not anyhow affect how another resources are closed.
Based on all above, I think, that it also depends on how the resources are implemented. E.g. if the enclosing resource, upon an invocation of its .close(), implements the logic to close the enclosed resource, then your concern:
I guess it might happen that new Impl1() performs fine, but new Impl2() crashes, and in this case Java would have no reference to the Impl1, in order to close it.
will not really be a problem in your particular case, as the container resource which is being closed, will hold a reference to the contained resource and eventually will close it (or will just implement its closure);
However, it is still a bad idea to construct resources in the decoration of chaining them into wrapper constructors, due to several reasons I brought up.
P. S.
In any case, disregarding of how the resources are implemented or how you chain them or etc. everything, including JLS, Core Java book, OCP Java SE 8 book, and points brought here, indicate that it's always best to declare your resources separately.
The javadoc for Stream states:
Streams have a BaseStream.close() method and implement AutoCloseable, but nearly all stream instances do not actually need to be closed after use. Generally, only streams whose source is an IO channel (such as those returned by Files.lines(Path, Charset)) will require closing. Most streams are backed by collections, arrays, or generating functions, which require no special resource management. (If a stream does require closing, it can be declared as a resource in a try-with-resources statement.)
Therefore, the vast majority of the time one can use Streams in a one-liner, like collection.stream().forEach(System.out::println); but for Files.lines and other resource-backed streams, one must use a try-with-resources statement or else leak resources.
This strikes me as error-prone and unnecessary. As Streams can only be iterated once, it seems to me that there is no a situation where the output of Files.lines should not be closed as soon as it has been iterated, and therefore the implementation should simply call close implicitly at the end of any terminal operation. Am I mistaken?
Yes, this was a deliberate decision. We considered both alternatives.
The operating design principle here is "whoever acquires the resource should release the resource". Files don't auto-close when you read to EOF; we expect files to be closed explicitly by whoever opened them. Streams that are backed by IO resources are the same.
Fortunately, the language provides a mechanism for automating this for you: try-with-resources. Because Stream implements AutoCloseable, you can do:
try (Stream<String> s = Files.lines(...)) {
s.forEach(...);
}
The argument that "it would be really convenient to auto-close so I could write it as a one-liner" is nice, but would mostly be the tail wagging the dog. If you opened a file or other resource, you should also be prepared to close it. Effective and consistent resource management trumps "I want to write this in one line", and we chose not to distort the design just to preserve the one-line-ness.
I have more specific example in addition to #BrianGoetz answer. Don't forget that the Stream has escape-hatch methods like iterator(). Suppose you are doing this:
Iterator<String> iterator = Files.lines(path).iterator();
After that you may call hasNext() and next() several times, then just abandon this iterator: Iterator interface perfectly supports such use. There's no way to explicitly close the Iterator, the only object you can close here is the Stream. So this way it would work perfectly fine:
try(Stream<String> stream = Files.lines(path)) {
Iterator<String> iterator = stream.iterator();
// use iterator in any way you want and abandon it at any moment
} // file is correctly closed here.
In addition if you want "one line write". You can just do this:
Files.readAllLines(source).stream().forEach(...);
You can use it if you are sure that you need entire file and the file is small. Because it isn't a lazy read.
If you're lazy like me and don't mind the "if an exception is raised, it will leave the file handle open" you could wrap the stream in an autoclosing stream, something like this (there may be other ways):
static Stream<String> allLinesCloseAtEnd(String filename) throws IOException {
Stream<String> lines = Files.lines(Paths.get(filename));
Iterator<String> linesIter = lines.iterator();
Iterator it = new Iterator() {
#Override
public boolean hasNext() {
if (!linesIter.hasNext()) {
lines.close(); // auto-close when reach end
return false;
}
return true;
}
#Override
public Object next() {
return linesIter.next();
}
};
return StreamSupport.stream(Spliterators.spliteratorUnknownSize(it, Spliterator.DISTINCT), false);
}
What is difference between creating object and then passing it to parameter vs declaring an object as an argument in the constructor of class. Like what is difference between 1) and 2)?
1)
InputStreamReader isr=new InputStreamReader(System.in);
BufferedReader br=new BufferedReader(isr);
2)
BufferedReader br=new BufferedReader(new InputStreamReader(System.in));
Two differences come to mind:
In theory, with the all-in-one construction, if the outer constructor threw an exception, you'd have an instance of the inner object lying around that you never call close on (because you don't have a reference to it that you can use to do that), so it won't get closed until finalization (if then). In practice, I don't think BufferedReader's constructor can throw an exception, but...
This is part of what Java's new try-with-resources statement is designed to help with:
try (
InputStreamReader isr=new InputStreamReader(System.in);
BufferedReader br=new BufferedReader(isr);
)
{
// do your stuff here
}
Using try-with-resources, you can be sure that even if the second constructor throws, the first object gets closed correctly.
Note that when you use try-with-resources with multiple declarations as above, close is called in the reverse order in which they appear, which is usually exactly what you want (in this case, br.close() is called before isr.close()).
With the all-in-one construction, you're assuming that the outer object will close the object you pass it when you close it (because, again, you have no reference to the inner object to use to close it). This is true of BufferedReader, but may not be universally true. Again try-with-resources helps there, by ensuring that close is called.
If you're dealing with objects that don't need close or similar, there's not much difference between the two at all. It's easier to single-step over the separate statements when debugging than it is the all-in-one construction, but other than that, there's not much in it.
I know it has been the point of many previous questions to close or not to close a ServletOutputStream like here: Should I close the servlet outputstream? or here: Should one call .close() on HttpServletResponse.getOutputStream()/.getWriter()? or with another focus here: Do I need to flush the servlet outputstream?
The general consensus seems to be not to close it because you are not owning it in a more strict sense. (The HttpServletResponse owns it.)
But what about e.g. these constructs:
PrintWriter out = new PrintWriter( new OutputStreamWriter( resp.getOutputStream(), MY.ENCODING ) );
Now I'm clearly the owner of the PrintWriter which has some additional buffers which at least needs to get flushed (and which are flushed e.g. by closing it).
What is the general consensus here? Do I need to close the PrintWriter (or any other such construct as for this matter.)?
EDIT: There are valid arguments for closing the stream, too. Notable e.g. not wanting something else writing on the stream. And meanwhile we have with try-with-resource constructs which might change the picture. See my other question here: Eclipse complaining over resource leak when not closing ServletOutputStream
This might change nothing (and my general feeling is to not close the stream, too) but try-with-resource above all literally screams for code like:
try( Something out = new Something( resp.getOutputStream() ) ){
out.print( "Foo" );
}
instead of
Something out;
try {
out = new Something( resp.getOutputStream() );
out.print( "Foo" )
} finally {
if( out != null && out.isUnFlushedWhatever() ) out.flush();
}
The OutputStream is something you're not creating, you just query a reference to it with ServletResponse.getOutputStream(). Therefore if you put something around it (e.g. OutputStreamWriter or ZipOutputStream) the wrapper stream or writer will just write to it.
It is implementation dependant whether closing a wrapper stream or writer closes the underlying stream, so you should not close that. But since in most of the cases the wrappers only use the underlying stream to write bytes, it is more than enough to flush the wrapper.
In cases where the wrapper needs some finalizing, it should be (and generally is) the wrapper's responsibility to provide this finalizing functionality in a separate method. For example ZipOutputStream provides a finish() method which finishes writing the contents of the ZIP output stream without closing the underlying stream.
Summarizing:
You should not close the wrapper, but check if it provides some finalizing method without closing the underlying stream, which you should obviously call.
I need to design an API method which takes an OutputStream as a parameter.
Is it a good practice to close the stream inside the API method or let the caller close it?
test(OutputStream os) {
os.close() //???
}
I think it should be symmetric.
If you do not open that stream (which is likely to be your case), you should not close it, either, in general.
Unless the purpose of the API is to "finish up the stream", you should let the caller close. He had it first, he was responsible for it, and he may decide that he wants to write some stuff to the stream that your API didn't originally envision. Keep your functionality seperated; its more composable.
Let user close it. As you are taking OutputStream in argument so we can think that user has already created and opened it. So if you close in your method it will be not good. And if you are just taking new OutputStream as argument and opens it in your method then no need to take it as argument and you can also close it in your method.
Different use-cases require different patterns, for example, depending on whether the caller needs to read from or write to the stream after the call has completed.
The key API design rule is that the API should specify whether it is the caller or called method's responsibility to close the stream.
Having said that, it is generally simpler and safer if the code that opens a stream is also responsible for closing it.
Consider the case where methodA is supposed to open a stream and pass it to methodB, but an exception is thrown between the stream being opened and methodB entering the try / finally statement that is ultimately responsible for closing it. You need to code it something like the following to ensure that streams don't leak:
public void methodA() throws IOException {
InputStream myStream = new FileInputStream(...);
try {
// do stuff with stream
methodB(myStream);
} finally {
myStream.close();
}
}
/**
* #param myStream this method is responsible for closing myStream.
*/
public void methodB(InputStream myStream) throws IOException {
try {
// do more stuff with myStream
} finally {
myStream.close();
}
}
This won't leak an open stream as a result of exceptions (or errors!) thrown in either methodA or methodB. (It works for the standard stream types because the Closable API specifies that close has no effect when called on a stream that is already closed.)