This question already has answers here:
Why lambda inside map is not running?
(1 answer)
Why does Java8 Stream generate nothing?
(3 answers)
Closed last year.
Test method always throw RuntimeException. And I can catch the RuntimeException
void test() {
throw new RuntimeException();
}
System.out.println("start");
try {
test();
}
catch(RuntimeException e) {
System.out.println(e);
}
System.out.println("end");
This codes show the below result.
start
java.lang.RuntimeException
end
But, In a stream, I can't catch the RuntimeException.
System.out.println("start");
try {
nums.stream().map((num) -> {
test();
return null;
});
}
catch(RuntimeException e) {
System.out.println(e);
}
System.out.println("end");
This codes show the below result.
start
end
Why I can't catch the RuntimeException in stream.
As it says in the Javadoc of the java.util.stream package (emphasis added):
Intermediate operations return a new stream. They are always lazy; executing an intermediate operation such as filter() does not actually perform any filtering, but instead creates a new stream that, when traversed, contains the elements of the initial stream that match the given predicate. Traversal of the pipeline source does not begin until the terminal operation of the pipeline is executed.
map is an intermediate operation, and you don't have a terminal operation. As such, the pipeline source is not traversed.
Change map to forEach (and remove the return):
nums.stream().forEach((num) -> {
test();
});
(I assume also that nums is non-empty).
Related
This question already has answers here:
Enforcing desired CompletableFuture behavior
(1 answer)
Surprising behavior of Java 8 CompletableFuture exceptionally method
(2 answers)
Closed 15 days ago.
In the first snippet below, I would expect "ex" in the handle() method to be of type NoSuchElementException. To my surprise, the NoSuchElementException was wrapped in a CompletionException instead at runtime.
CompletableFuture.failedFuture(new NoSuchElementException())
.thenApply(s -> s) // transform the result
.handle((response, ex) -> {
if (ex instanceof NoSuchElementException) {
// translate the exception into another exception
} else {
return response;
}
});
If I remove the call to thenApply() such as below, then ex would be of type NoSuchElementException. I would like to understand what happen in thenApply that led to ex being wrapped inside CompletionException.
CompletableFuture.failedFuture(new NoSuchElementException())
.handle((response, ex) -> {
if (ex instanceof NoSuchElementException) {
// translate the exception into another exception
} else {
return response;
}
});
Here is my simple test code:
class Scratch {
public static void main(String[] args) {
try {
System.out.println("Line 1");
throw new RuntimeException();
} catch (RuntimeException e) {
e.printStackTrace();
} finally {
System.out.println("Line 2");
}
}
}
After running I'll get this:
Line 1
Line 2
java.lang.RuntimeException
at Scratch.main(scratch_4.java:5)
Process finished with exit code 0
I thought that "finally" code must be executed at last, but it's not.
What is the reason?
By default, printStackTrace prints to System.err, whereas you're writing to System.out. So you're writing to two different streams, and in your particular case it looks like the buffering involved has switched the output order from the actual execution order.
If you either write to a single stream (e.g. using System.err.println or calling e.printStackTrace(System.out)) or change your catch block to just write to System.out like your other lines do, you'll see the order of try => catch => finally.
A Stream is an AutoCloseable and if I/O-based, should be used in a try-with-resource block. What about intermediate I/O-based streams which are inserted via flatMap()? Example:
try (var foos = foos()) {
return foos.flatMap(Foo::bars).toArray(Bar[]::new);
}
vs.
try (var foos = foos()) {
return foos.flatMap(foo -> {
try (var bars = foo.bars()) {
return bars;
}
}).toArray(Bar[]::new);
}
The flatMap() documentation says:
Each mapped stream is closed after its contents have been placed into this stream.
Well, that's the happy path. What if there happened an exception in between? Would that stream then stay unclosed and potentially leaking resources? Should I then always use a try-with-resource also for intermediate streams?
There is no sense in a construct like
return foos.flatMap(foo -> {
try (var bars = foo.bars()) {
return bars;
}
}).toArray(Bar[]::new);
as that would close the stream before it is returned to the caller, which makes the sub-stream entirely unusable.
In fact, it is impossible for the function’s code to ensure that the closing will happen at the appropriate place, which is outside the function. That’s surely the reason why the API designers decided that you don’t have to, and the Stream implementation will take care.
This also applies to the exceptional case. The Stream still ensures that the stream gets closed, once the function has returned it to the Stream:
try {
IntStream.range(1, 3)
.flatMap(i -> {
System.out.println("creating "+i);
return IntStream.range('a', 'a'+i)
.peek(j -> {
System.out.println("processing sub "+i+" - "+(char)j);
if(j=='b') throw new IllegalStateException();
})
.onClose(() -> System.out.println("closing "+i));
})
.forEach(i -> System.out.println("consuming "+(char)i));
} catch(IllegalStateException ex) {
System.out.println("caught "+ex);
}
creating 1
processing sub 1 - a
consuming a
closing 1
creating 2
processing sub 2 - a
consuming a
processing sub 2 - b
closing 2
caught java.lang.IllegalStateException
You may play with the conditions, to see that a constructed Stream is always closed. For elements of the outer Stream which do not get processed, there will be no Stream at all.
For a Stream operation like .flatMap(Foo::bars) or .flatMap(foo -> foo.bars()), you can assume that once bars() successfully created and returned a Stream, it will be passed to the caller for sure and properly closed.
A different scenario would be mapping functions which perform operations after the Stream creation which could fail, e.g.
.flatMap(foo -> {
Stream<Type> s = foo.bar();
anotherOperation(); // Stream is not closed if this throws
return s;
})
In this case, it would be necessary to ensure the closing in the exceptional case, but only in the exceptional case:
.flatMap(foo -> {
Stream<Type> s = foo.bar();
try {
anotherOperation();
} catch(Throwable t) {
try(s) { throw t; } // close and do addSuppressed if follow-up error
}
return s;
})
but obviously, you should follow the general rule to keep lambdas simple, in which case you don’t need such protection.
In Stream or not, you have to close the IO resources at the relevant place.
The flatMap() method is a general stream method and so it not aware of IO resources you opened inside it.
But Why flatMap() would behave differently from any method that manipulates IO resources ?
For example if you manipulate IO in map(), you could get the same issue (no releasing resource) if an exception occurs.
Closing a stream (as in flatMap()) will not make it release all resources opened in the stream operation.
Some methods do that, File.lines(Path) for example. But if you open yourself some resources in flatMap(), the closing of these resources will not do automatically when the stream is closed.
For example here the flatMap processing doesn't close the FileInputStreams opened :
...
.stream()
.flatMap(foo -> {
try {
FileInputStream fileInputStream = new FileInputStream("..."));
//...
}
catch (IOException e) {
// handle
}
})
You have to close it explicitly :
...
.stream()
.flatMap(foo -> {
try (FileInputStream fileInputStream = new FileInputStream("...")){
//...
} catch (IOException e) {
// handle
}
// return
})
So yes if the statements used inside the flatMap() or any method manipulates some IO resources, you want to close them in any case by surrounding it with a try-with-resources statement to make them free.
I've been learning about concurrency and the streams API and came across this. The offerLast()method can throw InterruptedException, so I get that I must handle it. What I don't get is why can't I throw it at the method level by adding throws Exception?. As it is this code does not compile.
static BlockingDeque<Integer> queue = new LinkedBlockingDeque<>();
public static void testing() throws Exception {
IntStream.iterate(1, i -> i+1).limit(5)
.parallel()
.forEach(s -> queue.offerLast(s, 10000, TimeUnit.MILLISECONDS));
}
I know it can be solved by surrounding it in a try/catch, or by creating a wrapper method that handles the error, but I'm still trying to understand why it can't be thrown at the method level.
Because lambda expressions are not always evaluated immediately.
Let's you have this:
public Supplier<String> giveMeASupplier() throws Exception {
return () -> someMethodThatThrowsCheckedException()
}
According to you, the above would work. Right?
Now in another method, I can do this:
Suppler<String> supplier = null;
try {
supplier = giveMeASupplier() // no exception is thrown here.
} catch (Exception ex) {
ex.printStackTrace();
}
if (supplier != null) {
System.out.println(supplier.get()); // this might throw an exception! Yet it's not in a try...catch!
}
Now what do you think would happen if supplier.get() throws an exception? Is there anything to catch it? No. If somehow the catch block a few lines before gets run, then it would be really weird.
The simple answer is that the "method" you're referring to is Consumer.accept, not YourClass.testing.
The lambda s -> queue.offerLast(s, 10000, TimeUnit.MILLISECONDS) is an implementation of java.util.function.Consumer.accept(T), which doesn't declare that it can throw InterruptedException.
And this behavior is not particular to streams, wherever a lambda expression is defined, it must comply with the signature of the abstract method of the functional interface it implements.
This question already has answers here:
Java Stream API: why the distinction between sequential and parallel execution mode?
(4 answers)
Closed 6 years ago.
I don't know why I should use aggregate functions.
I mean, it is supposed that an aggregate function would parallelize the execution if improves the performance.
https://docs.oracle.com/javase/tutorial/collections/streams/parallelism.html
But it is not true, according to the documentation, the code won't be parallel if you do not use a parallelStream() instead stream(), so
Why should I use a stream() if nothing goes better?
Shouldn't those codes be the same?
//it is not parallel
listOfIntegers.stream()
.forEach( e -> System.out.print(e+" "));
And
//it is parallel
listOfIntegers.parallelStream()
.forEach( e -> System.out.print(e+" "));
if you use stream, all data in your list will be processed in order, while if you use parallelStream your data might not be process in order.
consider method
static void test(Integer i){
try {
Thread.sleep((long) (1000*Math.random()));
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
System.out.println(i);
}
and compare output from this method using parallelStream and stream