How to return void in stream? - java

I am haveing List of sending orders.It is increased when method name of parameter is same
But It is not working. Because It hasn't Termination operation
List<SendingOrdres> sendingOrders = new ArrayList<SendingOrdres>();
private void countUpOrResetSendingOrders(String method) {
sendingOrders.stream()
.filter((e) -> {
System.out.println("filter:"+e);
return e.getMethod().equals(method);
})
.peek((e) -> System.out.println("peek:"+e)) //For check
.map((e)->{
int nextNowSendingOrder = e.getNowSendingOrder()+1;
if(nextNowSendingOrder > e.getMaxSendingOrder()) {
e.setNowSendingOrder(0);
}else {
e.setNowSendingOrder(nextNowSendingOrder);
}
return e;
});
// no Termination operation
}
I added Termination operation in upper code. It is working well.
.collect(Collectors.toList());
I have a question.I don't need to return value. So i want to return void.
But If Termination operation hasn't, Stream is not working.
How to return void in stream?

Stream consists of two mandatory (sourcing, terminal) and one optional (intermediate) parts.
Stream:
is generated with sourcing operation (something that creates the Stream<T> instance);
is then optionally continued with one or more, chained intermediate operation(s);
is finally terminated with terminal operation.
void can only be considered to be the return type of the terminal operation (hence, of its lambda (or method reference) expression) in the stream, because every intermediate operation has to return stream, upon which, subsequent intermediate (or terminal) operation would operate.
For example:
List.of(1, 2, 3, 4)
.stream() //sourcing the stream
.forEach(System.out::println); //terminating the stream
is OK, because println just consumes the stream and doesn't have to return another stream.
List.of(1, 2, 3, 4)
.stream() //sourcing the stream
.filter(System.out::println); //ouch..
however, does not compile.
Additionally, beware, that Stream API is lazy, in Java. Intermediate operations are not effectively evaluated, until the terminal operation is executed.

Related

Flux last() operation when empty

I am trying solve my problem when i need to get last element (last method) of a flux but in some cases these flux can be empty and the follow error is appear
Flux#last() didn't observe any onNext signal
and this is the chain i have
return apiService.getAll(entry)
.flatMap(response -> {
if (response.getId() != null){
//do some logic
return Mono.just("some Mono");
}
else{
return Mono.empty();
}
})
.last()
//more flatMap operators
I already use switchIfEmpty()as well but can't fix.
What is the correct implementation to verify if can call last() or skip and return a empty to terminate chain operation.
Thanks,
According to Flux.last() api doc:
emit NoSuchElementException error if the source was empty. For a passive version use takeLast(int)
It means that, for an empty upstream Flux:
last() will emit an error
takeLast(1) will return an empty flux
Now, takeLast(1) returns a Flux, not a Mono, as last() does. Then, you can just chain it with Flux.next(), and it will return the only retained value (if any), or propagate the empty signal.
Note: another solution would be to use last().onErrorResume(NoSuchElementException.class, err -> Mono.empty()).
This would catch the error sent by last() internally, and then return an empty mono.
However, if you've got some code other than last() that can throw a NoSuchElementException, you might miss a problem. For this, my personal choice for your case would be to use takeLast(1).next().
The following code example shows behavior of last() vs takeLast(1).next():
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
public class FluxLast {
static void subscribe(Mono<?> publisher) {
publisher.subscribe(value -> {},
err -> System.out.println("Failed: " + err.getMessage()),
() -> System.out.println("Completed empty"));
}
public static void main(String[] args) {
subscribe(Flux.empty().last());
subscribe(Flux.empty().takeLast(1).next());
// When not empty, takeLast(1).next() will return the last value
Integer v = Flux.just(1, 2, 3)
.takeLast(1)
.next()
.block();
System.out.println("Last value: "+v);
}
}
Program output:
Failed: Flux#last() didn't observe any onNext signal from Callable flux
Completed empty
3

Ignore exception in stream operations

Assuming you have an exception (checked/unchecked) in a stream operation
and you want to ignore from now on this element.
The stream must not be aborted, just ignoring elements throwing exceptions.
I explicitly avoid saying skip, because it is a stream operation.
So the example is using the map() operation for demonstration.
Here I have a division by zero (for example), so the "map" should skip this
element.
As an example:
#Test
public void ignoreException() {
assertThat(Stream.of(1,2,1,3).map(i -> 10 / i).reduce(0, Integer::sum), is(28));
// the zero will break the next stream
assertThat(Stream.of(1,2,0,3).map(i -> 10 / i).reduce(0, Integer::sum), is(18));
}
So the division by zero can break the whole stream.
I found a lot of articles that wrap a runtime exception in a checked exception (throw new RuntimeException(ex)).
Or partial vs. total functions.
Or I made a wrapper returning a java.util.function.Function
(e.g: ....map(wrapper(i -> 10/i))...),
returning a "null" in the case of a exception. But right-hand operation may now fail,
as in my example (reduce).
The only useful approach is an "EITHER" concept (a stream of EITHER),
so the division by zero in my example
will become a "left" and can be handled in a different way.
There are relatively few operations on streams that can achieve a transformation of elements and result in elements being dropped -- in fact, there's really only one, flatMap.
So your wrapper more or less has to look like
interface CanThrow<F, T> { T apply(F from) throws Exception; }
<T, R> Function<T, Stream<R>> wrapper(CanThrow<T, R> fn) {
return t -> {
try {
return Stream.of(fn.apply(t));
} catch (Exception ignored) { return Stream.empty(); }
}
}
assertThat(Stream.of(1, 2, 0, 3).flatMap(wrapper(i -> 10 / i)).reduce(0, Integer::sum))
.isEqualTo(18));
Try this:
#Test
public void ignoreException() {
assertThat(Stream.of(1,2,1,3).map(i -> i == 0 ? 0 : 10 / i).reduce(0, Integer::sum), is(28));
// the zero will break the next stream
assertThat(Stream.of(1,2,0,3).map(i -> i == 0 ? 0 : 10 / i).reduce(0, Integer::sum), is(18));
}

Stream mysteriously consumed twice

The following code ends up with a java.lang.IllegalStateException: stream has already been operated upon or closed.
public static void main(String[] args) {
Stream.concat(Stream.of("FOOBAR"),
reverse(StreamSupport.stream(new File("FOO/BAR").toPath().spliterator(), true)
.map(Path::toString)));
}
static <T> Stream<T> reverse(Stream<T> stream) {
return stream.reduce(Stream.empty(),
(Stream<T> a, T b) -> Stream.concat(Stream.of(b), a),
(a, b) -> Stream.concat(b, a));
}
The obvious solution is to generate a non parallel stream with StreamSupport.stream(…, false), but I can’t see why can’t run in parallel.
Stream.empty() is not a constant. This method returns a new stream instance on each invocation that will get consumed like any other stream, e.g. when you pass it into Stream.concat.
Therefore, Stream.empty() is not suitable as identity value for reduce, as the identity value may get passed as input to the reduction function an arbitrary, intentionally unspecified number of times. It’s an implementation detail that is happens to be used only a single time for sequential reduction and potentially multiple times for parallel reduction.
You can use
static <T> Stream<T> reverse(Stream<T> stream) {
return stream.map(Stream::of)
.reduce((a, b) -> Stream.concat(b, a))
.orElseGet(Stream::empty);
}
instead.
However, I only provide the solution as an academic exercise. As soon as the stream gets large, it leads to an excessive amount of concat calls and the note of the documentation applies:
Use caution when constructing streams from repeated concatenation. Accessing an element of a deeply concatenated stream can result in deep call chains, or even StackOverflowError.
Generally, the resulting underlying data structure will be far more expensive than a flat list, when using the Stream API this way.
You can use something like
Stream<String> s = Stream.concat(Stream.of("FOOBAR"),
reverse(new File("FOO/BAR").toPath()).map(Path::toString));
static Stream<Path> reverse(Path p) {
ArrayDeque<Path> d = new ArrayDeque<>();
p.forEach(d::addFirst);
return d.stream();
}
or
static Stream<Path> reverse(Path p) {
Stream.Builder b = Stream.builder();
for(; p != null; p = p.getParent()) b.add(p.getFileName());
return b.build();
}
With Java 9+ you can use a stream that truly has no additional storage (which does not necessarily imply that it will be more efficient):
static Stream<Path> reverse(Path p) {
return Stream.iterate(p, Objects::nonNull, Path::getParent).map(Path::getFileName);
}

Java Stream `generate()` how to "include" the first "excluded" element

Assume this usage scenario for a Java stream, where data is added from a data source. Data source can be a list of values, like in the example below, or a paginated REST api. It doesn't matter, at the moment.
import java.util.List;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.stream.Stream;
public class Main {
public static void main(String[] args) {
final List<Boolean> dataSource = List.of(true, true, true, false, false, false, false);
final AtomicInteger index = new AtomicInteger();
Stream
.generate(() -> {
boolean value = dataSource.get(index.getAndIncrement());
System.out.format("--> Executed expensive operation to retrieve data: %b\n", value);
return value;
})
.takeWhile(value -> value == true)
.forEach(data -> System.out.printf("--> Using: %b\n", data));
}
}
If you run this code your output will be
--> Executed expensive operation to retrieve data: true
--> Using: true
--> Executed expensive operation to retrieve data: true
--> Using: true
--> Executed expensive operation to retrieve data: true
--> Using: true
--> Executed expensive operation to retrieve data: false
As you can see the last element, the one that evaluated to false, did not get added to the stream, as expected.
Now assume that the generate() method loads pages of data from a REST api. In that case the value true/false is a value on page N indicating if page N + 1 exists, something like a has_more field. Now, I want the last page returned by the API to be added to the stream, but I do not want to perform another expensive operation to read an empty page, because I already know that there are no more pages.
What is the most idiomatic way to do this using the Java Stream API? Every workaround I can think of requires a call to the API to be executed.
UPDATE
In addition to the approaches listed in Inclusive takeWhile() for Streams there is another ugly way to achieve this.
public static void main(String[] args) {
final List<Boolean> dataSource = List.of(true, true, true, false, false, false, false);
final AtomicInteger index = new AtomicInteger();
final AtomicBoolean hasMore = new AtomicBoolean(true);
Stream
.generate(() -> {
if (!hasMore.get()) {
return null;
}
boolean value = dataSource.get(index.getAndIncrement());
hasMore.set(value);
System.out.format("--> Executed expensive operation to retrieve data: %b\n", value);
return value;
})
.takeWhile(Objects::nonNull)
.forEach(data -> System.out.printf("--> Using: %b\n", data));
}
You are using the wrong tool for your job. As already noticable in your code example, the Supplier passed to Stream.generate has to go great lengths to maintain the index it needs for fetching pages.
What makes matters worse, is that Stream.generate creates an unordered Stream:
Returns an infinite sequential unordered stream where each element is generated by the provided Supplier.
This is suitable for generating constant streams, streams of random elements, etc.
You’re not returning constant or random values nor anything else that would be independent of the order.
This has a significant impact on the semantics of takeWhile:
Otherwise returns, if this stream is unordered, a stream consisting of a subset of elements taken from this stream that match the given predicate.
This makes sense if you think about it. If there is at least one element rejected by the predicate, it could be encountered at an arbitrary position for an unordered stream, so an arbitrary subset of elements encountered before it, including the empty set, would be a valid prefix.
But since there is no “before” or “after” for an unordered stream, even elements produced by the generator after the rejected one could be included by the result.
In practice, you are unlikely to encounter such effects for a sequential stream, but it doesn’t change the fact that Stream.generate(…) .takeWhile(…) is semantically wrong for your task.
From your example code, I conclude that pages do not contain their own number nor a "getNext" method, so we have to maintain the number and the "hasNext" state for creating a stream.
Assuming an example setup like
class Page {
private String data;
private boolean hasNext;
public Page(String data, boolean hasNext) {
this.data = data;
this.hasNext = hasNext;
}
public String getData() {
return data;
}
public boolean hasNext() {
return hasNext;
}
}
private static String[] SAMPLE_PAGES = { "foo", "bar", "baz" };
public static Page getPage(int index) {
Objects.checkIndex(index, SAMPLE_PAGES.length);
return new Page(SAMPLE_PAGES[index], index + 1 < SAMPLE_PAGES.length);
}
You can implement a correct stream like
Stream.iterate(Map.entry(0, getPage(0)), Objects::nonNull,
e -> e.getValue().hasNext()? Map.entry(e.getKey()+1, getPage(e.getKey()+1)): null)
.map(Map.Entry::getValue)
.forEach(page -> System.out.println(page.getData()));
Note that Stream.iterate creates an ordered stream:
Returns a sequential ordered Stream produced by iterative application of the given next function to an initial element,
conditioned on satisfying the given hasNext predicate.
Of course, things would be much easier if the page knew its own number, e.g.
Stream.iterate(getPage(0), Objects::nonNull,
p -> p.hasNext()? getPage(p.getPageNumber()+1): null)
.forEach(page -> System.out.println(page.getData()));
or if there was a method to get from an existing Page to the next Page, e.g.
Stream.iterate(getPage(0), Objects::nonNull, p -> p.hasNext()? p.getNextPage(): null)
.forEach(page -> System.out.println(page.getData()));

takeWhile() working differently with flatmap

I am creating snippets with takeWhile to explore its possibilities. When used in conjunction with flatMap, the behaviour is not in line with the expectation. Please find the code snippet below.
String[][] strArray = {{"Sample1", "Sample2"}, {"Sample3", "Sample4", "Sample5"}};
Arrays.stream(strArray)
.flatMap(indStream -> Arrays.stream(indStream))
.takeWhile(ele -> !ele.equalsIgnoreCase("Sample4"))
.forEach(ele -> System.out.println(ele));
Actual Output:
Sample1
Sample2
Sample3
Sample5
ExpectedOutput:
Sample1
Sample2
Sample3
Reason for the expectation is that takeWhile should be executing till the time the condition inside turns true. I have also added printout statements inside flatmap for debugging. The streams are returned just twice which is inline with the expectation.
However, this works just fine without flatmap in the chain.
String[] strArraySingle = {"Sample3", "Sample4", "Sample5"};
Arrays.stream(strArraySingle)
.takeWhile(ele -> !ele.equalsIgnoreCase("Sample4"))
.forEach(ele -> System.out.println(ele));
Actual Output:
Sample3
Here the actual output matches with the expected output.
Disclaimer: These snippets are just for code practise and does not serve any valid usecases.
Update:
Bug JDK-8193856: fix will be available as part of JDK 10.
The change will be to correct whileOps
Sink::accept
#Override
public void accept(T t) {
if (take = predicate.test(t)) {
downstream.accept(t);
}
}
Changed Implementation:
#Override
public void accept(T t) {
if (take && (take = predicate.test(t))) {
downstream.accept(t);
}
}
This is a bug in JDK 9 - from issue #8193856:
takeWhile is incorrectly assuming that an upstream operation supports and honors cancellation, which unfortunately is not the case for flatMap.
Explanation
If the stream is ordered, takeWhile should show the expected behavior. This is not entirely the case in your code because you use forEach, which waives order. If you care about it, which you do in this example, you should use forEachOrdered instead. Funny thing: That doesn't change anything. 🤔
So maybe the stream isn't ordered in the first place? (In that case the behavior is ok.) If you create a temporary variable for the stream created from strArray and check whether it is ordered by executing the expression ((StatefulOp) stream).isOrdered(); at the breakpoint, you will find that it is indeed ordered:
String[][] strArray = {{"Sample1", "Sample2"}, {"Sample3", "Sample4", "Sample5"}};
Stream<String> stream = Arrays.stream(strArray)
.flatMap(indStream -> Arrays.stream(indStream))
.takeWhile(ele -> !ele.equalsIgnoreCase("Sample4"));
// breakpoint here
System.out.println(stream);
That means that this is very likely an implementation error.
Into The Code
As others have suspected, I now also think that this might be connected to flatMap being eager. More precisely, both problems might have the same root cause.
Looking into the source of WhileOps, we can see these methods:
#Override
public void accept(T t) {
if (take = predicate.test(t)) {
downstream.accept(t);
}
}
#Override
public boolean cancellationRequested() {
return !take || downstream.cancellationRequested();
}
This code is used by takeWhile to check for a given stream element t whether the predicate is fulfilled:
If so, it passes the element on to the downstream operation, in this case System.out::println.
If not, it sets take to false, so when it is asked next time whether the pipeline should be canceled (i.e. it is done), it returns true.
This covers the takeWhile operation. The other thing you need to know is that forEachOrdered leads to the terminal operation executing the method ReferencePipeline::forEachWithCancel:
#Override
final boolean forEachWithCancel(Spliterator<P_OUT> spliterator, Sink<P_OUT> sink) {
boolean cancelled;
do { } while (
!(cancelled = sink.cancellationRequested())
&& spliterator.tryAdvance(sink));
return cancelled;
}
All this does is:
check whether pipeline was canceled
if not, advance the sink by one element
stop if this was the last element
Looks promising, right?
Without flatMap
In the "good case" (without flatMap; your second example) forEachWithCancel directly operates on the WhileOp as sink and you can see how this plays out:
ReferencePipeline::forEachWithCancel does its loop:
WhileOps::accept is given each stream element
WhileOps::cancellationRequested is queried after each element
at some point "Sample4" fails the predicate and the stream is canceled
Yay!
With flatMap
In the "bad case" (with flatMap; your first example), forEachWithCancel operates on the flatMap operation, though, , which simply calls forEachRemaining on the ArraySpliterator for {"Sample3", "Sample4", "Sample5"}, which does this:
if ((a = array).length >= (hi = fence) &&
(i = index) >= 0 && i < (index = hi)) {
do { action.accept((T)a[i]); } while (++i < hi);
}
Ignoring all that hi and fence stuff, which is only used if the array processing is split for a parallel stream, this is a simple for loop, which passes each element to the takeWhile operation, but never checks whether it is cancelled. It will hence eagerly ply through all elements in that "substream" before stopping, likely even through the rest of the stream.
This is a bug no matter how I look at it - and thank you Holger for your comments. I did not want to put this answer in here (seriously!), but none of the answer clearly states that this is a bug.
People are saying that this has to with ordered/un-ordered, and this is not true as this will report true 3 times:
Stream<String[]> s1 = Arrays.stream(strArray);
System.out.println(s1.spliterator().hasCharacteristics(Spliterator.ORDERED));
Stream<String> s2 = Arrays.stream(strArray)
.flatMap(indStream -> Arrays.stream(indStream));
System.out.println(s2.spliterator().hasCharacteristics(Spliterator.ORDERED));
Stream<String> s3 = Arrays.stream(strArray)
.flatMap(indStream -> Arrays.stream(indStream))
.takeWhile(ele -> !ele.equalsIgnoreCase("Sample4"));
System.out.println(s3.spliterator().hasCharacteristics(Spliterator.ORDERED));
It's very interesting also that if you change it to:
String[][] strArray = {
{ "Sample1", "Sample2" },
{ "Sample3", "Sample5", "Sample4" }, // Sample4 is the last one here
{ "Sample7", "Sample8" }
};
then Sample7 and Sample8 will not be part of the output, otherwise they will. It seems that flatmap ignores a cancel flag that would be introduced by dropWhile.
If you look at the documentation for takeWhile:
if this stream is ordered, [returns] a stream consisting of the
longest prefix of elements taken from this stream that match the given
predicate.
if this stream is unordered, [returns] a stream consisting of a subset
of elements taken from this stream that match the given predicate.
Your stream is coincidentally ordered, but takeWhile doesn't know that it is. As such, it is returning 2nd condition - the subset. Your takeWhile is just acting like a filter.
If you add a call to sorted before takeWhile, you'll see the result you expect:
Arrays.stream(strArray)
.flatMap(indStream -> Arrays.stream(indStream))
.sorted()
.takeWhile(ele -> !ele.equalsIgnoreCase("Sample4"))
.forEach(ele -> System.out.println(ele));
The reason for that is the flatMap operation also being an intermediate operations with which (one of) the stateful short-circuiting intermediate operation takeWhile is used.
The behavior of flatMap as pointed by Holger in this answer is certainly a reference one shouldn't miss out to understand the unexpected output for such short-circuiting operations.
Your expected result can be achieved by splitting these two intermediate operations by introducing a terminal operation to deterministically use an ordered stream further and performing them for a sample as :
List<String> sampleList = Arrays.stream(strArray).flatMap(Arrays::stream).collect(Collectors.toList());
sampleList.stream().takeWhile(ele -> !ele.equalsIgnoreCase("Sample4"))
.forEach(System.out::println);
Also, there seems to be a related Bug#JDK-8075939 to trace this behavior already registered.
Edit: This can be tracked further at JDK-8193856 accepted as a bug.

Categories

Resources