Can someone help me understand what is going on here with Flux's takeUntil operator?
Flux.just(1, 2, 3, 4, 5)
.takeUntil { it < 4 }
.map { println("Flux:$it") }
.subscribe()
In the console, the only thing that is printed is:
Flux:1
I expected to see
Flux:1
Flux:2
Flux:3
Why do I only see one element?
Please, note that you are using the takeUntil() operator:
Relay values from this Flux until the given Predicate matches. This includes the matching data (unlike takeWhile(java.util.function.Predicate<? super T>)).
— Flux (reactor-core 3.4.22).
Please, note: «until»:
until the given Predicate matches
To achieve the desired behavior, please, consider using the takeWhile() operator instead:
Relay values from this Flux while a predicate returns TRUE for the values (checked before each value is delivered). This only includes the matching data (unlike takeUntil(java.util.function.Predicate<? super T>)).
— Flux (reactor-core 3.4.22).
Related
List<Boolean> results = new ArrayList<>();
results.add(true);
results.add(true);
results.add(true);
results.add(false);
if (results.contains(false)) {
System.out.println(false);
} else {
System.out.println(true);
}
System.out.println(results.stream().reduce((a, b) -> a && b).get());
//System.out.println(!results.stream().anyMatch(a -> a == false));
System.out.println(!results.stream().anyMatch(a -> !a));
OUTPUT:
false
false
false
FYI, the results are a result of a map+collect op
List<Job> jobs;
List<Boolean> results = job.stream().map(j -> j.ready()).collect(Collector.toList())
If I choose either reduce or anyMatch, I don't have to collect the results from map operation.
From results which is a list of boolean, I just want to return false if there is at least one false.
I can do it via reduce or anyMatch. I kinda don't like Optional from reduce, and I kinda don't like that I have to negate anyMatch
Are there any pros/cons for using either?
It appears that the only reason you are collecting the booleans into the list is so you can check if some are false:
If I choose either reduce or anyMatch, I don't have to collect the results from map operation [...] I just want to return false if there is at least one false.
If this is the case, then you definitely should consider the straightforward stream-based approach:
return jobs.stream().allMatch(Job::ready);
You ask pros/cons. Contains is the fastest and simplest. Reduce is the most cumbersome/complicated here. But your task is very simple, so does it really matter? Maybe the better key to select which approach to use would be: "Which one is better readable, understandable and maintainable?" This clean-code approach is usually more important in practical software development than hunting microseconds in runtime or number of lines in source code. But then again I would say contains is the best here.
System.out.println(!results.contains(false));
Then your anyMatch(a -> !a) is effectively the same as contains and I would definitely prefer it over reduce for this concrete task. But again, real difference is very small and I would more concern the readability and understandability for a future maintainer of this software.
System.out.println(results.stream().reduce((a, b) -> a && b).get());
This will always return false, as the list(results) has at-least 1 false.
&& will always check for both value to be true to pass it as true.
System.out.println(!results.stream().anyMatch(a -> !a));
Stream anyMatch(Predicate predicate) returns whether any elements of this stream(results) match the provided predicate(a -> !a). As you are doing !results so the result will come-out to be false at last after initial true.
I started to try out Labda Expressions for implementing Boolean-Gates for a list of boolean input paramter.
For "or" and "and" I wrote the following statments:
OR: expressions.stream().anyMatch(e -> e.evaluate(input));
AND: expressions.stream().allMatch(e -> e.evaluate(input));
e.evaluate(input) returns true or false.
But since there is no onceMatch method allready implemented I am stuck with the XOR.
First idea would be to filter all true values and check if it is just one:
return expressions.stream().filter(e -> e.evaluate(input) == true).collect(Collectors.counting()) == 1;
But I would like to see it in one lambda expression.
If you want to know whether there is exactly one match, you can use
expressions.stream().filter(e -> e.evaluate(input)).limit(2).count() == 1
the limit(2) avoids unnecessary processing, as once you encountered two matches, you already know that the result can’t be ==1, without needing to count the other matches.
However, that’s not “XOR” logic, not even remotely. If you want an XOR operation, you may use
expressions.stream().map(e -> e.evaluate(input)).reduce((a,b) -> a^b).orElse(Boolean.FALSE)
Unlike AND or OR, there is no way to short-circuit an XOR operation.
I can't come up with a lambda expression that would suit your needs, but a slight refactoring of your first idea looks fine to me:
return expressions.stream().filter(e -> e.evaluate(input)).count() == 1;
I need to perform an add operation on two big decimals that are wrapped optionals:
Optional<BigDecimal> ordersTotal;
Optional<BigDecimal> newOrder;
I want to achieve ordersTotal += newOrder
It's important to note that if both values are empty the result should likewise be empty (ie not zero).
Here is what I came up with:
ordersTotal = ordersTotal.flatMap( b -> Optional.of(b.add(newOrder.orElse(BigDecimal.ZERO))));
but I'm wondering if there's a more elegant solution.
I think the suggested answers of using streams or chains of methods on optionals are very clever, but perhaps so clever as to be obscure. The OP has modeled this as ordersTotal += newOrder with the exception that if both are empty, the result should be empty instead of zero. Maybe it would be reasonable to write the code so that it says that:
if (!ordersTotal.isPresent() && !newOrder.isPresent()) {
result = Optional.empty();
} else {
result = Optional.of(ordersTotal.orElse(ZERO).add(newOrder.orElse(ZERO)));
}
While this isn't the shortest, it clearly expresses exactly what the OP asked for.
Now I've assigned the computed value to result but the OP actually wanted to assign it back to ordersTotal. If we know both are empty, we can then skip the then-clause that assigns empty to ordersTotal. Doing that, and then inverting the condition gives something simpler:
if (ordersTotal.isPresent() || newOrder.isPresent()) {
ordersTotal = Optional.of(ordersTotal.orElse(ZERO).add(newOrder.orElse(ZERO)));
}
Now, this tends to obscure the both-empty special case, which might not be a good idea. On the other hand, this says "add the values if either is non-empty" which might make a lot of sense for the application.
Not sure if you'll consider it more elegant, but here's one alternative:
ordersTotal = Optional.of(ordersTotal.orElse(BigDecimal.ZERO).add(newOrder.orElse(BigDecimal.ZERO)));
Another, based on #user140547's suggestion:
ordersTotal = Stream.of(ordersTotal, newOrder)
.filter(Optional::isPresent)
.map(Optional::get)
.reduce(BigDecimal::add);
Note that the first version returns Optional.of(BigDecimal.ZERO) even when both optionals are empty, whereas the second will return Optional.empty() in such a case.
You could use a stream of optionals. Then you can make a stream of bigdecimals, and then reduce those bigdecimals, or else return 0.
This has the advantage that you don't have to change the code if you want to do that more than two optionals.
(code can be added later if needed, currently I don't have access to a computer)
Note that your solution
ordersTotal=ordersTotal.flatMap(b -> Optional.of(b.add(newOrder.orElse(BigDecimal.ZERO))));
will produce an empty Optional, if ordersTotal is empty, even if newOrder is not.
This could be fixed by changing it to
ordersTotal=ordersTotal
.map(b -> Optional.of(b.add(newOrder.orElse(BigDecimal.ZERO))))
.orElse(newOrder);
but I’d prefer
ordersTotal=ordersTotal
.map(b -> newOrder.map(b::add).orElse(b))
.map(Optional::of).orElse(newOrder);
I know this is an old thread, but how about this?
orderTotal = !newOrder.isPresent()?
orderTotal :
newOrder.flatMap(v -> Optional.of(v.add(orderTotal.orElse(BigDecimal.ZERO));
My thinking behind this approach is like this:
Behind all the shinny Optional etc. the basic logic here is still
orderTotal += newOrder
Before the first newOrder exists orderTotal does not exist, which is represented by an empty Optional in the code.
If a newOrder does not exist yet (another empty Optional), no operation is necessary at all, i.e. there is no need to modify orderTotal
If there is a newOrder, invoke its flatMap(..) as presented in maxTrialfire's original post.
Optional and Stream here do not fit together elegantly.
The best in java 8 is:
ordersTotaI = !ordersTotaI.isPresent() ? newOrder
: !newOrder.isPresent() ? ordersTotaI
: Optional.of(ordersTotaI.get().add(newOrder.get()));
However the good news is, that java 9 will add some nice (and also ugly) functionality to Optional.
What is wrong is your requirement not your solution. An empty Optional is not zero but a missing value. You're basically asking that 5 + NaN is equal to 5. Optional's flatMap guides you to the happy path: 5 + Nan is Nan and this is exactly what flatMap does.
Considering that you want to reassign to ordersTotal, you'll notice that ordersTotal only changes if newOrder is present.
You can thus start with that check, and write it as:
if (newOrder.isPresent()) {
ordersTotal = newOrder.map(ordersTotal.orElse(ZERO)::add);
}
(This could be considered as a simplification of Stuart Marks' second solution. This is also a case where the method reference cannot be converted back to a lambda, due to ordersTotal not being effectively final)
If you start by this check, there is also another possible “clever” approach:
if (newOrder.isPresent()) {
ordersTotal = ordersTotal.map(o -> newOrder.map(o::add)).orElse(newOrder);
}
where the intermediate map() returns an Optional<Optional<BigDecimal>> whose inner Optional cannot be empty. I wouldn't consider it a good solution due to its bad readability, and I would thus recommend the first option or one of Stuart's solutions.
I have a problem statement here
what I need to do it iterate over a list find the first integer which is greater than 3 and is even then just double it and return it.
These are some methods to check how many operations are getting performed
public static boolean isGreaterThan3(int number){
System.out.println("WhyFunctional.isGreaterThan3 " + number);
return number > 3;
}
public static boolean isEven(int number){
System.out.println("WhyFunctional.isEven " + number);
return number % 2 == 0;
}
public static int doubleIt(int number){
System.out.println("WhyFunctional.doubleIt " + number);
return number << 1;
}
with java 8 streams I could do it like
List<Integer> integerList = Arrays.asList(1, 2, 3, 5, 4, 6, 7, 8, 9, 10);
integerList.stream()
.filter(WhyFunctional::isGreaterThan3)
.filter(WhyFunctional::isEven)
.map(WhyFunctional::doubleIt)
.findFirst();
and the output is
WhyFunctional.isGreaterThan3 1
WhyFunctional.isGreaterThan3 2
WhyFunctional.isGreaterThan3 3
WhyFunctional.isGreaterThan3 5
WhyFunctional.isEven 5
WhyFunctional.isGreaterThan3 4
WhyFunctional.isEven 4
WhyFunctional.doubleIt 4
Optional[8]
so total 8 operations.
And with imperative style or before java8 I could code it like
for (Integer integer : integerList) {
if(isGreaterThan3(integer)){
if(isEven(integer)){
System.out.println(doubleIt(integer));
break;
}
}
}
and the output is
WhyFunctional.isGreaterThan3 1
WhyFunctional.isGreaterThan3 2
WhyFunctional.isGreaterThan3 3
WhyFunctional.isGreaterThan3 5
WhyFunctional.isEven 5
WhyFunctional.isGreaterThan3 4
WhyFunctional.isEven 4
WhyFunctional.doubleIt 4
8
and operations are same. So my question is what difference does it make if I am using streams rather traditional for loop.
Stream API introduces the new idea of streams which allows you to decouple the task in a new way. For example, based on your task it's possible that you want to do different things with the doubled even numbers greater than three. In some place you want to find the first one, in other place you need 10 such numbers, in third place you want to apply more filtering. You can encapsulate the algorithm of finding such numbers like this:
static IntStream numbers() {
return IntStream.range(1, Integer.MAX_VALUE)
.filter(WhyFunctional::isGreaterThan3)
.filter(WhyFunctional::isEven)
.map(WhyFunctional::doubleIt);
}
Here it is. You've just created an algorithm to generate such numbers (without generating them) and you don't care how they will be used. One user might call:
int num = numbers().findFirst().get();
Other user might need to get 10 such numbers:
int[] tenNumbers = numbers().limit(10).toArray();
Third user might want to find the first matching number which is also divisible by 7:
int result = numbers().filter(n -> n % 7 == 0).findFirst().get();
It would be more difficult to encapsulate the algorithm in traditional imperative style.
In general the Stream API is not about the performance (though parallel streams may work faster than traditional solution). It's about the expressive power of your code.
The imperative style complects the computational logic with the mechanism used to achieve it (iteration). The functional style, on the other hand, decomplects the two. You code against an API to which you supply your logic and the API has the freedom to choose how and when to apply it.
In particular, the Streams API has two ways how to apply the logic: either sequentially or in parallel. The latter is actually the driving force behind the introduction of both lambdas and the Streams API itself into Java.
The freedom to choose when to perform computation gives rise to laziness: whereas in the imperative style you have a concrete collection of data, in the functional style you can have a collection paired with logic to transform it. The logic can be applied "just in time", when you actually consume the data. This further allows you to spread the building up of computation: each method can receive a stream and apply a further step of computation on it, or it can consume it in different ways (by collecting into a list, by finding just the first item and never applying computation to the rest, but calculating an aggregate value, etc.).
As a particular example of the new opportunities offered by laziness, I was able to write a Spring MVC controller which returned a Stream whose data source was a database—and at the time I return the stream, the data is still in the database. Only the View layer will pull the data, implicitly applying the transformation logic it has no knowledge of, never having to retain more than a single stream element in memory. This converted a solution which classically had O(n) space complexity into O(1), thus becoming insensitive to the size of the result set.
Using the Stream API you are describing an operation instead of implementing it. One commonly known advantage of letting the Stream API implement the operation is the option of using different execution strategies like parallel execution (as already said by others).
Another feature which seems to be a bit underestimated is the possibility to alter the operation itself in a way that is impossible to do in an imperative programming style as that would imply modifying the code:
IntStream is=IntStream.rangeClosed(1, 10).filter(i -> i > 4);
if(evenOnly) is=is.filter(i -> (i&1)==0);
if(doubleIt) is=is.map(i -> i<<1);
is.findFirst().ifPresent(System.out::println);
Here, the decision whether to filter out odd numbers or double the result is made before the terminal operation is commenced. In an imperative programming you either have to recheck the flags within the loop or code multiple alternative loops. It should be mentioned that checking such conditions within a loop isn’t that bad on today’s JVM as the optimizer is capable of moving them out of the loop at runtime, so coding multiple loops is usually unnecessary.
But consider the following example:
Stream<String> s = Stream.of("java8 streams", "are cool");
if(singleWords) s=s.flatMap(Pattern.compile("\\s")::splitAsStream);
s.collect(Collectors.groupingBy(str->str.charAt(0)))
.forEach((k,v)->System.out.println(k+" => "+v));
Since flatMap is the equivalent of a nested loop, coding the same in an imperative style isn’t that simple any more as we have either a simple loop or a nested loop based on a runtime value. Usually, you have to resort to splitting the code into multiple methods if you want to share it between both kind of loops.
I already encountered a real-life example where the composition of a complex operation had multiple conditional flatMap steps. The equivalent imperative code is insane…
1) Functional approach allows more declarative way of programming: you just provide a list of functions to apply and don't need to write iterations manually, so your code is more consine sometimes.
2) If you switch to parallel stream (https://docs.oracle.com/javase/tutorial/collections/streams/parallelism.html) it will be possible to automatically convert your program to parallel and execute it faster. It is possbile because you don't explicitly code iteration, just list what functions to apply, so compiler/runtime may parallel it.
In this simple example, there is little difference, and the JVM will try to do the same amount of work in each case.
Where you start to see a difference is in more complicated examples like
integerList.parallelStream()
making the code concurrent for a loop is much harder. Note: you wouldn't actually do this as the overhead would to high and you only want the first element.
BTW The first example returns the result and the second prints.
I have boiled my problem down into the following snippet:
Observable<Integer> numbers = Observable.just(1, 2, 3);
Observable<GroupedObservable<Integer,Integer>> outer = numbers.groupBy(i->i%3);
System.out.println(outer.count().toBlocking().single());
which blocks interminably. I've been reading several posts and believe I understand the problem: GroupedObservables will not call onComplete until their inner Observables have also been completed. Unfortunately though I still can't get the above snippet to print!
For example, the following:
Observable<Integer> just = Observable.just(1, 2, 3);
Observable<GroupedObservable<Integer,Integer>> groupBy = just.groupBy(i->i%3);
groupBy.subscribe(inner -> inner.ignoreElements());
System.out.println(groupBy.count().toBlocking().single());
still does nothing. Have I misunderstood the problem? Is there another problem? In short, how can I get the above snippets to work?
Many thanks in advance,
Dan.
Yes, you have to consume the groups in some fashion. Your second example doesn't work because you have two independent subscription to the grouping operation.
Usually, the solution is flatMap, but not not with ignoreElements because that will just complete and count won't get any elements. Instead, you can use takeLast(1):
Observable.just(1, 2, 3)
.groupBy(k -> k % 3)
.flatMap(g -> g.takeLast(1))
.count()
.toBlocking()
.forEach(System.out::println);