java: reduce vs anyMatch vs contains - java

List<Boolean> results = new ArrayList<>();
results.add(true);
results.add(true);
results.add(true);
results.add(false);
if (results.contains(false)) {
System.out.println(false);
} else {
System.out.println(true);
}
System.out.println(results.stream().reduce((a, b) -> a && b).get());
//System.out.println(!results.stream().anyMatch(a -> a == false));
System.out.println(!results.stream().anyMatch(a -> !a));
OUTPUT:
false
false
false
FYI, the results are a result of a map+collect op
List<Job> jobs;
List<Boolean> results = job.stream().map(j -> j.ready()).collect(Collector.toList())
If I choose either reduce or anyMatch, I don't have to collect the results from map operation.
From results which is a list of boolean, I just want to return false if there is at least one false.
I can do it via reduce or anyMatch. I kinda don't like Optional from reduce, and I kinda don't like that I have to negate anyMatch
Are there any pros/cons for using either?

It appears that the only reason you are collecting the booleans into the list is so you can check if some are false:
If I choose either reduce or anyMatch, I don't have to collect the results from map operation [...] I just want to return false if there is at least one false.
If this is the case, then you definitely should consider the straightforward stream-based approach:
return jobs.stream().allMatch(Job::ready);

You ask pros/cons. Contains is the fastest and simplest. Reduce is the most cumbersome/complicated here. But your task is very simple, so does it really matter? Maybe the better key to select which approach to use would be: "Which one is better readable, understandable and maintainable?" This clean-code approach is usually more important in practical software development than hunting microseconds in runtime or number of lines in source code. But then again I would say contains is the best here.
System.out.println(!results.contains(false));
Then your anyMatch(a -> !a) is effectively the same as contains and I would definitely prefer it over reduce for this concrete task. But again, real difference is very small and I would more concern the readability and understandability for a future maintainer of this software.

System.out.println(results.stream().reduce((a, b) -> a && b).get());
This will always return false, as the list(results) has at-least 1 false.
&& will always check for both value to be true to pass it as true.
System.out.println(!results.stream().anyMatch(a -> !a));
Stream anyMatch(Predicate predicate) returns whether any elements of this stream(results) match the provided predicate(a -> !a). As you are doing !results so the result will come-out to be false at last after initial true.

Related

Infinite loop JavaRDD<String> spark 1.6

i'am trying to iterate a JavaRDD and find element by applying method which use this RDD and then i should delete is
here is my code:
items=input.map(x->{
min=getMin(input);
return min;
})
.filter(x -> ! Domine(x, min));
but there is no result it seem an infinite loop
how can i fix it
thanks
When it comes to implementations like this one (same as Java8 streams, or Kotlin sequences) they are implemented in a lazy way, thus you need to perform a terminal operation, only then the work will be done.
So if you do a filter and end there - nothing will happen since you didn't perform any terminal operation. Do for example first(), take(1), forEach(...) or any other terminal operation, you can find them here.
From the very vague description, i believe what you require would be something like the following, assuming that input is of type JavaRDD<Row>:
final Row min = input.min((row1, row2) -> {
// TODO: replace by some real comparator implementation
Integer row1value = row1.getInt(row1.fieldIndex("fieldName"));
Integer row2value = row2.getInt(row2.fieldIndex("fieldName"));
return row1value.compareTo(row2value);
});
items = input.filter(row -> !Domine(row, min));
Sine Apache SPARK Transformations like filter are inherently lazy, to actually retrieve the value you would have to then write List collectedValues = items.collect(); I would, however, strongly recommend that .collect() never actually goes into production since it can be very dangerous indeed.

Is it possible to replace all loop constructs in Java with stream-based constructs?

I am exploring the possibilities of the Java Stream API in order to find out if one could possibly replace all loop-based constructs with stream-based constructs.
As an example that would probably hint to the assumption that this is actually possible, consider this:
Is it possible to use the Stream API to split a string containing words delimited by spaces into an array of strings like the following call of String.split(String) would do ?
String[] tokens = "Peter Paul Mary".split(" ");
I am aware of the fact that a stream-based solution could make itself use of the String.split(String) method like so:
Stream.of("Peter Paul Mary".split(" "))
.collect(Collectors.toList());
or make use of Pattern.splitAsStream(CharSequence) (the implementation of which certainly uses a loop-based approach) but I am looking for a Stream-only solution, meaning something along the lines of this Haskell snippet:
words :: String -> [String]
words s = case dropWhile Char.isSpace s of
"" -> []
s' -> w : words s''
where (w, s'') = break Char.isSpace s'
I am asking this because I am still wondering if the introduction of the Stream API will lead to a profound change in the way we handle object collections in Java or just add another option to it, thus making it more challenging to maintain a larger codebase rather than to simplify this task in the long run.
EDIT: Although there is an accepted answer (see below) it only shows that its possible in this special case. I am still interested in any hints to the general case as required in the question.
A distinct non answer here: you are asking the wrong question!
It doesn't matter if all "loop related" lines of Java code can be converted into something streamish.
Because: good programming is the ability to write code that humans can easily read and understand.
So when somebody puts up a rule that says "we only use streams from hereon for everything we do" then that rule significantly reduces your options when writing code. Instead of being able to carefully decide "should I use streams" versus "should I go with a simple old-school loop construct" you are down to "how do I get this working with streams"!
From that point of view, you should focus on coming up with "rules" that work for all people in your development team. That could mean to empasize the use of stream constructs. But you definitely want to avoid the absolutism, and leave it to each developer to write up that code that implements a given requirement in the "most readable" way. If that is possible with streams, fine. If not, don't force people to do it.
And beyond that: depending on what exactly you are doing, using streams comes also with performance cost. So even when you can find a stream solution for a problem - you have to understand its runtime cost. You surely want to avoid using streams in places where they cost too much (especially when that code is already on your "critical path" regarding performance, like: called zillions of times per second).
Finally: to a certain degree, this is a question of skills. Meaning: when you are trained in using streams, it is much easier for you to A) read "streamish" code that others wrote and B) coming up with "streamish" solutions that are in fact easy to read. In other words: this again depends on the context you are working. The other week I was educating another team on "clean code", and my last foil was "Clean Code, Java8 streams/lambdas". One guy asked me "what are streams?" No point in forcing such a community to do anything with streams tomorrow.
Just for fun (this is one horrible way to do it), neither do I know if this fits your needs:
List<String> result = ",,,abc,def".codePoints()
.boxed()
// .parallel()
.collect(Collector.of(
() -> {
List<StringBuilder> inner = new ArrayList<>();
inner.add(new StringBuilder());
return inner;
},
(List<StringBuilder> list, Integer character) -> {
StringBuilder last = list.get(list.size() - 1);
if (character == ',') {
list.add(new StringBuilder());
} else {
last.appendCodePoint(character);
}
},
(left, right) -> {
left.get(left.size() - 1).append(right.remove(0));
left.addAll(right);
return left;
},
list -> list.stream()
.map(StringBuilder::toString)
.filter(x -> !x.equals(""))
.collect(Collectors.toList())
));

XOR in Java Lambda Expression for a List of boolean

I started to try out Labda Expressions for implementing Boolean-Gates for a list of boolean input paramter.
For "or" and "and" I wrote the following statments:
OR: expressions.stream().anyMatch(e -> e.evaluate(input));
AND: expressions.stream().allMatch(e -> e.evaluate(input));
e.evaluate(input) returns true or false.
But since there is no onceMatch method allready implemented I am stuck with the XOR.
First idea would be to filter all true values and check if it is just one:
return expressions.stream().filter(e -> e.evaluate(input) == true).collect(Collectors.counting()) == 1;
But I would like to see it in one lambda expression.
If you want to know whether there is exactly one match, you can use
expressions.stream().filter(e -> e.evaluate(input)).limit(2).count() == 1
the limit(2) avoids unnecessary processing, as once you encountered two matches, you already know that the result can’t be ==1, without needing to count the other matches.
However, that’s not “XOR” logic, not even remotely. If you want an XOR operation, you may use
expressions.stream().map(e -> e.evaluate(input)).reduce((a,b) -> a^b).orElse(Boolean.FALSE)
Unlike AND or OR, there is no way to short-circuit an XOR operation.
I can't come up with a lambda expression that would suit your needs, but a slight refactoring of your first idea looks fine to me:
return expressions.stream().filter(e -> e.evaluate(input)).count() == 1;

Whats the most elegant way to add two numbers that are Optional<BigDecimal>

I need to perform an add operation on two big decimals that are wrapped optionals:
Optional<BigDecimal> ordersTotal;
Optional<BigDecimal> newOrder;
I want to achieve ordersTotal += newOrder
It's important to note that if both values are empty the result should likewise be empty (ie not zero).
Here is what I came up with:
ordersTotal = ordersTotal.flatMap( b -> Optional.of(b.add(newOrder.orElse(BigDecimal.ZERO))));
but I'm wondering if there's a more elegant solution.
I think the suggested answers of using streams or chains of methods on optionals are very clever, but perhaps so clever as to be obscure. The OP has modeled this as ordersTotal += newOrder with the exception that if both are empty, the result should be empty instead of zero. Maybe it would be reasonable to write the code so that it says that:
if (!ordersTotal.isPresent() && !newOrder.isPresent()) {
result = Optional.empty();
} else {
result = Optional.of(ordersTotal.orElse(ZERO).add(newOrder.orElse(ZERO)));
}
While this isn't the shortest, it clearly expresses exactly what the OP asked for.
Now I've assigned the computed value to result but the OP actually wanted to assign it back to ordersTotal. If we know both are empty, we can then skip the then-clause that assigns empty to ordersTotal. Doing that, and then inverting the condition gives something simpler:
if (ordersTotal.isPresent() || newOrder.isPresent()) {
ordersTotal = Optional.of(ordersTotal.orElse(ZERO).add(newOrder.orElse(ZERO)));
}
Now, this tends to obscure the both-empty special case, which might not be a good idea. On the other hand, this says "add the values if either is non-empty" which might make a lot of sense for the application.
Not sure if you'll consider it more elegant, but here's one alternative:
ordersTotal = Optional.of(ordersTotal.orElse(BigDecimal.ZERO).add(newOrder.orElse(BigDecimal.ZERO)));
Another, based on #user140547's suggestion:
ordersTotal = Stream.of(ordersTotal, newOrder)
.filter(Optional::isPresent)
.map(Optional::get)
.reduce(BigDecimal::add);
Note that the first version returns Optional.of(BigDecimal.ZERO) even when both optionals are empty, whereas the second will return Optional.empty() in such a case.
You could use a stream of optionals. Then you can make a stream of bigdecimals, and then reduce those bigdecimals, or else return 0.
This has the advantage that you don't have to change the code if you want to do that more than two optionals.
(code can be added later if needed, currently I don't have access to a computer)
Note that your solution
ordersTotal=ordersTotal.flatMap(b -> Optional.of(b.add(newOrder.orElse(BigDecimal.ZERO))));
will produce an empty Optional, if ordersTotal is empty, even if newOrder is not.
This could be fixed by changing it to
ordersTotal=ordersTotal
.map(b -> Optional.of(b.add(newOrder.orElse(BigDecimal.ZERO))))
.orElse(newOrder);
but I’d prefer
ordersTotal=ordersTotal
.map(b -> newOrder.map(b::add).orElse(b))
.map(Optional::of).orElse(newOrder);
I know this is an old thread, but how about this?
orderTotal = !newOrder.isPresent()?
orderTotal :
newOrder.flatMap(v -> Optional.of(v.add(orderTotal.orElse(BigDecimal.ZERO));
My thinking behind this approach is like this:
Behind all the shinny Optional etc. the basic logic here is still
orderTotal += newOrder
Before the first newOrder exists orderTotal does not exist, which is represented by an empty Optional in the code.
If a newOrder does not exist yet (another empty Optional), no operation is necessary at all, i.e. there is no need to modify orderTotal
If there is a newOrder, invoke its flatMap(..) as presented in maxTrialfire's original post.
Optional and Stream here do not fit together elegantly.
The best in java 8 is:
ordersTotaI = !ordersTotaI.isPresent() ? newOrder
: !newOrder.isPresent() ? ordersTotaI
: Optional.of(ordersTotaI.get().add(newOrder.get()));
However the good news is, that java 9 will add some nice (and also ugly) functionality to Optional.
What is wrong is your requirement not your solution. An empty Optional is not zero but a missing value. You're basically asking that 5 + NaN is equal to 5. Optional's flatMap guides you to the happy path: 5 + Nan is Nan and this is exactly what flatMap does.
Considering that you want to reassign to ordersTotal, you'll notice that ordersTotal only changes if newOrder is present.
You can thus start with that check, and write it as:
if (newOrder.isPresent()) {
ordersTotal = newOrder.map(ordersTotal.orElse(ZERO)::add);
}
(This could be considered as a simplification of Stuart Marks' second solution. This is also a case where the method reference cannot be converted back to a lambda, due to ordersTotal not being effectively final)
If you start by this check, there is also another possible “clever” approach:
if (newOrder.isPresent()) {
ordersTotal = ordersTotal.map(o -> newOrder.map(o::add)).orElse(newOrder);
}
where the intermediate map() returns an Optional<Optional<BigDecimal>> whose inner Optional cannot be empty. I wouldn't consider it a good solution due to its bad readability, and I would thus recommend the first option or one of Stuart's solutions.

What is the Java 8 reduce BinaryOperator used for?

I am currently reading the O'Reilly Java 8 Lambdas, it is a really good book. I came across with a example like this.
I have a
private final BiFunction<StringBuilder,String,StringBuilder>accumulator=
(builder,name)->{if(builder.length()>0)builder.append(",");builder.append("Mister:").append(name);return builder;};
final Stream<String>stringStream = Stream.of("John Lennon","Paul Mccartney"
,"George Harrison","Ringo Starr");
final StringBuilder reduce = stringStream
.filter(a->a!=null)
.reduce(new StringBuilder(),accumulator,(left,right)->left.append(right));
System.out.println(reduce);
System.out.println(reduce.length());
this produce the right output.
Mister:John Lennon,Mister:Paul Mccartney,Mister:George Harrison,Mister:Ringo Starr
My question is regarding the reduce method the last parameter which is a BinaryOperator.
Which this parameter is used for? If I change by
.reduce(new StringBuilder(),accumulator,(left,right)->new StringBuilder());
the output is the same; if I pass NULL then N.P.E is returned.
What is this parameter used for?
Update
Why if I run it on parallelStream I am receiving different results?
First run:
returned StringBuilder length = 420
Second run:
returned StringBuilder length = 546
Third run:
returned StringBuilder length = 348
and so on. Why is this - should it not return all the values at each iteration?
The method reduce in the interface Stream is overloaded. The parameters for the method with three arguments are:
identity
accumulator
combiner
The combiner supports parallel execution. Apparently, it is not used for sequential streams. However, there is no such guarantee. If you change your streams into parallel stream, I guess you will see a difference:
Stream<String>stringStream = Stream.of(
"John Lennon", "Paul Mccartney", "George Harrison", "Ringo Starr")
.parallel();
Here is an example of how the combiner can be used to transform a sequential reduction into a reduction, that supports parallel execution. There is a stream with four Strings and acc is used as an abbreviation for accumulator.apply. Then the result of the reduction can be computed as follows:
acc(acc(acc(acc(identity, "one"), "two"), "three"), "four");
With a compatible combiner, the above expression can be transformed into the following expression. Now it is possible to execute the two sub-expressions in different threads.
combiner.apply(
acc(acc(identity, "one"), "two"),
acc(acc(identity, "three"), "four"));
Regarding your second question, I use a simplified accumulator to explain the problem:
BiFunction<StringBuilder,String,StringBuilder> accumulator =
(builder,name) -> builder.append(name);
According to the Javadoc for Stream::reduce, the accumulator has to be associative. In this case, that would imply, that the following two expressions return the same result:
acc(acc(acc(identity, "one"), "two"), "three")
acc(acc(identity, "one"), acc(acc(identity, "two"), "three"))
That's not true for the above accumulator. The problem is, that you are mutating the object referenced by identity. That's a bad idea for the reduce operation. Here are two alternative implementations which should work:
// identity = ""
BiFunction<String,String,String> accumulator = String::concat;
// identity = null
BiFunction<StringBuilder,String,StringBuilder> accumulator =
(builder,name) -> builder == null
? new StringBulder(name) : builder.append(name);
nosid's answer got it mostly right (+1) but I wanted to amplify a particular point.
The identity parameter to reduce must be an identity value. It's ok if it's an object, but if it is, it should immutable. If the "identity" object is mutated, it's no longer an identity! For more discussion of this point, see my answer to a related question.
It looks like this example originated from Example 5-19 of Richard Warburton, Java 8 Lambdas, O'Reilly 2014. If so, I shall have to have a word about this with the good Dr. Warburton.

Categories

Resources