Why it works: BigDecimal Sum with Reduce and BigDecimal::add - java

I can understand why Total1 is calculated, but as Total2 is calculated I have no idea!
How can a BigDecimal::add be used in a BiFunction? Signatures are not the same !!!
package br.com.jorge.java8.streams.bigdecimal;
import java.math.BigDecimal;
import java.util.ArrayList;
import java.util.List;
public class BigDecimalSumTest {
public static void main(String[] args) {
List<BigDecimal> list = new ArrayList<>();
list.add(new BigDecimal("1"));
list.add(new BigDecimal("2"));
BigDecimal total1 = list.stream().reduce(BigDecimal.ZERO, (t, v) -> t.add(v));
BigDecimal total2 = list.stream().reduce(BigDecimal.ZERO, BigDecimal::add);
System.out.println("Total 1: " + total1);
System.out.println("Total 2: " + total2);
}
}

Its used as a BinaryOperator<T> in your current context.
Its equivalent lambda representation:
(bigDecimal, augend) -> bigDecimal.add(augend) // same as in your previous line of code
and anonymous class representation:
new BinaryOperator<BigDecimal>() {
#Override
public BigDecimal apply(BigDecimal bigDecimal, BigDecimal augend) {
return bigDecimal.add(augend);
}
}
where BinaryOperator<T> extends BiFunction<T,T,T>, meaning its a specialization of BiFunction for the case where the operands and the result are all of the same type.
Added to that, your code is actually using one of the overloaded implementations of reduce method i.e. Stream.reduce(T identity, BinaryOperator<T> accumulator).
How can a BigDecimal::add be used in a BiFunction
Just a step further and only for explanation, there is also an overloaded implementation that uses combiner as in Stream.reduce​(U identity, BiFunction<U,​? super T,​U> accumulator, BinaryOperator<U> combiner) which would look like :
BigDecimal total = list.stream()
.reduce(BigDecimal.ZERO, BigDecimal::add, BigDecimal::add);
// ^^ ^^
// BiFunction here BinaryOperator here

Given BigDecimal::add is being used as a BiFunction<BigDecimal, BigDecimal, BigDecimal>, the compiler will look for one of two eligible signatures.
The first possible signature, as you have picked up on, would be a two-argument static method. The relevant lambda would be (a, b) -> BigDecimal.add(a, b). Of course, you are correct to recognise that this doesn't exist.
The second possible signature would be a one-argument instance method. The equivalent lambda here would be (a, b) -> a.add(b). As this one exists and the other does not, this is how the compiler would interpret it.

Related

Understading Java types for stream operations

When using stream, very similar two pieces of code behave differently.
I cannot understand what the compiler's error is trying to convey.
import java.util.Arrays;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
import java.util.stream.IntStream;
import static java.util.stream.Collectors.toList;
public class Main {
public static void main(String[] args) {
// Pythangorian stream
List<List<Integer>> ret = IntStream.range(1, 10).boxed()
.flatMap(a -> IntStream.range(a, 10)
// This works fine...
// .mapToObj(b -> Arrays.asList(a, b, a*a + b*b))
.mapToObj(b -> Arrays.asList(a, b, Math.sqrt(a*a + b*b)))
// .filter(t -> Math.sqrt(t.get(2)) % 1 == 0))
.filter(t -> t.get(2) % 1 == 0))
.collect(toList());
System.out.println(ret);
}
}
If Math.sqrt is used in mapToObj, it gets compiler error. If not, it works fine. The error message is like so:
Error:(21, 62) java: bad operand types for binary operator '%'
first type: java.lang.Number&java.lang.Comparable<? extends java.lang.Number&java.lang.Comparable<?>>
second type: int
Error:(22, 25) java: incompatible types: inference variable T has incompatible bounds
equality constraints: java.util.List<java.lang.Integer>
lower bounds: java.util.List<java.lang.Number&java.lang.Comparable<? extends java.lang.Number&java.lang.Comparable<?>>>
Can someone please tell me what this error message is saying? I am aware of my little knowledge in Java is the main concern. Can you please guide me where to look to understand this?
The reason is you are trying to collect to List<Integer> but you should use List<? extends Number> instead. Moreover your filter method can be removed because this condition is always true, i.e. any integer can be divided by one without remainder.
Working code can look like this:
List<List<? extends Number>> ret = IntStream.range(1, 10)
.boxed()
.flatMap(a -> IntStream.range(a, 10)
.mapToObj(b -> Arrays.asList(a, b, Math.sqrt(a * a + b * b))))
.collect(toList());
The reason is Math.sqrt() returns double and a and b are integers so you cannot collect all of these three to List<Integer>. The second approach is to collect to List<Double> and use explicit cast from integer to double:
List<List<Double>> ret = IntStream.range(1, 10)
.boxed()
.flatMap(a -> IntStream.range(a, 10)
.mapToObj(b -> Arrays.asList((double) a, (double) b, Math.sqrt(a * a + b * b))))
.collect(toList());

Which FunctionalInterface should I use?

I was learning to write some lambda representation as FunctionalInterface.
So, to add two integers I used:
BiFunction<Integer, Integer, Integer> biFunction = (a, b) -> a + b;
System.out.println(biFunction.apply(10, 60));
Gives me the output 70. But if I write it as this
BinaryOperator<Integer, Integer, Integer> binaryOperator = (a, b) -> a + b;
I get an error saying
Wrong number of type arguments: 3; required: 1
Isn't BinaryOperator a child of BinaryFunction? How do I improve it?
BinaryOperator
Since BinaryOperator works on a single type of operands and result. i.e. BinaryOperator<T>.
Isn't BinaryOperator a child of BinaryFunction?
Yes. BinaryOperator does extends BiFunction.
But do note the documentation states(formatting mine):
This is a specialization of BiFunction for the case where the
operands and the result are all of the same type.
The complete representation is as:
BinaryOperator<T> extends BiFunction<T,T,T>
hence your code shall work with
BinaryOperator<Integer> binaryOperator = (a, b) -> a + b;
System.out.println(binaryOperator.apply(10, 60));
IntBinaryOperator
If you're supposed to be dealing with two primitive integers as currently in your example (add two integers I used), you can make use of the IntBinaryOperator FunctionalInterface as
IntBinaryOperator intBinaryOperator = (a, b) -> a + b;
System.out.println(intBinaryOperator.applyAsInt(10, 60));
Represents an operation upon two int-valued operands and producing
an int-valued result. This is the primitive type specialization of
BinaryOperator for int.
I am using Integer, can I still use IntBinaryOperator
Yes, you can still use it but notice the representation of the IntBinaryOperator
Integer first = 10;
Integer second = 60;
IntBinaryOperator intBinaryOperator = new IntBinaryOperator() {
#Override
public int applyAsInt(int a, int b) {
return Integer.sum(a, b);
}
};
Integer result = intBinaryOperator.applyAsInt(first, second);
would incur you an overhead of unboxing first and second to primitives and then autoboxing the sum as an output to result of type Integer.
Note: Be careful of using null-safe values for the Integer though or else you would probably end up with a NullPointerException.
BiFunction<Integer, Integer, Integer> biFunction = (a, b) -> a + b;
can be represented by
BinaryOperator<Integer> binaryOperator = (a, b) -> a + b;
But generally you want to perform arithmetical computations on int and not Integer in order to avoid unboxing to compute (Integer to int) and boxing again to return the result (int to Integer) :
IntBinaryOperator intBinaryOperator = (a, b) -> a + b;
As a side note, you could also use a method reference instead of a lambda to compute a sum between two ints.
Integer.sum(int a, int b) is what you are looking for :
IntBinaryOperator biFunction = Integer::sum;
Isn't BinaryOperator a child of BinaryFunction?
Yes, it is. If you look at source code of BinaryOperator, you see:
public interface BinaryOperator<T> extends BiFunction<T,T,T> {
// ...
}
So you just have to fix your syntax:
BinaryOperator<Integer> binaryOperator = (a, b) -> a + b;
System.out.println(binaryOperator.apply(10, 60));
How do I improve it?
You can use IntBinaryOperator. It simplifies sytax even more:
IntBinaryOperator binaryOperator = (a, b) -> a + b;
System.out.println(binaryOperator.applyAsInt(10, 60));

Difference between BiFunction<X, X> and BinaryOperator<X>

I am not able to understand that how come BinaryOperator<Integer> could be placed at the place of A in the code below, but not BiFunction<Integer, Integer>?
A foo = (a, b) -> { return a * a + b * b; };
int bar = foo.apply(2, 3);
System.out.println(bar);
Could someone please help me understand it.
BinaryOperator is a special BiFunction. So you can assign the same expression to both of them. Check this out.
BinaryOperator<Integer> foo = (a, b) -> {
return a * a + b * b;
};
BiFunction<Integer, Integer, Integer> barFn = (a, b) -> {
return a * a + b * b;
};
If you look at the source code, it would be
public interface BinaryOperator<T> extends BiFunction<T,T,T> {
// Remainder omitted.
}
The Bifunction and the BinaryOperator are same but the only difference here is the argument type and the return type of interfaces.
Consider a case where you want to concatenate two strings and return the result. In this case, you can choose either one of them but BinaryOperator is a good choice to go with because if you focus on the arguments and the return type, they all are the same.
BinaryOperator<String> c=(str,str1)->str+str1;
you can do the same with Bifunction but now see the difference here:
BiFunction<String,String,String> c=(str,str1)->str+str1;
Now consider a case in which we want to add two integers and return a string. Here we can only choose the BiFunction and not the BinaryOperator:
BiFunction<Integer,Integer,String> c=(a,b)->"Answer="+(a+b);

Equivalent of Scala's foldLeft in Java 8

What is the equivalent of of Scala's great foldLeft in Java 8?
I was tempted to think it was reduce, but reduce has to return something of identical type to what it reduces on.
Example:
import java.util.List;
public class Foo {
// this method works pretty well
public int sum(List<Integer> numbers) {
return numbers.stream()
.reduce(0, (acc, n) -> (acc + n));
}
// this method makes the file not compile
public String concatenate(List<Character> chars) {
return chars.stream()
.reduce(new StringBuilder(""), (acc, c) -> acc.append(c)).toString();
}
}
The problem in the code above is the accumulator: new StringBuilder("")
Thus, could anyone point me to the proper equivalent of the foldLeft/fix my code?
There is no equivalent of foldLeft in Java 8's Stream API. As others noted, reduce(identity, accumulator, combiner) comes close, but it's not equivalent with foldLeft because it requires the resulting type B to combine with itself and be associative (in other terms, be monoid-like), a property that not every type has.
There is also an enhancement request for this: add Stream.foldLeft() terminal operation
To see why reduce won't work, consider the following code, where you intend to execute a series of arithmetic operations starting with given number:
val arithOps = List(('+', 1), ('*', 4), ('-', 2), ('/', 5))
val fun: (Int, (Char, Int)) => Int = {
case (x, ('+', y)) => x + y
case (x, ('-', y)) => x - y
case (x, ('*', y)) => x * y
case (x, ('/', y)) => x / y
}
val number = 2
arithOps.foldLeft(number)(fun) // ((2 + 1) * 4 - 2) / 5
If you tried writing reduce(2, fun, combine), what combiner function could you pass that combines two numbers? Adding the two numbers together clearly does not solve it. Also, the value 2 is clearly not an identity element.
Note that no operation that requires a sequential execution can be expressed in terms of reduce. foldLeft is actually more generic than reduce: you can implement reduce with foldLeft but you cannot implement foldLeft with reduce.
Update:
Here is initial attempt to get your code fixed:
public static String concatenate(List<Character> chars) {
return chars
.stream()
.reduce(new StringBuilder(),
StringBuilder::append,
StringBuilder::append).toString();
}
It uses the following reduce method:
<U> U reduce(U identity,
BiFunction<U, ? super T, U> accumulator,
BinaryOperator<U> combiner);
It may sound confusing but if you look at the javadocs there is a nice explanation that may help you quickly grasp the details. The reduction is equivalent to the following code:
U result = identity;
for (T element : this stream)
result = accumulator.apply(result, element)
return result;
For a more in-depth explanation please check this source.
This usage is not correct though because it violates the contract of reduce which states that the accumulator should be an associative, non-interfering, stateless function for incorporating an additional element into a result. In other words since the identity is mutable the result will be broken in case of parallel execution.
As pointed in the comments below a correct option is using the reduction as follows:
return chars.stream().collect(
StringBuilder::new,
StringBuilder::append,
StringBuilder::append).toString();
The supplier StringBuilder::new will be used to create reusable containers which will be later combined.
The method you are looking for is java.util.Stream.reduce, particularly the overload with three parameters, identity, accumulator, and binary function. That is the correct equivalent to Scala's foldLeft.
However, you are not allowed to use Java's reduce that way, and also not Scala's foldLeft for that matter. Use collect instead.
It can be done by using Collectors:
public static <A, B> Collector<A, ?, B> foldLeft(final B init, final BiFunction<? super B, ? super A, ? extends B> f) {
return Collectors.collectingAndThen(
Collectors.reducing(Function.<B>identity(), a -> b -> f.apply(b, a), Function::andThen),
endo -> endo.apply(init)
);
}
Usage example:
IntStream.rangeClosed(1, 100).boxed().collect(foldLeft(50, (a, b) -> a - b)); // Output = -5000
For your question, this does what you wanted:
public String concatenate(List<Character> chars) {
return chars.stream()
.collect(foldLeft(new StringBuilder(), StringBuilder::append)).toString();
}
Others are correct there's no equivalent though. Here's a util that comes close-
<U, T> U foldLeft(Collection<T> sequence, U identity, BiFunction<U, ? super T, U> accumulator) {
U result = identity;
for (T element : sequence)
result = accumulator.apply(result, element);
return result;
}
your case using the above method would look like-
public String concatenate(List<Character> chars) {
return foldLeft(chars, new StringBuilder(""), StringBuilder::append).toString();
}
Or without the lambda method ref sugar,
public String concatenate(List<Character> chars) {
return foldLeft(chars, new StringBuilder(""), (stringBuilder, character) -> stringBuilder.append(character)).toString();
}

Reduce for parallel stream without combiner executes by several threads correctly. When should I use combiner at this case?

I have read following about:
https://stackoverflow.com/a/22814174/2674303
and I made resolution that combiner uses only in parallel stream for correct merge accumulator results. one accumulator instance on each thread.
Thus I made resolution that reduce without combiner will not work correctly.
To check this I have wrote following example:
Person reduce = Person.getPersons().stream()
.parallel()
.reduce(new Person(), (intermediateResult, p2) -> {
System.out.println(Thread.currentThread().getName());
return new Person("default", intermediateResult.getAge() + p2.getAge());
});
System.out.println(reduce);
model:
public class Person {
String name;
Integer age;
///...
public static Collection<Person> getPersons() {
List<Person> persons = new ArrayList<>();
persons.add(new Person("Vasya", 12));
persons.add(new Person("Petya", 32));
persons.add(new Person("Serj", 10));
persons.add(new Person("Onotole", 18));
return persons;
}
}
As you can see I don't provide combiner
sample output:
ForkJoinPool.commonPool-worker-3
ForkJoinPool.commonPool-worker-2
ForkJoinPool.commonPool-worker-1
ForkJoinPool.commonPool-worker-2
ForkJoinPool.commonPool-worker-1
ForkJoinPool.commonPool-worker-1
Person{name='default', age=72}
I have executed application several times and always I see correct result.
Please, explain how does reduce working for parallel stream if combiner is not provided.
In this case your accumulator works as combiner as well. This is a shorthand when reduction type is the same as stream element type. Thus
myStream.reduce(identity, accumulator);
Is fully equivalent to
myStream.reduce(identity, accumulator, accumulator);
You can even check the source code of these methods in OpenJDK:
#Override
public final <R> R reduce(R identity, BiFunction<R, ? super P_OUT, R> accumulator,
BinaryOperator<R> combiner) {
return evaluate(ReduceOps.makeRef(identity, accumulator, combiner));
}
#Override
public final P_OUT reduce(final P_OUT identity, final BinaryOperator<P_OUT> accumulator) {
return evaluate(ReduceOps.makeRef(identity, accumulator, accumulator));
}
The three-argument version is more flexible as the reduction operation may produce an object of another type. In this case you cannot use two-argument reduction, because you don't provide a rule how to combine two elements of the resulting type. However when resulting type is the same, accumulator and combiner work on the same object type, thus if it's associative, it should be just the same operation.
You have specified a combiner. In this case, the combiner function is identical to your accumulator function.
This is always possible, if the result type is identical to your stream element type.
Compare with reduction by summing up the values, a+b+c+d might evaluated in parallel by calculating (a+b)+(c+d). Here, the accumulator is the addition, which is the same operation as the combiner function which processes the intermediate results of (a+b) and (c+d).
This example also shows that, unless there is a type conversion involved, it would be strange if you need a different combiner function as the associativity constraint of the accumulator function implies that it is normally sufficient as a combiner function. Keep in mind, that it should be irrelevant, whether the stream calculates, a+b+c+d, (a+b+c)+d, (a+b)+(c+d) or a+(b+c+d).
3-argument reduce exists for a somewhat uncommon situation that looks like this:
Imagine that instead of reducing a stream of Persons into a Person, you had a different intermediate value like PopulationStats.
class PopulationStats {
// make new stats that includes this person
PopulationStats addPerson(Person p) {
return new PopulationStats(........);
}
// make new stats that combines this and other stats
PopulationStats addStats(PopulationStats other) {
return new PopulationStats(........);
}
}
In a case like that, the 3-argument reduce serves to avoid the intermediate step of making a PopulationStats for each Person before reducing.
PopulationStats stats = people.stream()
.reduce(new PopulationStats(), PopulationStats::addPerson, PopulationStats::addStats);

Categories

Resources