Suppose I have a simple class with method eval(). Is possible to convert this method to stream.reduce or something similar except for using for loop? Operation is interface with many possible implementations of method execute which compute different arithmetical operations.
public class Expression {
private final List<Operation> operations;
public Expression(List<Operation> operations) {
this.operations = operations;
}
int eval() {
int result = 0;
for (Operation operation: operations) {
result = operation.execute(result);
}
return result;
}
}
Try this.
int eval() {
int[] r = {0};
operations.stream()
.forEach(op -> r[0] = op.execute(r[0]));
return r[0];
}
forEach
Why not to try forEach() as the simplest and most common operation; it loops over the stream elements, calling the supplied function on each element.
public void eval() {
operations.stream().forEach(e -> e.execute());
}
This will effectively call the execute() on each element in the operations.
Also, a note to your current code is that result will have the latest result of operations execute, but not all.
Related
I'm trying to use Java 8 streams to create a single CarData object, which consists of an average of all the CarData fields in the list coming from getCars;
CarData = new CarData();
CarData.getBodyWeight returns Integer
CarData.getShellWeight returns Integer
List<CarData> carData = carResults.getCars();
IntSummaryStatistics averageBodyWeight = carData.stream()
.mapToInt((x) -> x.getBodyWeight())
.summaryStatistics();
averageBodyWeight.getAverage();
IntSummaryStatistics averageShellWeight = carData.stream()
.mapToInt((x) -> x.getShellWeight())
.summaryStatistics();
getShellWeight.getAverage();
I don't want to have to put each of these back together in my final returned result.
Visually, this is my list
getCars() : [
{CarData: { getBodyWeight=10, getShellWeight=3 } }
{CarData: { getBodyWeight=6, getShellWeight=5 } }
{CarData: { getBodyWeight=8, getShellWeight=19 } }
]
and the output I'm trying to achieve is a single object that has the average of each of the fields I specify. not sure If I need to use Collectors.averagingInt or some combo of IntSummaryStatistics to achieve this. Easy to do across one field for either of these techniques, just not sure what I'm missing when using multiple integer fields.
{CarData: { getBodyWeight=8, getShellWeight=9 } }
Starting with JDK 12, you can use the following solution:
CarData average = carData.stream().collect(Collectors.teeing(
Collectors.averagingInt(CarData::getBodyWeight),
Collectors.averagingInt(CarData::getShellWeight),
(avgBody, avgShell) -> new CarData(avgBody.intValue(), avgShell.intValue())));
For older Java versions, you can do either, add the teeing implementation of this answer to your code base and use it exactly as above or create a custom collector tailored to your task, as shown in Andreas’ answer.
Or consider that streaming twice over a List in memory is not necessarily worse than doing two operations in one stream, both, readability- and performance-wise.
Note that calling intValue() on Double objects has the same behavior as the (int) casts in Andreas’ answer. So in either case, you have to adjust the code if other rounding behavior is intended.
Or you consider using a different result object, capable of holding two floating point values for the averages.
You need to write your own Collector, something like this:
class CarDataAverage {
public static Collector<CarData, CarDataAverage, Optional<CarData>> get() {
return Collector.of(CarDataAverage::new, CarDataAverage::add,
CarDataAverage::combine,CarDataAverage::finish);
}
private long sumBodyWeight;
private long sumShellWeight;
private int count;
private void add(CarData carData) {
this.sumBodyWeight += carData.getBodyWeight();
this.sumShellWeight += carData.getShellWeight();
this.count++;
}
private CarDataAverage combine(CarDataAverage that) {
this.sumBodyWeight += that.sumBodyWeight;
this.sumShellWeight += that.sumShellWeight;
this.count += that.count;
return this;
}
private Optional<CarData> finish() {
if (this.count == 0)
return Optional.empty();
// adjust as needed if averages should be rounded
return Optional.of(new CarData((int) (this.sumBodyWeight / this.count),
(int) (this.sumShellWeight / this.count)));
}
}
You then use it like this:
List<CarData> list = ...
Optional<CarData> averageCarData = list.stream().collect(CarDataAverage.get());
I am learning usage of,
java.util.function.Function
I wrote a code which uses java Function to add 4 to itself,
the code is as follows,
public class Test01 {
public static void main(String[] args) {
Function<Integer,Integer> addFunction = new Function<Integer,Integer>() {
private int total = 0;
public Integer apply(Integer value) {
this.total += value;
return this.total;
}
};
int finalTotal = addFunction.andThen(addFunction)
.andThen(addFunction)
.andThen(addFunction)
.apply(4);
System.out.println(finalTotal);
}
}
When I run the above code the output which I get is
32
How can I achieve something which I did in javaScript which is as follows,
var fn19 = function(){
var addNum = function(num){
var fn = function(num2){
fn.sum += num2;
return fn;
};
fn.sum = num;
return fn;
};
print("addNum(3)(4)(3)(10) ==> "+addNum(3)(4)(3)(10).sum);
};
fn19();
The output of the above code is
addNum(3)(4)(3)(10) ==> 20
Can I have the same kind of java function call where I can pass as many number arguments and the addFunction will add those many numbers.
An example, as close as possible to your JavaScript code, would be
class QuestionableConstruct {
int sum;
QuestionableConstruct add(int num2) {
sum += num2;
return this;
}
}
Runnable fn19 = () -> {
IntFunction<QuestionableConstruct> addNum = num -> {
QuestionableConstruct fn = new QuestionableConstruct();
fn.sum = num;
return fn;
};
System.out.println("addNum(3)(4)(3)(10)==> "+addNum.apply(3).add(4).add(3).add(10).sum);
};
fn19.run();
A more Java like solution would be
interface Add {
int sum();
default Add add(int num) {
int sum = sum() + num;
return () -> sum;
}
static Add num(int num) {
return () -> num;
}
}
usable as
System.out.println("addNum(3)(4)(3)(10) ==> "+Add.num(3).add(4).add(3).add(10).sum());
Unlike the JavaScript construct, this uses real immutable functions. Consider
Add a = Add.num(1).add(2).add(3);
System.out.println("1+2+3+4+5 = "+a.add(4).add(5).sum());
System.out.println("1+2+3+10+20 = "+a.add(10).add(20).sum());
which works smoothly without interference.
But of course, if you just want to sum a variable number of items, use
System.out.println("addNum(3)(4)(3)(10) ==> "+IntStream.of(3, 4, 3, 10).sum());
or if you want a mutable accumulator, use
System.out.println("addNum(3)(4)(3)(10) ==> "+
IntStream.builder().add(3).add(4).add(3).add(10).build().sum());
which allows to keep the builder and pass it around.
You can't exactly do that in Java, what you are looking for is reducing a stream of values.
In other words: the "real" functional solution here isn't to call one method with multiple arguments. You rather have the values in some list, and then you define the function that "accumulates" over the content of that list, by applying a function on the elements.
See here for some examples.
The technique you're describing in the JavaScript world is taking advantage of a closure.
This is a nice side-effect of functions in JavaScript being first-class citizens. It's a way of associating functions with data in the enclosing scope and then being able to pass this association around without losing inner context. The most common/simple use of which is caching(its formal name being memoisation).
You would need functions(methods) in Java to be first class, but doing so would be redundant as classes by design are an entity which associates data and methods, making the whole concept of a closure in this context redundant.
I know java streams, and tried to implement the map, filter, fold (with custom function as argument), both the strict and lazy evaluation ways.
However i could not implement a lazy implementation of flatmap in java.
Normal map,filter, fold are just composed functions which run on the main iterator (if its list) and apply of functions is discarded if the incoming value is null.
However flatMap input function produces another list( stream) which needs to be flattened,
How is the lazy flatMap implemented in java 10? is there any document on the algorithm?
Thanks.
If you want to implement lazy flatMap, the most important part is to provide a correct implementation of Iterator. This implementation can look like this:
final class FlatMappedIterator<A, B> implements Iterator<B> {
private final Iterator<A> iterator;
private final Function<A, Iterable<B>> f;
private Iterator<B> targetIterator; // Iterator after applying `f` to element of type A
FlatMappedIterator(Iterator<A> iterator, Function<A, Iterable<B>> f) {
this.iterator = iterator;
this.f = f;
}
#Override
public boolean hasNext() {
if (targetIterator != null && targetIterator.hasNext()) {
return true;
} else if (iterator.hasNext()) {
A next = iterator.next();
Iterable<B> targetIterable = f.apply(next);
targetIterator = targetIterable.iterator();
return targetIterator.hasNext();
} else {
return false;
}
}
#Override
public B next() {
if (hasNext()) {
return targetIterator.next();
} else {
throw new NoSuchElementException();
}
}
}
So the retrieval of the next element is postponed to the moment when hasNext or next is called.
Then you need to implement the flatMap function itself. But this is easy. I'm leaving it as an exercise for the reader :)
I have an async API that essentially returns results through pagination
public CompletableFuture<Response> getNext(int startFrom);
Each Response object contains a list of offsets from startFrom and a flag indicating whether there are more elements remaining and, therefore, another getNext() request to make.
I'd like to write a method that goes through all the pages and retrieves all the offsets. I can write it in a synchronous manner like so
int startFrom = 0;
List<Integer> offsets = new ArrayList<>();
for (;;) {
CompletableFuture<Response> future = getNext(startFrom);
Response response = future.get(); // an exception stops everything
if (response.getOffsets().isEmpty()) {
break; // we're done
}
offsets.addAll(response.getOffsets());
if (!response.hasMore()) {
break; // we're done
}
startFrom = getLast(response.getOffsets());
}
In other words, we call getNext() with startFrom at 0. If an exception is thrown, we short-circuit the entire process. Otherwise, if there are no offsets, we complete. If there are offsets, we add them to the master list. If there are no more left to fetch, we complete. Otherwise, we reset the startFrom to the last offset we fetched and repeat.
Ideally, I want to do this without blocking with CompletableFuture::get() and returning a CompletableFuture<List<Integer>> containing all the offsets.
How can I do this? How can I compose the futures to collect their results?
I'm thinking of a "recursive" (not actually in execution, but in code)
private CompletableFuture<List<Integer>> recur(int startFrom, List<Integer> offsets) {
CompletableFuture<Response> future = getNext(startFrom);
return future.thenCompose((response) -> {
if (response.getOffsets().isEmpty()) {
return CompletableFuture.completedFuture(offsets);
}
offsets.addAll(response.getOffsets());
if (!response.hasMore()) {
return CompletableFuture.completedFuture(offsets);
}
return recur(getLast(response.getOffsets()), offsets);
});
}
public CompletableFuture<List<Integer>> getAll() {
List<Integer> offsets = new ArrayList<>();
return recur(0, offsets);
}
I don't love this, from a complexity point of view. Can we do better?
I also wanted to give a shot at EA Async on this one, as it implements Java support for async/await (inspired from C#). So I just took your initial code, and converted it:
public CompletableFuture<List<Integer>> getAllEaAsync() {
int startFrom = 0;
List<Integer> offsets = new ArrayList<>();
for (;;) {
// this is the only thing I changed!
Response response = Async.await(getNext(startFrom));
if (response.getOffsets().isEmpty()) {
break; // we're done
}
offsets.addAll(response.getOffsets());
if (!response.hasMore()) {
break; // we're done
}
startFrom = getLast(response.getOffsets());
}
// well, you also have to wrap your result in a future to make it compilable
return CompletableFuture.completedFuture(offsets);
}
You then have to instrument your code, for example by adding
Async.init();
at the beginning of your main() method.
I must say: this really looks like magic!
Behind the scenes, EA Async notices there is an Async.await() call within the method, and rewrites it to handle all the thenCompose()/thenApply()/recursion for you. The only requirement is that your method must return a CompletionStage or CompletableFuture.
That's really async code made easy!
For the exercise, I made a generic version of this algorithm, but it is rather complex because you need:
an initial value to call the service (the startFrom)
the service call itself (getNext())
a result container to accumulate the intermediate values (the offsets)
an accumulator (offsets.addAll(response.getOffsets()))
a condition to perform the "recursion" (response.hasMore())
a function to compute the next input (getLast(response.getOffsets()))
so this gives:
public <T, I, R> CompletableFuture<R> recur(T initialInput, R resultContainer,
Function<T, CompletableFuture<I>> service,
BiConsumer<R, I> accumulator,
Predicate<I> continueRecursion,
Function<I, T> nextInput) {
return service.apply(initialInput)
.thenCompose(response -> {
accumulator.accept(resultContainer, response);
if (continueRecursion.test(response)) {
return recur(nextInput.apply(response),
resultContainer, service, accumulator,
continueRecursion, nextInput);
} else {
return CompletableFuture.completedFuture(resultContainer);
}
});
}
public CompletableFuture<List<Integer>> getAll() {
return recur(0, new ArrayList<>(), this::getNext,
(list, response) -> list.addAll(response.getOffsets()),
Response::hasMore,
r -> getLast(r.getOffsets()));
}
A small simplification of recur() is possible by replacing initialInput by the CompletableFuture returned by the result of the first call, the resultContainer and the accumulator can be merged into a single Consumer and the service can then be merged with the nextInput function.
But this gives a little more complex getAll():
private <I> CompletableFuture<Void> recur(CompletableFuture<I> future,
Consumer<I> accumulator,
Predicate<I> continueRecursion,
Function<I, CompletableFuture<I>> service) {
return future.thenCompose(result -> {
accumulator.accept(result);
if (continueRecursion.test(result)) {
return recur(service.apply(result), accumulator, continueRecursion, service);
} else {
return CompletableFuture.completedFuture(null);
}
});
}
public CompletableFuture<List<Integer>> getAll() {
ArrayList<Integer> resultContainer = new ArrayList<>();
return recur(getNext(0),
result -> resultContainer.addAll(result.getOffsets()),
Response::hasMore,
r -> getNext(getLast(r.getOffsets())))
.thenApply(unused -> resultContainer);
}
In Java, one can easily generate an infinite stream with Stream.generate(supplier). However, I would need to generate a stream that will eventually finish.
Imagine, for example, I want a stream of all files in a directory. The number of files can be huge, therefore I can not gather all the data upfront and create a stream from them (via collection.stream()). I need to generate the sequence piece by piece. But the stream will obviously finish at some point, and terminal operators like (collect() or findAny()) need to work on it, so Stream.generate(supplier) is not suitable here.
Is there any reasonable easy way to do this in Java, without implementing the entire Stream interface on my own?
I can think of a simple hack - doing it with infinite Stream.generate(supplier), and providing null or throwing an exception when all the actual values are taken. But it would break the standard stream operators, I could use it only with my own operators that are aware of this behaviour.
CLARIFICATION
People in the comments are proposing me takeWhile() operator. This is not what I meant. How to phrase the question better... I am not asking how to filter (or limit) an existing stream, I am asking how to create (generate) the stream - dynamically, without loading all the elements upfront, but the stream would have a finite size (unknown in advance).
SOLUTION
The code I was looking for is
Iterator it = myCustomIteratorThatGeneratesTheSequence();
StreamSupport.stream(Spliterators.spliteratorUnknownSize(it, Spliterator.DISTINCT), false);
I just looked into java.nio.file.Files, how the list(path) method is implemented.
Is there any reasonable easy way to do this in Java, without implementing the entire Stream interface on my own?
A simple .limit() guarantees that it will terminate. But that's not always powerful enough.
After the Stream factory methods the simplest approach for creating customs stream sources without reimplementing the stream processing pipeline is subclassing java.util.Spliterators.AbstractSpliterator<T> and passing it to java.util.stream.StreamSupport.stream(Supplier<? extends Spliterator<T>>, int, boolean)
If you're intending to use parallel streams note that AbstractSpliterator only yields suboptimal splitting. If you have more control over your source fully implementing the Spliterator interface can better.
For example, the following snippet would create a Stream providing an infinite sequence 1,2,3...
in that particular example you could use IntStream.range()
But the stream will obviously finish at some point, and terminal operators like (collect() or findAny()) need to work on it.
short-circuiting operations like findAny() can actually finish on an infinite stream, as long as there is any element that matches.
Java 9 introduces Stream.iterate to generate finite streams for some simple cases.
Kotlin code to create Stream of JsonNode from InputStream
private fun InputStream.toJsonNodeStream(): Stream<JsonNode> {
return StreamSupport.stream(
Spliterators.spliteratorUnknownSize(this.toJsonNodeIterator(), Spliterator.ORDERED),
false
)
}
private fun InputStream.toJsonNodeIterator(): Iterator<JsonNode> {
val jsonParser = objectMapper.factory.createParser(this)
return object: Iterator<JsonNode> {
override fun hasNext(): Boolean {
var token = jsonParser.nextToken()
while (token != null) {
if (token == JsonToken.START_OBJECT) {
return true
}
token = jsonParser.nextToken()
}
return false
}
override fun next(): JsonNode {
return jsonParser.readValueAsTree()
}
}
}
Here is a stream which is custom and finite :
package org.tom.stream;
import java.util.*;
import java.util.function.*;
import java.util.stream.*;
public class GoldenStreams {
private static final String IDENTITY = "";
public static void main(String[] args) {
Stream<String> stream = java.util.stream.StreamSupport.stream(new Spliterator<String>() {
private static final int LIMIT = 25;
private int integer = Integer.MAX_VALUE;
{
integer = 0;
}
#Override
public int characteristics() {
return Spliterator.DISTINCT;
}
#Override
public long estimateSize() {
return LIMIT-integer;
}
#Override
public boolean tryAdvance(Consumer<? super String> arg0) {
arg0.accept(IDENTITY+integer++);
return integer < 25;
}
#Override
public Spliterator<String> trySplit() {
System.out.println("trySplit");
return null;
}}, false);
List<String> peeks = new ArrayList<String>();
List<String> reds = new ArrayList<String>();
stream.peek(data->{
peeks.add(data);
}).filter(data-> {
return Integer.parseInt(data)%2>0;
}).peek(data ->{
System.out.println("peekDeux:"+data);
}).reduce(IDENTITY,(accumulation,input)->{
reds.add(input);
String concat = accumulation + ( accumulation.isEmpty() ? IDENTITY : ":") + input;
System.out.println("reduce:"+concat);
return concat;
});
System.out.println("Peeks:"+peeks.toString());
System.out.println("Reduction:"+reds.toString());
}
}
While the author has discarded the takeWhile option, I find it adequate for certain use cases and worth an explanation.
The method takeWhile can be used on any stream and will terminate the stream when the predicate provided to the method returns false. The object which results in a false is not appended to the stream; only the objects which resulted in true are passed downstream.
So one method for generating a finite stream could be to use the Stream.generate method and return a value which signals the end of the stream by being evaluated to false by the predicate provided to takeWhile.
Here's an example, generating all the permutations of an array :
public static Stream<int[]> permutations(int[] original) {
int dim = original.length;
var permutation = original.clone();
int[] controller = new int[dim];
var low = new AtomicInteger(0);
var up = new AtomicInteger(1);
var permutationsStream = Stream.generate(() -> {
while (up.get() < dim) {
if (controller[up.get()] < up.get()) {
low.set(up.get() % 2 * controller[up.get()]);
var tmp = permutation[low.get()];
permutation[low.get()] = permutation[up.get()];
permutation[up.get()] = tmp;
controller[up.get()]++;
up.set(1);
return permutation.clone();
} else {
controller[up.get()] = 0;
up.incrementAndGet();
}
}
return null;
}).takeWhile(Objects::nonNull);
return Stream.concat(
Stream.ofNullable(original.clone()),
permutationsStream
);
}
In this example, I used the null value to signal the end of the stream.
The caller of the method won't receive the null value !
OP could use a similar strategy, and combine it with a visitor pattern.
If it's a flat directory, OP would be better off using Stream.iterate with the seed being the index of the file to yield and Stream.limit on the number of files (which can be known without browsing the directory).