Does the JDK provide a dummy consumer? - java

I have a need in a block of code to consume 'n' items from a stream then finish, in essence:
public static <T> void eat(Stream<T> stream, int n)
// consume n items of the stream (and throw them away)
}
In my situation, I can't change the signature to return Stream<T> and simply return stream.skip(n); I have to actually throw away some elements from the stream (not simple logic) - to be ready for a down stream consumer which doesn't need to know how, or even that, this has happened.
The simplest way to do this is to use limit(n), but I have to call a stream terminating method to activate the stream, so in essence I have:
public static <T> void skip(Stream<T> stream, int n) {
stream.limit(n).forEach(t -> {});
}
Note: This code is a gross over simplification of the actual code and is for illustrative purposes only. Actually, limit won't work because there is logic around what/how to consume elements. Think of it like consuming "header" elements from a stream, then having a consumer consume the "body" elements.
This question is about the "do nothing" lambda t -> {}.
Is there a "do nothing" consumer somewhere in the JDK, like the "do nothing" function Function.identity()?

No, JDK does not provide dummy consumer as well as other predefined functions like dummy runnable, always-true predicate or supplier which always returns zero. Just write t -> {}, it's anyways shorter than calling any possible ready method which will do the same.

Introducing the dummy (empty) consumer was considered in the scope of the ticket:
[JDK-8182978] Add default empty consumer - Java Bug System.
Archived: [JDK-8182978] Add default empty consumer - Java Bug System.
According to the ticket, it was decided not to introduce it.
Therefore, there is no dummy (empty) consumer in the JDK.

Yes. Well, more or less yes...
Since a Function is also a Consumer, you can use Function.identity() as a "do nothing" Consumer.
However, the compiler needs a little help to make the leap:
someStream.forEach(identity()::apply);

Related

next() vs Single() Which one is effective?

I want cast to mono from flux. However, I can't decide when to use single() or next() and don't know which one is more effective?
Flux<String> optionalIdsFlux = Flux
.fromIterable(result.getPersonalizationEntity())
.filter(i -> i.getKey().equals(PERSONALIZATION_KEY))
.next() // or single() ??
.map(DataEntity::getValue)
.flatMapMany(Flux::fromIterable);`
They are actually quite different. next() takes the first value that is emitted and cancels the subscription afterwards.
single(), on the other hand, expects that exactly one element is emitted in the first place. If that's not the case, and zero or more elements are emitted, then an error signal is emitted.
Which one to choose depends on your use case. If the source is guaranteed to emit exactly one element, then you can use single(). Otherwise use next().
Additionally to what a better oliver said, if there is a super strong guarantee that the Publisher you have only ever emits at most one onNext, you can turn it into a Mono - without the safety belt on - by using Mono.fromDirect(Publisher).

How to perform a completable and then return a processed result with single?

I hope my question is clear enough.
Let's say your have an API that performs requests over a network backed by IO-streams (input and output).
Using RxJava (which I am debuting with), I would think it could be possible to do the following:
public Single<MyData> getMyDataFromApi() {
return requestMyData()
.map/flat/then()->waitAndprocessData() // here is the missing link for me. What can I use ?
.andThen()->returnData()
As you will understand, the method requestMyData returns a Completable which sole responsibility and only task it to perform said request (IO-type operation).
Then, upon performing the request, the remote entity shall process it and return a result the requested MyData object by performing an IO-type operation as well.
The key-point here, is that I work with streams (both input and output) which reading and writing operations are obviously performed in separate IO threads (using Schedulers.io()).
So in the end, is there a way so that my getMyDataFromApi() method does the following :
Perform the request -> it's a completable
Wait for the result -> something like a subscribe ? but without splitting the chain
Process the result -> it's a single or can be a lambada in map method
Return the result -> final element, obviously a single
To conclude, I strongly believe that requestMyData's signature should be that of a Single, because it's getter and I am expecting a result or an error.
Without having the implementation of the methods is quite hard to understand the real problem.
If requestMyData returns a Completable and waitAndprocessData a Single, you can do the following:
return requestMyData().andThen(waitAndprocessData());
Anyway remember that a Completable is computation without any value but only indication for completion (or exceptions).

Invoking .map() on an infinite stream?

According to Javadocs for SE 8 Stream.map() does the following
Returns a stream consisting of the results of applying the given function to the elements of this stream.
However, a book I'm reading (Learning Network Programming with Java, Richard M. Reese) on networking implements roughly the following code snippet in an echo server.
Supplier<String> inputLine = () -> {
try {
return br.readLine();
} catch(IOException e) {
e.printStackTrace();
return null;
}
};
Stream.generate(inputLine).map((msg) -> {
System.out.println("Recieved: " + (msg == null ? "end of stream" : msg));
out.println("echo: " + msg);
return msg;
}).allMatch((msg) -> msg != null);
This is supposed to be a functional way to accomplish getting user input to print to the socket input stream. It works as intended, but I don't quite understand how. Is it because map knows the stream is infinite so it lazily executes as new stream tokens become available? It seems like adding something to a collection currently being iterated over by map is a little black magick. Could someone please help me understand what is going on behind the scenes?
Here is how I restated this in order to avoid the confusing map usage. I believe the author was trying to avoid an infinite loop since you can't break out of a forEach.
Stream.generate(inputLine).allMatch((msg) -> {
boolean alive = msg != null;
System.out.println("Recieved: " + (alive ? msg : "end of stream"));
out.println("echo: " + msg);
return alive;
});
Streams are lazy. Think of them as workers in a chain that pass buckets to each other. The laziness is in the fact that they will only ask the worker behind them for the next bucket if the worker in front of them asks them for it.
So it's best to think about this as allMatch - being a final action, thus eager - asking the map stream for the next item, and the map stream asking the generate stream for the next item, and the generate stream going to its supplier, and providing that item as soon as it arrives.
It stops when allMatch stops asking for items. And it does so when it knows the answer. Are all items in this stream not null? As soon as the allMatch receives an item that is null, it knows the answer is false, and will finish and not ask for any more items. Because the stream is infinite, it will not stop otherwise.
So you have two factors causing this to work the way it work - one is allMatch asking eagerly for the next item (as long as the previous ones weren't null), and the generate stream that - in order to supply that next item - may need to wait for the supplier that waits for the user to send more input.
But it should be said that map shouldn't have been used here. There should not be side effects in map - it should be used for mapping an item of one type to an item of another type. I think this example was used only as a study aid. The much simpler and straightforward way would be to use BufferedReader's method lines() which gives you a finite Stream of the lines coming from the buffered reader.
Yes - Streams are setup lazily until and unless you perform a terminal operation (final action) on the Stream. Or simpler:
For as long as the operations on your stream return another Stream, you do not have a terminal operation, and you keep on chaining until you have something returning anything other than a Stream, including void.
This makes sense, as to be able to return anything other than a Stream, the operations earlier in your stream will need to be evaluated to actually be able to provide the data.
In this case, and as per documentation, allMatch returns a boolean, and thus final execution of your stream is required to calculate that boolean. This is the point also where you provide a Predicate limiting your resulting Stream.
Also note that in the documentation it states:
This is a short-circuiting terminal operation.
Follow that link for more information on those terminal operations, but a terminal operation basically means that it will actually execute the operation. Additionally, the limiting of your infinite stream is the 'short-circuiting' aspect of that method.
Here are two the most relevant sentences of the java-stream documentation. The snippet you provided is a perfect example of these working together:
Stream::generate(Supplier<T> s) says that it returns:
Returns an infinite sequential unordered stream where each element is generated by the provided Supplier.
3rd dot of Stream package documentation:
Laziness-seeking. Many stream operations, such as filtering, mapping, or duplicate removal, can be implemented lazily, exposing opportunities for optimization. For example, "find the first String with three consecutive vowels" need not examine all the input strings. Stream operations are divided into intermediate (Stream-producing) operations and terminal (value- or side-effect-producing) operations. Intermediate operations are always lazy.
In a shortcut, this generated stream await the further elements until the terminal operation is reached. As long as the execution inside the supplied Supplier<T>, the stream pipeline continues.
As an example, if you provide the following Supplier, the execution has no chance to stop and will continue infinitely:
Supplier<String> inputLine = () -> {
return "Hello world";
};

Mono<T> and Flux<T> as a parameter in function

What is the use case for Mono<T> and Flux<T> as parameter in function.
Code
Flux<String> findByLastName(Mono<String> lastname) {
//implementation
}
When I invoke above method from rest, what will be difference from not using String.class as parameter?
To answer your first comment question:
#ErwinBolwidt i know use case for Mono/Flux in computation but i dont understand specifically using it as method parameter – Bibek Shakya
When you use it as a parameter you have to deal with it as a stream (meaning you don't have it yet) so for example you should never say lastname.block(), because this means you've just blocked the thread until this value is available.
Disclaimer Extra information
If you're asking whether you should wrap anything from now on in a Mono or a flux, then of course not, because it adds unnecessary complexity to the method and the caller.
And for a design perspective, answer is simple, by asking basic questions:
When to use a Mono in general ?
Well, when you still don't have the value.
When to use Flux in general ?
Well, when you have a stream of data coming or not.
So we should not think of who is using the method and try to make the method convenient for him, but actually we should think of the method needs.
And a use case for that is when the method actually needs argument in this way; meaning you actually do stream processing inside, for example your method accepts an infinite stream of sensor data, and the method inside is going crazy like:
Flux<Point> processSensor(Flux<Double> data){
return data.filter(blabla).zipWith(blabla).map(...);
}
Only use cases I can think of why a method parameter is Mono<String> lastname
Was retrieved from a WebClient/Router type function
#Secured("ROLE_EVERYONE") was used in a previous method to retrieve the lastname
For this to work the return type of the method must be a
org.reactivestreams.Publisher (i.e. Mono/Flux).

Java 8 apply function to all elements of Stream without breaking stream chain

Is there a way in Java to apply a function to all the elements of a Stream without breaking the Stream chain? I know I can call forEach, but that method returns a void, not a Stream.
There are (at least) 3 ways. For the sake of example code, I've assumed you want to call 2 consumer methods methodA and methodB:
A. Use peek():
list.stream().peek(x -> methodA(x)).forEach(x -> methodB(x));
Although the docs say only use it for "debug", it works (and it's in production right now)
B. Use map() to call methodA, then return the element back to the stream:
list.stream().map(x -> {method1(x); return x;}).forEach(x -> methodB(x));
This is probably the most "acceptable" approach.
C. Do two things in the forEach():
list.stream().forEach(x -> {method1(x); methodB(x);});
This is the least flexible and may not suit your need.
You are looking for the Stream's map() function.
example:
List<String> strings = stream
.map(Object::toString)
.collect(ArrayList::new, ArrayList::add, ArrayList::addAll);
The best option you have is to apply the map to your stream. which returns a stream consisting of the results of applying the given function to the elements of the stream.
For example:
IntStream.range(1, 100)
.boxed()
.map(item->item+3)
.map(item->item*2)...
We are applying several modifications to the stream but in some case we don't want to modify the stream. We just want to visit every element and then pass it down the stream without modification (like the peek() method in the streams API). in such cases, we can
StreamItem peekyMethod(StreamItem streamItemX) {
// .... visit the streamItemX
//Then pass it down the stream
return streamItemX;
}
Not entirely sure what you mean by breaking the stream chain, but any operation on a Stream that returns a Stream will not break or consume your Stream. Streams are consumed by terminal operations and as you noted the forEach does not return a Stream<T> and as such ends the stream, by executing all the intermediate operations before the forEach and the forEach itself.
In the example that you provided in the comments:
myStream.map(obj -> {obj.foo(); return obj;}
You can't really do this with one liner. Of course you could use a method reference, but then your returned Stream would be of a different type (assuming foo returns a type):
myStream.map(Obj::foo) // this will turn into Stream<T>, where T is
// the return type of foo, instead of Stream<Obj>
Besides that your map operation is stateful, which is strongly discouraged. Your code will compile and might even work as you want it to - but it might later fail. map operations should be stateless.
You can use map method but you have to create helper method which returns this. For example:
public class Fluent {
public static <T> Function<T, T> of(Consumer<T> consumer) {
return t -> {
consumer.accept(t);
return t;
};
}
}
And use it when you want to call void method:
list.stream().map(Fluent.of(SomeClass::method));
or if you want to use it with method with some argument:
list.stream().map(Fluent.of(x -> x.method("hello")))
I think you are looking for Stream.peek. But read the docs carefully, as it was designed mainly as a debug method. From the docs:
This method exists mainly to support debugging, where you want to see the elements as they flow past a certain point in a pipeline
The action passed to peek must be non interfering.
I think the cleanest way is to add a mutator to the objects in the stream.
For example,
class Victim {
private String tag;
private Victim withTag(String t)
this.tag = t;
return this;
}
}
List<Victim> base = List.of(new Victim());
Stream<Victim> transformed = base.stream().map(v -> v.withTag("myTag"));
If you prefer (and many will), you can have the withTag method create and return a new Victim; this allows you to make Victim immutable.

Categories

Resources