In project reactor flux there is a sample method Flux#sample java doc. It changes flux so that it emits events only at the ends of specified periods.
Is it possible to tweak this behaviour and achieve this : on first element - emit it instantly , start sampling with delay from 2nd up to the end. Basically I want to exclude first (and only first) element from sampling so that it is emmited without initial wait.
Would it be possible to achieve using built-in operators ? If not then does anybody have an idea how to approach this problem ?
Here is a simplest example of what I want to achieve :
Flux<String> inputFlux = Flux.just("first", "second", "third").delayElements(Duration.ofMillis(400));
Flux<String> transformed = /*do some magic with input flux*/;
StepVerifier.create(transformed)
.expectNext("first")//first should always be emmited instantly
//second arrives 400ms after first
//third arrives 400ms after second
.expectNoEvent(Duration.ofSeconds(1))
.expectNext("third")//after sample period last received element should be received
.verifyComplete();
By turning the source flux myFlux into a hot flux, you can easily achieve this:
Flux<T> myFlux;
Flux<T> sharedFlux = myFlux.publish().refCount(2);
Flux<T> first = sharedFlux.take(1);
Flux<T> sampledRest = sharedFlux.skip(1).sample(Duration.ofMillis(whatever));
return Flux.merge(first, sampledRest);
You could achieve it with Flux#sample(org.reactivestreams.Publisher<U>) method.
yourFlux.take(1)
.mergeWith(yourFlux.sample(Flux.interval(yourInterval)
.delaySubscription(yourFlux.take(1))))
Related
What/is there a difference between:
flux2 = flux1.repeat().map(x -> x + 1).repeat();
and
flux2 = flux1.repeat().map(x -> x + 1);
As the Flux never completes because of the first repeat(), the second repeat() will never have a chance to resubscribe so it's basically a no-op.
However, note that this is not always the case for any two repeats within a Flux chain. If between the first and the second repeat there is an operator which would complete the publisher based on some condition (e.g. take, takeUntil), then the second repeat would make the otherwise finite Flux infinite.
Context:
I have a storage/DB where I store a <String, List<Integer>> -> <uniqueName, numbersPresentForTheName> mapping each time:
I am writing an API that does the following:
How can we make sure this does not happen?
The usual answer is that you use a compare-and-update operation, rather than just an update operation.
t = 1 -> call readStorage
t = 10 -> get response from readStorage <ABCD, [1,2,3]>
t = 11 -> comapreAndUpdateStorage ABCD, [1,2,3] -> [1,2,3,4]
t = 5 -> Request 1 to the API: <ABCD, [5]>
t = 6 -> call readStorage
t = 11 -> get response from readStorage <ABCD, [1,2,3]> (note that we didn't get 4 as first call didnt update 4 yet)
t = 13 -> compareAnUpdateStorage ABCD, [1,2,3] -> [1,2,3,5] (This call will fail, because the current value is [1,2,3,4])
In other words, we're trying to address the lost edit problem by ensuring that the first edit always wins.
That's the first piece; the rest of the work is choosing an appropriate retry strategy.
If your two calls are on separate threads, and the steps of the two calls are interleaved over time, then you are seeing correct behavior.
If both threads retrieved a set of three values, and both threads replaced the existing set in storage with a new set of four values, then whichever thread saves their set last wins.
The key idea here to understand is atomic actions. The retrieval of the data and the update of the data in your system are separate actions. To achieve your aim, you must combine those actions two actions into one. This combining is the purpose of transactions and locks in databases.
See also the correct Answer by VoiceOfUnreason along the same line.
Recently I started using project reactor 3.3 and I don't know what is the best way to handle flux of lines, with first line as column names, then use those column names to process/convert all other lines. Right now I'm doing this way:
Flux<String> lines = ....;
Mono<String[]> columns = Mono.from(lines.take(1).map(header -> header.split(";"))); //getting first line
Flux<SomeDto> objectFlux = lines.skip(1) //skip first line
.flatMapIterable(row -> //iterating over lines
columns.map(cols -> convert(cols, row))); //convert line into SomeDto object
So is it the right way?
So is it the right way?
There's always more than one way to cook an egg - but the code you have there seems odd / suboptimal for two main reasons:
I'd assume it's one line per record / DTO you want to extract, so it's a bit odd you're using flatMapIterable() rather than flatMap()
You're going to resubscribe to lines once for each line, when you re-evaluate that Mono. That's almost certainly not what you want to do. (Caching the Mono helps, but you'd still resubscribe at least twice.)
Instead you may want to look at using switchOnFirst(), which will enable you to dynamically transform the Flux based on the first element (the header in your case.) This means you can do something like so:
lines
.switchOnFirst((signal, flux) -> flux.zipWith(Flux.<String[]>just(signal.get().split(";")).repeat()))
.map(row -> convert(row.getT1(), row.getT2()))
Note this is a bear-bones example, in real-world use you'll need to check whether the signal actually has a value as per the docs:
Note that the source might complete or error immediately instead of emitting, in which case the Signal would be onComplete or onError. It is NOT necessarily an onNext Signal, and must be checked accordingly.
I am learning Java 11 reactor. I have seen this example:
StepVerifier.withVirtualTime(() -> Flux.interval(Duration.ofSeconds(1)).take(3600))
.expectSubscription()
.expectNextCount(3600);
This example just checks that with a Flux<Long> which increments 1 after every second till one hour, the final count is 3600.
But, is there any way to check the counter repeatedly after every second?
I know this:
.expectNoEvent(Duration.ofSeconds(1))
.expectNext(0L)
.thenAwait(Duration.ofSeconds(1))
But I have seen no way to repeatedly check this after every second, like:
.expectNoEvent(Duration.ofSeconds(1))
.expectNext(i)
.thenAwait(Duration.ofSeconds(1))
when i increments till 3600. Is there?
PS:
I tried to add verifyComplete() at last in a long-running tests and it will never end. Do I have to? Or just ignore it?
You can achieve what you wanted by using expectNextSequence. You have to pass an Iterable and include every element you expect to arrive. See my example below:
var longRange = LongStream.range(0, 3600)
.boxed()
.collect(Collectors.toList());
StepVerifier
.withVirtualTime(() -> Flux.interval(Duration.ofSeconds(1)).take(3600))
.expectSubscription()
.thenAwait(Duration.ofHours(1))
.expectNextSequence(longRange)
.expectComplete().verify();
If you don't add verifyComplete() or expectComplete().verify() then the JUnit test won't wait until elements arrive from the flux and just terminate.
For further reference see the JavaDoc of verify():
this method will block until the stream has been terminated
The idea is when I call publishSubject.onNext(someValue) multiple times I need to get only one value like debounce operator does, but it delivers the last value, and I need to skip all values except first in a bunch till I stop calling onNext() for 1 sec.
I've tried to use something like throttleFirst(1000,TimeUnit.MILLISECONDS) but it's not working like debounce, it just makes windows after every delivery and after 1 sec immediate deliver next value.
Try this:
// Observable<T> stream = ...;
stream.window(stream.debounce(1, TimeUnit.Seconds))
.flatMap(w -> w.take(1));
Explanation: If I understand you correctly, you want to emit items if none have been emitted for 1 second prior. This is equivalent to getting the first element following an item debounced by 1 second. The below marble diagram may also help:
You can use the first operator. Like:
Observable.first()
It will take only the first value