What's the difference between rxjava2's Maybe and Optional? - java

The doc says
Conceptually, it is a union of Single and Completable providing the
means to capture an emission pattern where there could be 0 or 1 item
or an error signalled by some reactive source.
But I am not sure what it truly means. It seems it is java8's Optional.
The following two codes have the same result , but I don't know what Maybe can do and Optional cannot (or cumbersome) do.
#Test
public void testMaybe1() {
Observable.just(3, 2, 1, 0, -1)
.map(i -> {
try {
int result = 6 / i;
return Maybe.just(result);
} catch (Exception e) {
return Maybe.empty();
}
})
.blockingForEach(maybe -> {
logger.info("result = {}", maybe.blockingGet());
}
);
}
#Test
public void testMaybe2() {
Observable.just(3, 2, 1, 0, -1)
.map(i -> {
try {
int result = 6 / i;
return Optional.of(result);
} catch (Exception e) {
return Optional.empty();
}
})
.blockingForEach(opt -> {
logger.info("result = {}", opt.orElse(null));
}
);
}
The results are the same :
result = 2
result = 3
result = 6
result = null
result = -6
In rxJava1 , My API used to return Observable<Optional<T>> , Is it a bad smell ? Should I change to Observable<Maybe<T>> ?

Maybe is a wrapper around an operation/event that may have either
A single result
No result
Error result
However Optional is a wrapper around a value that may either be
Present
Absent
In your example, in the map operation, the computation is synchronous (i.e. 6/i is synchronous and can result in a value immediately) and you want to propagate a value (if division is possible) or empty value (if division is not possible). Hence using Optional makes more sense.
There are however other options also:
If you want to propagate why division is not possible then you would want to report the exception that occurred. In such a case using Maybe will make more sense.
If you are not interested in both empty value and reason of error, then you simply want to skip propagating those results. In such a scenario I would use a flatMap instead of map. I will then not have to use any of Optional or Maybe.
.flatMap(i -> {
try {
int result = 6 / i;
return Observable.just(result);
} catch (Exception e) {
return Observable.empty();
}
})
Maybe is also useful when you have an Observable that can emit multiple values but you are interested in, let's say, only the first one and hence you use the firstElement() operator on the Observable. This returns a Maybe because either there is a single value, or there is no value (if source Observable does not emit any value before completing) or there is an error (if source Observable errors before emitting any value).

Maybe is a lazy stream of zero or one things (and being a stream can result in an error). Optional is not lazy, it it is either present or absent. There is no sense of deferred calculation with an Optional whereas there is with Maybe.

The difference relevant to your question is that Maybe can propagate error while Optional cannot - in your example one cannot distinguish between error and empty result. If error handling is important, Optional is useless, while Maybe has Maybe.error(Throwable). API-wise, for your use case I would prefer Single to Maybe - because it yields either Error or single Result, so return type would be Observable<Single<T>>

RxJava 2 targets Java 6. This means there is no builtin Optional support guaranteed, and they have to bring their own. Similar to how they have to bring their own Function types.
If your application/library only supports Java >= 8 you can use whatever suits you better.

Related

Avoid isPresent / get dance in Java

Say I have a piece of code that will try a few ways to find some value and, if unsuccessful, log all the ways that it tried.
Example:
public Optional<Integer> getFooOpt() {
Optional<Integer> fooOpt = Optional.empty();
List<String> needles = new ArrayList<>();
Optional<Integer> barOpt = getBarOpt();
if(barOpt.isPresent()) {
Integer bar = barOpt.get();
fooOpt = getFooOptByBar(bar);
if(!fooOpt.isPresent()) {
needles.add("bar " + bar);
}
}
if(!fooOpt.isPresent()) {
Optional<Integer> quxOpt = getQuxOpt();
if(quxOpt.isPresent()) {
Integer qux = quxOpt.get();
fooOpt = getFooOptByQux(qux);
if(!fooOpt.isPresent()) {
needles.add("qux " + qux);
}
}
}
if(!fooOpt.isPresent()) {
log.error("Not found by {}", needles);
}
return fooOpt;
}
Is there a way to restructure this code to avoid all the isPresent / get noise with Optionals here to make the code easier to read / follow?
If you're heavily using Optional, you can try nesting Optional#orElse.
Assuming you have several methods try{something}ToComputeValue that all return Optional<Integer>:
Optional<Integer> value = tryFooToComputeValue()
.orElse(tryBarToComputeValue()
.orElse(tryBazToComputeValue()
.orElseThrow(() -> throw new CannotComputeValue())));
That takes care of daisy chaining those. For logging, you could have each method log when it returns Optional.empty() (has failed to find the value). This doesn't fully match your desired behavior of logging only one statement with all methods. If you really need this, you can probably play with Optional#map or similar so that you add method names to needles when they fail to return a value.
You can simplify this by moving some of the logic into support methods. You don't care about the barOpt and quxOpt references here - all you care about is if they can get your that sweet, sweet fooOpt goodness. I think this is a candidate for OptionalInt rather than Optional. Without the error log message construction you have something like:
OptionalInt fooOpt = getFooOptByBarOpt();
if (fooOpt.isPresent()) {
return fooOpt;
}
fooOpt = getFooOptByQuxOpt();
if (fooOpt.isPresent()) {
return fooOpt;
}
return OptionalInt.empty();
Looking at the error logging there doesn't seem to be a reason to compile a List of failed attempts. You only log if you didn't find a value and if you didn't find a value you know exactly what you attempted.
if (!fooOpt.isPresent()) {
logger.error("Foo not found by bar or qux");
}
return fooOpt;

Ignore exception in stream operations

Assuming you have an exception (checked/unchecked) in a stream operation
and you want to ignore from now on this element.
The stream must not be aborted, just ignoring elements throwing exceptions.
I explicitly avoid saying skip, because it is a stream operation.
So the example is using the map() operation for demonstration.
Here I have a division by zero (for example), so the "map" should skip this
element.
As an example:
#Test
public void ignoreException() {
assertThat(Stream.of(1,2,1,3).map(i -> 10 / i).reduce(0, Integer::sum), is(28));
// the zero will break the next stream
assertThat(Stream.of(1,2,0,3).map(i -> 10 / i).reduce(0, Integer::sum), is(18));
}
So the division by zero can break the whole stream.
I found a lot of articles that wrap a runtime exception in a checked exception (throw new RuntimeException(ex)).
Or partial vs. total functions.
Or I made a wrapper returning a java.util.function.Function
(e.g: ....map(wrapper(i -> 10/i))...),
returning a "null" in the case of a exception. But right-hand operation may now fail,
as in my example (reduce).
The only useful approach is an "EITHER" concept (a stream of EITHER),
so the division by zero in my example
will become a "left" and can be handled in a different way.
There are relatively few operations on streams that can achieve a transformation of elements and result in elements being dropped -- in fact, there's really only one, flatMap.
So your wrapper more or less has to look like
interface CanThrow<F, T> { T apply(F from) throws Exception; }
<T, R> Function<T, Stream<R>> wrapper(CanThrow<T, R> fn) {
return t -> {
try {
return Stream.of(fn.apply(t));
} catch (Exception ignored) { return Stream.empty(); }
}
}
assertThat(Stream.of(1, 2, 0, 3).flatMap(wrapper(i -> 10 / i)).reduce(0, Integer::sum))
.isEqualTo(18));
Try this:
#Test
public void ignoreException() {
assertThat(Stream.of(1,2,1,3).map(i -> i == 0 ? 0 : 10 / i).reduce(0, Integer::sum), is(28));
// the zero will break the next stream
assertThat(Stream.of(1,2,0,3).map(i -> i == 0 ? 0 : 10 / i).reduce(0, Integer::sum), is(18));
}

java reactor filterAndMap?

I want to use reactor achieve:
for (val worker : getWorkers(request)) {
val response = worker.tryDo(work);
if (response != null) {
return response;
}
}
return null;
getWorkers can transfer to return Flux<Worker>, tryDo can also return a mono.
The key is I want exactly one or zero Response and only try the next if the current worker.tryDo fails.
Which operator do I need? I cannot find an answer in the document.
assuming you can rework tryWork to return a Mono that is empty when no work instead of returning null, you could use getWorkers(request).flatMap(worker -> worker.tryDo(work), 1).next()
the 1 parameter to flatMap instruct it to only consider workers 1 by 1.
workers that return an empty mono are effectively not influencing the output of flatMap.
Flux.next() converts to Mono by discarding elements past the first one and cancelling the source.
I found an answer in the gitter, which come from #OlegDokuka:
Mono.fromDirect(workersFlux.concatMap(worker ->
Mono.justOrEmpty(worker.tryDo(work))).take(1))
Update:
Thanks for #Simon Baslé: Use singleOrEmpty instead of fromDirect.
workersFlux.concatMap(worker -> Mono.justOrEmpty(worker.tryDo(work))).take(1).singleOrEmpty()

takeWhile() working differently with flatmap

I am creating snippets with takeWhile to explore its possibilities. When used in conjunction with flatMap, the behaviour is not in line with the expectation. Please find the code snippet below.
String[][] strArray = {{"Sample1", "Sample2"}, {"Sample3", "Sample4", "Sample5"}};
Arrays.stream(strArray)
.flatMap(indStream -> Arrays.stream(indStream))
.takeWhile(ele -> !ele.equalsIgnoreCase("Sample4"))
.forEach(ele -> System.out.println(ele));
Actual Output:
Sample1
Sample2
Sample3
Sample5
ExpectedOutput:
Sample1
Sample2
Sample3
Reason for the expectation is that takeWhile should be executing till the time the condition inside turns true. I have also added printout statements inside flatmap for debugging. The streams are returned just twice which is inline with the expectation.
However, this works just fine without flatmap in the chain.
String[] strArraySingle = {"Sample3", "Sample4", "Sample5"};
Arrays.stream(strArraySingle)
.takeWhile(ele -> !ele.equalsIgnoreCase("Sample4"))
.forEach(ele -> System.out.println(ele));
Actual Output:
Sample3
Here the actual output matches with the expected output.
Disclaimer: These snippets are just for code practise and does not serve any valid usecases.
Update:
Bug JDK-8193856: fix will be available as part of JDK 10.
The change will be to correct whileOps
Sink::accept
#Override
public void accept(T t) {
if (take = predicate.test(t)) {
downstream.accept(t);
}
}
Changed Implementation:
#Override
public void accept(T t) {
if (take && (take = predicate.test(t))) {
downstream.accept(t);
}
}
This is a bug in JDK 9 - from issue #8193856:
takeWhile is incorrectly assuming that an upstream operation supports and honors cancellation, which unfortunately is not the case for flatMap.
Explanation
If the stream is ordered, takeWhile should show the expected behavior. This is not entirely the case in your code because you use forEach, which waives order. If you care about it, which you do in this example, you should use forEachOrdered instead. Funny thing: That doesn't change anything. 🤔
So maybe the stream isn't ordered in the first place? (In that case the behavior is ok.) If you create a temporary variable for the stream created from strArray and check whether it is ordered by executing the expression ((StatefulOp) stream).isOrdered(); at the breakpoint, you will find that it is indeed ordered:
String[][] strArray = {{"Sample1", "Sample2"}, {"Sample3", "Sample4", "Sample5"}};
Stream<String> stream = Arrays.stream(strArray)
.flatMap(indStream -> Arrays.stream(indStream))
.takeWhile(ele -> !ele.equalsIgnoreCase("Sample4"));
// breakpoint here
System.out.println(stream);
That means that this is very likely an implementation error.
Into The Code
As others have suspected, I now also think that this might be connected to flatMap being eager. More precisely, both problems might have the same root cause.
Looking into the source of WhileOps, we can see these methods:
#Override
public void accept(T t) {
if (take = predicate.test(t)) {
downstream.accept(t);
}
}
#Override
public boolean cancellationRequested() {
return !take || downstream.cancellationRequested();
}
This code is used by takeWhile to check for a given stream element t whether the predicate is fulfilled:
If so, it passes the element on to the downstream operation, in this case System.out::println.
If not, it sets take to false, so when it is asked next time whether the pipeline should be canceled (i.e. it is done), it returns true.
This covers the takeWhile operation. The other thing you need to know is that forEachOrdered leads to the terminal operation executing the method ReferencePipeline::forEachWithCancel:
#Override
final boolean forEachWithCancel(Spliterator<P_OUT> spliterator, Sink<P_OUT> sink) {
boolean cancelled;
do { } while (
!(cancelled = sink.cancellationRequested())
&& spliterator.tryAdvance(sink));
return cancelled;
}
All this does is:
check whether pipeline was canceled
if not, advance the sink by one element
stop if this was the last element
Looks promising, right?
Without flatMap
In the "good case" (without flatMap; your second example) forEachWithCancel directly operates on the WhileOp as sink and you can see how this plays out:
ReferencePipeline::forEachWithCancel does its loop:
WhileOps::accept is given each stream element
WhileOps::cancellationRequested is queried after each element
at some point "Sample4" fails the predicate and the stream is canceled
Yay!
With flatMap
In the "bad case" (with flatMap; your first example), forEachWithCancel operates on the flatMap operation, though, , which simply calls forEachRemaining on the ArraySpliterator for {"Sample3", "Sample4", "Sample5"}, which does this:
if ((a = array).length >= (hi = fence) &&
(i = index) >= 0 && i < (index = hi)) {
do { action.accept((T)a[i]); } while (++i < hi);
}
Ignoring all that hi and fence stuff, which is only used if the array processing is split for a parallel stream, this is a simple for loop, which passes each element to the takeWhile operation, but never checks whether it is cancelled. It will hence eagerly ply through all elements in that "substream" before stopping, likely even through the rest of the stream.
This is a bug no matter how I look at it - and thank you Holger for your comments. I did not want to put this answer in here (seriously!), but none of the answer clearly states that this is a bug.
People are saying that this has to with ordered/un-ordered, and this is not true as this will report true 3 times:
Stream<String[]> s1 = Arrays.stream(strArray);
System.out.println(s1.spliterator().hasCharacteristics(Spliterator.ORDERED));
Stream<String> s2 = Arrays.stream(strArray)
.flatMap(indStream -> Arrays.stream(indStream));
System.out.println(s2.spliterator().hasCharacteristics(Spliterator.ORDERED));
Stream<String> s3 = Arrays.stream(strArray)
.flatMap(indStream -> Arrays.stream(indStream))
.takeWhile(ele -> !ele.equalsIgnoreCase("Sample4"));
System.out.println(s3.spliterator().hasCharacteristics(Spliterator.ORDERED));
It's very interesting also that if you change it to:
String[][] strArray = {
{ "Sample1", "Sample2" },
{ "Sample3", "Sample5", "Sample4" }, // Sample4 is the last one here
{ "Sample7", "Sample8" }
};
then Sample7 and Sample8 will not be part of the output, otherwise they will. It seems that flatmap ignores a cancel flag that would be introduced by dropWhile.
If you look at the documentation for takeWhile:
if this stream is ordered, [returns] a stream consisting of the
longest prefix of elements taken from this stream that match the given
predicate.
if this stream is unordered, [returns] a stream consisting of a subset
of elements taken from this stream that match the given predicate.
Your stream is coincidentally ordered, but takeWhile doesn't know that it is. As such, it is returning 2nd condition - the subset. Your takeWhile is just acting like a filter.
If you add a call to sorted before takeWhile, you'll see the result you expect:
Arrays.stream(strArray)
.flatMap(indStream -> Arrays.stream(indStream))
.sorted()
.takeWhile(ele -> !ele.equalsIgnoreCase("Sample4"))
.forEach(ele -> System.out.println(ele));
The reason for that is the flatMap operation also being an intermediate operations with which (one of) the stateful short-circuiting intermediate operation takeWhile is used.
The behavior of flatMap as pointed by Holger in this answer is certainly a reference one shouldn't miss out to understand the unexpected output for such short-circuiting operations.
Your expected result can be achieved by splitting these two intermediate operations by introducing a terminal operation to deterministically use an ordered stream further and performing them for a sample as :
List<String> sampleList = Arrays.stream(strArray).flatMap(Arrays::stream).collect(Collectors.toList());
sampleList.stream().takeWhile(ele -> !ele.equalsIgnoreCase("Sample4"))
.forEach(System.out::println);
Also, there seems to be a related Bug#JDK-8075939 to trace this behavior already registered.
Edit: This can be tracked further at JDK-8193856 accepted as a bug.

RxJava: How to conditionally apply Operators to an Observable without breaking the chain

I have a chain of operators on an RxJava observable. I'd like to be able to apply one of two operators depending on a boolean value without "breaking the chain".
I'm relatively new to Rx(Java) and I feel like there's probably a more idiomatic and readable way of doing this than my current approach of introducing a temporary variable.
Here's a concrete example, buffering items from an observable if a batch size field is non-null, otherwise emitting a single batch of unbounded size with toList():
Observable<Item> source = Observable.from(newItems);
Observable<List<Item>> batchedSource = batchSize == null ?
source.toList() :
source.buffer(batchSize);
return batchedSource.flatMap(...).map(...)
Is something like this possible? (pseudo-lambdas because Java):
Observable.from(newItems)
.applyIf(batchSize == null,
{ o.toList() },
{ o.buffer(batchSize) })
.flatMap(...).map(...)
You can use compose(Func1) to stay in-sequence but do custom behavior
source
.compose(o -> condition ? o.map(v -> v + 1) : o.map(v -> v * v))
.filter(...)
.subscribe(...)
You can also use filter operator with defaultIfEmpty if you wish to emit single value or switchIfEmpty if you wish to emit multiple values using another Observable.
val item = Observable.just("ABC")
item.filter { s -> s.startsWith("Z") }
.defaultIfEmpty("None")
.subscribe { println(it) }

Categories

Resources