Supply value to Observable - java

I am new to rxjava and I have the following problem:
Objects are irregularly dropped into a FIFO queue by an outside system. I need an Observable which runs every second, takes an item from the queue (if there is one) and emits it to subscribers.
Two problems:
The queue items are produced while the Observable is alive, it is not possible to provide all items upfront. The queue may run empty, in which case the Observable must standby and not emit anything. (It would be nice if the Observable would jump-start immediately when an item becomes available in the queue after a pause, but then the queue would probably need to be an Observable as well if we don't want to poll more frequently, no idea how.)
It must be possible for the outside system to complete the Observable. I could set a variable and read it from within the Observable, but I'd like to know if there is a more elegant way to do that.
LinkedList<Layer> queue = new LinkedList<Layer>(); // the queue
boolean stopObservable = false; // the variable to stop the observable
Observable.create(new Observable.OnSubscribe<Layer>() {
#Override public void call(Subscriber<? super Layer> subscriber) {
try {
if (!queue.isEmpty()) {
Layer layer = queue.poll();
subscriber.onNext(layer);
} else {
if (stopObservable) { subscriber.onCompleted(); }
}
} catch (Exception e) {
subscriber.onError(e);
}
}
}).somethingThatCreatesTheInterval().subscribeOnEtc.
For the interval, I cannot use .sample(), because it drops items, and it is important that all items are emitted.
.throttleWithTimeout() looks better, but it also seems to drop items.
rx is very cool, but tough to get into. Any input appreciated.

I did something similar when I needed to poll external web services at a regular time interval.
For the time interval you could proceed with a timer ; upon each tick with a granularity of 1s the observable chain will poll and maybe pick one layer, if that layer is null then nothing is emitted
Observable.timer(0, 1, TimeUnit.SECOND)
.flatMap(tick -> Observable.just(queue.poll()).filter(layer -> layer != null))
.subscribe(layer -> System.out.format("The layer is : %s", layer));
Now if you want to abort the whole the chain you may add takeUntil. So when your external system want to stop it will submit something in stopObservable which will stop subsequent subscription :
// somewhere before
PublishSubject stopNotifier = PublishSubject.create();
// somewhere process the queue
Observable.timer(0, 1, TimeUnit.SECOND)
.takeUntil(stopNotifier)
.flatMap(tick -> Observable.just(queue.poll()))
.subscribe(layer -> System.out.format("The layer is : %s", layer));
// when not anymore interested (calling onComplete works too)
stopNotifier.onNext("cancel everything about the queue");
I'm writing this response from a tablet so you may assume I may have misspell some words or made naive programming errors ;)

If possible, you should use a PublishSubject<Layer> instead of a LinkedList<Layer>. Then, the outside system can provide new items by calling publishSubject.onNext, and since PublishSubject is a subclass of Observable, your system can treat it as an Observable, and, depending on what semantics with regard to timing you want, apply one of these operators to it:
sample
debounce
throttleFirst/throttleLast/throttleWithTimeout
.zipWith(Observable.timer(1, TimeUnit.SECONDS), (value, tick) -> value) (might do a lot of buffering!)
no timing modification at all (consider this as well)

Related

RxJava emit items manually

I'm a newbie in RxJava world. I started few days ago and I'm trying to do something very concrete.
I would like to use RxJava to communicate some events from one class to other classes. My emitter class do some work and when finish I want notify that event to every subscribed class.
Also, I would like to don't loose any event, so if emitter finish an action and notifies to subscribers but no one consumes that event it should keep it.
I have been looking Subjects because i read was the good approach for emit by demand but I can't find one that solves my problem, may be I could try PublishSubject in the way they say in that link:
Note that a PublishSubject may begin emitting items immediately upon creation (unless you have taken steps to prevent this), and so there is a risk that one or more items may be lost between the time the Subject is created and the observer subscribes to it. If you need to guarantee delivery of all items from the source Observable, you’ll need either to form that Observable with Create so that you can manually reintroduce “cold” Observable behavior (checking to see that all observers have subscribed before beginning to emit items), or switch to using a ReplaySubject instead.
At the moment I have an Observable that emits the complete list of events to every new Subscription:
protected List<T> commands = new ArrayList<>();
Observable<T> observe = Observable.defer(() -> Observable.fromIterable(commands));
Disposable subscribe(Consumer<T> onNext, Consumer<Throwable> onError) {
return observe.subscribe(onNext/*, onError*/);
}
But this is far away from what I want. As I said I'm pretty noob with Rx, if I can't find a way I will end up using ReplaySubject (The complete list of commands would be repeated on every subscription but at least I will never lose one event).
EDIT:
Finally I split my problem in two parts. For emit items manually I use PublishSubject
PublishSubject<T> actionSubject = PublishSubject.create();
actionSubject.subscribe(onceAction);
actionSubject.onNext(action);
You are probably looking for a BehaviorSubject which replays last value emitted for every new subscriber.
Anyhow, you can create ReplaySubject with the factory method called createWithSize(capacity) to define size-bounded buffer, which discards the oldest item on overflow. There is also a method to create time-bounded buffer with similar behavior.
Isnt it what are you looking for?
Observable<Integer> observable = Observable.create((ObservableEmitter<Integer> e) -> {
for (int i = 0; i < 10; i++) {
e.onNext(i);
}
e.onComplete();
});
System.out.println("FIRST");
observable
.doOnComplete(() -> System.out.println("FIRST COMPLETE"))
.subscribe(System.out::println);
System.out.println("SECOND");
observable
.doOnComplete(() -> System.out.println("SECOND COMPLETE"))
.subscribe(System.out::println);
Output:
FIRST
0
1
2
3
4
5
6
7
8
9
FIRST COMPLETE
SECOND
0
1
2
3
4
5
6
7
8
9
SECOND COMPLETE

Am I misusing rxJava by converting an observable into a blocking observable?

My API makes about 100 downstream calls, in pairs, to two separate services. All responses need to be aggregated, before I can return my response to the client. I use hystrix-feign to make the HTTP calls.
I came up with what I believed was an elegant solution until on the rxJava docs I've found the following
BlockingObservable is a variety of Observable that provides blocking operators. It can be useful for testing and demo purposes, but is generally inappropriate for production applications (if you think you need to use a BlockingObservable this is usually a sign that you should rethink your design).
My code looks roughly as follows
List<Observable<C>> observables = new ArrayList<>();
for (RequestPair request : requests) {
Observable<C> zipped = Observable.zip(
feignClientA.sendRequest(request.A()),
feignClientB.sendRequest(request.B()),
(a, b) -> new C(a,b));
observables.add(zipped);
}
Collection<D> apiResponse = = new ConcurrentLinkedQueue<>();
Observable
.merge(observables)
.toBlocking()
.forEach(combinedResponse -> apiResponse.add(doSomeWork(combinedResponse)));
return apiResponse;
Few questions based on this setup:
Is toBlocking() justified given my use case
Am I correct in understanding that the actual HTTP calls do not get made until the main thread gets to the forEach()
I've seen that the code in the forEach() block is executed by different threads, but I was not able to verify if there can be more than one thread in the forEach() block. Is the execution there concurrent?
A better option is to return the Observable to be consumed by other operators but you may get away with blocking code (It should, however, run on a background thread.)
public Observable<D> getAll(Iterable<RequestPair> requests) {
return Observable.from(requests)
.flatMap(request ->
Observable.zip(
feignClientA.sendRequest(request.A()),
feignClientB.sendRequest(request.B()),
(a, b) -> new C(a,b)
)
, 8) // maximum concurrent HTTP requests
.map(both -> doSomeWork(both));
}
// for legacy users of the API
public Collection<D> getAllBlocking(Iterable<RequestPair> requests) {
return getAll(requests)
.toList()
.toBlocking()
.first();
}
Am I correct in understanding that the actual HTTP calls do not get made until the main thread gets to the forEach()
Yes, the forEach triggers the whole sequence of operations.
I've seen that the code in the forEach() block is executed by different threads, but I was not able to verify if there can be more than one thread in the forEach() block. Is the execution there concurrent?
Only one thread at a time is allowed to execute the lambda in forEach but you may indeed see different threads entering there.

Get the latest value of an Observable and emit it immeditely

I'm trying to get the latest value of a given Observable and get it to emit
immediately once it's called. Given the code below as an example:
return Observable.just(myObservable.last())
.flatMap(myObservable1 -> {
return myObservable1;
})
.map(o -> o.x) // Here I want to end up with a T object instead of Observable<T> object
This does not work because by doing this the flatMap will emit myObservable1 which in turn will have
to emit to reach the map.
I don't know if doing such a thing is even possible. Does anyone have any clue on how to achieve this goal? Thank you
last() method will not be of any help here as it waits for the Observable to terminate to give you the last item emitted.
Assuming that you do not have the control over the emitting observable you could simply create a BehaviorSubject and subscribe it to the observable that emits the data that you want to listen and then subscribe to the created subject. Since Subject is both Observable and Subscriber you will get what you want.
I think (do not have the time to check it now) you may have to manually unsubscribe from the original observable as the BehaviorSubject once all of his subscribers unsubscribe will not unsubscribe automatically.
Something like this:
BehaviorSubject subject = new BehaviorSubject();
hotObservable.subscribe(subject);
subject.subscribe(thing -> {
// Here just after subscribing
// you will receive the last emitted item, if there was any.
// You can also always supply the first item to the behavior subject
});
http://reactivex.io/RxJava/javadoc/rx/subjects/BehaviorSubject.html
In RxJava, subscriber.onXXX is called asynchronous.It means that if your Observable emit items in new thread, you can never get the last item before return, except you block the thread and wait for the item.But if the Observable emit item synchronously and you dont' change it's thread by subscribeOn and observOn,
such as the code:
Observable.just(1,2,3).subscribe();
In this case, you can get the last item by doing like this:
Integer getLast(Observable<Integer> o){
final int[] ret = new int[1];
Observable.last().subscribe(i -> ret[0] = i);
return ret[0];
}
It's a bad idea doing like this.RxJava prefer you to do asynchronous work by it.
What you actually want to achieve here is to take an asynchronous task and transform it to a synchronous one.
There are several ways to achieve it, each one with it's pros and cons:
Use toBlocking() - it means that this thread will be BLOCKED, until the stream is finish, in order to get only one item simply use first() as it will complete once an item is delivered.
let's say your entire stream is Observable<T> getData();
then a method that will get the last value immediately will look like this:
public T getLastItem(){
return getData().toBlocking().first();
}
please don't use last() as it will wait for the stream to complete and only then will emit the last item.
If your stream is a network request and it didn't get any item yet this will block your thread!, so only use it when you are sure that there is an item available immediately (or if you really want a block...)
another option is to simply cache the last result, something like this:
getData().subscribe(t-> cachedT = t;) //somewhere in the code and it will keep saving the last item delivered
public T getLastItem(){
return cachedT;
}
if there wasn't any item sent by the time you request it you will get null or whatever initial value you have set.
the problem with this approch is that the subscribe phase might happen after the get and might make a race condition if used in 2 different threads.

Block before draining ArrayBlockingQueue

I find myself repeating this pattern and have often wondered if it is idiomatic in Java or there is a better way of achieving this behaviour.
Problem: Given a producer/consumer setup, the consumer wants to process batches of items, so it uses drainTo(), however drainTo() will poll for existing items and possibly fail to get any items, to avoid this I prefix the drain with a take() to ensure it blocks until at least one item is available.
One problem I get, with a particular dataset, is with many use cases that the batch size is often irregular alternating between (1, N, 1, N). In general is this a common way to solve this problem:
Example:
ArrayBlockingQueue<Foo> queue;
function void produce() {
while(true) {
queue.put(createFoo());
}
}
function void consumeBatchSpin() {
while(true) {
List<Foo> batch = Lists.newLinkedList();
queue.drainTo(batch);
doSomething(batch);
//the problem here is that if nothing is being produced, this loop will spin
}
}
function void consumeBatchTake() {
while(true) {
List<Foo> batch = Lists.newLinkedList();
batch.add(queue.take()); //force at least one item to be there
queue.drainTo(batch);
doSomething(batch);
}
}
Have you considered adding to a list and taking the whole list on get.
I have posted one here recently. It is undergoing code review here but my tests suggest it is robust.
Essentially, when you do a put you add your new element to the current list. When you do a get you get the whole list and atomically replace it with a new empty one.
No need to use drainTo and no spinning at all.

Incremental Future of list extensions

I essentially have a Future<List<T>> that is fetched in batches from the server. For some clients I'd like to provide incremental results while it loads in addition to the whole collection when future is fulfilled.
Is there a common Future extension defined somewhere for this? What are typical patterns/combinators exist for such futures?
I assume that given IncrementalListFuture<T> I can easily define map operation. What else comes to your mind?
Is there a common Future extension defined somewhere for this?
I assume you are talking about incremental results from an ExecutorService. You should consider using an ExecutorCompletionService which allows you to be informed as soon as one of the Future objects is get-able.
To quote from the javadocs:
CompletionService<Result> ecs = new ExecutorCompletionService<Result>(e);
for (Callable<Result> s : solvers) {
ecs.submit(s);
}
int n = solvers.size();
for (int i = 0; i < n; ++i) {
// this waits for one of the futures to finish and provide a result
Future<Result> future = ecs.take();
Result result = future.get();
if (result != null) {
// do something with the result
}
}
Sorry. I initially misread the question and thought that you were asking about a List<Future<?>>. It may be that you could refactor your code to actually return a number of Futures so I'll leave this for posterity.
I would not pass back the list in this case in a Future. You aren't going to be able to get the return until the job finishes.
If possible, I would pass in some sort of BlockingQueue so both the caller and the thread can access it:
final BlockingQueue<T> queue = new LinkedBlockingQueue<T>();
// build out job with the queue
threadPool.submit(new SomeJob(queue));
threadPool.shutdown();
// now we can consume from the queue as it is built:
while (true) {
T result = queue.take();
// you could some constant result object to mean that the job finished
if (result == SOME_END_OBJECT) {
break;
}
// provide intermediate results
}
You could also have some sort of SomeJob.take() method which calls through to a BlockingQueue defined inside of your job class.
// the blocking queue in this case is hidden inside your job object
T result = someJob.take();
...
Here's what I would do:
In the thread that populates the List, make it thread-safe by wrapping the list using Collections.synchronizedList
Make the list publically available, but not modifiable by adding a public method to the thread which returns the list, but wrapped by Collections.unmodifiableList
Instead of giving clients a Future>, give them a handle to the thread, or some kind of wrapper of it, so that they can call the public method above.
Alternatively, as Gray has suggested, BlockingQueues are great for thread coordination like this. This may require more changes to your client code, however.
To answer my own question: there has been lots of development in this area recently. Among most used are: Play iteratees (http://www.playframework.org/documentation/2.0/Iteratees) and Rx for .NET (http://msdn.microsoft.com/en-us/data/gg577609.aspx)
Instead of Future they define something like:
interface Observable<T> {
Disposable subscribe(Observer<T> observer);
}
interface Observer<T> {
void onCompleted();
void onError(Exception error);
void onNext(T value);
}
and lots of combinators.
Alternatively to Observables you can take a look at twitter's approach.
They use Spool, which is an asynchronous version of the Stream.
Basically it is a simple trait similar to the List
trait Spool[+A] {
def head: A
/**
* The (deferred) tail of the spool. Invalid for empty spools.
*/
def tail: Future[Spool[A]]
}
that allows you to do functional stuff like map, filter and foreach on top of it.
Future is really designed to return a single (atomic) result, not for communicating intermediate results in this manner. What you will really want to do is to use multiple futures, one per batch.
We have a similar requirement where we have a bunch of things that we need to get from different remote servers, and each will come return at different times. We don't want to wait until the last one has returned, but rather process them in the order they return. For this we created the AsyncCompleter which takes an Iterable<Callable<T>> and returns an Iterable<T> that blocks on iteration, completely abstracting usage of the Future interface.
If you look at how that class is implemented, you'll see how to use a CompletionService to receive results from an Executor in the order in which they become available, if you need to build this for yourself.
edit: just saw that the second half of Gray's answer is similar, basically using an ExecutorCompletionService

Categories

Resources