Are JavaFX Property objects threadsafe for multiple asynchronous writes? - java

Is it dangerous for the standard implementations of Property in JavaFX to call set methods from multiple threads? I don't really care about race conditions on the client side from reading-and-then-setting operations. I want to know if the Property itself can be corrupted internally if multiple threads call its set() or setValue() method.
Is this code below threadsafe?
public class ThreadSafeProperty {
public static void main(String[] args) {
ObjectProperty<Integer> property = new SimpleObjectProperty<>(5);
ExecutorService exec = Executors.newFixedThreadPool(5);
property.addListener((obs,o,n) -> System.out.println("OLD: " + o + " NEW: " + n));
exec.execute(() -> property.set(4));
exec.execute(() -> property.set(6));
exec.execute(() -> property.set(11));
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
e.printStackTrace();
}
exec.shutdown();
}
}

SimpleObjectProperty is not thread safe.
You see this in the source: javafx/beans/property/ObjectPropertyBase.set is not synchronized, or you use a tool like http://vmlens.com witch looks for you:-)

Related

How to use supplyAsync of CompletableFuture to run the same method with multiple inputs each time?

I have the following code where I create a supplier and use the completableFuture's supplyAsync method to invoke another method after async execution.
public void runParallelFunctions(MyInput myInput) {
Supplier<Map<String, String>> taskSupplier = () -> {
try {
return invokeLambda("input1");
} catch (Exception e) {
System.out.println(e);
}
return new HashMap<>();
};
for (int i = 0; i < 5; i++) {
CompletableFuture.supplyAsync(taskSupplier::get, executorService)
.thenAccept(this::printResultsFromParallelInvocations);
}
System.out.println("Doing other work....");
}
Below is the method I call after the execution completes.
private void printResultsFromParallelInvocations(Map<String, String> result) {
result.forEach((key, value) -> System.out.println(key + ": " + value));
}
In the above code, how can I call the method invokeLambda passing multiple arguments like "input1", "input2" etc.? I can generate the inputs through a loop, but how can I use some sort of a list with the supplier so that I can call the entire list for supplyAsync method? I cannot use runAsync method because I have a return value that I need to call printResultsFromParallelInvocations with. I'm new to futures and async callbacks and would appreciate any help. Thanks in advance.
You can not create a single Supplier<Map<String, String>> and expect it to behave differently for the five evaluations. It would require external mutable state to make it detectable that an evaluation is the n’th evaluation which at the same time contradicts the idea of performing five concurrent evaluations which have no order.
Simply create five different suppliers, e.g.
for(int i = 0; i < 5; i++) {
String input = "input" + i;
CompletableFuture.supplyAsync(() -> invokeLambda(input), executorService)
.thenAccept(this::printResultsFromParallelInvocations);
}
In each loop iteration, the lambda expression () -> invokeLambda(input) captures the current value of input and creates an appropriate Supplier instance.
Side notes:
Don’t name methods after technical aspects like invokeLambda but rather try to express their purpose.
the taskSupplier::get in your original code was an unnecessary method reference as it produced a Supplier invoking the method on an object that was already a Supplier. So taskSupplier could have been passed to supplyAsync directly if getting the same behavior for every evaluation was intended.
You can create new Supplier on the fly inside the loop.
public static Supplier<Map<String, String>> supplierFunc(Object... args) {
return () -> {
try {
return invokeLambda(args);
} catch (Exception e) {
System.out.println(e);
}
return new HashMap<>();
};
}
public void runParallelFunctions(Object myInput) {
for (int i = 0; i < 5; i++) {
CompletableFuture.supplyAsync(supplierFunc("input1", "input2"), executorService)
.thenAccept(this::printResultsFromParallelInvocations);
}
System.out.println("Doing other work....");
}

Understanding RxJava observable when underlying data source has new values

I am trying to experiment RxJava observable and observer code. My objective is to check that how things work when underlying source receives new data values. My code is as:
List<Integer> numbers = new ArrayList<>();
Runnable r = new Runnable() {
#Override
public void run() {
int i = 100;
while(i < 110) {
numbers.add(i);
try {
Thread.sleep(10);
} catch (InterruptedException e) {
e.printStackTrace();
}
i++;
}
}
};
numbers.add(0);
numbers.add(1);
numbers.add(2);
Observable.fromIterable(numbers)
.observeOn(Schedulers.io())
.subscribe(i -> System.out.println("Received "+i+ " on "+ Thread.currentThread().getName()),
e -> e.printStackTrace());
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
Thread t = new Thread(r);
t.start();
try {
Thread.sleep(10000);
} catch (InterruptedException e) {
e.printStackTrace();
}
So I have a list of numbers. I then have a runnable which adds new numbers to this list with a time gap between the additions. I don't start thread yet. I add 0,1,2 to the list and then create an observable with it, scheduling the observer on a thread from pool, and finally subscribing to the observable. As subscription happens, observable emits the values 0,1,2 and observer is invoked(lambda passed to subscribe is executed). Then I introduce a delay of 1 sec on the main thread and then I spawn a new thread using runnable I created earlier, and also add a final delay so that application doesn't exit immediately.
What I expect is that as new numbers are added to the list, observer must be invoked, thus printing the message. But that doesn't happen. Surely I have got it wrong in my understanding. Do I need to also put observable on a scheduler?
The Observable.fromIterable() method is a "one time" load of values for an observable each time a subscription is build. What happens "after" building the subscription has no affect anymore. When you use the subscribe(onNext, onError, onComplete) method with the onComplete argument you will see that the subscription has fully consumed and the three initial values has been printed.
You can use a Subject (something like a PublishSubject) where you use the onNext() method to add "new values" while the subscriptions which were built earlier are still active (and not completed). That way you can build the subscriptions first and keep calling onNext() for new values in the subject until you are done and call onCompleted().

Multiple Consumer Threads Consume Queue FIFO Overall

As I am trying to learn the multi-threading part of JAVA programming, I have the following issue when dealing with One Producer - Multiple Consumer coding.
What I'm trying to achieve is: multiple consumer threads taking items out of the queue in the order of how they were put into the queue. in other words, make the consumer threads maintain a FIFO manner overall.
final BlockingDeque<String> deque = new LinkedBlockingDeque<String>();
Runnable rb = new Runnable() {
public void run() {
try {
System.out.println(deque.takeLast());
} catch (InterruptedException e) {
e.printStackTrace();
}
}
};
deque.putFirst("a");
deque.putFirst("b");
deque.putFirst("c");
deque.putFirst("d");
ExecutorService pool = Executors.newFixedThreadPool(4);
pool.submit(rb);
pool.submit(rb);
pool.submit(rb);
pool.submit(rb);
WHAT I AM LOOKING FOR:
a
b
c
d
WHAT IT ACTUALLY OUTPUTS:
b
c
a
d
OR in random orders
Any simple solutions to solve this? Thank you!
In your case the problem is that
System.out.println(deque.takeLast());
are actually two instructions which together are not atomic. Imagine such scenario :
Thread 1 takes string from queue.
Thread 2 takes string from queue.
Thread 2 prints value.
Thread 1 prints value.
So it all depends how operating system will manage the threads execution.
In your case one possible solution would be to add synchronized keyword to run method :
Runnable rb = new Runnable() {
public synchronized void run() {
try {
String s = deque.takeLast();
System.out.println(s);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
};
This will synchronize on instance of anonymous class which you created here. Since you are passing the same runnable to ExecutorService - it should work.
Or you can synchornize on your queue object since your runnable, which has access to queue object, will be executed in many threads, as you passed it to ExecutorService :
Runnable rb = new Runnable() {
public void run() {
synchronized (deque) {
try {
String s = deque.takeLast();
System.out.println(s);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
};
Also remember about closing your thread pool because now your application will never exit.

thenApply in CompletableFuture

In the following code
public CompletableFuture<String> getMyFuture(String input)
{
CompletableFuture<String> future = new CompletableFuture<String>().thenApply((result) -> result+ "::");
ExecutorService service = Executors.newFixedThreadPool(6);
service.submit(() -> {
try {
future.complete(getResult(input));
} catch (InterruptedException e) {
e.printStackTrace();
}
});
return future;
}
public String getResult(String input) throws InterruptedException
{
Thread.sleep(3000);
return "hello "+ input +" :" + LocalTime.now();
}
I am expecting the output to contain trailing "::" but program doesn't is "hello first :16:49:30.231
" Is my implementation of apply correct ?
You're invoking complete() method of the CompletionStage that you got at the first line (where you call "thenApply" method).
If your intention is to complete the CompletableFuture with some string value (future.complete(getResult(input))) and then apply some function, you'd better place thenApply() at the end (where you return the future).
public CompletableFuture<String> getMyFuture(String input)
{
CompletableFuture<String> future = new CompletableFuture<String>();
ExecutorService service = Executors.newFixedThreadPool(6);
service.submit(() -> {
try {
future.complete(getResult(input));
} catch (InterruptedException e) {
e.printStackTrace();
}
});
return future.thenApply(result -> result+ "::");
}
I don't know how to explain it in a more understandable way. But in short: you're calling complete() method on the wrong object reference inside your Runnable.
You are creating two CompletableFuture instances. The first, created via new CompletableFuture<String>() will never get completed, you don’t even keep a reference to it that would make completing it possible.
The second, created by calling .thenApply((result) -> result+ "::") on the first one, could get completed by evaluating the specified function once the first one completed, using the first’s result as an argument to the function. However, since the first never completes, the function becomes irrelevant.
But CompletableFuture instances can get completed by anyone, not just a function passed to a chaining method. The possibility to get completed is even prominently displayed in its class name. In case of multiple completion attempts, one would turn out to be the first one, winning the race and all subsequent completion attempts will be ignored. In your code, you have only one completion attempt, which will successfully complete it with the value returned by getResult, without any adaptations.
You could change your code to keep a reference to the first CompletableFuture instance to complete it manually, so that the second gets completed using the function passed to thenApply, but on the other hand, there is no need for manual completion here:
public CompletableFuture<String> getMyFuture(String input) {
ExecutorService service = Executors.newFixedThreadPool(6);
return CompletableFuture.supplyAsync(() -> getResult(input), service)
.thenApply(result -> result + "::");
}
public String getResult(String input) {
LockSupport.parkNanos(TimeUnit.SECONDS.toNanos(3));
return "hello "+ input +" :" + LocalTime.now();
}
When specifying the executor to supplyAsync, the function will be evaluated using that executor. More is not needed.
Needless to say, that’s just for example. You should never create a temporary thread pool executor, as the whole point of a thread pool executor is to allow reusing the threads (and you’re using only one of these six threads at all) and it should get shut down after use.

Mix explicit and implicit parallelism with java-8 streams

in the past I have written some java programs, using two threads.
First thread (producer) was reading data from an API (C library), create a java object, send the object to the other thread.
The C API is delivering an event stream (infinite).
The threads are using a LinkedBlockingQueue as a pipeline to exchange the objects (put, poll).
The second thread (consumer) is dealing with the object.
(I also found that code is more readable within the threads. First thread is dealing with the C API stuff and producing
proper java objects, second thread is free from C API handling and is dealing with the data).
Now I'm interested, how I can realize this scenario above with the new stream API coming in java 8.
But assuming I want to keep the two threads (producer/consumer)!
First thread is writing into the stream. Second thread is reading from the stream.
I also hope, that I can handle with this technique a better explicit parallelism (producer/consumer)
and within the stream I can use some implicit parallelism (e.g. stream.parallel()).
I don't have many experience with the new stream api.
So I experimented with the following code below, to solve the idea above.
I use 'generate' to access the C API and feed this to the java stream.
I used in the consumer thread .parallel() to test and handle implicit parallelism. Looks fine. But see below.
Questions:
Is 'generate' the best way in this scenario for the producer?
I have an understanding problem how to terminate/close the stream in the producer,
if the API has some errors AND I want to shutdown the whole pipeline.
Do I use stream.close or throw an exception?
2.1 I used stream.close(). But 'generate' is still running after closing,
I found only to throw an exception to terminate the generate part.
This exception is going into the stream and consumer is receiving the exception
(This is fine for me, consumer can recognize it and terminate).
But in this case, the producer has produced more then consumer has processed, while exception is arriving.
2.2 if consumer is using implicit parallelism stream.parallel(). The producer is processing much more items.
So I don't see any solution for this problem. (Accessing C API, check error, make decision).
2.3 Throwing the exception in producer arrives at consumer stream, but not all inserted objects are processed.
Once more: the idea is to have an explicit parallelism with the threads.
But internally I can deal with the new features and use parallel processing when possible
Thanks for breeding about this problem too.
package sandbox.test;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.stream.LongStream;
public class MyStream {
private volatile LongStream stream = null;
private AtomicInteger producerCount = new AtomicInteger(0);
private AtomicInteger consumerCount = new AtomicInteger(0);
private AtomicInteger apiError = new AtomicInteger(0);
public static void main(String[] args) throws InterruptedException {
MyStream appl = new MyStream();
appl.create();
}
private static void sleep(long sleep) {
try {
Thread.sleep(sleep);
} catch (InterruptedException e) {
throw new RuntimeException(e);
}
}
private static void apiError(final String pos, final int iteration) {
RuntimeException apiException = new RuntimeException("API error pos=" + pos + " iteration=" + iteration);
System.out.println(apiException.getMessage());
throw apiException;
}
final private int simulateErrorAfter = 10;
private Thread produce() {
Thread thread = new Thread(new Runnable() {
#Override
public void run() {
System.out.println("Producer started");
stream = LongStream.generate(() -> {
int localCount;
// Detect error, while using stream.parallel() processing
int error = apiError.get();
if ( error > 0 )
apiError("1", error);
// ----- Accessing the C API here -----
localCount = producerCount.incrementAndGet(); // C API access; delegate for accessing the C API
// ----- Accessing the C API here -----
// Checking error code from C API
if ( localCount > simulateErrorAfter ) { // Simulate an API error
producerCount.decrementAndGet();
stream.close();
apiError("2", apiError.incrementAndGet());
}
System.out.println("P: " + localCount);
sleep(200L);
return localCount;
});
System.out.println("Producer terminated");
}
});
thread.start();
return thread;
}
private Thread consume() {
Thread thread = new Thread(new Runnable() {
#Override
public void run() {
try {
stream.onClose(new Runnable() {
#Override
public void run() {
System.out.println("Close detected");
}
}).parallel().forEach(l -> {
sleep(1000);
System.out.println("C: " + l);
consumerCount.incrementAndGet();
});
} catch (Exception e) {
// Capturing the stream end
System.out.println(e);
}
System.out.println("Consumer terminated");
}
});
thread.start();
return thread;
}
private void create() throws InterruptedException {
Thread producer = produce();
while ( stream == null )
sleep(10);
Thread consumer = consume();
producer.join();
consumer.join();
System.out.println("Produced: " + producerCount);
System.out.println("Consumed: " + consumerCount);
}
}
You need to understand some fundamental points about the Stream API:
All operations applied on a stream are lazy and won’t do anything before the terminal operation will be applied. There is no sense in creating the stream using a “producer” thread as this thread won’t do anything. All actions are performed within your “consumer” thread and the background threads started by the Stream implementation itself. The thread that created the Stream instance is completely irrelevant
Closing a stream has no relevance for the Stream operation itself, i.e. does not shut down threads. It is meant to release additional resources, e.g. closing the file associated with the stream returned by Files.lines(…). You can schedule such cleanup actions using onClose and the Stream will invoke them when you call close but that’s it. For the Stream class itself it has no meaning.
Streams do not model a scenario like “one thread is writing and another one is reading”. Their model is “one thread is calling your Supplier, followed by calling your Consumer and another thread does the same, and x other threads too…”
If you want to implement a producer/consumer scheme with distinct producer and consumer threads, you are better off using Threads or an ExecutorService and a thread-safe queue.
But you still can use Java 8 features. E.g. there is no need to implement Runnables using inner classes; you can use lambda expression for them.

Categories

Resources