I have the following case to model
the program starts by querying the DB per received parameters and understanding the amount of tasks to be run.
Threads queue with some fixed max threads is defined to execute the tasks. Each task starts a flow that can have different configuration and can take different time. Once a tasks completes, each has a configurable sleep time.
Once a task sleeps, it cannot block a spot on the execution queue. The execution queue must continue with tasks that are ready to execute
I find it hard to code for some reason (mainly due to the last requirement)
Any help will be appreciated
Thanks
This is lengthy, but straight-forward code for illustration of scheduled resubmitter, which I haven't tested :)
import java.util.Deque;
import java.util.Iterator;
import java.util.LinkedList;
import java.util.concurrent.*;
interface Repeatable {
boolean shouldBeRepeated();
/**
* #return how long to sleep
*/
long delayBeforeRepeat();
/**
* #return "initial" state of this task instance, so this state can be resubmitted for repeated execution
*/
BusinessTask reset();
}
/**
* Whatever suits your business logic
*/
interface BusinessTask extends Callable<Repeatable> {
}
class BusinessTaskCompletionData {
final BusinessTask businessTask;
/**
* Timestamp when this task should be resubmitted
*/
final long nextSubmitTime;
BusinessTaskCompletionData(BusinessTask businessTask, long nextSubmitTime) {
this.businessTask = businessTask;
this.nextSubmitTime = nextSubmitTime;
}
}
class TaskResusltsConsumer implements Runnable {
private final CompletionService<Repeatable> completionService;
private final Deque<BusinessTaskCompletionData> completedTasks;
TaskResusltsConsumer(ExecutorService executor, Deque<BusinessTaskCompletionData> completedTasks) {
this.completedTasks = completedTasks;
completionService = new ExecutorCompletionService<>(executor);
}
#Override
public void run() {
while (true) {
try {
Future<Repeatable> completedBusinessTask = completionService.take();
Repeatable repeatable = completedBusinessTask.get();
if (repeatable.shouldBeRepeated()) {
completedTasks.add(new BusinessTaskCompletionData(repeatable.reset(),
System.currentTimeMillis() + repeatable.delayBeforeRepeat()));
}
} catch (ExecutionException | InterruptedException ie) {
// handle somehow
}
}
}
}
class TasksSupplier implements Runnable {
private final Deque<BusinessTaskCompletionData> completedTasks;
private final ExecutorService executor;
TasksSupplier(Deque<BusinessTaskCompletionData> completedTasks, ExecutorService executor) {
this.completedTasks = completedTasks;
this.executor = executor;
}
#Override
public void run() {
while (true) {
BusinessTask t = getTaskSomehow();
executor.submit(getTaskSomehow());
}
}
private BusinessTask getTaskSomehow() {
// implement
return null;
}
}
/**
* Actual implementation of logic to obtain 'initial state' of task to repeat and repeat schedule
*/
class BusinessData implements Repeatable {
// whatever
}
public class SOTest {
public static void main(String[] args) {
final LinkedList<BusinessTaskCompletionData> tasksToRepeat = new LinkedList<>();
// workers pool
final ExecutorService workersPool = Executors.newFixedThreadPool(10);
// controllers pool: 1 thread for supplier, the other for results consumer
final ExecutorService controllersPool = Executors.newFixedThreadPool(2);
controllersPool.submit(new TasksSupplier(tasksToRepeat, workersPool));
controllersPool.submit(new TaskResusltsConsumer(workersPool, tasksToRepeat));
// resubmitter scheduled pool
ScheduledExecutorService scheduledExecutor = Executors.newSingleThreadScheduledExecutor();
scheduledExecutor.scheduleWithFixedDelay(new Runnable() {
#Override
public void run() {
long now = System.currentTimeMillis();
Iterator<BusinessTaskCompletionData> it = tasksToRepeat.iterator();
while (it.hasNext()) {
BusinessTaskCompletionData data = it.next();
if (data.nextSubmitTime >= now) {
workersPool.submit(data.businessTask);
it.remove();
}
}
}
},
// initial delay of 1 sec
1000,
// periodic delay of 1 sec
1000,
TimeUnit.MILLISECONDS
);
}
}
Related
I am using spring boot
public interface StringConsume extends Consumer<String> {
default public void strHandel(String str) {
accept(str);
}
}
Impl
#Component("StrImpl")
public class StringConsumeImpl implements StringConsume {
BlockingQueue<String> queue = new ArrayBlockingQueue<>(500);
final ExecutorService exService = Executors.newSingleThreadExecutor();
Future<?> future = CompletableFuture.completedFuture(true);
#Override
public void accept(String t) {
try {
queue.put(t);
} catch (InterruptedException e) {
e.printStackTrace();
}
while (null != queue.peek()) {
if (future.isDone()) {
future = exService.submit(() -> queue.take());
}
}
}
}
Class
#Component
public class Test {
#Resource(name="StrImpl")
private #Autowired StringConsume handler;
public void insertIntoQueue(String str) {
handler.accept(str);
}
}
In StringConsumeImpl , do I need synchronized while loop? and suppose five time StringConsumeImpl class called, then do while loop will create 5 process or only 1 process ? and what is the best replacement of while loop in StringConsumeImpl , if any ?
There are some problems with that code.
First of all, the consumer doesn't really "consume" anything, it just adds the string to the queue then takes it back out. Let's say for the sake of the argument that it also "consumes" it by printing it to console or something.
Secondly, the consumer will only get called once due to the loop unless it is running in a thread of its own. Eg if you do
public static void main(String[]args) {
StringConsume consumer = new StringConsumeImpl();
consumer.accept("hello");
}
The consumer will put "hello" into the queue, take it out immediately and then stay in the loop, waiting for more elements to take out; however, no one is there to actually add any.
The usual concept of doing what it looks like you're trying to do is "producer/consumer". This means that there is a "producer" that puts items into a queue and a "consumer" taking them out and doing stuff with them.
So in your case what your class does is "consume" the string by putting it into the queue, making it a "producer", then "consuming" the string by taking it back out of the queue. Of course, there's also the "actual" producer of the string, ie the class calling this.
So in general you'd do something like this:
/** Produces random Strings */
class RandomStringProducer {
Random random = new Random();
public String produceString() {
return Double.toString(random.nextDouble());
}
}
/** Prints a String */
class PrintConsumer implements StringConsume {
public void accept(String s) { System.out.println(s); }
}
/** Consumes String by putting it into a queue */
class QueueProducer implements StringConsume {
BlockingQueue<String> queue;
public QueueProducer(BlockingQueue<String> q) { queue = q; }
public void accept(String s) {
queue.put(s);
}
}
public static void main(String[] args) {
// the producer
RandomStringProducer producer = new RandomStringProducer();
// the end consumer
StringConsume printConsumer = new PrintConsumer();
// the queue that links producer and consumer
BlockingQueue<String> queue = new ArrayBlockingQueue<>();
// the consumer putting strings into the queue
QueueProducer queuePutter = new QueueProducer(queue);
// now, let's tie them together
// one thread to produce strings and put them into the queue
ScheduledExecutorService producerService = Executors.newScheduledThreadPool(1);
Runnable createStringAndPutIntoQueue = () -> {
String created = producer.createString();
queuePutter.consume(created);
};
// put string into queue every 100ms
producerService.scheduleAtFixedRate(createStringAndPutIntoQueue, 100, TimeUnit.MILLISECONDS);
// one thread to consume strings
Runnable takeStringFromQueueAndPrint = () -> {
while(true) {
String takenFromQueue = queue.take(); // this will block until a string is available
printConsumer.consume(takenFromQueue);
}
};
// let it run in a different thread
ExecutorService consumerService = Executors.newSingleThreadExecutor();
consumerService.submit(takeStringFromQueueAndPrint);
// this will be printed; we are in the main thread and code is still being executed
System.out.println("the produce/consume has started");
}
So when you run this, there will be three threads: the main thread, the producer thread and the consumer thread. The producer and consumer will be doing their thing concurrently, and the main thread will also continue to run (as exemplified by the System.out.println in the last line).
I'm developing Spring MVC web application. One of it's functionalities is file converting (uploading file -> converting -> storing on server).
Some files could be too big for converting on-the-fly so I decided to put them in shared queue after upload.
Files will be converted with priority based on upload time, i.e. FIFO.
My idea is to add task to queue in controller after upload.
There would also be service executing all tasks in queue, and if empty, then wait until new task is added. I don't need scheduling - tasks should be executing always when queue is not empty.
I've read about ExecutorService but I didn't find any example that fit to my case.
I'd appreciate any suggestions.
EDIT
Thanks for answers, I need to clarify my problem:
Basically, I know how to execute tasks, I need to manage with handling the queue of tasks. User should be able to view the queue and pause, resume or remove task from queue.
My task class:
public class ConvertTask implements Callable<String> {
private Converter converter;
private File source;
private File target;
private State state;
private User user;
public ConvertTask(Converter converter, File source, File target, User user) {
this.converter = converter;
this.source = source;
this.target = target;
this.user = user;
this.state = State.READY;
}
#Override
public String call() throws Exception {
if (this.state == State.READY) {
BaseConverterService converterService = ConverterUtils.getConverterService(this.converter);
converterService.convert(this.source, this.target);
MailSendServiceUtil.send(user.getEmail(), target.getName());
return "success";
}
return "task not ready";
}
}
I also created class responsible for managing queue/tasks followed by your suggestions:
#Component
public class MyExecutorService {
private LinkedBlockingQueue<ConvertTask> converterQueue = new LinkedBlockingQueue<>();
private ExecutorService executorService = Executors.newSingleThreadExecutor();
public void add(ConvertTask task) throws InterruptedException {
converterQueue.put(task);
}
public void execute() throws InterruptedException, ExecutionException {
while (!converterQueue.isEmpty()) {
ConvertTask task = converterQueue.peek();
Future<String> statusFuture = executorService.submit(task);
String status = statusFuture.get();
converterQueue.take();
}
}
}
My point is, how to execute tasks if queue is not empty and resume when new task is added and queue was previously empty. I think of some code that fits in add(ConvertTask task) method.
Edited after question updates
You don't need to create any queue for the tasks since the ThreadPoolExecutor implementation has its own queue. Here's the source code of Oracle's Java 8 implementation of newSingleThreadExecutor() method:
public static ExecutorService newSingleThreadExecutor() {
return new FinalizableDelegatedExecutorService
(new ThreadPoolExecutor(1, 1,
0L, TimeUnit.MILLISECONDS,
new LinkedBlockingQueue<Runnable>()));
}
So you just submit a new task directly and it's getting queued by the ThreadPoolExecutor
#Component
public class MyExecutorService {
private ExecutorService executorService = Executors.newSingleThreadExecutor();
public void add(ConvertTask task) throws InterruptedException {
Future<String> statusFuture = executorService.submit(task);
}
}
If you're worried about the bounds of your queue, you can create a queue instance explicitly and supply it to a ThreadPoolExecutor constructor.
private executorService = new ThreadPoolExecutor(1, 1,
0L, TimeUnit.MILLISECONDS, new LinkedBlockingQueue<>(MAX_SIZE));
Please note that I have removed the line
String status = statusFuture.get();
because get() call is blocking. If you have this line in the same thread where you submit, your code is not asynchronous anymore. You should store the Future objects and check them asynchronously in a different thread. Or you can consider using CompletableFuture introduced in Java 8. Check out this post.
After upload you should return response immediately. The client can't wait for resource too long. However you can change it in the client settings. Anyway if you are running a background task you can do it without interacting with the client or notify the client while execution is in progress. This is an example of callable demo used by the executor service
/**
* Created by Roma on 17.02.2015.
*/
class SumTask implements Callable<Integer> {
private int num = 0;
public SumTask(int num){
this.num = num;
}
#Override
public Integer call() throws Exception {
int result = 0;
for(int i=1;i<=num;i++){
result+=i;
}
return result;
}
}
public class CallableDemo {
Integer result;
Integer num;
public Integer getNumValue() {
return 123;
}
public Integer getNum() {
return num;
}
public void setNum(Integer num) {
this.num = num;
}
public Integer getResult() {
return result;
}
public void setResult(Integer result) {
this.result = result;
}
ExecutorService service = Executors.newSingleThreadExecutor();
public String execute() {
try{
Future<Integer> future = service.submit(new SumTask(num));
result = future.get();
//System.out.println(result);
service.shutdown();
}
catch(Exception e)
{
e.printStackTrace();
}
return "showlinks";
}
}
My tasks contain some identifier. I need ScheduledExecutionService, which will execute given tasks after specified time interval (as all standard implementations do), but with one restriction: it must not start the task, until previous task with the same identifier completes (other tasks, of course, must execute concurrently).
In other words, for given ID all tasks must be executed serially.
Is there ready to use implementation in standard or 3-rd party library or easy way to implement it?
Also returned ScheduledFuture.cancel must work correctly, because scheduled task may be cancelled by the currently executed task.
Here is a rough idea of how I would do it. (Not tested)
public class Solution {
private static final ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(1000);
//a map of taskQueues. We queue all tasks of same ID.
private static final ConcurrentHashMap<Long, BlockingDeque<MyTask>> mapOfTasks = new ConcurrentHashMap<>(1000);
public boolean submitWithChecks(Runnable task, long ID, long interval) {
final BlockingDeque<MyTask> queue;
if(mapOfTasks.containsKey(ID)) queue = mapOfTasks.get(ID);
else queue = new LinkedBlockingDeque<>(1000);
//At this point we have a valid queue object
try {
//insert the task into the queue
queue.putLast(new MyTask(task, ID, interval));
} catch (InterruptedException e) {
e.printStackTrace();
return false;
}
//If the queue was already present it will get Updated my previous queue.putLast operation
//If the queue was not present, we put it in the map and start a new queueEater thread.
if(!mapOfTasks.containsKey(ID)) {
mapOfTasks.put(ID, queue);
scheduler.submit(new QueueEater(ID)); //start a new task execution queue
}
return true;
}
private class QueueEater implements Runnable {
//This queueEater will consume the queue with this taskID
private final Long ID;
private QueueEater(Long id) {
ID = id;
}
#Override
public void run() {
//QueueEater will poll the mapOfTasks for the queue Object with its associated ID
while (mapOfTasks.containsKey(ID)) {
final BlockingDeque<MyTask> tasks = mapOfTasks.get(ID);
if(tasks.size() == 0) {
mapOfTasks.remove(ID);
return; //if task queue empty kill this thread;
}
final MyTask myTask;
try {
myTask = tasks.takeFirst();
//schedule the task with given interval
final Future future = scheduler.schedule(myTask.getTask(), myTask.getInterval(), TimeUnit.SECONDS);
future.get(); //wait till this task gets executed before scheduling new task
} catch (InterruptedException | ExecutionException e) {
e.printStackTrace();
}
}
}
}
private class MyTask {
private final Runnable task;
private final long ID;
private final long interval;
public long getInterval() {
return interval;
}
public long getID() {
return ID;
}
public Runnable getTask() {
return task;
}
private MyTask(Runnable task, long id, long interval) {
this.task = task;
ID = id;
this.interval = interval;
}
}
}
I have a JobService that processes larger jobs. Jobs are dynamically subdivided into multiple tasks, tasks also might generate sub-tasks, etc, so its not possible to predict the total number of tasks for a job. Each task queues itself to run via ExecutorService.submit(...) The problem is it seems like I have to create a separate ExecutorService for each job, since the only way to tell when the 'job queue' is complete is to use ExecutorService.awaitTermination(...). This seems inefficient though, because I can't share a single threadpool between the jobs and their ExecutorService.
I'm looking for some alternatives, I was thinking of using an AtomicInteger for each job. Incrementing it when I submit a new task, decrementing it when a task finishes. But then I have to poll for when it is zero, and that seems messy, as well as some exception handling mess.
It seems like there must be a better solution?
Submit returns a Future object that can be used to wait on completion of tasks. You could keep track of these and add a method that recursively blocks until all subtasks are done. This way you can reuse the executor wherever you need to.
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.Future;
import java.util.concurrent.atomic.AtomicBoolean;
public class JobExecutor {
ExecutorService executorService = Executors.newFixedThreadPool(1);
private class Task implements Runnable {
private final String name;
private final Task[] subtasks;
private final ExecutorService executorService;
private volatile boolean started = false;
private Future<?> taskFuture;
// Separate list from subtasks because this is what you'll probably
// actually use as you may not be passing subtasks as constructor args
private final List<Task> subtasksToWaitOn = new ArrayList<Task>();
public Task(String name, ExecutorService executorService,
Task... subtasks) {
this.name = name;
this.executorService = executorService;
this.subtasks = subtasks;
}
public synchronized void start() {
if (!started) {
started = true;
taskFuture = executorService.submit(this);
}
}
public synchronized void blockTillDone() {
if (started) {
try {
taskFuture.get();
} catch (InterruptedException e) {
// TODO Handle
} catch (ExecutionException e) {
// TODO Handle
}
for (Task subtaskToWaitOn : subtasksToWaitOn) {
subtaskToWaitOn.blockTillDone();
}
} else {
// TODO throw exception
}
}
#Override
public void run() {
for (Task subtask : subtasks) {
subtask.start();
subtasksToWaitOn.add(subtask);
}
System.out.println("My name is: " + name);
}
}
void testSubmit() {
Task subsubTask1 = new Task("Subsubtask1", executorService);
Task subtask1 = new Task("Subtask1", executorService, subsubTask1);
Task subtask2 = new Task("Subtask2", executorService);
Task subtask3 = new Task("Subtask3", executorService);
Task job = new Task("Job", executorService, subtask1, subtask2,
subtask3);
job.start();
job.blockTillDone();
System.out.println("Job done!");
}
public static void main(String[] args) {
new JobExecutor().testSubmit();
}
}
Prints out:
My name is: Job
My name is: Subtask1
My name is: Subtask2
My name is: Subtask3
My name is: Subsubtask1
Job done!
If you're on java7 (or java6 with the backport library http://www.cs.washington.edu/homes/djg/teachingMaterials/grossmanSPAC_forkJoinFramework.html ), you might want to consider a Fork-Join pool for this sort of thing:
class MainTask extends RecursiveTask<Long> {
#Override
protected Long compute() {
SubTask subtask0 = new SubTask(0L);
SubTask subtask1 = new SubTask(1L);
SubTask subtask2 = new SubTask(2L);
SubTask subtask3 = new SubTask(3L);
SubTask subtask4 = new SubTask(4L);
SubTask subtask5 = new SubTask(5L);
subtask1.fork();
subtask2.fork();
subtask3.fork();
subtask4.fork();
subtask5.fork();
return subtask0.compute() +
subtask1.join() +
subtask2.join() +
subtask3.join() +
subtask4.join() +
subtask5.join();
}
}
class SubTask extends RecursiveTask<Long> {
private Long rawResult = null;
private Long expected = null;
public SubTask(long expected) {
this.expected = expected;
}
#Override
protected Long compute() {
return expected;
}
}
public static void main( String[] args )
{
ForkJoinPool forkJoinPool = new ForkJoinPool();
Long result = forkJoinPool.invoke(new MainTask());
System.out.println(result);
}
obviously this has hardcoded subtasks, but there's no reason you can't pass parameters to your main task, and use that to generate subtasks. The subtasks themselves don't all have to be of the same type, but they should all extend RecursiveTask. Realistically if a task generates subtasks (like MainTask above), at least one of the subtasks should have "compute" called directly on it (rather and a fork and a join), so that the current thread can execute some computations, and let other threads do the rest.
I want to write a reusable piece of code to allow waiting conditions while submitting tasks to an executor service. There are alot of implementaions for neat ways of blocking if too many tasks are queue, e.g. here
I need a executor that evaluates all waiting threads, every time on task is finished. For deciding if task is allowed to be submitted atm, the current state of all active tasks must be considered. I came up with the following solution, which doesn't have to scale for multiple submitters or a high grade of simultaneous executed tasks.
Question: Is the following code safe to use, or is there some flaw that I'm missing? The person implementing the aquireAccess method of the ConditionEvaluator<T> must ensure that the way the state of the threads in queried is thread safe, but the implementer needn't safeguard the iteration over the activeTasks collection. Here is the code:
public class BlockingExecutor<T extends Runnable> {
private final Executor executor;
private final ConditionEvaluator<T> evaluator;
final ReentrantLock lock = new ReentrantLock();
final Condition condition = this.lock.newCondition();
final LinkedList<T> active = new LinkedList<T>();
private final long reevaluateTime;
private final TimeUnit reevaluateTimeUnit;
public BlockingExecutor(Executor executor, ConditionEvaluator<T> evaluator) {
this.evaluator = evaluator;
this.executor = executor;
this.reevaluateTimeUnit = null;
this.reevaluateTime = 0;
}
public BlockingExecutor(Executor executor, ConditionEvaluator<T> evaluator, long reevaluateTime, TimeUnit reevaluateTimeUnit) {
this.evaluator = evaluator;
this.executor = executor;
this.reevaluateTime = reevaluateTime;
this.reevaluateTimeUnit = reevaluateTimeUnit;
}
public void submitTask(final T task) throws InterruptedException {
this.lock.lock();
try {
do{
if (this.reevaluateTimeUnit == null) {
this.condition.await(this.reevaluateTime, this.reevaluateTimeUnit);
} else {
this.condition.await();
}
}while(!this.evaluator.aquireAccess(this.active, task));
this.active.add(task);
this.executor.execute(new Runnable() {
#Override
public void run() {
try {
task.run();
} finally {
BlockingExecutor.this.lock.lock();
try{
BlockingExecutor.this.active.remove(task);
BlockingExecutor.this.condition.signalAll();
}finally{
BlockingExecutor.this.lock.unlock();
}
}
}
});
} finally {
this.lock.unlock();
}
}
}
public interface ConditionEvaluator<T extends Runnable> {
public boolean aquireAccess(Collection<T> activeList,T task);
}
Question: Can the code be improved?