Threads Executor non blocking list - java

I have a class that iterates through a list of links. For each link I want to do a treatment. So I have to create a thread for each link.
Here is the code (Main):
ThreadProcessing tp;
for(int i = 0; i < listUrl.size(); i++)
{
tp.add(string_url);
}
For the ThreadProcessing class, I have to use the Executor interface.
The point is : I have to create a pool of 30 threads. ThreadP class contains a list of non-blocking threads (it can contain more than 30 threads of course). You can add as many threads as you want and the class will be responsible to perform all these threads.
So that's what I tried to do (it does not work).The ThreadProcessing class :
public class ThreadProcessing {
List<Runnable> runnables = new ArrayList<Runnable>();
ExecutorService pool;
#PostConstruct
public void init()
{
pool = Executors.newFixedThreadPool(30);
}
public void add(String url)
{
runnables.add(createRunnable(url));
executeRunnables(pool, runnables);
}
public static void executeRunnables(final ExecutorService service, List<Runnable> runnables){
for(Runnable r : runnables){
service.execute(r);
}
service.shutdown();
}
private Runnable createRunnable(final String url){
Runnable getContentFromURL = new Runnable(){
public void run(){
//My treatment with url
}
};
return getContentFromURL;
}
}
I hope I have not been too vague in my explanation, thank you.

public void add( String url) {
Runnable job = createRunnable(url);
runnables.add( job);
pool.execute( job);
}
Also, do not shut the pool down unless you are finished submitting/adding jobs. Of course, in this example, you don't really need the runnables List.

Try something like:
public void main() {
ExecutorService es = Executors.newFixedThreadPool(30);
BlockingQueue<String> urls =
new ArrayBlockingQueue<String>(listUrl.size(), false, listUrl);
LinkedList<Future<?>> futures = new LinkedList<Future<?>>();
for(int i = 0 ; i < 30 ; ++i) {
futures.add(es.submit(new URLRunnable(urls)));
}
// Wait for all the futures to return
for(Future<?> f : futures) {
f.get();
}
}
public class URLRunnable() implements Runnable() {
private final BlockingQueue<String> urls;
URLRunnable(BlockingQueue<String> urls) { this.urls = urls; }
#Override
public void run() {
String url = null;
while((url = urls.poll()) != null) {
// do something with url
}
}
}

Related

Java. Consumer - Producer with BlockingQueue. Search tool

I was trying to implement some Consumer-Producer problem with BlockingQueue. To do it with some purpose, I decided to write file searching tool.
I decided that search mechanism is working recursively, and every new directory is going to have new thread pool to increase speed of searching.
My problem is, that I have no idea how can I implement mechanism that stops printing threads (consumers) at the end - when searching threads get job done.
I was trying to do that with some ideas like POISON PILLS, but it doesn't works well (threads stops before print any results). Any ideas how can I do that?
Here is some code:
Searching mechanism:
public class SearchingAlgorithm implements Runnable {
private final File file;
private BlockingQueue<File> queue;
private ExecutorService executor;
public SearchingAlgorithm(File fileName, BlockingQueue<File> queue) {
this.file = fileName;
this.queue = queue;
this.executor = Executors.newWorkStealingPool();
}
#Override
public void run() {
try {
searchDeep();
} catch (InterruptedException e) {
e.printStackTrace();
}
}
private void searchDeep() throws InterruptedException {
File[] files = file.listFiles();
if (files != null) {
for (File fil : files) {
if (fil.isDirectory()) {
executor.submit(new SearchingAlgorithm(fil, this.queue));
} else {
this.queue.add(fil);
}
}
}
}
}
Printer:
public class ContainingCheckAlgorithm implements Runnable {
private BlockingQueue<File> queue;
// private ExecutorService executor;
private String keyWord;
public ContainingCheckAlgorithm(BlockingQueue<File> queue, String keyWord) {
this.queue = queue;
this.keyWord = keyWord;
// executor = Executors.newFixedThreadPool(2);
}
#Override
public void run() {
try {
printFile();
} catch (InterruptedException e) {
e.printStackTrace();
}
}
private void printFile() throws InterruptedException {
while (true) {
File takeFile = queue.take();
String fileName = takeFile.getAbsolutePath()
.toLowerCase();
boolean isContainingKeyWord = fileName.contains(keyWord.toLowerCase());
if (isContainingKeyWord) {
System.out.println(takeFile.getAbsolutePath());
}
}
}
}
Main test class:
public class MainClass {
public static void main(String[] args) throws InterruptedException {
ExecutorService executor = Executors.newFixedThreadPool(2);
BlockingQueue<File> queue = new LinkedBlockingQueue<>();
File fileName = new File("C:/");
SearchingAlgorithm sa = new SearchingAlgorithm(fileName, queue);
executor.submit(sa);
ContainingCheckAlgorithm ca = new ContainingCheckAlgorithm(queue, "Slipknot");
executor.submit(ca);
executor.shutdown();
}
}
Split the whole work in 2 stages. At the first stage, SearchingAlgorithm's work and ContainingCheckAlgorithm waits for new jobs if the queue is empty. At second stage, all SearchingAlgorithm instances finished, and ContainingCheckAlgorithm quits if finds the queue empty. To discover if the queue is empty, ContainingCheckAlgorithm uses queue.poll(timeout) instead of queue.take().
And you need not to create new thread pool for each SearchingAlgorithm.
As u said, I try to do it this way:
Searching algorithm share Thread Pool with others insatances of searchingAlgorithm.
SEARCHING:
public class SearchingAlgorithm implements Runnable {
private final File file;
private BlockingQueue<File> queue;
private ExecutorService executor;
public SearchingAlgorithm(File fileName, BlockingQueue<File> queue, ExecutorService executor) {
this.file = fileName;
this.queue = queue;
this.executor = executor;
}
#Override
public void run() {
try {
searchDeep();
} catch (InterruptedException e) {
e.printStackTrace();
}
}
private void searchDeep() throws InterruptedException {
File[] files = file.listFiles();
if (files != null) {
for (File fil : files) {
if (fil.isDirectory()) {
executor.submit(new SearchingAlgorithm(fil, this.queue, executor));
} else {
this.queue.add(fil);
}
}
}
}
Now ContainingCheckAlgorith needs to share CountDownLatch with main class, because I need some mechanism to close Thread Pool in main class. Also it uses pool(timeout) as u said, and my threads finally finishing thier job.
CHECKING
public class ContainingCheckAlgorithm implements Runnable {
private BlockingQueue<File> queue;
private String keyWord;
private CountDownLatch latch;
public ContainingCheckAlgorithm(BlockingQueue<File> queue, String keyWord, CountDownLatch latch) {
this.queue = queue;
this.keyWord = keyWord;
this.latch = latch;
}
#Override
public void run() {
try {
printFile();
latch.countDown();
} catch (InterruptedException e) {
e.printStackTrace();
}
}
private void printFile() throws InterruptedException {
File takeFile;
while ((takeFile = queue.poll(1, TimeUnit.SECONDS)) != null) {
String fileName = takeFile.getName()
.toLowerCase();
boolean isContainingKeyWord = fileName.contains(keyWord.toLowerCase());
if (isContainingKeyWord) {
System.out.println(takeFile.getAbsolutePath());
}
}
}
MAIN:
public class MainClass {
public static void main(String[] args) throws InterruptedException {
ExecutorService executor = Executors.newCachedThreadPool();
BlockingQueue<File> queue = new LinkedBlockingQueue<>();
CountDownLatch latch = new CountDownLatch(1);
File fileName = new File("C:/");
SearchingAlgorithm sa = new SearchingAlgorithm(fileName, queue, executor);
executor.submit(sa);
ContainingCheckAlgorithm ca = new ContainingCheckAlgorithm(queue, "Slipknot", latch);
executor.submit(ca);
latch.await();
executor.shutdown();
}
It looks weird, but I wonder what if:
More than 1 thread would run as ContainingCheckAlgorithm ?
SearchingAlgorithm would searching for an file more than 1 second, and ContainingCheckAlgorithm finish work ? Obviously, I can change timeout to 2 second, and more, but we always try to optimize our programs.

Android how to group async tasks together like in iOS

I have a function in iOS app that uses dispatch_group to group multiple rest request:
static func fetchCommentsAndTheirReplies(articleId: String, failure: ((NSError)->Void)?, success: (comments: [[String: AnyObject]], replies: [[[String: AnyObject]]], userIds: Set<String>)->Void) {
var retComments = [[String: AnyObject]]()
var retReplies = [[[String: AnyObject]]]()
var retUserIds = Set<String>()
let queue = dispatch_get_global_queue(QOS_CLASS_USER_INITIATED, 0)
Alamofire.request(.GET, API.baseUrl + API.article.listCreateComment, parameters: [API.article.articleId: articleId]).responseJSON {
response in
dispatch_async(queue) {
guard let comments = response.result.value as? [[String: AnyObject]] else {
failure?(Helper.error())
return
}
print(comments)
retComments = comments
let group = dispatch_group_create()
for (commentIndex, comment) in comments.enumerate() {
guard let id = comment["_id"] as? String else {continue}
let relevantUserIds = helperParseRelaventUserIdsFromEntity(comment)
for userId in relevantUserIds {
retUserIds.insert(userId)
}
retReplies.append([[String: AnyObject]]())
dispatch_group_enter(group)
Alamofire.request(.GET, API.baseUrl + API.article.listCreateReply, parameters: [API.article.commentId: id]).responseJSON {
response in
dispatch_async(queue) {
if let replies = response.result.value as? [[String: AnyObject]] {
for (_, reply) in replies.enumerate() {
let relevantUserIds = helperParseRelaventUserIdsFromEntity(reply)
for userId in relevantUserIds {
retUserIds.insert(userId)
}
}
retReplies[commentIndex] = replies
}
dispatch_group_leave(group)
}
}
}
dispatch_group_wait(group, DISPATCH_TIME_FOREVER)
success(comments: retComments, replies: retReplies, userIds: retUserIds)
}
}
}
As you can see from my code, I fetch all the comments under the same article, then fetch coresponding replies under each comment. After all requests are done, I invoke my success callback. This can be achieved using GCD's dispatch_group.
Now I am migrating the same functionality to android.
public static void fetchCommentsAndTheirReplies(Context context, String articleId, final StringBuffer outErrorMessage, final Runnable failure, final ArrayList<JSONObject> outComments, final ArrayList<ArrayList<JSONObject>> outReplies, final HashSet<String> outUserIds, final Runnable success) {
final RequestQueue queue = Volley.newRequestQueue(context);
HashMap<String, String> commentParams = new HashMap<>();
commentParams.put(API.article.articleId, articleId);
JsonArrayRequest commentRequest = new JsonArrayRequest(Request.Method.GET, API.baseUrl + API.article.listCreateComment, new JSONObject(commentParams), new Response.Listener<JSONArray>() {
#Override
public void onResponse(JSONArray response) {
try {
for (int i = 0; i < response.length(); i++) {
JSONObject comment = response.getJSONObject(i);
outComments.add(comment);
outUserIds.addAll(helperParseRelaventUserIdsFromEntity(comment));
outReplies.add(new ArrayList<JSONObject>());
//TODO: DISPATCH_GROUP?
String id = comment.getString("_id");
HashMap<String, String> replyParams = new HashMap<>();
replyParams.put(API.article.commentId, id);
final int finalI = i;
JsonArrayRequest replyRequest = new JsonArrayRequest(Request.Method.GET, API.baseUrl + API.article.listCreateReply, new JSONObject(replyParams), new Response.Listener<JSONArray>() {
#Override
public void onResponse(JSONArray response) {
try {
for (int j = 0; j < response.length(); j++) {
JSONObject reply = response.getJSONObject(j);
outUserIds.addAll(helperParseRelaventUserIdsFromEntity(reply));
outReplies.get(finalI).add(reply);
}
} catch (JSONException ex) {}
}
}, new Response.ErrorListener() {
#Override
public void onErrorResponse(VolleyError error) {}
});
queue.add(replyRequest);
}
success.run();
} catch (JSONException ex) {}
}
}, new Response.ErrorListener() {
#Override
public void onErrorResponse(VolleyError error) {
outErrorMessage.append(error.getMessage());
failure.run();
}
});
queue.add(commentRequest);
}
Note that I am using success is executed right after I get all the comments, and before getting all the replies.
So how can I group them and delay the response?
I am working on the hairy implementation like
taskCount++;
if (taskCount == totalCount) {
success.run();
}
in reply block, but it seems very tedious.
You can simply do it with this class I made to mimic the iOS behavior. Call enter() and leave() the same way you did in iOS with dispatch_group_enter and dispatch_group_leave and call notify() just after the requests you want to group, just like dispatch_group_notify. It also uses runnable the same way iOS uses blocks :
public class DispatchGroup {
private int count = 0;
private Runnable runnable;
public DispatchGroup()
{
super();
count = 0;
}
public synchronized void enter(){
count++;
}
public synchronized void leave(){
count--;
notifyGroup();
}
public void notify(Runnable r) {
runnable = r;
notifyGroup();
}
private void notifyGroup(){
if (count <=0 && runnable!=null) {
runnable.run();
}
}
}
Hope it helps ;)
Here is the Kotlin version of Damien Praca's answer. This will allow you to use Kotlin lambdas like this.
val dispatchGroup = DispatchGroup()
dispatchGroup.enter()
// Some long running task
dispatchGroup.leave()
dispatchGroup.notify {
// Some code to run after all dispatch groups complete
}
class DispatchGroup {
private var count = 0
private var runnable: (() -> Unit)? = null
init {
count = 0
}
#Synchronized
fun enter() {
count++
}
#Synchronized
fun leave() {
count--
notifyGroup()
}
fun notify(r: () -> Unit) {
runnable = r
notifyGroup()
}
private fun notifyGroup() {
if (count <= 0 && runnable != null) {
runnable!!()
}
}
}
There is no direct analogue of dispatch_group in plain Java or Android. I can recommend a few rather sophisticated techniques to produce a really clean and elegant solution if you're ready to invest some extra time in it. It's not gonna be one or two lines of code, unfortunately.
Use RxJava with parallelization. RxJava provides a clean way to dispatch multiple tasks, but it works sequentially by default. See this article to make it execute tasks concurrently.
Although this is not exactly the intended usecase, you can try the ForkJoinPool to execute your group of tasks and recieve a single result afterwards.
You may use Threads and Thread.join() with Handlers as an option.
quote from:https://docs.oracle.com/javase/tutorial/essential/concurrency/join.html
The join method allows one thread to wait for the completion of
another. If t is a Thread object whose thread is currently executing,
t.join(); causes the current thread to pause execution until t's
thread terminates. Overloads of join allow the programmer to specify a
waiting period. However, as with sleep, join is dependent on the OS
for timing, so you should not assume that join will wait exactly as
long as you specify.
Like sleep, join responds to an interrupt by exiting with an
InterruptedException.
EDIT:
You should also check my event dispatcher gist. You may like it.
I use java.util.concurrent.CountDownLatch to achieve the goal.
First of all I made a interface for each task.
interface GroupTask {
void onProcessing(final CountDownLatch latch);
}
Then I create a class to handle grouping tasks.
interface MyDisptchGroupObserver {
void onAllGroupTaskFinish();
}
class MyDisptchGroup {
private static final int MSG_ALLTASKCOMPLETED = 300;
private CountDownLatch latch;
private MyDisptchGroupObserver observer;
private MsgHandler msgHandler;
private class MsgHandler extends Handler {
MsgHandler(Looper looper) {
super(looper);
}
#Override
public void handleMessage(Message msg) {
switch(msg.what) {
case MSG_ALLTASKCOMPLETED:
observer.onAllGroupTaskFinish();
break;
default:
break;
}
}
}
MyDisptchGroup(List<GroupTask> tasks, MyDisptchGroupObserver obj) {
latch = new CountDownLatch(tasks.size());
observer = obj;
msgHandler = new MsgHandler(getActivity().getMainLooper())
new Thread( new Runnable() {
#Override
public void run() {
try {
latch.await();
Log.d(TAG, "========= All Tasks Completed =========");
msgHandler.sendEmptyMessage(MSG_ALLTASKCOMPLETED);
} catch() {
e.printStackTrace();
}
}
}).start();
for( GroupTask task : tasks ) {
task.onProcessing(latch);
}
}
}
Of course I have more than one task implementation as the following.
The Task1
class Task1 implements GroupTask {
#Override
public void onProcessing(final CountDownLatch latch) {
new Thread( new Runnable() {
#Override
public void run() {
// Just implement my task1 stuff here
// The end of the Task1 remember to countDown
latch.countDown();
}
}).start();
}
}
And Task2
class Task2 implements GroupTask {
#Override
public void onProcessing(final CountDownLatch latch) {
new Thread( new Runnable() {
#Override
public void run() {
// Just implement my task2 stuff here
// The end of the Task2 remember to countDown
latch.countDown();
}
}).start();
}
}
Now everything are ready to fire.
ArrayList<GroupTask> allTasks = new ArrayList<GroupTask>();
allTasks.add(new Task1());
allTasks.add(new Task2());
new MyDisptchGroup(allTasks, this);

Waiting for executor Service thread

I have a class which uses the executor service to run a task concurrently.
Code:
class SomeClass{
private static ExecutorService taskThread = Executors.newFixedThreadPool(1, new ThreadFactory() {
private int threadCount = 0;
#Override
public Thread newThread(Runnable r) {
Thread t = new Thread(r);
t.setDaemon(true);
return t;
}
});
static {
Runtime.getRuntime().addShutdownHook(new Thread() {
#Override
public void run() {
// TODO Auto-generated method stub
taskThread.shutdown();
}
});
}
doSomeTask()
{
DocumentUploader callable = new DocumentUploader(randomID,fileLoc);
FutureTask<String> task1 = new FutureTask<String>(callable);
taskThread.execute(task1);
}
someFunctionforWait(){
//what here..???
}
I have another class name SomeOtherClass which will access the modifications / calculations done by task1 thread. So I need to wait for the thread to complete, So how can I wait for task1 to complete . What I intend to do is to call someFunctionforWait() from the class SomeOtherClass to check if the thread has completed and then start to do its task.
How do I do that.
You could use Futures, or if you want to stick with the ExecutorService, just wait for taskThread.shutdown() and taskThread.awaitTermination(); you could put those statements into your someFunctionforWait() and continue execution after that. Here is a simplified example based on your code:
public class SomeClass {
private ExecutorService taskThread = Executors.newFixedThreadPool(1);
private List<Future<String>> futures = new ArrayList<Future<String>>();
void doSomeTask() {
FutureTask<String> task1 = new FutureTask<String>(new Callable<String>() {
public String call() throws Exception {
System.out.println("thread executing");
Thread.sleep(1000);
return Thread.currentThread().toString();
}
});
taskThread.execute(task1);
futures.add(task1);
};
public void someFunctionforWait() throws InterruptedException, ExecutionException{
taskThread.shutdown();
taskThread.awaitTermination(5, TimeUnit.SECONDS);
System.out.println("joined");
}
public void someFunctionforWaitAlternative() throws InterruptedException, ExecutionException{
for(Future<String> future : futures) {
System.out.println("future val: " + future.get());
}
System.out.println("joined");
}
public static void main(String[] args) throws Exception {
SomeClass c = new SomeClass();
c.doSomeTask();
c.someFunctionforWait();
//c.someFunctionforWaitAlternative();
}
}
Create a Future through taskThread and call get() on it. It will block until the Future is completed:
Future<String> f = taskThread.submit(callable); // concurrent operation
String result = f.get(); // blocks until f completes
// use result

ExecutorService fixed threads rejecting

I have a problem with some threads.
My script
1 - loads like over 10 millions lines into an Array from a text file
2 - creates an ExecutorPool of 5 fixed threads
3 - then it is iterating that list and add some threads to the queue
executor.submit(new MyCustomThread(line,threadTimeout,"[THREAD "+Integer.toString(increment)+"]"));
Now the active threads never bypass 5 fixed threads, which is good, but i obseved that my processor goes into 100% load, and i have debuged a little bit and i saw that MyCustomThread constructor is being called, witch means that no matter if i declare 5 fixed threads, the ExecutorService will still try to create 10 milions objects.
The main question is :
How do i prevent this? I just want to have threads being rejected if they don't have room, not to create 10 million object and run them one by one.
Second question :
How do i get the current active threads? I tried threadGroup.activeCount() but it always give me 5 5 5 5 ....
THE CALLER CLASS :
System.out.println("Starting threads ...");
final ThreadGroup threadGroup = new ThreadGroup("workers");
//ExecutorService executor = Executors.newFixedThreadPool(howManyThreads);
ExecutorService executor = Executors.newFixedThreadPool(5,new ThreadFactory() {
public Thread newThread(Runnable r) {
return new Thread(threadGroup, r);
}
});
int increment = 0;
for(String line : arrayOfLines)
{
if(increment > 10000)
{
//System.out.println("TOO MANY!!");
//System.exit(0);
}
System.out.println(line);
System.out.println(threadGroup.activeCount());
if(threadGroup.activeCount() >= 5)
{
for(int i = 0; i < 10; i++)
{
System.out.println(threadGroup.activeCount());
System.out.println(threadGroup.activeGroupCount());
Thread.sleep(1000);
}
}
try
{
executor.submit(new MyCustomThread(line,threadTimeout,"[THREAD "+Integer.toString(increment)+"]"));
}
catch(Exception ex)
{
continue;
//System.exit(0);
}
increment++;
}
executor.awaitTermination(10, TimeUnit.MILLISECONDS);
executor.shutdown();
THREAD CLASS :
public class MyCustomThread extends Thread
{
private String ip;
private String threadName;
private int threadTimeout = 10;
public MyCustomThread(String ip)
{
this.ip = ip;
}
public MyCustomThread(String ip,int threadTimeout,String threadName)
{
this.ip = ip;
this.threadTimeout = threadTimeout;
this.threadName = threadName;
System.out.prinln("MyCustomThread constructor has been called!");
}
#Override
public void run()
{
// do some stuff that takes time ....
}
}
Thank you.
You are doing it a bit wrong. The philosophy with executors is that you implement the work unit as a Runnable or a Callable (instead of a Thread). Each Runnable or Callable should do one atomic piece of work which is mutually exclusive of other Runnables or Callables.
Executor services internally use a pool of threads so your creating a thread group and Thread is not doing any good.
Try this simple piece:
ExecutorService executor = Executors.newFixedThreadPool(5);`
executor.execute(new MyRunnableWorker());
public class MyRunnableWorker implements Runnable{
private String ip;
private String threadName;
private int threadTimeout = 10;
public MyRunnableWorker(String ip){
this.ip = ip;
}
public MyRunnableWorker(String ip,int threadTimeout,String threadName){
this.ip = ip;
this.threadTimeout = threadTimeout;
this.threadName = threadName;
System.out.prinln("MyRunnableWorker constructor has been called!");
}
#Override
public void run(){ {
// do some stuff that takes time ....
}
}
This would give you what you want. Also try to test you thread code execution using visualVM to see how threads are running and what the load distribution.
I think your biggest problem here is that MyCustomThread should implement Runnable, not extend Thread. When you use an ExecutorService you let it handle the Thread management (i.e. you don't need to create them.)
Here's an approximation of what I think you're trying to do. Hope this helps.
public class FileProcessor
{
public static void main(String[] args)
{
List<String> lines = readFile();
System.out.println("Starting threads ...");
ExecutorService executor = Executors.newFixedThreadPool(5);
for(String line : lines)
{
try
{
executor.submit(new MyCustomThread(line));
}
catch(Exception ex)
{
ex.printStackTrace();
}
}
try
{
executor.shutdown();
executor.awaitTermination(10, TimeUnit.SECONDS);
}
catch (InterruptedException e)
{
System.out.println("A processor took longer than the await time to complete.");
}
executor.shutdownNow();
}
protected static List<String> readFile()
{
List<String> lines = new ArrayList<String>();
try
{
String filename = "/temp/data.dat";
FileReader fileReader = new FileReader(filename );
BufferedReader bufferedReader = new BufferedReader(fileReader);
String line = null;
while ((line = bufferedReader.readLine()) != null) {
lines.add(line);
}
bufferedReader.close();
}
catch (Exception e)
{
e.printStackTrace();
}
return lines;
}
}
public class MyCustomThread implements Runnable
{
String line;
MyCustomThread(String line)
{
this.line = line;
}
#Override
public void run()
{
System.out.println(Thread.currentThread().getName() + " processed line:" + line);
}
}
EDIT:
This implementation does NOT block on the ExecutorService submit. What I mean by this is that a new instance of MyCustomThread is created for every line in the file regardless of whether any previously submitted MyCustomThreads have completed. You could add a blocking / limiting worker queue to prevent this.
ExecutorService executor = new ThreadPoolExecutor(5, 5, 0L, TimeUnit.MILLISECONDS, new LimitedQueue<Runnable>(10));
An example of a blocking / limiting queue implementation can be found here:

Java ThreadPool reporting

I have a worker threadpool set up that executes a bit of work which I want to log in a central place.
To be more precise, I've extended the Thread class into a worker class, which checks the status of a concurrent queue. If it's empty, then it waits. As elements are added by another thread, notify() wakes the workers. Once they've completed the task, they wait for the next element in the queue.
What's the best practice to have each of the threads report their status at the end of each of their tasks?
public class PoolWorker extends Thread {
public ConcurrentLinkedQueue<Device> q;
public PoolWorker(ConcurrentLinkedQueue<Device> q, String type){
this.q = q;
this.type = type;
}
#Override
public void run(){
while (true)
{
Device d = null;
try{
synchronized(q){
while(q.isEmpty())
{
q.wait(); // wait for a notify()
}
d = q.remove();
}
// do some work
// report status of work completed
}
}
Try to do something like this
ExecutorService exec = Executors.newFixedThreadPool(10);
Runnable runn = new Runnable()
{
#Override
public void run()
{
System.out.println("");
}
};
exec.execute(runn);
As mentioned best way is to use BlockingQueue. Below is the sample code:
public class PoolWorker extends Thread {
public ArrayBlockingQueue<String> q;
public String type;
public PoolWorker(ArrayBlockingQueue<String> q, String type) {
this.q = q;
this.type = type;
}
#Override
public void run() {
while(true){
String work = null;
try {
System.out.println("PoolWorker.run:waiting .............");
work = q.take();
} catch (InterruptedException e) {
e.printStackTrace();
}
System.out.println("PoolWorker.run..work: " + work);
}
}
public static void main(String[] args) throws InterruptedException {
ArrayBlockingQueue<String> pool = new ArrayBlockingQueue<String>(100);
PoolWorker worker = new PoolWorker(pool, "Something");
worker.start();
addWork(pool, "work1");
addWork(pool, "work2");
addWork(pool, "work3");
addWork(pool, "work4");
addWork(pool, "work5");
//Just give enough time to run
Thread.sleep(5000);
}
private static void addWork(ArrayBlockingQueue<String> pool, String work) throws InterruptedException {
System.out.println("PoolWorker.addWork: " + work);
pool.put(work);
}
}
There is nice sample code available in Java documentation as well:
http://docs.oracle.com/javase/7/docs/api/java/util/concurrent/BlockingQueue.html

Categories

Resources