Here is a simplified version of my application showing what I'm doing.
/*
in my app's main():
Runner run = new Runner();
run.dowork();
*/
class Runner
{
private int totalWorkers = 2;
private int workersDone = 0;
public synchronized void workerDone()
{
workersDone++;
notifyAll();
}
public synchronized void dowork()
{
workersDone = 0;
//<code for opening a file here, other setup here, etc>
Worker a = new Worker(this);
Worker b = new Worker(this);
while ((line = reader.readLine()) != null)
{
//<a large amount of processing on 'line'>
a.setData(line);
b.setData(line);
while (workersDone < totalWorkers)
{
wait();
}
}
}
}
class Worker implements Runnable
{
private Runner runner;
private String data;
public Worker(Runner r)
{
this.runner = r;
Thread t = new Thread(this);
t.start();
}
public synchronized void setData(String s)
{
this.data = s;
notifyAll();
}
public void run
{
while (true)
{
synchronized(this)
{
wait();
//<do work with this.data here>
this.runner.workerDone();
}
}
}
}
The basic concept here is that I have a bunch of workers which all do some processing on an incoming line of data, all independently, and write out the data wherever they like - they do not need to report any data back to the main thread or share data with each other.
The problem that I'm having is that this code deadlocks. I'm reading a file of over 1 million lines and I'm lucky to get 100 lines into it before my app stops responding.
The workers, in reality, all do differing amounts of work so I want to wait until they all complete before moving to the next line.
I cannot let the workers process at different speeds and queue the data internally because the files I am processing are too large for this and won't fit in memory.
I cannot give each worker its own FileReader to independently get 'line', because I do a ton of processing on the line before the workers see it, and do not want to have to re-do the processing in each worker.
I know I'm missing some fairly simple aspect of synchronization in Java but I'm stuck at this point. If someone could explain what I'm doing wrong here I would appreciate it. I believe I'm misunderstanding some aspect of the synchronization but I'm out of ideas for attempting to fix it.
Working directly with synchronized, wait(), and notify() is definitely tricky.
Fortunately the Java Concurrency API provides some excellent control objects for this sort of thing that are much more intuitive. In particular, look at CyclicBarrier and CountDownLatch; one of them almost certainly will be what you're looking for.
You may also find a ThreadPoolExecutor to be handy for this situation.
Here's a simple example / conversion of your snippet that produces the following output (without deadlock, of course):
Read line: Line 1
Waiting for work to be complete on line: Line 1
Working on line: Line 1
Working on line: Line 1
Read line: Line 2
Waiting for work to be complete on line: Line 2
Working on line: Line 2
Working on line: Line 2
Read line: Line 3
Waiting for work to be complete on line: Line 3
Working on line: Line 3
Working on line: Line 3
All work complete!
public class Runner
{
public static void main(String args[]) {
Runner r = new Runner();
try {
r.dowork();
} catch (IOException e) {
// handle
e.printStackTrace();
}
}
CyclicBarrier barrier;
ExecutorService executor;
private int totalWorkers = 2;
public Runner() {
this.barrier = new CyclicBarrier(this.totalWorkers + 1);
this.executor = Executors.newFixedThreadPool(this.totalWorkers);
}
public synchronized void dowork() throws IOException
{
//<code for opening a file here, other setup here, etc>
//BufferedReader reader = null;
//String line;
final Worker worker = new Worker();
for(String line : new String[]{"Line 1", "Line 2", "Line 3"})
//while ((line = reader.readLine()) != null)
{
System.out.println("Read line: " + line);
//<a large amount of processing on 'line'>
for(int c = 0; c < this.totalWorkers; c++) {
final String curLine = line;
this.executor.submit(new Runnable() {
public void run() {
worker.doWork(curLine);
}
});
}
try {
System.out.println("Waiting for work to be complete on line: " + line);
this.barrier.await();
} catch (InterruptedException e) {
// handle
e.printStackTrace();
} catch (BrokenBarrierException e) {
// handle
e.printStackTrace();
}
}
System.out.println("All work complete!");
}
class Worker
{
public void doWork(String line)
{
//<do work with this.data here>
System.out.println("Working on line: " + line);
try {
Runner.this.barrier.await();
} catch (InterruptedException e) {
// handle
e.printStackTrace();
} catch (BrokenBarrierException e) {
// handle
e.printStackTrace();
}
}
}
}
IMHO you have improperly placed "workersDone = 0".
public synchronized void dowork()
{
// workersDone = 0;
//<code for opening a file here, other setup here, etc>
Worker a = new Worker(this);
Worker b = new Worker(this);
while ((line = reader.readLine()) != null)
{
workersDone = 0;
//<a large amount of processing on 'line'>
a.setData(line);
b.setData(line);
while (workersDone < totalWorkers)
{
wait();
}
}
}
Related
I was trying to implement a reader-writer using notify and wait. But i think I'm stuck.
My sequence goes like this.
RRRRRRRRRRWWWWWWWWW This happens if the main start with reader invoked first.
Or
WWWWWWWRRRRRRRRRRR. This happens if the main start with the writer invoked first.
Looks like reads notify isn't working at all. Writer thread never goes into execution.
If i make while loop in run method to run infinite then it's just
RRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRR......... No chance for the writer to write.
Can you have a look at this?
DATA CLASS
public class Data {
private int q ;
private boolean isAnyOneReading;
public Data() {
}
public void readQ() {
synchronized (this){
isAnyOneReading = true;
System.out.println("Read start "+q);
}
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
synchronized (this){
isAnyOneReading = false;
System.out.println("Read end "+q);
notifyAll();
}
}
public synchronized void writeQ(int q) {
System.out.println(isAnyOneReading);
while (isAnyOneReading){
try{
wait();
} catch (InterruptedException e) {
e.printStackTrace();
System.out.println("Done");
Thread.currentThread().interrupt();
}
}
System.out.println("Write start "+q);
this.q = q;
try{
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
Thread.currentThread().interrupt();
}
System.out.println("Write end "+q);
notifyAll();
}
}
READER CLASS
public class Reader implements Runnable {
private Data data;
private Thread readerThread;
public Reader(Data data) {
this.data = data;
readerThread = new Thread(this, "ReaderThread");
}
void startThread(){
readerThread.start();
}
#Override
public void run() {
int i = 0 ;
while (i != 5){
data.readQ();
i++;
}
}
}
WRITER CLASS
public class Writer implements Runnable{
private Data data;
private Thread writerThread;
public Writer(Data data) {
this.data = data;
writerThread = new Thread(this,"WriterThread," );
}
void startThread(){
writerThread.start();
}
#Override
public void run() {
int i = 0 ;
int j = 0 ;
while (j != 5){
data.writeQ(i++);
// i++;
j++;
}
}
}
MAIN CLASS
public class ReaderWriterDemo {
public static void main(String[] args) {
Data data = new Data();
Reader reader = new Reader(data);
Writer writer = new Writer(data);
reader.startThread();
writer.startThread();
}
}
Try removing the Thread.sleep from Data class.
And add Thread.sleep in run methods like so. (pasting one example):
#Override
public void run() {
int i = 0;
while (i != 5) {
data.readQ();
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
} finally {
i++;
}
}
}
Read's notifyAll() works, but it seems that read() re-called again and changes isAnyOneReading's value prior to any other action in write(). That's why check fails and write() starts waiting again. As Danny Fried suggested moving Thread.sleep() to the run methods will help.
Looks like a simple case of starvation. Consider your writer's main loop:
while (j != 5){
data.writeQ(i++);
// i++;
j++;
}
data.writeQ() is a synchronized method: The very last thing it does before it returns is to unlock the lock. The very first thing it does on the next call is to re-lock the lock. Not much happens in-between--increment and test a local variable is all.
Java synchronized locks are not fair. (i.e., when a lock becomes available, the system does not guarantee that the winner will be the thread that's been waiting the longest.) In fact, it may be be the opposite of fair: The OS may try to maximize efficient use of the CPU(s) by always choosing the thread that's easiest to wake up.
When the writer comes back to call data.writeQ() on each subsequent iteration, it may be that the OS has not even have started to wake up the reader, and it simply lets the writer enter the synchronized block again.
Same thing happens with your reader. The code is a bit more complicated, but just like in the writer, the very last thing that data.readQ() does before returning is to unlock the lock, and the very first thing that it does on the next call is to lock it again.
Brute force solution: replace the synchronized blocks with a fair ReentrantLock object.
Alternate solution, which is more typical of how many programs actually work: Have the threads do something else (e.g., have them do some I/O) in between calls to the locked function, thereby giving the other threads a chance to get in and use the locked resource.
What would be a JUnit based code to run this 3 methods each as 10 concurrent threads.
#RunWith(SpringJUnit4ClassRunner.class
#SpringBootTest
public class TestClass {
#Test
public void readFromDBOneRecord(){
try {
dbService.findOneByID("1");
} catch (Exception error) {
Assert.fail("Unexpected error occured .");
}
}
#Test
public void writeToDBOneRecord(){
try {
dbService.save(entity.builder()
.setID("1").setName("John").build())
} catch (Exception error) {
Assert.fail("Unexpected error occured .");
}
}
#Test
public void deleteDbRecord(){
try {
dbService.delete("1");
} catch (Exception error) {
Assert.fail("Unexpected error occured .");
}
}
}
In some cases some of the methods would throw exceptions. Like if the delete being executed before writeToDBOneRecord.
So the sequence would be say for only 3 threads per method e.g.:
OperationNr|| OperationName || [ThreadNr/total threads per method]OperationType
1. write [2/3]w
2. read [1/3]r
3. read [3/3]r
4. delete [2/3]d
5. read [2/3]r
6. delete [3/3]d ->exception no record
7. write [1/3]w
8. write [3/3]w ->exception record already present
9. delete [1/3]d
What would the code for executing this 3 test methods each in 10 concurrent threads (30 in total)?
As you want to do everything in parallel, I would mix everything and rely on CountDownLatch instances to synchronize the threads as next:
#Test
public void testMultiThreading() throws Exception {
// Total of reader threads
int reader = 5;
// Total of writer threads
int writer = 3;
// Total of remover threads
int remover = 1;
// CountDownLatch used to release all the threads at the same time
final CountDownLatch startSignal = new CountDownLatch(1);
// CountDownLatch used to be notified when all threads did their task
final CountDownLatch doneSignal = new CountDownLatch(reader + writer + remover);
// List in which we collect all the errors
final List<Exception> errors = Collections.synchronizedList(new ArrayList<>());
// Create all the reader threads and start them
for (int i = 0; i < reader; i++) {
Thread thread = new Thread() {
public void run() {
try {
startSignal.await();
dbService.findOneByID("1");
} catch (Exception e) {
errors.add(e);
} finally {
doneSignal.countDown();
}
}
};
thread.start();
}
// Create all the writer threads and start them
for (int i = 0; i < writer; i++) {
Thread thread = new Thread() {
public void run() {
try {
startSignal.await();
dbService.save(entity.builder()
.setID("1").setName("John").build());
} catch (Exception e) {
errors.add(e);
} finally {
doneSignal.countDown();
}
}
};
thread.start();
}
// Create all the remover threads and start them
for (int i = 0; i < remover; i++) {
Thread thread = new Thread() {
public void run() {
try {
startSignal.await();
dbService.delete("1");
} catch (Exception e) {
errors.add(e);
} finally {
doneSignal.countDown();
}
}
};
thread.start();
}
// Release the threads
startSignal.countDown();
// Wait until all threads did their task
doneSignal.await();
// If an error has been collected, print the stack trace and throws the
// first error to make the test fail
if (!errors.isEmpty()) {
for (Exception e : errors) {
e.printStackTrace();
}
throw errors.get(0);
}
}
NB: If you want a given unit test to be executed by several concurrent threads, have a look to contiperf but it won't allow you to mix them as you want to achieve
I am trying to create a sort of console/terminal that allows the user to input a string, which then gets made into a process and the results are printed out. Just like a normal console. But I am having trouble managing the input/output streams. I have looked into this thread, but that solution sadly doesn't apply to my problem.
Along with the standard commands like "ipconfig" and "cmd.exe", I need to be able to run a script and use the same inputstream to pass some arguments, if the script is asking for input.
For example, after running a script "python pyScript.py", I should be able pass further input to the script if it is asking for it(example: raw_input), while also printing the output from the script. The basic behavior you would expect from a terminal.
What I've got so far:
import java.awt.BorderLayout;
import java.awt.Color;
import java.awt.Dimension;
import java.awt.event.KeyEvent;
import java.awt.event.KeyListener;
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.OutputStream;
import java.io.OutputStreamWriter;
import javax.swing.JFrame;
import javax.swing.JPanel;
import javax.swing.JScrollPane;
import javax.swing.JTextPane;
import javax.swing.text.BadLocationException;
import javax.swing.text.Document;
public class Console extends JFrame{
JTextPane inPane, outPane;
InputStream inStream, inErrStream;
OutputStream outStream;
public Console(){
super("Console");
setPreferredSize(new Dimension(500, 600));
setLocationByPlatform(true);
setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
// GUI
outPane = new JTextPane();
outPane.setEditable(false);
outPane.setBackground(new Color(20, 20, 20));
outPane.setForeground(Color.white);
inPane = new JTextPane();
inPane.setBackground(new Color(40, 40, 40));
inPane.setForeground(Color.white);
inPane.setCaretColor(Color.white);
JPanel panel = new JPanel(new BorderLayout());
panel.add(outPane, BorderLayout.CENTER);
panel.add(inPane, BorderLayout.SOUTH);
JScrollPane scrollPanel = new JScrollPane(panel);
getContentPane().add(scrollPanel);
// LISTENER
inPane.addKeyListener(new KeyListener(){
#Override
public void keyPressed(KeyEvent e){
if(e.getKeyCode() == KeyEvent.VK_ENTER){
e.consume();
read(inPane.getText());
}
}
#Override
public void keyTyped(KeyEvent e) {}
#Override
public void keyReleased(KeyEvent e) {}
});
pack();
setVisible(true);
}
private void read(String command){
println(command);
// Write to Process
if (outStream != null) {
System.out.println("Outstream again");
BufferedWriter writer = new BufferedWriter(new OutputStreamWriter(outStream));
try {
writer.write(command);
//writer.flush();
//writer.close();
} catch (IOException e1) {
e1.printStackTrace();
}
}
// Execute Command
try {
exec(command);
} catch (IOException e) {}
inPane.setText("");
}
private void exec(String command) throws IOException{
Process pro = Runtime.getRuntime().exec(command, null);
inStream = pro.getInputStream();
inErrStream = pro.getErrorStream();
outStream = pro.getOutputStream();
Thread t1 = new Thread(new Runnable() {
public void run() {
try {
String line = null;
while(true){
BufferedReader in = new BufferedReader(new InputStreamReader(inStream));
while ((line = in.readLine()) != null) {
println(line);
}
BufferedReader inErr = new BufferedReader(new InputStreamReader(inErrStream));
while ((line = inErr.readLine()) != null) {
println(line);
}
Thread.sleep(1000);
}
} catch (Exception e) {
e.printStackTrace();
}
}
});
t1.start();
}
public void println(String line) {
Document doc = outPane.getDocument();
try {
doc.insertString(doc.getLength(), line + "\n", null);
} catch (BadLocationException e) {}
}
public static void main(String[] args){
new Console();
}
}
I don't use the mentioned ProcessBuilder, since I do like to differentiate between error and normal stream.
UPDATE 29.08.2016
With the help of #ArcticLord we have achieved what was asked in the original question.
Now it is just a matter of ironing out any strange behavior like the non terminating process. The Console has a "stop" button that simply calls pro.destroy(). But for some reason this does not work for infinitely running processes, that are spamming outputs.
Console: http://pastebin.com/vyxfPEXC
InputStreamLineBuffer: http://pastebin.com/TzFamwZ1
Example code that does not stop:
public class Infinity{
public static void main(String[] args){
while(true){
System.out.println(".");
}
}
}
Example code that does stop:
import java.util.concurrent.TimeUnit;
public class InfinitySlow{
public static void main(String[] args){
while(true){
try {
TimeUnit.SECONDS.sleep(1);
} catch (InterruptedException e) {
e.printStackTrace();
}
System.out.println(".");
}
}
}
You are on the right way with your code. There are only some minor things you missed.
Lets start with your read method:
private void read(String command){
[...]
// Write to Process
if (outStream != null) {
[...]
try {
writer.write(command + "\n"); // add newline so your input will get proceed
writer.flush(); // flush your input to your process
} catch (IOException e1) {
e1.printStackTrace();
}
}
// ELSE!! - if no outputstream is available
// Execute Command
else {
try {
exec(command);
} catch (IOException e) {
// Handle the exception here. Mostly this means
// that the command could not get executed
// because command was not found.
println("Command not found: " + command);
}
}
inPane.setText("");
}
Now lets fix your exec method. You should use separate threads for reading normal process output and error output. Additionally I introduce a third thread that waits for the process to end and closes the outputStream so next user input is not meant for process but is a new command.
private void exec(String command) throws IOException{
Process pro = Runtime.getRuntime().exec(command, null);
inStream = pro.getInputStream();
inErrStream = pro.getErrorStream();
outStream = pro.getOutputStream();
// Thread that reads process output
Thread outStreamReader = new Thread(new Runnable() {
public void run() {
try {
String line = null;
BufferedReader in = new BufferedReader(new InputStreamReader(inStream));
while ((line = in.readLine()) != null) {
println(line);
}
} catch (Exception e) {
e.printStackTrace();
}
System.out.println("Exit reading process output");
}
});
outStreamReader.start();
// Thread that reads process error output
Thread errStreamReader = new Thread(new Runnable() {
public void run() {
try {
String line = null;
BufferedReader inErr = new BufferedReader(new InputStreamReader(inErrStream));
while ((line = inErr.readLine()) != null) {
println(line);
}
} catch (Exception e) {
e.printStackTrace();
}
System.out.println("Exit reading error stream");
}
});
errStreamReader.start();
// Thread that waits for process to end
Thread exitWaiter = new Thread(new Runnable() {
public void run() {
try {
int retValue = pro.waitFor();
println("Command exit with return value " + retValue);
// close outStream
outStream.close();
outStream = null;
} catch (InterruptedException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
});
exitWaiter.start();
}
Now this should work.
If you enter ipconfig it prints the command output, closes the output stream and is ready for a new command.
If you enter cmd it prints the output and let you enter more cmd commands like dir or cd and so on until you enter exit. Then it closes the output stream and is ready for a new command.
You may run into problems with executing python scripts because there are problems with reading Process InputStreams with Java if they are not flushed into system pipeline.
See this example python script
print "Input something!"
str = raw_input()
print "Received input is : ", str
You could run this with your Java programm and also enter the input but you will not see the script output until the script is finished.
The only fix I could find is to manually flush the output in the script.
import sys
print "Input something!"
sys.stdout.flush()
str = raw_input()
print "Received input is : ", str
sys.stdout.flush()
Running this script will bahave as you expect.
You can read more about this problem at
Java: is there a way to run a system command and print the output during execution?
Why does reading from Process' InputStream block altough data is available
Java: can't get stdout data from Process unless its manually flushed
EDIT: I have just found another very easy solution for the stdout.flush() problem with Python Scripts. Start them with python -u script.py and you don't need to flush manually. This should solve your problem.
EDIT2: We discussed in the comments that with this solution output and error Stream will be mixed up since they run in different threads. The problem here is that we cannot distinguish if output writing is finish when error stream thread comes up. Otherwise classic thread scheduling with locks could handle this situation. But we have a continuous stream until process is finished no matter if data flows or not. So we need a mechanism here that logs how much time has elapsed since last line was read from each stream.
For this I will introduce a class that gets an InputStream and starts a Thread for reading the incoming data. This Thread stores each line in a Queue and stops when end of stream arrives. Additionally it holds the time when last line was read and added to Queue.
public class InputStreamLineBuffer{
private InputStream inputStream;
private ConcurrentLinkedQueue<String> lines;
private long lastTimeModified;
private Thread inputCatcher;
private boolean isAlive;
public InputStreamLineBuffer(InputStream is){
inputStream = is;
lines = new ConcurrentLinkedQueue<String>();
lastTimeModified = System.currentTimeMillis();
isAlive = false;
inputCatcher = new Thread(new Runnable(){
#Override
public void run() {
StringBuilder sb = new StringBuilder(100);
int b;
try{
while ((b = inputStream.read()) != -1){
// read one char
if((char)b == '\n'){
// new Line -> add to queue
lines.offer(sb.toString());
sb.setLength(0); // reset StringBuilder
lastTimeModified = System.currentTimeMillis();
}
else sb.append((char)b); // append char to stringbuilder
}
} catch (IOException e){
e.printStackTrace();
} finally {
isAlive = false;
}
}});
}
// is the input reader thread alive
public boolean isAlive(){
return isAlive;
}
// start the input reader thread
public void start(){
isAlive = true;
inputCatcher.start();
}
// has Queue some lines
public boolean hasNext(){
return lines.size() > 0;
}
// get next line from Queue
public String getNext(){
return lines.poll();
}
// how much time has elapsed since last line was read
public long timeElapsed(){
return (System.currentTimeMillis() - lastTimeModified);
}
}
With this class we could combine the output and error reading thread into one. That lives while the input reading buffer threads live and have not comsumed data. In each run it checks if some time has passed since last output was read and if so it prints all unprinted lines at a stroke. The same with the error output. Then it sleeps for some millis for not wasting cpu time.
private void exec(String command) throws IOException{
Process pro = Runtime.getRuntime().exec(command, null);
inStream = pro.getInputStream();
inErrStream = pro.getErrorStream();
outStream = pro.getOutputStream();
InputStreamLineBuffer outBuff = new InputStreamLineBuffer(inStream);
InputStreamLineBuffer errBuff = new InputStreamLineBuffer(inErrStream);
Thread streamReader = new Thread(new Runnable() {
public void run() {
// start the input reader buffer threads
outBuff.start();
errBuff.start();
// while an input reader buffer thread is alive
// or there are unconsumed data left
while(outBuff.isAlive() || outBuff.hasNext() ||
errBuff.isAlive() || errBuff.hasNext()){
// get the normal output if at least 50 millis have passed
if(outBuff.timeElapsed() > 50)
while(outBuff.hasNext())
println(outBuff.getNext());
// get the error output if at least 50 millis have passed
if(errBuff.timeElapsed() > 50)
while(errBuff.hasNext())
println(errBuff.getNext());
// sleep a bit bofore next run
try {
Thread.sleep(100);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
System.out.println("Finish reading error and output stream");
}
});
streamReader.start();
// remove outStreamReader and errStreamReader Thread
[...]
}
Maybe this is not a perfect solution but it should handle the situation here.
EDIT (31.8.2016)
We discussed in comments that there is still a problem with the code while implementing a stop button that kills the started
process using Process#destroy(). A process that produces very much output e.g. in an infinite loop will
be destroyed immediately by calling destroy(). But since it has already produced a lot of output that has to be consumed
by our streamReader we can't get back to normal programm behaviour.
So we need some small changes here:
We will introduce a destroy() method to the InputStreamLineBuffer that stops the output reading and clears the queue.
The changes will look like this:
public class InputStreamLineBuffer{
private boolean emergencyBrake = false;
[...]
public InputStreamLineBuffer(InputStream is){
[...]
while ((b = inputStream.read()) != -1 && !emergencyBrake){
[...]
}
}
[...]
// exits immediately and clears line buffer
public void destroy(){
emergencyBrake = true;
lines.clear();
}
}
And some little changes in the main programm
public class ExeConsole extends JFrame{
[...]
// The line buffers must be declared outside the method
InputStreamLineBuffer outBuff, errBuff;
public ExeConsole{
[...]
btnStop.addActionListener(new ActionListener() {
public void actionPerformed(ActionEvent e) {
if(pro != null){
pro.destroy();
outBuff.destroy();
errBuff.destroy();
}
}});
}
[...]
private void exec(String command) throws IOException{
[...]
//InputStreamLineBuffer outBuff = new InputStreamLineBuffer(inStream);
//InputStreamLineBuffer errBuff = new InputStreamLineBuffer(inErrStream);
outBuff = new InputStreamLineBuffer(inStream);
errBuff = new InputStreamLineBuffer(inErrStream);
[...]
}
}
Now it should be able to destroy even some output spamming processes.
Note: I found out that Process#destroy() is not able to destroy child processes. So if you start cmd on windows
and start a java programm from there you will end up destroying the cmd process while the java programm is still running.
You will see it in the task manager. This problem could not be solved with java itself. it will need
some os depending external tools to get the pids of these processes and kill them manually.
Although #ArticLord solution is nice and neat, recently I faced the same kind of problem and came up with a solution that's conceptually equivalent, but slightly different in its implementation.
The concept is the same, namely "bulk reads": when a reader thread acquires its turn, it consumes all the stream it handles, and pass the hand only when it is done.
This guarantees the out/err print order.
But instead of using a timer-based turn assignment, I use a lock-based non-blocking read simulation:
// main method for testability: replace with private void exec(String command)
public static void main(String[] args) throws Exception
{
// create a lock that will be shared between reader threads
// the lock is fair to minimize starvation possibilities
ReentrantLock lock = new ReentrantLock(true);
// exec the command: I use nslookup for testing on windows
// because it is interactive and prints to stderr too
Process p = Runtime.getRuntime().exec("nslookup");
// create a thread to handle output from process (uses a test consumer)
Thread outThread = createThread(p.getInputStream(), lock, System.out::print);
outThread.setName("outThread");
outThread.start();
// create a thread to handle error from process (test consumer, again)
Thread errThread = createThread(p.getErrorStream(), lock, System.err::print);
errThread.setName("errThread");
errThread.start();
// create a thread to handle input to process (read from stdin for testing purpose)
PrintWriter writer = new PrintWriter(p.getOutputStream());
Thread inThread = createThread(System.in, null, str ->
{
writer.print(str);
writer.flush();
});
inThread.setName("inThread");
inThread.start();
// create a thread to handle termination gracefully. Not really needed in this simple
// scenario, but on a real application we don't want to block the UI until process dies
Thread endThread = new Thread(() ->
{
try
{
// wait until process is done
p.waitFor();
logger.debug("process exit");
// signal threads to exit
outThread.interrupt();
errThread.interrupt();
inThread.interrupt();
// close process streams
p.getOutputStream().close();
p.getInputStream().close();
p.getErrorStream().close();
// wait for threads to exit
outThread.join();
errThread.join();
inThread.join();
logger.debug("exit");
}
catch(Exception e)
{
throw new RuntimeException(e.getMessage(), e);
}
});
endThread.setName("endThread");
endThread.start();
// wait for full termination (process and related threads by cascade joins)
endThread.join();
logger.debug("END");
}
// convenience method to create a specific reader thread with exclusion by lock behavior
private static Thread createThread(InputStream input, ReentrantLock lock, Consumer<String> consumer)
{
return new Thread(() ->
{
// wrap input to be buffered (enables ready()) and to read chars
// using explicit encoding may be relevant in some case
BufferedReader reader = new BufferedReader(new InputStreamReader(input));
// create a char buffer for reading
char[] buffer = new char[8192];
try
{
// repeat until EOF or interruption
while(true)
{
try
{
// wait for your turn to bulk read
if(lock != null && !lock.isHeldByCurrentThread())
{
lock.lockInterruptibly();
}
// when there's nothing to read, pass the hand (bulk read ended)
if(!reader.ready())
{
if(lock != null)
{
lock.unlock();
}
// this enables a soft busy-waiting loop, that simultates non-blocking reads
Thread.sleep(100);
continue;
}
// perform the read, as we are sure it will not block (input is "ready")
int len = reader.read(buffer);
if(len == -1)
{
return;
}
// transform to string an let consumer consume it
String str = new String(buffer, 0, len);
consumer.accept(str);
}
catch(InterruptedException e)
{
// catch interruptions either when sleeping and waiting for lock
// and restore interrupted flag (not necessary in this case, however it's a best practice)
Thread.currentThread().interrupt();
return;
}
catch(IOException e)
{
throw new RuntimeException(e.getMessage(), e);
}
}
}
finally
{
// protect the lock against unhandled exceptions
if(lock != null && lock.isHeldByCurrentThread())
{
lock.unlock();
}
logger.debug("exit");
}
});
}
Note that both solutions, #ArticLord's and mine, are not totally starvation-safe, and chances (really few) are inversely proportional to consumers speed.
Happy 2016! ;)
I have rewritten this many times but I could not find a solution to this problem for a while. Some other Class writes gps.log file with lines like:
2014-09-02 10:23:13 35.185604 33.859077
2014-09-02 10:23:18 35.185620 33.859048
I am trying to read the last line of the file and update a text field in the user interface. The Thread below is overdriving the CPU into 85-100%.
I keep the file very tiny (100 lines - < 5KB). I have been working with CSV for a long time, and I think reading this file every 3 seconds should not have this footprint on the CPU. Although I have been reading huge CSV files in the past it is the first time I have this issue now that I try to update the User Interface every couple seconds. Am I doing something wrong with how I am updating the text field? Any ideas?
Thanks for looking.
new Thread(new Runnable() {
public void run() {
while (true) {
Display.getDefault().asyncExec(new Runnable() {
public void run() {
try {
try { Thread.sleep(3000); } catch (Exception e) { }
BufferedReader gpslog = new BufferedReader(new FileReader("log/gps.log"));
String line = "";
String lastLine = "";
int i=0;
while (line != null) {
i++;
lastLine = line;
line = gpslog.readLine();
}
//System.out.println(lastLine);
gpslog.close();
if (lastLine != null) { txtGPSStatus.setText(lastLine); }
//If more than 100 gps entries, flush the file
if (i>100) {
PrintWriter writer = new PrintWriter("log/gps.log");
writer.close();
}
} catch (IOException e1) {
log.error(e1);
}
}
});
}
}
}).start();
Move
try { Thread.sleep(3000); } catch (Exception e) { }
so it is just after
while(true) {
Then you will run, wait 3 secs, run, etc.
You should get a clear idea of what should be done by the background thread and what the UI thread is for!
Executor executor = Executors.newSingleThreadExecutor();
executor.execute(new Runnable() {
public void run() {
try {
while (true) {
updateLog();
Thread.sleep(3000);
}
} catch (InterruptedException ex) {
// restore interruption flag
Thread.currentThread().interrupt();
}
}
});
private void updateLog() {
String lastLine = readLastLogLine();
Display.getDefault().syncExec(new Runnable() {
public void run() {
txtGPSStatus.setText(lastLine);
}
});
}
I have a problem with some threads.
My script
1 - loads like over 10 millions lines into an Array from a text file
2 - creates an ExecutorPool of 5 fixed threads
3 - then it is iterating that list and add some threads to the queue
executor.submit(new MyCustomThread(line,threadTimeout,"[THREAD "+Integer.toString(increment)+"]"));
Now the active threads never bypass 5 fixed threads, which is good, but i obseved that my processor goes into 100% load, and i have debuged a little bit and i saw that MyCustomThread constructor is being called, witch means that no matter if i declare 5 fixed threads, the ExecutorService will still try to create 10 milions objects.
The main question is :
How do i prevent this? I just want to have threads being rejected if they don't have room, not to create 10 million object and run them one by one.
Second question :
How do i get the current active threads? I tried threadGroup.activeCount() but it always give me 5 5 5 5 ....
THE CALLER CLASS :
System.out.println("Starting threads ...");
final ThreadGroup threadGroup = new ThreadGroup("workers");
//ExecutorService executor = Executors.newFixedThreadPool(howManyThreads);
ExecutorService executor = Executors.newFixedThreadPool(5,new ThreadFactory() {
public Thread newThread(Runnable r) {
return new Thread(threadGroup, r);
}
});
int increment = 0;
for(String line : arrayOfLines)
{
if(increment > 10000)
{
//System.out.println("TOO MANY!!");
//System.exit(0);
}
System.out.println(line);
System.out.println(threadGroup.activeCount());
if(threadGroup.activeCount() >= 5)
{
for(int i = 0; i < 10; i++)
{
System.out.println(threadGroup.activeCount());
System.out.println(threadGroup.activeGroupCount());
Thread.sleep(1000);
}
}
try
{
executor.submit(new MyCustomThread(line,threadTimeout,"[THREAD "+Integer.toString(increment)+"]"));
}
catch(Exception ex)
{
continue;
//System.exit(0);
}
increment++;
}
executor.awaitTermination(10, TimeUnit.MILLISECONDS);
executor.shutdown();
THREAD CLASS :
public class MyCustomThread extends Thread
{
private String ip;
private String threadName;
private int threadTimeout = 10;
public MyCustomThread(String ip)
{
this.ip = ip;
}
public MyCustomThread(String ip,int threadTimeout,String threadName)
{
this.ip = ip;
this.threadTimeout = threadTimeout;
this.threadName = threadName;
System.out.prinln("MyCustomThread constructor has been called!");
}
#Override
public void run()
{
// do some stuff that takes time ....
}
}
Thank you.
You are doing it a bit wrong. The philosophy with executors is that you implement the work unit as a Runnable or a Callable (instead of a Thread). Each Runnable or Callable should do one atomic piece of work which is mutually exclusive of other Runnables or Callables.
Executor services internally use a pool of threads so your creating a thread group and Thread is not doing any good.
Try this simple piece:
ExecutorService executor = Executors.newFixedThreadPool(5);`
executor.execute(new MyRunnableWorker());
public class MyRunnableWorker implements Runnable{
private String ip;
private String threadName;
private int threadTimeout = 10;
public MyRunnableWorker(String ip){
this.ip = ip;
}
public MyRunnableWorker(String ip,int threadTimeout,String threadName){
this.ip = ip;
this.threadTimeout = threadTimeout;
this.threadName = threadName;
System.out.prinln("MyRunnableWorker constructor has been called!");
}
#Override
public void run(){ {
// do some stuff that takes time ....
}
}
This would give you what you want. Also try to test you thread code execution using visualVM to see how threads are running and what the load distribution.
I think your biggest problem here is that MyCustomThread should implement Runnable, not extend Thread. When you use an ExecutorService you let it handle the Thread management (i.e. you don't need to create them.)
Here's an approximation of what I think you're trying to do. Hope this helps.
public class FileProcessor
{
public static void main(String[] args)
{
List<String> lines = readFile();
System.out.println("Starting threads ...");
ExecutorService executor = Executors.newFixedThreadPool(5);
for(String line : lines)
{
try
{
executor.submit(new MyCustomThread(line));
}
catch(Exception ex)
{
ex.printStackTrace();
}
}
try
{
executor.shutdown();
executor.awaitTermination(10, TimeUnit.SECONDS);
}
catch (InterruptedException e)
{
System.out.println("A processor took longer than the await time to complete.");
}
executor.shutdownNow();
}
protected static List<String> readFile()
{
List<String> lines = new ArrayList<String>();
try
{
String filename = "/temp/data.dat";
FileReader fileReader = new FileReader(filename );
BufferedReader bufferedReader = new BufferedReader(fileReader);
String line = null;
while ((line = bufferedReader.readLine()) != null) {
lines.add(line);
}
bufferedReader.close();
}
catch (Exception e)
{
e.printStackTrace();
}
return lines;
}
}
public class MyCustomThread implements Runnable
{
String line;
MyCustomThread(String line)
{
this.line = line;
}
#Override
public void run()
{
System.out.println(Thread.currentThread().getName() + " processed line:" + line);
}
}
EDIT:
This implementation does NOT block on the ExecutorService submit. What I mean by this is that a new instance of MyCustomThread is created for every line in the file regardless of whether any previously submitted MyCustomThreads have completed. You could add a blocking / limiting worker queue to prevent this.
ExecutorService executor = new ThreadPoolExecutor(5, 5, 0L, TimeUnit.MILLISECONDS, new LimitedQueue<Runnable>(10));
An example of a blocking / limiting queue implementation can be found here: