I have some code that runs a process and reads from the stdout and stderr asynchronously and then handles when the process completes. It looks something like this:
Process process = builder.start();
Thread outThread = new Thread(() -> {
try (BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream()))) {
// Read stream here
} catch (Exception e) {
}
});
Thread errThread = new Thread(() -> {
try (BufferedReader reader = new BufferedReader(new InputStreamReader(process.getErrorStream()))) {
// Read stream here
} catch (Exception e) {
}
});
outThread.start();
errThread.start();
new Thread(() -> {
int exitCode = -1;
try {
exitCode = process.waitFor();
outThread.join();
errThread.join();
} catch (Exception e) {
}
// Process completed and read all stdout and stderr here
}).start();
My issue is with the fact that I am using 3 threads to achieve this asynchronous "run-and-get-output" task - I don't know why, but I feel it doesn't feel right using 3 threads. I could allocate the threads out of a thread pool, but that would still be blocking those threads.
Is there anything I can do, maybe with NIO, to reduce this to fewer (1?) thread? Anything I can think of will be constantly spinning a thread (unless I add a few sleeps), which I don't really want to do either...
NOTE: I do need to read as I go (rather than when the process has stopped) and I do need to separate stdin from stderr so can't do a redirect.
Since you've specified that you need to read the output as you go, there is no non-multi-threaded solution.
You can reduce the number of threads to one beyond your main thread though:
Process process = builder.start();
Thread errThread = new Thread(() -> {
try (BufferedReader reader = new BufferedReader(new InputStreamReader(process.getErrorStream()))) {
// Read stream here
} catch (Exception e) {
}
});
errThread.start();
try (BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream()))) {
// Read stream here
} catch (Exception e) {
}
// we got an end of file, so there can't be any more input. Now we need to wait for stderr/process exit.
int exitCode = -1;
try {
exitCode = process.waitFor();
errThread.join();
} catch (Exception e) {
}
// Process completed
If you truely don't need to deal with the error/output until after the process ends, you can simplify it a bit and only use your main thread like this:
File stderrFile = File.createTempFile("tmpErr", "out");
File stdoutFile = File.createTempFile("tmpStd", "out");
try {
ProcessBuilder builder = new ProcessBuilder("ls /tmp");
Process p = builder.start();
int exitCode = -1;
boolean done = false;
while (!done) {
try {
exitCode = p.waitFor();
done = true;
} catch (InterruptedException ie) {
System.out.println("Interrupted waiting for process to exit.");
}
}
BufferedReader err = new BufferedReader(new FileReader(stderrFile));
BufferedReader in = new BufferedReader(new FileReader(stdoutFile));
....
} finally {
stderrFile.delete();
stdoutFile.delete();
}
This is probably not a good idea if you generate a lot of output from the process you are calling as it could run out of disk space... but it'll likely be slightly faster since it doesn't have to spin up another Thread.
Assuming you don't mind the input and error streams to be merged, you could only use one thread with:
builder.redirectErrorStream(true); //merge input and error streams
Process process = builder.start();
Thread singleThread = new Thread(() -> {
int exitCode = -1;
//read from the merged stream
try (BufferedReader reader =
new BufferedReader(new InputStreamReader(process.getInputStream()))) {
String line;
//read until the stream is exhausted, meaning the process has terminated
while ((line = reader.readLine()) != null) {
System.out.println(line); //use the output here
}
//get the exit code if required
exitCode = process.waitFor();
} catch (Exception e) { }
}).start();
Have a look at the ExecHelper from OstermillerUtils.
The idea is that the thread waiting for the process to complete, does not just wait but reads input from stdout and stderr if there is input available and regurarly checks if the process has finished.
If you do not do any heavy processing with the input from stdout and stderr, you might not need an extra thread to handle the input. Just copy ExecHelper and add some extra functions/methods to process any new input. I've done this before to show the process output while the process is running, it is not difficult to do (but I lost the source code).
If you do need a separate thread for processing the input, make sure to synchronize the output and error StringBuffers when these buffers are updated or read.
Another thing you might want to consider is adding an abort time-out. It is a little bit harder to implement but was very valuable to me: if a process takes too much time, the process gets destroyed which in turn ensures nothing remains hanging. You can find an old (outdated?) example this gist.
You'll have to compromise. Here are your options:
A. You can do it with 2 threads (instead of 3):
First thread:
read from stdout until readline returns null
call Process.waitFor()
join Thread#2
Second thread:
reads from stderr until readline returns null
B. Merge streams and use Debian's annotate-output to discriminate the 2 streams
http://manpages.debian.org/cgi-bin/man.cgi?query=annotate-output&sektion=1
C. If it's a short-living process just wait for the end of it
D. If it's a long-living process then you can spin between readers with some sleep in between.
Related
I have the next code:
Process p = Runtime.getRuntime().exec(args);
and I want my program to wait for the Runtime.getRuntime().exec(args); to finish cause it last 2-3sec and then to continue.
Ideas?
use Process.waitFor():
Process p = Runtime.getRuntime().exec(args);
int status = p.waitFor();
From JavaDoc:
causes the current thread to wait, if necessary, until the process represented by this Process object has terminated. This method returns immediately if the subprocess has already terminated. If the subprocess has not yet terminated, the calling thread will be blocked until the subprocess exits.
Here is a sample code:
Process proc = Runtime.getRuntime().exec(ANonJava.exe#);
InputStream in = proc.getInputStream();
byte buff[] = new byte[1024];
int cbRead;
try {
while ((cbRead = in.read(buff)) != -1) {
// Use the output of the process...
}
} catch (IOException e) {
// Insert code to handle exceptions that occur
// when reading the process output
}
// No more output was available from the process, so...
// Ensure that the process completes
try {
proc.waitFor();
} catch (InterruptedException) {
// Handle exception that could occur when waiting
// for a spawned process to terminate
}
// Then examine the process exit code
if (proc.exitValue() == 1) {
// Use the exit value...
}
You can find more on this site: http://docs.rinet.ru/JWP/ch14.htm
What is the proper way to produce and consume the streams (IO) of external process from Java? As far as I know, java end input streams (process output) should be consumed in threads parallel to producing the process input due the possibly limited buffer size.
But I'm not sure if I eventually need to synchronize with those consumer threads, or is it enough just to wait for process to exit with waitFor method, to be certain that all the process output is actually consumed? I.E is it possible, even if the process exits (closes it's output stream), there is still unread data in the java end of the stream? How does the waitFor actually even know when the process is done? For the process in question, EOF (closing the java end of it's input stream) signals it to exit.
My current solution to handle the streams is following
public class Application {
private static final StringBuffer output = new StringBuffer();
private static final StringBuffer errOutput = new StringBuffer();
private static final CountDownLatch latch = new CountDownLatch(2);
public static void main(String[] args) throws IOException, InterruptedException {
Process exec = Runtime.getRuntime().exec("/bin/cat");
OutputStream procIn = exec.getOutputStream();
InputStream procOut = exec.getInputStream();
InputStream procErrOut = exec.getErrorStream();
new Thread(new StreamConsumer(procOut, output)).start();
new Thread(new StreamConsumer(procErrOut, errOutput)).start();
PrintWriter printWriter = new PrintWriter(procIn);
printWriter.print("hello world");
printWriter.flush();
printWriter.close();
int ret = exec.waitFor();
latch.await();
System.out.println(output.toString());
System.out.println(errOutput.toString());
}
public static class StreamConsumer implements Runnable {
private InputStream input;
private StringBuffer output;
public StreamConsumer(InputStream input, StringBuffer output) {
this.input = input;
this.output = output;
}
#Override
public void run() {
BufferedReader reader = new BufferedReader(new InputStreamReader(input));
String line;
try {
while ((line = reader.readLine()) != null) {
output.append(line + System.lineSeparator());
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} finally {
try {
reader.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} finally {
latch.countDown();
}
}
}
}
}
Is it necessary to use the latch here, or does the waitFor implicate all the output is already consumed? Also, if the output doesn't end/contain new line, will the readLine miss the output, or still read all that is left? Does reading null mean process has closed it's end of the stream - is there any other scenario where null could be read?
What is the correct way to handle streams, could I do something better than in my example?
waitFor signals that the process ended, but you cannot be sure the threads which collect strings from its stdout and stderr finished also, so using a latch is a step in the right direction, but not an optimal one.
Instead of waiting for the latch, you can wait for the threads directly:
Thread stdoutThread = new Thread(new StreamConsumer(procOut, output)).start();
Thread stderrThread = ...
...
int ret = exec.waitFor();
stdoutThread.join();
stderrThread.join();
BTW, storing lines in StringBuffers is useless work. Use ArrayList<String> instead, put lines there without any conversion, and finally retrieve them in a loop.
Your appapproach is right, but is't better to remove CountDownLatch and use ThreadPool, and not create new Thread directly. From ThreadPool you will get two futures, which you can wait after to completion.
But I'm not sure if I eventually need to synchronize with those consumer threads, or is it enough just to wait for process to exit with waitFor method, to be certain that all the process output is actually consumed? I.E is it possible, even if the process exits (closes it's output stream), there is still unread data in the java end of the stream?
Yes, this situation may occurs. Termination and reading IO streams is unrelated processes.
I have a program where I use named pipes to share info with an external executable:
Process p = Runtime.getRuntime().exec("mkfifo /tmp/myfifo");
p.waitFor();
Process cat = Runtime.getRuntime().exec("cat /tmp/myfifo");
BufferedWriter fifo = new BufferedWriter(
new OutputStreamWriter(new FileOutputStream("/tmp/myfifo")));
fifo.write("Hello!\n");
fifo.close();
cat.waitFor();
When I execute this, the program hangs waiting for cat to finish. It seems that cat has not 'realized' that the fifo was closed.
I tried running $> touch /tmp/myfifo on the terminal, and it worked to 'unhang' the process and it finishing properly; but when I added code to run this within my program, it would remain hanging:
fifo.close();
Process touch = Runtime.getRuntime().exec("touch /tmp/myfifo");
touch.waitFor();
cat.waitFor();
The process will still hang waiting for cat to finish. I'm not sure what to do now.
NOTE - I have already added code to consume the output of the cat command, but the problem does not seem to be there.
Anyone know a workaround/fix for this?
some native platforms only provide limited buffer size for standard
input and output streams, failure to promptly write the input stream
or read the output stream of the subprocess may cause the subprocess
to block, and even deadlock.you need to consume the output like print it on stdout something or file
try something like this
Process cat = Runtime.getRuntime().exec("cat /tmp/myfifo");
new Thread(new Reader(cat.getErrorStream(), System.err)).start();
new Thread(new Reader(cat.getInputStream(), System.out)).start();
int returnCode = cat.waitFor();
System.out.println("Return code = " + returnCode);
class Reader implements Runnable
{
public Reader (InputStream istrm, OutputStream ostrm) {
this.istrm = istrm;
this.ostrm = ostrm;
}
public void run() {
try
{
final byte[] buffer = new byte[1024];
for (int length = 0; (length = istrm.read(buffer)) != -1; )
{
ostrm.write(buffer, 0, length);
}
}
catch (Exception e)
{
e.printStackTrace();
}
}
private final OutputStream ostrm;
private final InputStream istrm;
}
Essentially, I'm making a small program that's going to install some software, and then run some basic commands afterwards to prep that program. However, what is happening is that the program starts its install, and then immediately moves on to the following lines (registration, updates, etc). Of course, that can't happen until it's fully installed, so I'd like to find a way of waiting on the first process before running the second. For example,
Main.say("Installing...");
Process p1 = Runtime.getRuntime().exec(dir + "setup.exe /SILENT");
//Wait here, I need to finish installing first!
Main.say("Registering...");
Process p2 = Runtime.getRuntime().exec(installDir + "program.exe /register aaaa-bbbb-cccc");
Main.say("Updating...");
Process p4 = Runtime.getRuntime().exec(installDir + "program.exe /update -silent");
Call Process#waitFor(). Its Javadoc says:
Causes the current thread to wait, if necessary, until the process represented by this Process object has terminated.
Bonus: you get the exit value of the subprocess. So you can check whether it exited successfully with code 0, or whether an error occured (non-zero exit code).
you can use Process.waitFor() method
and the doc says
Causes the current thread to wait, if necessary, until the process
represented by this Process object has terminated. This method returns
immediately if the subprocess has already terminated. If the
subprocess has not yet terminated, the calling thread will be blocked
until the subprocess exits.
If you are running a system command that returns a very long response string, stdin buffer fills up and the process appears to hang. This happened to me with sqlldr. If that appears to be the case then just read from stdin as the process is running.
try {
ProcessBuilder pb = new ProcessBuilder("myCommand");
Process p = pb.start();
BufferedReader stdInput = new BufferedReader(new
InputStreamReader(p.getInputStream()));
BufferedReader stdError = new BufferedReader(new
InputStreamReader(p.getErrorStream()));
StringBuffer response = new StringBuffer();
StringBuffer errorStr = new StringBuffer();
boolean alreadyWaited = false;
while (p.isAlive()) {
try {
if(alreadyWaited) {
// read the output from the command because
//if we don't then the buffers fill up and
//the command stops and doesn't return
String temp;
while ((temp = stdInput.readLine()) != null) {
response.append(temp);
}
String errTemp;
while ((errTemp = stdError.readLine()) != null) {
errorStr.append(errTemp);
}
}
Thread.sleep(1000);
alreadyWaited = true;
} catch (InterruptedException e) {
e.printStackTrace();
}
logger.debug("Response is " + response);
logger.debug("Error is: " + errorStr);
}
} catch (IOException e) {
logger.error("Error running system command", e);
}
Include waitFor(). In your case, your code will look something like below.
Main.say("Installing...");
Process p1 = Runtime.getRuntime().exec(dir + "setup.exe /SILENT");
p1.waitFor()
Main.say("Registering...");
Process p2 = Runtime.getRuntime().exec(installDir + "program.exe /register aaaa-bbbb-cccc");
Main.say("Updating...");
Process p4 = Runtime.getRuntime().exec(installDir + "program.exe /update -silent");
I am wondering what the best way is to detect/kill a process if it exceeds a predefined time. I know an old way was to use the watchdog/timeoutobserver class from the ant package. But this is deprecated now, so I am wondering how it should be done now?
Here is the code I have which uses watchdog:
import org.apache.tools.ant.util.Watchdog;
import org.apache.tools.ant.util.TimeoutObserver;
public class executer implements TimeoutObserver {
private int timeOut = 0;
Process process = null;
private boolean killedByTimeout =false;
public executer(int to) {
timeOut = t;
}
public String executeCommand() throws Exception {
Watchdog watchDog = null;
String templine = null;
StringBuffer outputTrace = new StringBuffer();
StringBuffer errorTrace = new StringBuffer();
Runtime runtime = Runtime.getRuntime();
try {
//instantiate a new watch dog to kill the process
//if exceeds beyond the time
watchDog = new Watchdog(getTimeout());
watchDog.addTimeoutObserver(this);
watchDog.start();
process = runtime.exec(command);
//... Code to do the execution .....
InputStream inputStream = process.getInputStream();
InputStreamReader inputStreamReader = new InputStreamReader(inputStream);
bufferedReader = new BufferedReader(inputStreamReader);
while (((templine = bufferedReader.readLine()) != null) && (!processWasKilledByTimeout)) {
outputTrace.append(templine);
outputTrace.append("\n");
}
this.setStandardOut(outputTrace);
int returnCode = process.waitFor();
//Set the return code
this.setReturnCode(returnCode);
if (processWasKilledByTimeout) {
//As process was killed by timeout just throw an exception
throw new InterruptedException("Process was killed before the waitFor was reached.");
}
} finally {
// stop the watchdog as no longer needed.
if (aWatchDog != null) {
aWatchDog.stop();
}
try {
// close buffered readers etc
} catch Exception() {
}
//Destroy process
// Process.destroy() sends a SIGTERM to the process. The default action
// when SIGTERM is received is to terminate, but any process is free to
// ignore the signal or catch it and respond differently.
//
// Also, the process started by Java might have created additional
// processes that don't receive the signal at all.
if(process != null) {
process.destroy();
}
}
public void timeoutOccured(Watchdog arg0) {
killedByTimeout = true;
if (process != null){
process.destroy();
}
arg0.stop();
}
}
}
Any help would be greatly appreciated as I am a bit lost. I am trying to take this up to Java 7, but I am not uptodate with the best way to kill it if it hangs beyond the alloted time.
Thanks,
try
final Process p = ...
Thread t = new Thread() {
public void run() {
try {
Thread.sleep(1000);
p.destroy();
} catch (InterruptedException e) {
}
};
};
p.waitFor();
t.interrupt();
Theoretically Thread has method stop() that totally kills the thread. This method is deprecated since java 1.1 because it may cause resources leak. So, you are really not recommended to use it.
The "right" solution is to implement your thread so that they can gracefully exit when receiving a special "signal". You can use "interruption" mechanism: your watchdog should call "interrupt()" of thread that exceeds the time limit. But thread should call isInterrupted() itself and exit if it is interrupted. The good news is that method like sleep() and wait() already support this, so if your thread is waiting and you interrupt it from outside it InterruptedException will be thrown.
I have written a set of ExecutorServices that will cancel processes after they have been given a certain period of time to execute. This code has been checked into GitHub.
The class to use to create the ExecutorService is CancelingExecutors. There are two main classes:
CancelingListeningExecutorService allows you to specify the timeout for each passed Callable
FixedTimeoutCancelingListeningExecutorService is configured to use a single timeout for all Callables
If you just concern about WatchDog itself is deprecated, it is nothing more difficult for you to make use of TimerTask, and do the process.destroy() after a period of time.