Stream not closing appropriately while using named pipes in Java/Linux - java

I have a program where I use named pipes to share info with an external executable:
Process p = Runtime.getRuntime().exec("mkfifo /tmp/myfifo");
p.waitFor();
Process cat = Runtime.getRuntime().exec("cat /tmp/myfifo");
BufferedWriter fifo = new BufferedWriter(
new OutputStreamWriter(new FileOutputStream("/tmp/myfifo")));
fifo.write("Hello!\n");
fifo.close();
cat.waitFor();
When I execute this, the program hangs waiting for cat to finish. It seems that cat has not 'realized' that the fifo was closed.
I tried running $> touch /tmp/myfifo on the terminal, and it worked to 'unhang' the process and it finishing properly; but when I added code to run this within my program, it would remain hanging:
fifo.close();
Process touch = Runtime.getRuntime().exec("touch /tmp/myfifo");
touch.waitFor();
cat.waitFor();
The process will still hang waiting for cat to finish. I'm not sure what to do now.
NOTE - I have already added code to consume the output of the cat command, but the problem does not seem to be there.
Anyone know a workaround/fix for this?

some native platforms only provide limited buffer size for standard
input and output streams, failure to promptly write the input stream
or read the output stream of the subprocess may cause the subprocess
to block, and even deadlock.you need to consume the output like print it on stdout something or file
try something like this
Process cat = Runtime.getRuntime().exec("cat /tmp/myfifo");
new Thread(new Reader(cat.getErrorStream(), System.err)).start();
new Thread(new Reader(cat.getInputStream(), System.out)).start();
int returnCode = cat.waitFor();
System.out.println("Return code = " + returnCode);
class Reader implements Runnable
{
public Reader (InputStream istrm, OutputStream ostrm) {
this.istrm = istrm;
this.ostrm = ostrm;
}
public void run() {
try
{
final byte[] buffer = new byte[1024];
for (int length = 0; (length = istrm.read(buffer)) != -1; )
{
ostrm.write(buffer, 0, length);
}
}
catch (Exception e)
{
e.printStackTrace();
}
}
private final OutputStream ostrm;
private final InputStream istrm;
}

Related

Java exec method, how to handle streams correctly

What is the proper way to produce and consume the streams (IO) of external process from Java? As far as I know, java end input streams (process output) should be consumed in threads parallel to producing the process input due the possibly limited buffer size.
But I'm not sure if I eventually need to synchronize with those consumer threads, or is it enough just to wait for process to exit with waitFor method, to be certain that all the process output is actually consumed? I.E is it possible, even if the process exits (closes it's output stream), there is still unread data in the java end of the stream? How does the waitFor actually even know when the process is done? For the process in question, EOF (closing the java end of it's input stream) signals it to exit.
My current solution to handle the streams is following
public class Application {
private static final StringBuffer output = new StringBuffer();
private static final StringBuffer errOutput = new StringBuffer();
private static final CountDownLatch latch = new CountDownLatch(2);
public static void main(String[] args) throws IOException, InterruptedException {
Process exec = Runtime.getRuntime().exec("/bin/cat");
OutputStream procIn = exec.getOutputStream();
InputStream procOut = exec.getInputStream();
InputStream procErrOut = exec.getErrorStream();
new Thread(new StreamConsumer(procOut, output)).start();
new Thread(new StreamConsumer(procErrOut, errOutput)).start();
PrintWriter printWriter = new PrintWriter(procIn);
printWriter.print("hello world");
printWriter.flush();
printWriter.close();
int ret = exec.waitFor();
latch.await();
System.out.println(output.toString());
System.out.println(errOutput.toString());
}
public static class StreamConsumer implements Runnable {
private InputStream input;
private StringBuffer output;
public StreamConsumer(InputStream input, StringBuffer output) {
this.input = input;
this.output = output;
}
#Override
public void run() {
BufferedReader reader = new BufferedReader(new InputStreamReader(input));
String line;
try {
while ((line = reader.readLine()) != null) {
output.append(line + System.lineSeparator());
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} finally {
try {
reader.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} finally {
latch.countDown();
}
}
}
}
}
Is it necessary to use the latch here, or does the waitFor implicate all the output is already consumed? Also, if the output doesn't end/contain new line, will the readLine miss the output, or still read all that is left? Does reading null mean process has closed it's end of the stream - is there any other scenario where null could be read?
What is the correct way to handle streams, could I do something better than in my example?
waitFor signals that the process ended, but you cannot be sure the threads which collect strings from its stdout and stderr finished also, so using a latch is a step in the right direction, but not an optimal one.
Instead of waiting for the latch, you can wait for the threads directly:
Thread stdoutThread = new Thread(new StreamConsumer(procOut, output)).start();
Thread stderrThread = ...
...
int ret = exec.waitFor();
stdoutThread.join();
stderrThread.join();
BTW, storing lines in StringBuffers is useless work. Use ArrayList<String> instead, put lines there without any conversion, and finally retrieve them in a loop.
Your appapproach is right, but is't better to remove CountDownLatch and use ThreadPool, and not create new Thread directly. From ThreadPool you will get two futures, which you can wait after to completion.
But I'm not sure if I eventually need to synchronize with those consumer threads, or is it enough just to wait for process to exit with waitFor method, to be certain that all the process output is actually consumed? I.E is it possible, even if the process exits (closes it's output stream), there is still unread data in the java end of the stream?
Yes, this situation may occurs. Termination and reading IO streams is unrelated processes.

Problems again with child processes in Java

I am on Ubuntu 14.04.
I am trying to run something like ps aux | grep whatevah through Java's class ProcessBuilder. I create two child processes and I make them communicate synchronously, but for some reason, I can not see anything in the terminal.
This is the code:
try {
// What comes out of process1 is our inputStream
Process process1 = new ProcessBuilder("ps", "aux").start();
InputStream is1 = process1.getInputStream();
BufferedReader br1 = new BufferedReader (new InputStreamReader(is1));
// What goes into process2 is our outputStream
Process process2 = new ProcessBuilder("grep", "gedit").start();
OutputStream os = process2.getOutputStream();
BufferedWriter bw = new BufferedWriter(new OutputStreamWriter(os));
// Send the output of process1 to the input of process2
String p1Output = null;
while ((p1Output = br1.readLine()) != null) {
bw.write(p1Output);
System.out.println(p1Output);
}
// Synchronization
int finish = process2.waitFor();
System.out.println(finish);
// What comes out of process2 is our inputStream
InputStream is2 = process2.getInputStream();
BufferedReader br2 = new BufferedReader(new InputStreamReader(is2));
String combOutput = null;
while ((combOutput = br2.readLine()) != null)
System.out.println(combOutput);
os.close();
is1.close();
is2.close();
} catch (IOException e) {
System.out.println("Command execution error: " + e.getMessage());
} catch (Exception e) {
System.out.println("General error: " + e.getMessage());
}
(The System.out.println(p1Output); is just for me to check, the print that has to work is the last one, printing the result of ps aux | grep whatevah.)
I've tried several things, the less silly include:
If I comment everything regarding process2, I get the result of ps aux printed on the terminal
If I run the program as is, it prints nothing to the terminal.
If I uncomment the waitFor call, only ps aux gets printed.
If change the commands to, for example, ls -al and ls -al, then both get printed.
I tried changing "aux" for "aux |" but still nothing is printed.
Closed the buffers, also nothing
etc.
Any help will be sorely appreciated.
Cheers!
EDIT
Minutes after accepting Ryan's amazing answer I made my last try to make this code work. And I succeeded! I changed:
while ((p1Output = br1.readLine()) != null) {
bw.write(p1Output);
System.out.println(p1Output);
}
for:
while ((p1Output = br1.readLine()) != null) {
bw.write(p1Output + "\n");
System.out.println(p1Output);
}
bw.close();
and it works! I remember closing the buffer before, so I don't know what went wrong. Turns out you should not stay awake until late trying to make a piece of code work XD.
Ryan's answer down here is still amazing, though.
Given the advice in the comments, the important thing to note is the necessity to use threads to process input/output for a process in order to achieve what you want.
I've used the link posted by jtahlborn and adapted this solution that you might be able to use.
I created a simple example that will list files in a directory and grep through the output.
This example simulates the command ls -1 | grep some from a directory called test with three files somefile.txt someotherfile.txt and this_other_file.csv
EDIT: The original solution didn't really fully use the "pipe" methodology, as it was waiting fully for p1 to finish before starting p2. Rather, it should start them both, and then the output of the first should be piped to the second. I've updated the solution with a class that accomplishes this.
import java.io.*;
import java.util.Scanner;
public class Main {
public static void main(String[] args) {
try {
// construct a process
ProcessBuilder pb1 = new ProcessBuilder("ls", "-1");
// set working directory
pb1.directory(new File("test"));
// start process
final Process process1 = pb1.start();
// get input/error streams
final InputStream p1InStream = process1.getInputStream();
final InputStream p1ErrStream = process1.getErrorStream();
// handle error stream
Thread t1Err = new InputReaderThread(p1ErrStream, "Process 1 Err");
t1Err.start();
// this will print out the data from process 1 (for illustration purposes)
// and redirect it to process 2
Process process2 = new ProcessBuilder("grep", "some").start();
// process 2 streams
final InputStream p2InStream = process2.getInputStream();
final InputStream p2ErrStream = process2.getErrorStream();
final OutputStream p2OutStream = process2.getOutputStream();
// do the same as process 1 for process 2...
Thread t2In = new InputReaderThread(p2InStream, "Process 2 Out");
t2In.start();
Thread t2Err = new InputReaderThread(p2ErrStream, "Process 2 Err");
t2Err.start();
// create a new thread with our pipe class
// pass in the input stream of p1, the output stream of p2, and the name of the input stream
new Thread(new PipeClass(p1InStream, p2OutStream, "Process 1 Out")).start();
// wait for p2 to finish
process2.waitFor();
} catch (IOException e) {
System.out.println("Command execution error: " + e.getMessage());
} catch (Exception e) {
System.out.println("General error: " + e.getMessage());
}
}
}
This is a class that will be used to simulate a process pipe. It uses some loops to copy bytes around, and could be more efficient, depending on your needs, but for the illustration, it should work.
// this class simulates a pipe between two processes
public class PipeClass implements Runnable {
// the input stream
InputStream is;
// the output stream
OutputStream os;
// the name associated with the input stream (for printing purposes only...)
String isName;
// constructor
public PipeClass(InputStream is, OutputStream os, String isName) {
this.is = is;
this.os = os;
this.isName = isName;
}
#Override
public void run() {
try {
// use a byte array output stream so we can clone the data and use it multiple times
ByteArrayOutputStream baos = new ByteArrayOutputStream();
// read the data into the output stream (it has to fit in memory for this to work...)
byte[] buffer = new byte[512]; // Adjust if you want
int bytesRead;
while ((bytesRead = is.read(buffer)) != -1) {
baos.write(buffer, 0, bytesRead);
}
// clone it so we can print it out
InputStream clonedIs1 = new ByteArrayInputStream(baos.toByteArray());
Scanner sc = new Scanner(clonedIs1);
// print the info
while (sc.hasNextLine()) {
System.out.println(this.isName + " >> " + sc.nextLine());
}
// clone again to redirect to the output of the other process
InputStream clonedIs2 = new ByteArrayInputStream(baos.toByteArray());
buffer = new byte[512]; // Adjust if you want
while ((bytesRead = clonedIs2.read(buffer)) != -1) {
// write it out to the output stream
os.write(buffer, 0, bytesRead);
}
}
catch (IOException ex) {
ex.printStackTrace();
}
finally {
try {
// close so the process will finish
is.close();
os.close();
}
catch (Exception ex) {
ex.printStackTrace();
}
}
}
}
This is a class that was created for handling process output, adapted from this reference
// Thread reader class adapted from
// http://www.javaworld.com/article/2071275/core-java/when-runtime-exec---won-t.html
public class InputReaderThread extends Thread {
// input stream
InputStream is;
// name
String name;
// is there data?
boolean hasData = false;
// data itself
StringBuilder data = new StringBuilder();
// constructor
public InputReaderThread(InputStream is, String name) {
this.is = is;
this.name = name;
}
// set if there's data to read
public synchronized void setHasData(boolean hasData) {
this.hasData = hasData;
}
// data available?
public boolean hasData() { return this.hasData; }
// get the data
public StringBuilder getData() {
setHasData(false); // clear flag
StringBuilder returnData = this.data;
this.data = new StringBuilder();
return returnData;
}
#Override
public void run() {
// input reader
InputStreamReader isr = new InputStreamReader(this.is);
Scanner sc = new Scanner(isr);
// while data remains
while ( sc.hasNextLine() ) {
// print out and append to data
String line = sc.nextLine();
System.out.println(this.name + " >> " + line);
this.data.append(line + "\n");
}
// flag there's data available
setHasData(true);
}
}
The produced output is:
Process 1 Out >> somefile.txt
Process 1 Out >> someotherfile.txt
Process 1 Out >> this_other_file.csv
Process 2 Out >> somefile.txt
Process 2 Out >> someotherfile.txt
To show that piping is really working, changing the command to ps -a | grep usr the output is:
Process 1 Out >> PID PPID PGID WINPID TTY UID STIME COMMAND
Process 1 Out >> I 15016 1 15016 15016 con 400 13:45:59 /usr/bin/grep
Process 1 Out >> 15156 1 15156 15156 con 400 14:21:54 /usr/bin/ps
Process 1 Out >> I 9784 1 9784 9784 con 400 14:21:54 /usr/bin/grep
Process 2 Out >> I 15016 1 15016 15016 con 400 13:45:59 /usr/bin/grep
Process 2 Out >> 15156 1 15156 15156 con 400 14:21:54 /usr/bin/ps
Process 2 Out >> I 9784 1 9784 9784 con 400 14:21:54 /usr/bin/grep
Seeing the grep command in process 2's output shows that the piping is working, with the old solution I posted, this would be missing.
Note the handling of the error stream, which is always good practice, even if you don't plan to use it.
This is a quick and dirty solution that could benefit from some additional thread management techniques, but it should get you what you want.

Run a process asynchronously and read from stdout and stderr

I have some code that runs a process and reads from the stdout and stderr asynchronously and then handles when the process completes. It looks something like this:
Process process = builder.start();
Thread outThread = new Thread(() -> {
try (BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream()))) {
// Read stream here
} catch (Exception e) {
}
});
Thread errThread = new Thread(() -> {
try (BufferedReader reader = new BufferedReader(new InputStreamReader(process.getErrorStream()))) {
// Read stream here
} catch (Exception e) {
}
});
outThread.start();
errThread.start();
new Thread(() -> {
int exitCode = -1;
try {
exitCode = process.waitFor();
outThread.join();
errThread.join();
} catch (Exception e) {
}
// Process completed and read all stdout and stderr here
}).start();
My issue is with the fact that I am using 3 threads to achieve this asynchronous "run-and-get-output" task - I don't know why, but I feel it doesn't feel right using 3 threads. I could allocate the threads out of a thread pool, but that would still be blocking those threads.
Is there anything I can do, maybe with NIO, to reduce this to fewer (1?) thread? Anything I can think of will be constantly spinning a thread (unless I add a few sleeps), which I don't really want to do either...
NOTE: I do need to read as I go (rather than when the process has stopped) and I do need to separate stdin from stderr so can't do a redirect.
Since you've specified that you need to read the output as you go, there is no non-multi-threaded solution.
You can reduce the number of threads to one beyond your main thread though:
Process process = builder.start();
Thread errThread = new Thread(() -> {
try (BufferedReader reader = new BufferedReader(new InputStreamReader(process.getErrorStream()))) {
// Read stream here
} catch (Exception e) {
}
});
errThread.start();
try (BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream()))) {
// Read stream here
} catch (Exception e) {
}
// we got an end of file, so there can't be any more input. Now we need to wait for stderr/process exit.
int exitCode = -1;
try {
exitCode = process.waitFor();
errThread.join();
} catch (Exception e) {
}
// Process completed
If you truely don't need to deal with the error/output until after the process ends, you can simplify it a bit and only use your main thread like this:
File stderrFile = File.createTempFile("tmpErr", "out");
File stdoutFile = File.createTempFile("tmpStd", "out");
try {
ProcessBuilder builder = new ProcessBuilder("ls /tmp");
Process p = builder.start();
int exitCode = -1;
boolean done = false;
while (!done) {
try {
exitCode = p.waitFor();
done = true;
} catch (InterruptedException ie) {
System.out.println("Interrupted waiting for process to exit.");
}
}
BufferedReader err = new BufferedReader(new FileReader(stderrFile));
BufferedReader in = new BufferedReader(new FileReader(stdoutFile));
....
} finally {
stderrFile.delete();
stdoutFile.delete();
}
This is probably not a good idea if you generate a lot of output from the process you are calling as it could run out of disk space... but it'll likely be slightly faster since it doesn't have to spin up another Thread.
Assuming you don't mind the input and error streams to be merged, you could only use one thread with:
builder.redirectErrorStream(true); //merge input and error streams
Process process = builder.start();
Thread singleThread = new Thread(() -> {
int exitCode = -1;
//read from the merged stream
try (BufferedReader reader =
new BufferedReader(new InputStreamReader(process.getInputStream()))) {
String line;
//read until the stream is exhausted, meaning the process has terminated
while ((line = reader.readLine()) != null) {
System.out.println(line); //use the output here
}
//get the exit code if required
exitCode = process.waitFor();
} catch (Exception e) { }
}).start();
Have a look at the ExecHelper from OstermillerUtils.
The idea is that the thread waiting for the process to complete, does not just wait but reads input from stdout and stderr if there is input available and regurarly checks if the process has finished.
If you do not do any heavy processing with the input from stdout and stderr, you might not need an extra thread to handle the input. Just copy ExecHelper and add some extra functions/methods to process any new input. I've done this before to show the process output while the process is running, it is not difficult to do (but I lost the source code).
If you do need a separate thread for processing the input, make sure to synchronize the output and error StringBuffers when these buffers are updated or read.
Another thing you might want to consider is adding an abort time-out. It is a little bit harder to implement but was very valuable to me: if a process takes too much time, the process gets destroyed which in turn ensures nothing remains hanging. You can find an old (outdated?) example this gist.
You'll have to compromise. Here are your options:
A. You can do it with 2 threads (instead of 3):
First thread:
read from stdout until readline returns null
call Process.waitFor()
join Thread#2
Second thread:
reads from stderr until readline returns null
B. Merge streams and use Debian's annotate-output to discriminate the 2 streams
http://manpages.debian.org/cgi-bin/man.cgi?query=annotate-output&sektion=1
C. If it's a short-living process just wait for the end of it
D. If it's a long-living process then you can spin between readers with some sleep in between.

ReadLine on TCPDump-Buffer sometimes blocks until kill tcpdump

I have a problem using TCPDump from my Android-Application.
It is supposed to read the output from tcpdump line by line and process it within my Application. The Problem is: Sometimes the code works fine, it reads the captured packets immediately. But sometimes, ReadLine blocks until I kill the tcpdump process from the Linux-Console (killall tcpdump). After doing that, my loop is processed for each line (sometimes 10, sometimes 1 or 2) - which means, the readLine should have worked, but didnĀ“t.
I read about similar problems, but did not find any solution for this problem... THANKS!!
public class ListenActivity extends Activity {
static ArrayList<Packet> packetBuffer = new ArrayList<Packet>();
static Process tcpDumpProcess = null;
static ListenThread thread = null;
public static final String TCPDUMP_COMMAND = "tcpdump -A -s0 | grep -i -e 'Cookie'\n";
private InputStream inputStream = null;
private OutputStream outputStream = null;
#Override
protected void onStart() {
super.onStart();
try {
tcpDumpProcess = new ProcessBuilder().command("su").redirectErrorStream(true).start();
inputStream = tcpDumpProcess.getInputStream();
outputStream = tcpDumpProcess.getOutputStream();
outputStream.write(TCPDUMP_COMMAND.getBytes("ASCII"));
} catch (Exception e) {
Log.e("FSE", "", e);
}
thread = new ListenThread(new BufferedReader(new InputStreamReader(inputStream)));
thread.start();
}
private class ListenThread extends Thread {
public ListenThread(BufferedReader reader) {
this.reader = reader;
}
private BufferedReader reader = null;
#Override
public void run() {
reader = new BufferedReader(new InputStreamReader(inputStream));
while (true) {
try {
String received = reader.readLine();
Log.d("FS", received);
Packet pReceived = Packet.analyze(received);
if (pReceived != null) {
packetBuffer.add(pReceived);
}
} catch (Exception e) {
Log.e("FSE", "", e);
}
}
}
}
}
Because output sent to pipes is usually block buffered, both the tcpdump process and the grep process will be waiting until they've received enough data to bother sending it onto your program. You're very lucky though, both programs you have chosen to use are prepared to modify their buffer behavior (using the setvbuf(3) function internally, in case you're curious about the details):
For tcpdump(8):
-l Make stdout line buffered. Useful if you want to see
the data while capturing it. E.g.,
``tcpdump -l | tee dat'' or ``tcpdump -l >
dat & tail -f dat''.
For grep(1):
--line-buffered
Use line buffering on output. This can cause a
performance penalty.
Try this:
"tcpdump -l -A -s0 | grep --line-buffered -i -e 'Cookie'\n";
I don't understand why, but even with the -l option the buffer is too large if you read on the standard output of the process wherein you run tcpdump.
I solve this problem by redirect TcpDump's output to a file and read this file in another thread. The TcpDump command should be something like :
tcpdump -l-A -s0 > /data/local/output.txt
The run method inside your thread have to be change to read in the output file :
File dumpedFile = new File("/data/local/output.txt");
//open a reader on the tcpdump output file
BufferedReader reader = new BufferedReader(new FileReader(dumpedFile));
String temp = new String();
//The while loop is broken if the thread is interrupted
while (!Thread.interrupted()) {
temp = reader.readLine();
if (temp!=null) {
Log.e("READER",new String(temp));
}
}
I dont exactly know what you want to do with grep but I think it's possible do achieve the same actions with a regexp inside the Java code.
You should also be aware that the TcpDump's process will never end, so you have to kill it when your activity is paused or distroy.
You can have a look here to my blog post, I explain my whole code to start/stop tcpdump.

java: ProcessBuilder makes a memory hog

I have some issues regarding ProcessBuilder.
The program is basically a simple wrapper invoking a command line script.
When running the script on its own via the terminal, the memory consumption stays below 2G.
When running the script via the java wrapper, the memory consumption explodes and even 8G is quickly filled up, resulting in out-of-memory errors.
The code to launch the process is simply:
public static int execute(String command) throws IOException
{
System.out.println("Executing: " + command);
ProcessBuilder pb = new ProcessBuilder(command.split(" +"));
Process p = pb.start();
// display any output in stderr or stdout
StreamConsumer stderr = new StreamConsumer(p.getErrorStream(), "stderr");
StreamConsumer stdout = new StreamConsumer(p.getInputStream(), "stdout");
new Thread(stderr).start();
new Thread(stdout).start();
try {
return p.waitFor();
} catch (InterruptedException e) {
throw new RuntimeException(e);
}
}
The StreamConsumer class is simply a class which consumes the stdout/stderr streams and display them on the console.
...the question is: why on earth does the memory consumption explode?
Regards,
Arnaud
Edit:
Whether I use ProcessBuilder or
Runtime.getRuntime.exec(...), the
result is the same.
The memory bursts tend to appear during unix 'sort' invoked by the
shell script called:
sort big-text-file > big-text-file.sorted
Edit 2 on request of Jim Garrison:
Ok, here is the StreamConsumer class which I omitted because it is rather simple:
class StreamConsumer implements Runnable
{
InputStream stream;
String descr;
StreamConsumer(InputStream stream, String descr) {
this.stream = stream;
this.descr = descr;
}
#Override
public void run()
{
String line;
BufferedReader brCleanUp =
new BufferedReader (new InputStreamReader (stream));
try {
while ((line = brCleanUp.readLine ()) != null)
System.out.println ("[" + descr + "] " + line);
brCleanUp.close();
} catch (IOException e) {
// TODO: handle exception
}
}
}
if you change your command like this :
sort -o big-text-file.sorted big-text-file
is it always the same ?
Maybe its because those StreamConsumer threads are not daemons so they don't die and get garbage collected when your processes return? You could try:
//...
final StreamConsumer stderr = new StreamConsumer(p.getErrorStream(), "stderr");
final StreamConsumer stdout = new StreamConsumer(p.getInputStream(), "stdout");
final Thread stderrThread = new Thread(stderr);
final Thread stdoutThread = new Thread(stdout);
stderrThread.setDaemon(true);
stdoutThread.setDaemon(true);
stderrThread.start();
stdoutThread.start();
//...
Is this behavior happening for single invocation or after doing this many times?

Categories

Resources