Create a process in java which stays alive always - java

I create class to execute CMD command continuously It is working fine for the first iteration in the below code, but the problem is the process is died after one iteration is done
class CommandLine{
Process Handle ;
OutputStreamWriter writer;
Scanner getCommand;
Socket socket;
public CommandLine(Socket socket) throws IOException {
this.socket = socket;
}
public void executeCommand() {
try {
getCommand = new Scanner(socket.getInputStream()).useDelimiter("\\A");
Handle = new ProcessBuilder("cmd.exe").redirectErrorStream(true).start();
while(getCommand.hasNextLine()) {
try(PrintWriter stdin = new PrintWriter(Handle.getOutputStream())) {
stdin.write(getCommand.nextLine()+System.lineSeparator());
stdin.flush();
}
if(Handle.getInputStream().read()>0) {
Scanner result = new Scanner(Handle.getInputStream()).useDelimiter("\\A");
while(result.hasNextLine()) {
System.out.print(result.nextLine()+"\n");
}
}
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
thx for response

You need to re-organise your code. The sub-process dies because you've got a try with resources block inside the loop:
try(PrintWriter stdin = new PrintWriter(Handle.getOutputStream())) {
stdin.write(getCommand.nextLine()+System.lineSeparator());
stdin.flush();
}
The above means STDIN of the sub-process ends after one line, and so does the CMD.EXE.
Also note that just moving PrintWriter stdin part outside the loop isn't enough. You won't be able to reliably supply the STDIN and read STDOUT in same loop as the STDOUT might be many lines of input and block the process when you write STDIN.
The fix is easy: follow #VGR suggestion and replace .redirectErrorStream(true) by either .redirectOutput(ProcessBuilder.Redirect.INHERIT) or .inheritIO()
which will mean that you don't need to read getInputStream(). Alternatively use a background thread for your code either for the write to STDIN or for read from getInputStream() / STDOUT.

Related

Java exec method, how to handle streams correctly

What is the proper way to produce and consume the streams (IO) of external process from Java? As far as I know, java end input streams (process output) should be consumed in threads parallel to producing the process input due the possibly limited buffer size.
But I'm not sure if I eventually need to synchronize with those consumer threads, or is it enough just to wait for process to exit with waitFor method, to be certain that all the process output is actually consumed? I.E is it possible, even if the process exits (closes it's output stream), there is still unread data in the java end of the stream? How does the waitFor actually even know when the process is done? For the process in question, EOF (closing the java end of it's input stream) signals it to exit.
My current solution to handle the streams is following
public class Application {
private static final StringBuffer output = new StringBuffer();
private static final StringBuffer errOutput = new StringBuffer();
private static final CountDownLatch latch = new CountDownLatch(2);
public static void main(String[] args) throws IOException, InterruptedException {
Process exec = Runtime.getRuntime().exec("/bin/cat");
OutputStream procIn = exec.getOutputStream();
InputStream procOut = exec.getInputStream();
InputStream procErrOut = exec.getErrorStream();
new Thread(new StreamConsumer(procOut, output)).start();
new Thread(new StreamConsumer(procErrOut, errOutput)).start();
PrintWriter printWriter = new PrintWriter(procIn);
printWriter.print("hello world");
printWriter.flush();
printWriter.close();
int ret = exec.waitFor();
latch.await();
System.out.println(output.toString());
System.out.println(errOutput.toString());
}
public static class StreamConsumer implements Runnable {
private InputStream input;
private StringBuffer output;
public StreamConsumer(InputStream input, StringBuffer output) {
this.input = input;
this.output = output;
}
#Override
public void run() {
BufferedReader reader = new BufferedReader(new InputStreamReader(input));
String line;
try {
while ((line = reader.readLine()) != null) {
output.append(line + System.lineSeparator());
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} finally {
try {
reader.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} finally {
latch.countDown();
}
}
}
}
}
Is it necessary to use the latch here, or does the waitFor implicate all the output is already consumed? Also, if the output doesn't end/contain new line, will the readLine miss the output, or still read all that is left? Does reading null mean process has closed it's end of the stream - is there any other scenario where null could be read?
What is the correct way to handle streams, could I do something better than in my example?
waitFor signals that the process ended, but you cannot be sure the threads which collect strings from its stdout and stderr finished also, so using a latch is a step in the right direction, but not an optimal one.
Instead of waiting for the latch, you can wait for the threads directly:
Thread stdoutThread = new Thread(new StreamConsumer(procOut, output)).start();
Thread stderrThread = ...
...
int ret = exec.waitFor();
stdoutThread.join();
stderrThread.join();
BTW, storing lines in StringBuffers is useless work. Use ArrayList<String> instead, put lines there without any conversion, and finally retrieve them in a loop.
Your appapproach is right, but is't better to remove CountDownLatch and use ThreadPool, and not create new Thread directly. From ThreadPool you will get two futures, which you can wait after to completion.
But I'm not sure if I eventually need to synchronize with those consumer threads, or is it enough just to wait for process to exit with waitFor method, to be certain that all the process output is actually consumed? I.E is it possible, even if the process exits (closes it's output stream), there is still unread data in the java end of the stream?
Yes, this situation may occurs. Termination and reading IO streams is unrelated processes.

PrintWriter.print blocks when the string is too long

I'm writing a wrapper program in java that's just supposed to pass arguments to other processes by writing to their standard in streams, and reading the response from their standard out streams. However, when the String I try to pass in is too large, PrintWriter.print simply blocks. No error, just freezes. Is there a good workaround for this?
Relevant code
public class Wrapper {
PrintWriter writer;
public Wrapper(String command){
start(command);
}
public void call(String args){
writer.println(args); // Blocks here
writer.flush();
//Other code
}
public void start(String command) {
try {
ProcessBuilder pb = new ProcessBuilder(command.split(" "));
pb.redirectErrorStream(true);
process = pb.start();
// STDIN of the process.
writer = new PrintWriter(new OutputStreamWriter(process.getOutputStream(), "UTF-8"));
} catch (Exception e) {
e.printStackTrace();
System.out.println("Process ended catastrophically.");
}
}
}
If I try using
writer.print(args);
writer.print("\n");
it can handle a larger string before freezing, but still ultimately locks up.
Is there maybe a buffered stream way to fix this? Does print block on the processes stream having enough space or something?
Update
In response to some answers and comments, I've included more information.
Operating System is Windows 7
BufferedWriter slows the run time, but didn't stop it from blocking eventually.
Strings could get very long, as large as 100,000 characters
The Process input is consumed, but by line i.e Scanner.nextLine();
Test code
import java.io.IOException;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.TimeoutException;
import ProcessRunner.Wrapper;
public class test {
public static void main(String[] args){
System.out.println("Building...");
Wrapper w = new Wrapper("java echo");
System.out.println("Calling...");
String market = "aaaaaa";
for(int i = 0; i < 1000; i++){
try {
System.out.println(w.call(market, 1000));
} catch (InterruptedException | ExecutionException
| TimeoutException e) {
System.out.println("Timed out");
}
market = market + market;
System.out.println("Size = " + market.length());
}
System.out.println("Stopping...");
try {
w.stop();
} catch (IOException e) {
e.printStackTrace();
System.out.println("Stop failed :(");
}
}
}
Test Process:
You have to first compile this file, and make sure the .class is in the same folder as the test .class file
import java.util.Scanner;
public class echo {
public static void main(String[] args){
while(true){
Scanner stdIn = new Scanner(System.in);
System.out.println(stdIn.nextLine());
}
}
}
I suspect that what is happening here is that the external process is writing to its standard output. Since your Java code doesn't read it, it eventually fills the external process's standard out (or err) pipe. That blocks the external process, which means that it can read from its input pipe .... and your Java process freezes.
If this is the problem, then using a buffered writer won't fix it. You either need to read the external processes output or redirect it to a file (e.g. "/dev/null" on Linux)
Writing to any pipe or socket by any means in java.io blocks if the peer is slower reading than you are writing.
Nothing you can do about it.

Run a process asynchronously and read from stdout and stderr

I have some code that runs a process and reads from the stdout and stderr asynchronously and then handles when the process completes. It looks something like this:
Process process = builder.start();
Thread outThread = new Thread(() -> {
try (BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream()))) {
// Read stream here
} catch (Exception e) {
}
});
Thread errThread = new Thread(() -> {
try (BufferedReader reader = new BufferedReader(new InputStreamReader(process.getErrorStream()))) {
// Read stream here
} catch (Exception e) {
}
});
outThread.start();
errThread.start();
new Thread(() -> {
int exitCode = -1;
try {
exitCode = process.waitFor();
outThread.join();
errThread.join();
} catch (Exception e) {
}
// Process completed and read all stdout and stderr here
}).start();
My issue is with the fact that I am using 3 threads to achieve this asynchronous "run-and-get-output" task - I don't know why, but I feel it doesn't feel right using 3 threads. I could allocate the threads out of a thread pool, but that would still be blocking those threads.
Is there anything I can do, maybe with NIO, to reduce this to fewer (1?) thread? Anything I can think of will be constantly spinning a thread (unless I add a few sleeps), which I don't really want to do either...
NOTE: I do need to read as I go (rather than when the process has stopped) and I do need to separate stdin from stderr so can't do a redirect.
Since you've specified that you need to read the output as you go, there is no non-multi-threaded solution.
You can reduce the number of threads to one beyond your main thread though:
Process process = builder.start();
Thread errThread = new Thread(() -> {
try (BufferedReader reader = new BufferedReader(new InputStreamReader(process.getErrorStream()))) {
// Read stream here
} catch (Exception e) {
}
});
errThread.start();
try (BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream()))) {
// Read stream here
} catch (Exception e) {
}
// we got an end of file, so there can't be any more input. Now we need to wait for stderr/process exit.
int exitCode = -1;
try {
exitCode = process.waitFor();
errThread.join();
} catch (Exception e) {
}
// Process completed
If you truely don't need to deal with the error/output until after the process ends, you can simplify it a bit and only use your main thread like this:
File stderrFile = File.createTempFile("tmpErr", "out");
File stdoutFile = File.createTempFile("tmpStd", "out");
try {
ProcessBuilder builder = new ProcessBuilder("ls /tmp");
Process p = builder.start();
int exitCode = -1;
boolean done = false;
while (!done) {
try {
exitCode = p.waitFor();
done = true;
} catch (InterruptedException ie) {
System.out.println("Interrupted waiting for process to exit.");
}
}
BufferedReader err = new BufferedReader(new FileReader(stderrFile));
BufferedReader in = new BufferedReader(new FileReader(stdoutFile));
....
} finally {
stderrFile.delete();
stdoutFile.delete();
}
This is probably not a good idea if you generate a lot of output from the process you are calling as it could run out of disk space... but it'll likely be slightly faster since it doesn't have to spin up another Thread.
Assuming you don't mind the input and error streams to be merged, you could only use one thread with:
builder.redirectErrorStream(true); //merge input and error streams
Process process = builder.start();
Thread singleThread = new Thread(() -> {
int exitCode = -1;
//read from the merged stream
try (BufferedReader reader =
new BufferedReader(new InputStreamReader(process.getInputStream()))) {
String line;
//read until the stream is exhausted, meaning the process has terminated
while ((line = reader.readLine()) != null) {
System.out.println(line); //use the output here
}
//get the exit code if required
exitCode = process.waitFor();
} catch (Exception e) { }
}).start();
Have a look at the ExecHelper from OstermillerUtils.
The idea is that the thread waiting for the process to complete, does not just wait but reads input from stdout and stderr if there is input available and regurarly checks if the process has finished.
If you do not do any heavy processing with the input from stdout and stderr, you might not need an extra thread to handle the input. Just copy ExecHelper and add some extra functions/methods to process any new input. I've done this before to show the process output while the process is running, it is not difficult to do (but I lost the source code).
If you do need a separate thread for processing the input, make sure to synchronize the output and error StringBuffers when these buffers are updated or read.
Another thing you might want to consider is adding an abort time-out. It is a little bit harder to implement but was very valuable to me: if a process takes too much time, the process gets destroyed which in turn ensures nothing remains hanging. You can find an old (outdated?) example this gist.
You'll have to compromise. Here are your options:
A. You can do it with 2 threads (instead of 3):
First thread:
read from stdout until readline returns null
call Process.waitFor()
join Thread#2
Second thread:
reads from stderr until readline returns null
B. Merge streams and use Debian's annotate-output to discriminate the 2 streams
http://manpages.debian.org/cgi-bin/man.cgi?query=annotate-output&sektion=1
C. If it's a short-living process just wait for the end of it
D. If it's a long-living process then you can spin between readers with some sleep in between.

Running a Perl script in Java using ProcessBuilder

I am using ProcessBuilder in Java to run a Perl script. When I run the Perl script while printing the InputStream of the process, the Java program seems to run for the duration of the Perl script. However if I comment out the getOutPut method in main the Java program terminates very fast and the Perl script does not run at all. Why does this occur?
private final static String SCENARIO = "scen";
/**
* #param args
*/
public static void main(String[] args) {
ProcessBuilder pb = new ProcessBuilder("perl", SCENARIO+".pl");
pb.directory(new File("t:/usr/aman/"+SCENARIO));
try {
Process p = pb.start();
getOutput(p.getInputStream(), true);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
private static List getOutput(InputStream is, boolean print) {
List output = new ArrayList<String>();
BufferedReader reader = new BufferedReader(new InputStreamReader(is));
String s = null;
try {
while ((s = reader.readLine()) != null) {
output.add(s);
if(print){
System.out.println(s);
}
}
is.close();
} catch (IOException e) {
// TODO Auto-generated catch block
//e.printStackTrace();
return null;
}
return output;
}
Likely the OS's output stream buffer for your PERL script process gets filled because nothing is emptying this buffer, and this will kill the process. You need to gobble the output stream for this reason which is what your getOutput method does for you.
Please read the classic reference on this problem: When Runtime.exec() won't. Per this article:
Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the subprocess to block, and even deadlock.

Calling a shell script from java hangs

So I'm trying to execute a shell script which produces a lot of output(in 100s of MBs) from a Java file.
This hangs the process and never completes.
However, within the shell script, if I redirect the output of the script to some log file or /dev/null Java file executes and completes in a jiffy.
Is it because of amount of data that the Java program never completes?
If so, is there any documentation as such? or is there any limit on the amount of data(documented)?
Here's how you can simulate this scenario.
Java file will look like:
import java.io.InputStream;
public class LotOfOutput {
public static void main(String[] args) {
String cmd = "sh a-script-which-outputs-huuggee-data.sh";
try {
ProcessBuilder pb = new ProcessBuilder("bash", "-c", cmd);
pb.redirectErrorStream(true);
Process shell = pb.start();
InputStream shellIn = shell.getInputStream();
int shellExitStatus = shell.waitFor();
System.out.println(shellExitStatus);
shellIn.close();
} catch (Exception ignoreMe) {
}
}
}
The script 'a-script-which-outputs-huuggee-data.sh' may look like:
#!/bin/sh
# Toggle the line below
exec 3>&1 > /dev/null 2>&1
count=1
while [ $count -le 1000 ]
do
cat some-big-file
((count++))
done
echo
echo Yes I m done
Free beer for the right answer. :)
It's because you're not reading from the Process' output.
As per the class' Javadocs, if you don't do this then you may end up with a deadlock; the process fills its IO buffer and waits for the "shell" (or listening process) to read from it and empty it. Meanwhile your process, which should be doing this, is blocking waiting for the process to exit.
You'll want to call getInputStream() and read from that reliably (perhaps from another thread) to stop the process blocking.
Also take a look at Five Java Process Pitfalls and When Runtime.exec() Won't - both informative articles about common problems with Process.
You're never reading the input stream, so it's probably blocking because the input buffer is full.
The input/output buffer have a limited size (depending on the operating system). If I remember correctly this wasn't big or Windows XP at least. Try creating a thread that reads the InputStream as fast as possible.
Something along these lines:
class StdInWorker
implements Worker
{
private BufferedReader br;
private boolean run = true;
private int linesRead = 0;
private StdInWorker (Process prcs)
{
this.br = new BufferedReader(
new InputStreamReader(prcs.getInputStream()));
}
public synchronized void run ()
{
String in;
try {
while (this.run) {
while ((in = this.br.readLine()) != null) {
this.buffer.add(in);
linesRead++;
}
Thread.sleep(50);
}
}
catch (IOException ioe) {
ioe.printStackTrace();
}
catch (InterruptedException ie) {}
}
}
}

Categories

Resources