I'm trying to run a perl script from Java code and read it's output with the following code:
String cmd = "/var/tmp/./myscript";
Process process = Runtime.getRuntime().exec(cmd);
stdin = new BufferedReader(new InputStreamReader(process.getInputStream()));
String line;
while((line = stdin.readLine()) != null) {
System.out.println(line);
}
But the code always hangs on the readLine().
I tried using
stdin.read();
Instead but that also hangs.
tried modifying the cmd to
cmd = "perl /var/tmp/myscript";
And also
cmd = {"perl","/var/tmp/myscript"};
But that also hangs.
tried reading the stdin in separate thread. tried reading both stdin and stderr in separate threads. Still no luck.
I know there are many questions here dealing with Process.waitFor() hanging due to not reading the streams, as well as BufferedReader.read() hanging, tried all the suggested solutions, still no luck.
Of course, running the same script on the CLI itself writes output to the standard output (console) and exists with exit code 0.
I'm running on Centos 6.6.
Any help will be appreciated.
I presume that when run directly from the command line, the script runs to completion, producing the expected output, and terminates cleanly. If not, then fix your script first.
The readLine() invocation hanging almost surely means that neither a line terminator nor end-of-file is encountered. In other words, the method is blocked waiting for the script. Perhaps the script produces no output at all under the conditions, but does not terminate. This might happen, for instance, if it expects to read data from its own standard input before it proceeds. It might also happen if it is blocked on output to its stderr.
In the general case, you must read both a Process's stdout and its stderr, in parallel, via the InputStreams provided by getInputstream() and getErrorStream(). You should also handle the OutputStream provided by getOutputStream() by either feeding it the needed standard input data (also in parallel with the reading) or by closing it. You can substitute closing the process's streams for reading them if the particular process you are running does not emit data to those streams, and you normally should close the Process's OutputStream when you have no more data for it. You need to read the two InputStreams even if you don't care about what you read from them, as the process may block or fail to terminate if you do not. This is tricky to get right, but easier to do for specific cases than it is to write generalized support for. And anyway, there's ProcessBuilder, which goes some way toward an easier general-purpose interface.
Try using ProcessBuilder like so:
String cmd = "/var/tmp/./myscript";
ProcessBuilder perlProcessBuilder = new ProcessBuilder(cmd);
perlProcessBuilder.redirectOutput(ProcessBuilder.Redirect.PIPE);
Process process = perlProcessBuilder.start();
stdin = new BufferedReader(new InputStreamReader(process.getInputStream()));
String line;
while((line = stdin.readLine()) != null) {
System.out.println(line);
}
From the ProcessBuilder javadoc (link)
public ProcessBuilder redirectOutput(ProcessBuilder.Redirect destination)
Sets this process builder's standard output destination. Subprocesses subsequently started by this object's start() method send their standard output to this destination.
If the destination is Redirect.PIPE (the initial value), then the standard output of a subprocess can be read using the input stream returned by Process.getInputStream(). If the destination is set to any other value, then Process.getInputStream() will return a null input stream.
Parameters:
destination - the new standard output destination
Returns:
this process builder
Throws:
IllegalArgumentException - if the redirect does not correspond to a valid destination of data, that is, has type READ
Since:
1.7
Related
try {
ProcessBuilder pb = new ProcessBuilder("C:\\Users\\--------\\PycharmProjects\\--------\\venv\\Scripts\\Python.exe", "---------.py");
Process p = pb.start();
System.out.println(p.getOutputStream());
}
catch(Exception e){
System.out.println("Exception: " + e);
}
">" So I am working on a program that grabs information from Spotify's API. I have a script in python that feeds a java program the data I need. Unfortunately, I am having trouble getting eclipse to run the .py script by itself. I am using ProcessBuilder and for some reason there are no errors but yet the program isn't executing the python script. I am new to integrating multiple languages in a project so any help is appreciated! I have done hours of research trying to get this figured out. I know that there are similar posts on here regarding the same topic but none of the answers seemed to work for me. Thanks!"<"
It is running the script, you just aren't getting the output, because you did two things wrong. First, see the javadoc for Process.getOutputStream:
Returns the output stream connected to the normal input of the process. Output to the stream is piped into the standard input of the process represented by this Process object.
That's not what you want. To get the output from the process USE Process.getInputStream:
Returns the input stream connected to the normal output of the process. The stream obtains data piped from the standard output of the process represented by this Process object. [plus stderr if merged]
Second, System.out.println(stream) (for an input stream) doesn't print the data that can be received on the stream, it prints only the stream object (as internal classname, atsign, hashcode). To display the data from the python process (i.e. the script) you must read it from the stream and then output the data that was read. There are examples of this everywhere; I can't imagine how you could spend hours without finding at least a hundred. Try for example:
read the output from java exec
Reading InputStream from Java Process
java Process, getInputStream, read newest line only
Cannot get the getInputStream from Runtime.getRunTime.exec()
Printing a Java InputStream from a Process
I want to execute multiple commands from Java Process but I don't want to spawn a new process for executing every command. So I made an Object called Shell that holds InputStream and OutputStream for Process.
The problem is that if I don't terminate a process by appending
"exit\n"
I can't tell where is the end of the InputStream and the InputStream gets into waiting state when I've read the whole output so I need to know when to stop doing next read.
Is there some kind of a standard symbol at the end of the output?
Because what I came up with is
final String outputTerminationSignal = checksum(command);
command += ";echo \"" + outputTerminationSignal + "\";echo $?\n"
This way when I get the outputTerminationSignal line I can get the exit code and stop reading.
final String line = bufferedReader.readLine();
if (line != null && line.equals(outputTerminationSignal)) {
final String exitCode = bufferedReader.readLine();
}
Of course this is exploitable and error-prone because the real output in some case may match my generated outputTerminationSignal and the app will stop reading when it shouldn't.
I wonder if there is some standard so called "outputTerminationSignal" comming from the output I am not aware of.
Unix doesn't use a special character or symbol to indicate the end of a stream. In java, if you try to read from a stream that's at end-of-file, then you'll get an EOFException.
Having said that, if you're reading from a stream connected to a running program, then you won't get an EOFException just because the other program is idle. You would only get an EOFException if the other program has exited, or if it explicitly closes its output stream (that you are reading from). The situation you describe sounds like the shell is just idle waiting for another command. You won't get an EOF indication from the stream in this case.
You could try getting the shell to print a command prompt when it's waiting for a command, then look for the command prompt as an "end of command" indicator. Shells normally print command prompts only when they're interactive, but you might be able to find a way around that.
If you want to make the shell process exit without sending it the "exit" command, you could try closing the stream that you're using to write to the shell process. The shell should see that as an end-of-file and exit.
You could ask the shell for the PID of the spawned child, and monitor its state
So I have a java program that calls a C program through ProcessBuilder, and I need the C program to inform tha Java program when something happens. I have the following code for the java prog:
String cmd[] = //string to run the c program in the terminal, no probs here
ProcessBuilder builder = new ProcessBuilder(cmd);
builder.redirectErrorStream(true);
Process process = builder.start();
BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(process.getInputStream()));
System.out.println(bufferedReader.ready());
System.out.println(bufferedReader.readLine());
The c program at a given point will have to inform the java program of something. I have tried many things like
char Buff[] = "output";
write(0, Buff, strlen(Buff)+1);
write(1, Buff, strlen(Buff)+1);
printf("output\n");
But I cant get the java program to read this, the only output I get is
false
null
The Java program won't see the output until buffers are flushed.
Buffering at the level of write is OS dependent and even within the same OS, different kinds of streams may have different default buffering modes. In Linux the documents imply a write to a pipe will immediately be readable by the other process, and ProcessBuilder uses a pipe at least in Android.
It's likely that if you use stdio.h that fflush will portably push data all the way out to the socket or pipe. E.g. I have had success with fflush for this purpose in Android using ProcessBuilder.
Line buffering is another possible choice for the OS. In this case appending \n to your messages may have an effect.
By the way, mixing write and printf calls in the same program is asking for trouble. And as has been mentioned, write(0 is an attempt to write to stdin, and strlen(buf)+1 is causing a final zero byte to be sent to the Java program, which is unlikely to be what you want.
I have the following Java code to start a ProcessBuilder, open an OutputStream, have the process write a string to an OutputStream, and then close the OutputStream. The whole thing hangs indefinitely when I try to close the OutputStream. This only happens on Windows, never on Mac or Linux.
Some of the related questions seem to be close to the same problem I'm having, but I haven't been able to figure out how to apply the answers to my problem, as I am a relative newbie with Java. Here is the code. You can see I have put in a lot of println statements to try to isolate the problem.
System.out.println("GenMic trying to get the input file now");
System.out.flush();
OutputStream out = child.getOutputStream();
try {
System.out.println("GenMic getting ready to write the input file to out");
System.out.flush();
out.write(intext.getBytes()); // intext is a string previously created
System.out.println("GenMic finished writing to out");
System.out.flush();
out.close();
System.out.println("GenMic closed OutputStream");
System.out.flush();
} catch (IOException iox) {
System.out.println("GenMic caught IOException 2");
System.out.flush();
String detailedMessage = iox.getMessage();
System.out.println("Exception: " + detailedMessage);
System.out.flush();
throw new RuntimeException(iox);
}
And here is the output when this chunk is executed:
GenMic trying to get the input file now
GenMic getting ready to write the input file to out
GenMic finished writing to out
You need to make sure that the streams returned by getInputStream() and getOutputStream() are drained on individual threads and these threads are different from the one on which you close the stream returned by getOutputStream().
Basically it is a requirement to have at least 3 threads per sub-process if you want to manipulate and examine its stdin, stdout and stderr. One of the threads, depending on your circumstances, may be your current execution thread ( the one on which you create ProcessBuilder ).
When that happened to me it was because I hadn't read everything from the stream being written to by the process.
The API docs for the java.lang.Process class say:
The created subprocess does not have its own terminal or console. All its standard io (i.e. stdin, stdout, stderr) operations will be redirected to the parent process through three streams (getOutputStream(), getInputStream(), getErrorStream()). The parent process uses these streams to feed input to and get output from the subprocess. Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the subprocess to block, and even deadlock.
I would try calling getInputStream() on the Process instance and writing a loop to read one byte at a time until it reaches EOF. And I'd do the same thing with getErrorStream() just in case the process is writing to stderr.
Do you have anything reading from stdout/stderr of the process ?
It's quite likely the process tries to output something, but gets blocked since noone is reading the output. Meaning your out.flush() or out.close() blocks as the process can't get around to process the input since its blocked doing output.
Is there a thread-safe way to concurrently consume the stdout from an external process, using ProcessBuilder in Java 1.6?
Background: I need to invoke pbzip2 to unzip large files to stdout and to process each line as the file is decompressed (pbzip2 utilizes multiple CPUs, unlike other implementations).
The logical approach is to create a child thread to loop over the InputStream (i.e. stdout; don't you just love the naming?), as follows:
while((line = reader.readLine()) != null)
{
// do stuff
}
However, unzipping is slow, so what I really need is for the reader.readLine method to quietly wait for the next line(s) to become available, instead of exiting.
Is there a good way to do this?
You should be able to wrap your input stream with an InputStreamReader and BufferedReader. You can then call readLine() and that will block as required.
Note that you should have a corresponding reader for the stderr. You don't have to do anything with it, but you will need to consume the stderr stream, otherwise your spawned process may well block. See this answer for links etc.
You more or less have the solution yourself. You just create a new thread which reads the next line in a loop from the stream of your external process and processes that line.
readLine() will block and wait until an entire new line is available. If you're on a multicore/processor machine, your external process can happily continue unzipping while your thread processes a line. Atleast unzipping can continue until the OS pipes/buffers becomes full.
Just note that if your processing is slower than unzipping, you'll block the unzipping, and at this point it becomes a memory vs speed issue. e.g. you could create one thread that does nothing but read lines(so unzipping will not block), buffer them up in a queue in memory and a another thread - or even several, that consumes said queue.
readLine method to quietly wait for
the next line(s) to become available,
instead of exiting
nd that's exactly what readLine should do, it will just block until a whole line is available.
Yes.
I have written some code that kicks off a time consuming job (ffmpeg) in a Process (spawned by process builder), and it in turn kicks off my OutputStreamReaderclass that is an extention of Thread that consumes the stdio and does some magic with it.
The catch (for me) was redirecting the error stream. Here is my code snippet:
procbbuilder.redirectErrorStream(true);
proc = pb.start();
err = new MyOutputStreamReader(this, proc.getInputStream()); //extenion of thread
err.start();
int exitCode = proc.waitFor();