Runtime Exec stop unexpectedly - java

I have a little executable program in C that produce a lot of output to a file.
When I call this program with Runtime, like this:
Runtime r = Runtime.getRuntime();
Process p = null;
p = r.exec("./my_program -in input.file -out output.file", null, new File(System.getProperty("java.io.tmpdir")));
When the program produce low output everything is ok, but when I call "*my_program*" with a large input it will produce a large quantity of output to the output.file, but in this case my program in Java freeze and nothing happen...
I test "*my_program*" in terminal with a lot of large inputs and everything is ok, but when I call the program in Java with Runtime.exec, the Java program freeze.
--
Thanks in advance

Make sure you're reading from the Process's .getOutputStream() and .getErrorStream() if you aren't already. Looking at your code snippet, it appears that you're just executing .exec(...) (and maybe waiting for it to complete with a call not shown to .waitFor()?).
Per http://download.oracle.com/javase/6/docs/api/java/lang/Process.html (emphasis added):
The parent process uses these streams to feed input to and get output
from the subprocess. Because some native platforms only provide
limited buffer size for standard input and output streams, failure to
promptly write the input stream or read the output stream of the
subprocess may cause the subprocess to block, and even deadlock.

Related

Running .py file in Java Eclipse

try {
ProcessBuilder pb = new ProcessBuilder("C:\\Users\\--------\\PycharmProjects\\--------\\venv\\Scripts\\Python.exe", "---------.py");
Process p = pb.start();
System.out.println(p.getOutputStream());
}
catch(Exception e){
System.out.println("Exception: " + e);
}
">" So I am working on a program that grabs information from Spotify's API. I have a script in python that feeds a java program the data I need. Unfortunately, I am having trouble getting eclipse to run the .py script by itself. I am using ProcessBuilder and for some reason there are no errors but yet the program isn't executing the python script. I am new to integrating multiple languages in a project so any help is appreciated! I have done hours of research trying to get this figured out. I know that there are similar posts on here regarding the same topic but none of the answers seemed to work for me. Thanks!"<"
It is running the script, you just aren't getting the output, because you did two things wrong. First, see the javadoc for Process.getOutputStream:
Returns the output stream connected to the normal input of the process. Output to the stream is piped into the standard input of the process represented by this Process object.
That's not what you want. To get the output from the process USE Process.getInputStream:
Returns the input stream connected to the normal output of the process. The stream obtains data piped from the standard output of the process represented by this Process object. [plus stderr if merged]
Second, System.out.println(stream) (for an input stream) doesn't print the data that can be received on the stream, it prints only the stream object (as internal classname, atsign, hashcode). To display the data from the python process (i.e. the script) you must read it from the stream and then output the data that was read. There are examples of this everywhere; I can't imagine how you could spend hours without finding at least a hundred. Try for example:
read the output from java exec
Reading InputStream from Java Process
java Process, getInputStream, read newest line only
Cannot get the getInputStream from Runtime.getRunTime.exec()
Printing a Java InputStream from a Process

Strange execution patterns with subprocess.Popen

I have a Python script wherein a JAR is called. After the JAR is called, two shell scripts are called. Initially I was doing this:
proc = subprocess.Popen(jar_command, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
proc.wait()
output, errors = proc.communicate()
proc = subprocess.Popen(prune_command, shell=True)
proc.wait()
proc = subprocess.call(push_command, shell=True)
I have to wait for the first two processes to finish so I use Popen() and the final one I can let it run in the background, so I call() it. I pass shell=True because I want the called shell scripts to have access to environment variables.
The above works, however, I don't get any logging from the JAR process. I've tried calling it this way:
proc = subprocess.call(jar_command)
This logs as I would expect, but the two shell scripts that follow are not executed. Initially I thought the logs just weren't going to stdout but it turns out they're not being executed at all. I.E. not removing superfluous files or pushing to a database.
Why are the followup shell scripts being ignored?
If you are certain your shell scripts are not running at all, and with the first code everything works - then it must be the java command deadlocks or not terminates correctly using the call() function.
You can validate that by adding a dummy file creation in your bash scripts. Put it in the first line of the script, so if it is executed you'll get the dummy file created. If it's not created, that means the scripts weren't executed, probably due to something with the java execution.
I would have try couple things:
First I would return the Popen instead of call. Instead of using wait(), use communicate():
Interact with process: Send data to stdin. Read data from stdout and stderr, until end-of-file is reached. Wait for process to terminate.
communicate() returns a tuple (stdoutdata, stderrdata).
proc = subprocess.Popen(jar_command, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
proc.communicate()
Make sure to check both streams for data (stdout and stderr). You might miss an error the java process raises.
Next I would try disabling the buffer by providing bufsize=0 to Popen. It will eliminate the option it relates to python buffering.
If both options still don't work, try to see if there is an exception by using check_call():
proc = subprocess.check_call(jar_command)
Run command with arguments. Wait for command to complete. If the return code was zero then return, otherwise raise CalledProcessError.
These options might have the answer; if not, they would help the debugging process. Feel free to comment how this progress.
Most likely, you are forgetting that the processes streams are in fact OS-level buffers with some finite capacity.
For example, if you run a process that produces a lot of output in PIPE mode, and you wait for it to finish before trying to consume whatever that process wrote to output, you have a deadlock:
The process has filled up the output buffer and is now blocked on writing more data to its output. Until somebody empties the buffer by reading from pipe, the process cannot continue.
Your program is waiting for the subprocess to finish before you read the data from its buffer.
The correct way is to start a thread in your program that will "drain" the pipe constantly as the process is running and while your main thread is waiting. You must first start the process, then start the drain threads, then wait for process to finish.
For differential diagnosis, check whether the subprocess will run fine with little output (i.e. as long as the buffer does not fill up, such as a line or two).
The documentation for subprocess has a note about this.

How can I detect if a subprocess is hung because its output buffer is full?

Consider the following Bash script and Java program:
$ cat kb.sh
#!/bin/bash
# Prints $1 KB - fold adds a \n
tr '\0' '=' < /dev/zero | fold -w 1023 | head -n ${1:-10}
$ cat Demo.java
import java.util.concurrent.TimeUnit;
class Demo {
public static void main(String[] args) throws Exception {
Process p = Runtime.getRuntime()
.exec("/tmp/kb.sh " + (args.length > 0 ? args[0] : ""));
if (p.waitFor(10, TimeUnit.SECONDS)) {
System.out.println("Process terminated");
} else {
System.err.println("Process did not terminate");
p.destroy();
System.exit(1);
}
}
}
The Demo class starts kb.sh as a subprocess, and expects it to terminate quickly. kb.sh, for its part, outputs (presumably quickly), some number of KBs of data. We can verify that it runs quickly in practice:
$ time /tmp/kb.sh 10000 | wc
10000 10000 10240000
real 0m0.398s
user 0m0.178s
sys 0m0.030s
When we run the Demo class however we see different behavior:
$ java -cp . Demo 64
Process terminated
$ java -cp . Demo 65
Process did not terminate
If we attempt to print ~65KB it hangs. I know why - Process is buffering the subprocess' output and when its buffer gets full the subprocess blocks until some data is read out of the buffer via Process.getInputStream(). If you added a call to ByteStreams.exhaust(p.getInputStream()); before p.waitFor() the process would always terminate successfully.
My question is, is there any way in Java to detect when a subprocess is being blocked like this? I fear the answer may be "not without reflection", since I don't see any such mechanism in any relevant APIs, but I could be missing something.
To forestall the inevitable "Why do you want to do this?", I'm writing a diagnostic utility to detect this in existing Process instances as it's an ongoing (and nefarious) source of bugs. I don't want to manipulate the Process or do anything destructive, I simply want to detect when the process has been stalled due to a full buffer so I can alert the caller.
NB: OS-dependent solutions, such as inspecting the output of ps, would be acceptable, but obviously aren't as ideal as a Java-only solution.
Short answer: there is no way to detect whether a full buffer is the cause of a hung subprocess.
Longer answer: The Java I/O stream APIs do not provide any way to determine
the state of a buffered stream. You can not determine whether a buffer is full. Worse, you can't even know how much space is available in the buffer, so it's difficult, if not impossible ,to determine whether the next write() operation will block or not. And, of course, once it's blocked, it doesn't respond to anything.
Short of requiring the child process to produce or respond to "heartbeat" pings to prove it's alive and not hung -- and once it's hung there's no way to know why -- there's not much you can do to proactively, or reactively, deal with full buffered streams other than reading them.
You don't have to detect it. You have to consume all its output, from both the standard output and standard errors. If you have code that doesn't do that, fix it.

Read stdout of c program in java

So I have a java program that calls a C program through ProcessBuilder, and I need the C program to inform tha Java program when something happens. I have the following code for the java prog:
String cmd[] = //string to run the c program in the terminal, no probs here
ProcessBuilder builder = new ProcessBuilder(cmd);
builder.redirectErrorStream(true);
Process process = builder.start();
BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(process.getInputStream()));
System.out.println(bufferedReader.ready());
System.out.println(bufferedReader.readLine());
The c program at a given point will have to inform the java program of something. I have tried many things like
char Buff[] = "output";
write(0, Buff, strlen(Buff)+1);
write(1, Buff, strlen(Buff)+1);
printf("output\n");
But I cant get the java program to read this, the only output I get is
false
null
The Java program won't see the output until buffers are flushed.
Buffering at the level of write is OS dependent and even within the same OS, different kinds of streams may have different default buffering modes. In Linux the documents imply a write to a pipe will immediately be readable by the other process, and ProcessBuilder uses a pipe at least in Android.
It's likely that if you use stdio.h that fflush will portably push data all the way out to the socket or pipe. E.g. I have had success with fflush for this purpose in Android using ProcessBuilder.
Line buffering is another possible choice for the OS. In this case appending \n to your messages may have an effect.
By the way, mixing write and printf calls in the same program is asking for trouble. And as has been mentioned, write(0 is an attempt to write to stdin, and strlen(buf)+1 is causing a final zero byte to be sent to the Java program, which is unlikely to be what you want.

Java output from process buider overwritten when using BufferedReader

I'm trying to run an external program in Java and to read the output. The program is a Linux application in C++ that runs a data mining algorithm and prints the patterns found on standard output. I want to be able to read that output from my Java app and to show the patterns using a table. The problem is that the size of the output is quite big (as a test it produces 6.5MB in about 30 seconds). I'm using ProcessBuilder and reading the output using an InputStreamReader buffered using a BufferedReader as you can see in the following code:
String[] cmd = {"./clogen_periodic", selected, support, "-t 4"};
Process p = new ProcessBuilder(cmd).start();
input = new BufferedReader (new InputStreamReader(p.getInputStream()));
while ((line = input.readLine()) != null) {
...
process line;
...
}
The problem is that the output gets corrupted. When I execute the same program on a console the output is correct but when I use the Java app some lines are merged. More precisely output should be like this
TMEmulation log_pseduo_allocation (34985) (2 45 76 89 90)
__divw clock timer (8273) (4 6 67 4 2)
but it is like this
TMEmulation log_pseduo_allocation (34985) (2__divw 45clock 76timer (89 8273) 904) (6 67 4 2)
Any idea about the possible problem?
Thanks a lot in advance,
Patricia
A few possibilities all to do with the called program
1) as #Artefacto says the C++ program output might not be fully buffered so call setvbuf to make it consistant. ie the first output is partially buffered and second is not and so first flushes after the end of the second. In general buffering can differ if called from the command line and from a process.
2) The program is multi-threaded and the output behaves differently when called from java and so the output timing differs.
Basically you need to look at the code for the called program to force logging/output to be all through the same call.
Try calling in C++ program, setvbuf with the option _IOLBF. The end of the pipe exposed to the C++ is probably unbuffered, while when you run the programs from the command line with |, it's line buffered.
If you're doing a System.out.print() or what ever for debugging in every iteration currently, then try putting all lines from all iterations into one String and give that a try.
Maybe your output method prints out asynchronously. Therefore your printed output may be corrupted but not the one you got from input stream.
Just an idea ...
You should be reading stdout and stderr in separate threads to avoid blocking issues.
I can't say for sure if that will fix your problem but it should be done anyway to avoid other problems you may hit (your app may deadlock waiting on stdout for example).
Luckily there's a very good example with sample code that walks you through this.
http://www.javaworld.com/javaworld/jw-12-2000/jw-1229-traps.html
The article states (see bottom of page 2) that you should always read from stderr and stdout even if you don't need the output to prevent possible deadlocks.
Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the subprocess to block, and even deadlock.

Categories

Resources