Running .py file in Java Eclipse - java

try {
ProcessBuilder pb = new ProcessBuilder("C:\\Users\\--------\\PycharmProjects\\--------\\venv\\Scripts\\Python.exe", "---------.py");
Process p = pb.start();
System.out.println(p.getOutputStream());
}
catch(Exception e){
System.out.println("Exception: " + e);
}
">" So I am working on a program that grabs information from Spotify's API. I have a script in python that feeds a java program the data I need. Unfortunately, I am having trouble getting eclipse to run the .py script by itself. I am using ProcessBuilder and for some reason there are no errors but yet the program isn't executing the python script. I am new to integrating multiple languages in a project so any help is appreciated! I have done hours of research trying to get this figured out. I know that there are similar posts on here regarding the same topic but none of the answers seemed to work for me. Thanks!"<"

It is running the script, you just aren't getting the output, because you did two things wrong. First, see the javadoc for Process.getOutputStream:
Returns the output stream connected to the normal input of the process. Output to the stream is piped into the standard input of the process represented by this Process object.
That's not what you want. To get the output from the process USE Process.getInputStream:
Returns the input stream connected to the normal output of the process. The stream obtains data piped from the standard output of the process represented by this Process object. [plus stderr if merged]
Second, System.out.println(stream) (for an input stream) doesn't print the data that can be received on the stream, it prints only the stream object (as internal classname, atsign, hashcode). To display the data from the python process (i.e. the script) you must read it from the stream and then output the data that was read. There are examples of this everywhere; I can't imagine how you could spend hours without finding at least a hundred. Try for example:
read the output from java exec
Reading InputStream from Java Process
java Process, getInputStream, read newest line only
Cannot get the getInputStream from Runtime.getRunTime.exec()
Printing a Java InputStream from a Process

Related

jar executable from python

I have a python list of dicts that I parse to get value strings from specific keys. I need to send these strings to an executable jar that translates them then take the translation and add it back to the dict. The jar runs from the command line as:
java -jar myJar.jar -a
this opens
Enter a word to begin:
I can enter as many words as I want and it gives the translation. Then ctrl+Z+retrun to close.
I tried
cmd = ['java', '-jar', 'myJar.jar', '-a']
process = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, stdin=subprocess.PIPE, universal_newlines=True)
stdout,stderr = process.communicate('word')
This works exactly once, then I have to call subprocess again. Is there a way to hold the jar open, translate a group of words and pipe the output to python? I have to do them one at a time; it's not possible to send a list or array.
'Popen.communicate' sends the input and then waits for end-of-file on output. By its documentation, it waits for the process to terminate before returning to its caller. Thus you cannot iteratively execute multiple 'communicate' calls.
You need to get the input and output streams from the process, and manage them yourself, rather than using 'communicate'. Loop, writing to the process input, and read from the process output.

BufferedReader.read() hangs when running a perl script using Runtime.exec()

I'm trying to run a perl script from Java code and read it's output with the following code:
String cmd = "/var/tmp/./myscript";
Process process = Runtime.getRuntime().exec(cmd);
stdin = new BufferedReader(new InputStreamReader(process.getInputStream()));
String line;
while((line = stdin.readLine()) != null) {
System.out.println(line);
}
But the code always hangs on the readLine().
I tried using
stdin.read();
Instead but that also hangs.
tried modifying the cmd to
cmd = "perl /var/tmp/myscript";
And also
cmd = {"perl","/var/tmp/myscript"};
But that also hangs.
tried reading the stdin in separate thread. tried reading both stdin and stderr in separate threads. Still no luck.
I know there are many questions here dealing with Process.waitFor() hanging due to not reading the streams, as well as BufferedReader.read() hanging, tried all the suggested solutions, still no luck.
Of course, running the same script on the CLI itself writes output to the standard output (console) and exists with exit code 0.
I'm running on Centos 6.6.
Any help will be appreciated.
I presume that when run directly from the command line, the script runs to completion, producing the expected output, and terminates cleanly. If not, then fix your script first.
The readLine() invocation hanging almost surely means that neither a line terminator nor end-of-file is encountered. In other words, the method is blocked waiting for the script. Perhaps the script produces no output at all under the conditions, but does not terminate. This might happen, for instance, if it expects to read data from its own standard input before it proceeds. It might also happen if it is blocked on output to its stderr.
In the general case, you must read both a Process's stdout and its stderr, in parallel, via the InputStreams provided by getInputstream() and getErrorStream(). You should also handle the OutputStream provided by getOutputStream() by either feeding it the needed standard input data (also in parallel with the reading) or by closing it. You can substitute closing the process's streams for reading them if the particular process you are running does not emit data to those streams, and you normally should close the Process's OutputStream when you have no more data for it. You need to read the two InputStreams even if you don't care about what you read from them, as the process may block or fail to terminate if you do not. This is tricky to get right, but easier to do for specific cases than it is to write generalized support for. And anyway, there's ProcessBuilder, which goes some way toward an easier general-purpose interface.
Try using ProcessBuilder like so:
String cmd = "/var/tmp/./myscript";
ProcessBuilder perlProcessBuilder = new ProcessBuilder(cmd);
perlProcessBuilder.redirectOutput(ProcessBuilder.Redirect.PIPE);
Process process = perlProcessBuilder.start();
stdin = new BufferedReader(new InputStreamReader(process.getInputStream()));
String line;
while((line = stdin.readLine()) != null) {
System.out.println(line);
}
From the ProcessBuilder javadoc (link)
public ProcessBuilder redirectOutput(ProcessBuilder.Redirect destination)
Sets this process builder's standard output destination. Subprocesses subsequently started by this object's start() method send their standard output to this destination.
If the destination is Redirect.PIPE (the initial value), then the standard output of a subprocess can be read using the input stream returned by Process.getInputStream(). If the destination is set to any other value, then Process.getInputStream() will return a null input stream.
Parameters:
destination - the new standard output destination
Returns:
this process builder
Throws:
IllegalArgumentException - if the redirect does not correspond to a valid destination of data, that is, has type READ
Since:
1.7

UNIX STDOUT end symbol

I want to execute multiple commands from Java Process but I don't want to spawn a new process for executing every command. So I made an Object called Shell that holds InputStream and OutputStream for Process.
The problem is that if I don't terminate a process by appending
"exit\n"
I can't tell where is the end of the InputStream and the InputStream gets into waiting state when I've read the whole output so I need to know when to stop doing next read.
Is there some kind of a standard symbol at the end of the output?
Because what I came up with is
final String outputTerminationSignal = checksum(command);
command += ";echo \"" + outputTerminationSignal + "\";echo $?\n"
This way when I get the outputTerminationSignal line I can get the exit code and stop reading.
final String line = bufferedReader.readLine();
if (line != null && line.equals(outputTerminationSignal)) {
final String exitCode = bufferedReader.readLine();
}
Of course this is exploitable and error-prone because the real output in some case may match my generated outputTerminationSignal and the app will stop reading when it shouldn't.
I wonder if there is some standard so called "outputTerminationSignal" comming from the output I am not aware of.
Unix doesn't use a special character or symbol to indicate the end of a stream. In java, if you try to read from a stream that's at end-of-file, then you'll get an EOFException.
Having said that, if you're reading from a stream connected to a running program, then you won't get an EOFException just because the other program is idle. You would only get an EOFException if the other program has exited, or if it explicitly closes its output stream (that you are reading from). The situation you describe sounds like the shell is just idle waiting for another command. You won't get an EOF indication from the stream in this case.
You could try getting the shell to print a command prompt when it's waiting for a command, then look for the command prompt as an "end of command" indicator. Shells normally print command prompts only when they're interactive, but you might be able to find a way around that.
If you want to make the shell process exit without sending it the "exit" command, you could try closing the stream that you're using to write to the shell process. The shell should see that as an end-of-file and exit.
You could ask the shell for the PID of the spawned child, and monitor its state

Runtime Exec stop unexpectedly

I have a little executable program in C that produce a lot of output to a file.
When I call this program with Runtime, like this:
Runtime r = Runtime.getRuntime();
Process p = null;
p = r.exec("./my_program -in input.file -out output.file", null, new File(System.getProperty("java.io.tmpdir")));
When the program produce low output everything is ok, but when I call "*my_program*" with a large input it will produce a large quantity of output to the output.file, but in this case my program in Java freeze and nothing happen...
I test "*my_program*" in terminal with a lot of large inputs and everything is ok, but when I call the program in Java with Runtime.exec, the Java program freeze.
--
Thanks in advance
Make sure you're reading from the Process's .getOutputStream() and .getErrorStream() if you aren't already. Looking at your code snippet, it appears that you're just executing .exec(...) (and maybe waiting for it to complete with a call not shown to .waitFor()?).
Per http://download.oracle.com/javase/6/docs/api/java/lang/Process.html (emphasis added):
The parent process uses these streams to feed input to and get output
from the subprocess. Because some native platforms only provide
limited buffer size for standard input and output streams, failure to
promptly write the input stream or read the output stream of the
subprocess may cause the subprocess to block, and even deadlock.

Java output from process buider overwritten when using BufferedReader

I'm trying to run an external program in Java and to read the output. The program is a Linux application in C++ that runs a data mining algorithm and prints the patterns found on standard output. I want to be able to read that output from my Java app and to show the patterns using a table. The problem is that the size of the output is quite big (as a test it produces 6.5MB in about 30 seconds). I'm using ProcessBuilder and reading the output using an InputStreamReader buffered using a BufferedReader as you can see in the following code:
String[] cmd = {"./clogen_periodic", selected, support, "-t 4"};
Process p = new ProcessBuilder(cmd).start();
input = new BufferedReader (new InputStreamReader(p.getInputStream()));
while ((line = input.readLine()) != null) {
...
process line;
...
}
The problem is that the output gets corrupted. When I execute the same program on a console the output is correct but when I use the Java app some lines are merged. More precisely output should be like this
TMEmulation log_pseduo_allocation (34985) (2 45 76 89 90)
__divw clock timer (8273) (4 6 67 4 2)
but it is like this
TMEmulation log_pseduo_allocation (34985) (2__divw 45clock 76timer (89 8273) 904) (6 67 4 2)
Any idea about the possible problem?
Thanks a lot in advance,
Patricia
A few possibilities all to do with the called program
1) as #Artefacto says the C++ program output might not be fully buffered so call setvbuf to make it consistant. ie the first output is partially buffered and second is not and so first flushes after the end of the second. In general buffering can differ if called from the command line and from a process.
2) The program is multi-threaded and the output behaves differently when called from java and so the output timing differs.
Basically you need to look at the code for the called program to force logging/output to be all through the same call.
Try calling in C++ program, setvbuf with the option _IOLBF. The end of the pipe exposed to the C++ is probably unbuffered, while when you run the programs from the command line with |, it's line buffered.
If you're doing a System.out.print() or what ever for debugging in every iteration currently, then try putting all lines from all iterations into one String and give that a try.
Maybe your output method prints out asynchronously. Therefore your printed output may be corrupted but not the one you got from input stream.
Just an idea ...
You should be reading stdout and stderr in separate threads to avoid blocking issues.
I can't say for sure if that will fix your problem but it should be done anyway to avoid other problems you may hit (your app may deadlock waiting on stdout for example).
Luckily there's a very good example with sample code that walks you through this.
http://www.javaworld.com/javaworld/jw-12-2000/jw-1229-traps.html
The article states (see bottom of page 2) that you should always read from stderr and stdout even if you don't need the output to prevent possible deadlocks.
Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the subprocess to block, and even deadlock.

Categories

Resources