I'm writing a wrapper script using Groovy (but the question is really a Java one) and would like to know if it's possible to create a Process without letting it run first. The problem is that the Process starts running and generating output on stdout and stderr. I would like to forward those to their appropriate destinations and at the same time create a merged stream for processing within the script. The problem I'm running into, however, is that the Process generates output too quickly and the output on the first two lines is a bit garbled. I would like to setup the streams before the process starts running. Any way to do that?
This consumes the output and error streams into two separate StringWriters, but I don't see anything "garbled"
new StringWriter().with { out ->
new StringWriter().with { err ->
'ls /tmp'.execute().with { proc ->
consumeProcessOutput( out, err )
waitFor()
}
println "OUT: $out"
println "ERR: $err"
}
}
Related
I'm running a jar file from another jar like here somebody answers but waiting for the process.
Process proc = Runtime.getRuntime().exec("java -jar A.jar" + stringParams);
try {
proc.waitFor();
} catch (InterruptedException e) {
e.printStackTrace();
}
InputStream in = proc.getInputStream();
InputStream err = proc.getErrorStream();
My problem comes when i have no feedback on the status of the program that is called, but i don't want my program continues beyond those lines. I would need the standard and error outputs but the results are shown when the execution is over. Is there any way of executing and getting those streams while the jar is still running?
Buffered/unbuffered
It seems like an issue with buffered output.
Executed process (in this case java -jar <path>) buffers output and writes it only when it's done (in big chunks, we don't like that!)
So one way to go is execute process through unbuffering (very hacky tools):
unbuffered <command>
stdbuf -i0 -o0 -e0 <command>
use terminal emulation
Hacking
stdbuf is part of GNU tools.
https://www.gnu.org/software/coreutils/manual/html_node/stdbuf-invocation.html
unbuffered is part of expect package.
https://wiki.tcl.tk/3548
The key thing is making the program thinking that it's in interactive mode (like you are launching it from console).
The first two options are very hacky and do not work in all cases (idk if java command works with them?)
Emulation
The third option is most promising.
We launch a program (terminal emulator) that emulates interactive terminal making program think it's working in real active session!
Pty4j
You might use pty4j too:
From there: https://github.com/traff/pty4j
// The command to run in a PTY...
String[] cmd = { "java", "-jar", "path_to_jar" };
// The initial environment to pass to the PTY child process...
String[] env = { "TERM=xterm" };
PtyProcess pty = PtyProcess.exec(cmd, env);
OutputStream os = pty.getOutputStream();
InputStream is = pty.getInputStream();
// ... work with the streams ...
// wait until the PTY child process terminates...
int result = pty.waitFor();
// free up resources.
pty.close();
Zt-exec
Maybe it's worth trying zt-exec?
I have no idea how it executes commands.
But it may be it (I didn't test that).
Using https://github.com/zeroturnaround/zt-exec
new ProcessExecutor().command("java", "-jar path_to_jar")
.redirectOutput(new LogOutputStream() {
#Override
protected void processLine(String line) {
...
}
})
.execute();
That should work, but I didn't test that.
In general, there are no ways to nicely resolve your problem.
Depending on what platforms you want to target consider using unbuffered, stdbuff or the (slowest) terminal emulation...
Please let me know if that helps and good luck! :)
So I have a Java console jar that works with commands that I enter while its running.
Is this also possible with PHP? I know executing the jar is, with exec(), but I can't really pass the running jar commands or get its output.
What you'll want to do is initialize the jar with proc_open() instead of exec(). proc_open() allows you to have individual streams to read/write from/to the stdin/stdout/stderr of your Java process. So, you'll start the Java process, and then you'll use fwrite() to send commands to the stdin ($pipes[0]) of the Java process. See the examples on proc_open()'s documentation page for more info.
EDIT Here's a quick code sample (just a lightly modified version of the example on the proc_open docs):
$descriptorspec = array(
0 => array("pipe", "r"), // stdin is a pipe that the child will read from
1 => array("pipe", "w"), // stdout is a pipe that the child will write to
2 => array("file", "/tmp/error-output.txt", "a") // stderr is a file to write to
);
$process = proc_open('java -jar example.jar', $descriptorspec, $pipes);
if (is_resource($process)) {
// $pipes now looks like this:
// 0 => writeable handle connected to child stdin
// 1 => readable handle connected to child stdout
// Any error output will be appended to /tmp/error-output.txt
fwrite($pipes[0], 'this is a command!');
fclose($pipes[0]);
echo stream_get_contents($pipes[1]);
fclose($pipes[1]);
// It is important that you close any pipes before calling
// proc_close in order to avoid a deadlock
$return_value = proc_close($process);
echo "command returned $return_value\n";
}
I'm building a c sharp serviceProcess that will start a Batch file (the batch file will start a java application).
If i stop the service it kills the java process. Stopping the java process can take up to 2 minutes. The service has to wait for the java application to stop, so i made a sleep.
System.Threading.Thread.Sleep( );
Is it possible to check if the "java" process is closed and after that stop the ServiceProcess.
You can access the process using the Process-class. It allows you to get several information about a specific process. When you start the java.exe directly (using Process.Start) you already have the Process-instance.
When using a batch-file, you need to find the process, which is not a problem at all. You can find a process using Process.GetProcessesByName, but you'd find all all java-processes running on your machine.
You could try something like the following:
Process[] proc = Process.GetProcessesByName("Java");
if (proc.Count() != 0)
{
//Process Alive
Process prod = proc[0];
prod.Kill();
prod.WaitForExit();
}
else
{
//Process Dead
}
A better option would be to use the process ID(if you know it)
Warning: This will kill the first java process it finds, you need to check which one you need to kill...
I'm using this code:
http://groovy.codehaus.org/Expect+for+Groovy
to attempt to automate a python based CLI.
My test main function is below.
Running this however, it seems that it never actually reads data from the process.
If I change the process to /bin/ls and expect some filename, it will work correctly, which leads me to believe it cant handle the fact that python is waiting for input, while /bin/ls closes the stream and flushes it.
Any ideas? Thanks.
public static void test2(String[] args){
println "Main"
def builder = new ProcessBuilder("/usr/bin/python");
builder.redirectErrorStream()
builder.redirectOutput(ProcessBuilder.Redirect.PIPE);
builder.redirectInput(ProcessBuilder.Redirect.PIPE);
def expectSession = new IOSession(builder.start());
expectSession.expect(">>>");
expectSession.send("print(%d) % (1+1)")
expectSession.expect("2");
expectSession.send("quit()");
expectSession.close();
println "Done...";
}
Looking through the source for IOSession it looks like this might be a bug in the constructor. Try:
def expectSession = new IOSession();
expectSession.addProcess(builder.start());
Also, you have to add \r to the end of the strings you are sending.
How do I run an external command (via a shell) from a Java program, such that no redirection takes place, and wait for the command to end? I want the file descriptors of the external program to be the same as those of the Java program. In particular I do not want the output to be redirected to a pipe that the Java program is reading. Having the Java program relay the output is not a solution.
This means that a plain invocation of java.lang.Runtime.exec is not the solution. I presume that java.lang.ProcessBuilder is involved, but how do I specify that output and error streams must be the same as the calling Java process?
class A {
public static void main(String[] args) {
try {
ProcessBuilder pb = new ProcessBuilder("echo", "foo");
/*TODO: pb.out = System.out; pb.err = System.err;*/
Process p = pb.start();
p.waitFor();
} catch (Exception e) {
System.err.println(e);
System.exit(1);
}
}
}
(This may or may not be the right approach.)
In other words, I'm looking for Java's system, but all I can find is (roughly) popen.
Here's an example of a situation where relaying cannot work: if the subprocess writes to both stdout and stderr and the Java program is relaying, then the Java program has no way to know the order of the write calls in the subprocess. So the order of the output on stdout and stderr from the Java program will be observably different if the two streams end up in the same file. Mixing stdout and stderr is of course not a solution because the caller may want to keep them separate.
While I think this question is of general interest, a Linux-specific solution would solve my immediate problem.
This is the intent of ProcessBuilder.redirectError/redirectOutput which were introduced in Java 7. Using Redirect.INHERIT will make the child process share stderr/stdout with the Java process:
class A {
public static void main(String[] args) {
try {
ProcessBuilder builder = new ProcessBuilder("echo", "foo");
builder.redirectError(ProcessBuilder.Redirect.INHERIT);
builder.redirectOutput(ProcessBuilder.Redirect.INHERIT);
Process p = builder.start();
p.waitFor();
} catch (Exception e) {
System.err.println(e);
System.exit(1);
}
}
}
You might take a look at the NuProcess project. Disclaimer: I wrote it. It provides non-blocking I/O from spawned processes. You still have to relay in Java (you receive callbacks), but because it is using epoll() in the case of Linux, I would expect it to preserve the order of the underlying program. Only a single thread is epoll()'ing the pipes so you won't get any thread scheduling order issues.
I'm 100% order would be preserved on MacOS X, or any BSD variant, because it uses a kqueue which is definitely ordered. Anyway, you might want to give it a shot, it's trivial to code and test.
You can't. By default all standard I/O of the child process are redirected to the parent process (the jvm running your java program).
from the javadoc of the Process class:
By default, the created subprocess does not have its own terminal or
console. All its standard I/O (i.e. stdin, stdout, stderr)
operations will be redirected to the parent process, where they can be
accessed via the streams obtained using the methods getOutputStream(),
getInputStream(), and getErrorStream(). The parent process uses these
streams to feed input to and get output from the subprocess. Because
some native platforms only provide limited buffer size for standard
input and output streams, failure to promptly write the input stream
or read the output stream of the subprocess may cause the subprocess
to block, or even deadlock.