I need to start jar and provide input to it.
I've found how to start jar, which works without problems, using Runtime#getRuntime#exec, and I've also found that
String command = "stop";
BufferedWriter bufferedWriter = new BufferedWriter(new OutputStreamWriter(process.getOutputStream()));
bufferedWriter.write(String.format("%s\n", command));
bufferedWriter.flush();
should do the job. But the problem is, it just doesn't work for me, it literally does nothing. Does anyone know how to do that?
Process:
public static Process startJar(File jarFile, String flags, String args, #Nullable File dir) throws IOException {
if (dir == null){
return Runtime.getRuntime().exec(String.format("cmd /c start /wait \"\" java -jar %s \"%s\" %s",flags ,jarFile.getAbsolutePath(), args));
}
return Runtime.getRuntime().exec(String.format("cmd /c start /wait \"\" java -jar %s \"%s\" %s", flags, jarFile.getAbsolutePath(), args), null, dir);
}
If the input is commandline args, pass it to exec directly.
If not, and assuming process is the other jar you are running, you'll need to write to the InputStream of process, not the OutputStream. process.getOutputStream() will give you the stream to which the process is outputting its results, not where it's reading its input from.
EDIT: After Taschi pointed out that the code is correct
I found another question similar to yours. The accepted answer states that you have to either close the writer or pass a \n in order for it to work. Try that. If it still doesn't work, make sure the other JAR you're running is actually waiting for input on its STDIN.
I assume your process is blocked because it tried to write something to its output, and you did not read it. The API doc states:
Because some native platforms only provide limited buffer size for
standard input and output streams, failure to promptly write the input
stream or read the output stream of the subprocess may cause the
subprocess to block, or even deadlock.
So, read from process.getInputStream() and process.getErrorStream() unless both of them are empty, and then it should process your input just fine.
See here: Starting a process in Java? for a possible example on how to read.
The problem was, that I was starting that jar in a new window (cmd /c start), after removing that, everything works without a problem.
Related
I've been trying to use Java's ProcessBuilder to launch an application in Linux that should run "long-term". The way this program runs is to launch a command (in this case, I am launching a media playback application), allow it to run, and check to ensure that it hasn't crashed. For instance, check to see if the PID is still active, and then relaunch the process, if it has died.
The problem I'm getting right now is that the PID remains alive in the system, but the GUI for the application hangs. I tried shifting the ProcessBuilder(cmd).start() into a separate thread, but that doesn't seem to be solving anything, as I hoped it would have.
Basically the result is that, to the user, the program APPEARS to have crashed, but killing the Java process that drives the ProcessBuilder.start() Process actually allows the created Process to resume its normal behavior. This means that something in the Java application is interfering with the spawned Process, but I have absolutely no idea what, at this point. (Hence why I tried separating it into another thread, which didn't seem to resolve anything)
If anyone has any input/thoughts, please let me know, as I can't for the life of me think of how to solve this problem.
Edit: I have no concern over the I/O stream created from the Process, and have thus taken no steps to deal with that--could this cause a hang in the Process itself?
If the process writes to stderr or stdout, and you're not reading it - it will just "hang" , blocking when writing to stdout/err. Either redirect stdout/err to /dev/null using a shell or merge stdout/err with redirectErrorStream(true) and spawn another thread that reads from stdout of the process
You want the trick?
Don't start your process from ProcessBuilder.start(). Don't try to mess with stream redirection/consumption from Java (especially if you give no s**t about it ; )
Use ProcessBuilder.start() to start a little shell script that gobbles all the input/output streams.
Something like that:
#!/bin/bash
nohup $1 >/dev/null 2>error.log &
That is: if you don't care about stdout and still want to log stderr (do you?) to a file (error.log here).
If you don't even care about stderr, just redirect it to stdout:
#!/bin/bash
nohup $1 >/dev/null 2>1 &
And you call that tiny script from Java, giving it as an argument the name of the process you want to run.
If a process running on Linux that is redirecting both stdout and stderr to /dev/null still produce anything then you've got a broken, non-compliant, Linux install ;)
In other word: the above Just Works [TM] and get rid of the problematic "you need to consume the streams in this and that order bla bla bla Java-specific non-sense".
The thread running the process may block if it does not handle the output. This can be done by spawning a new thread that reads the output of the process.
final ProcessBuilder builder = new ProcessBuilder("script")
.redirectErrorStream(true)
.directory(workDirectory);
final Process process = builder.start();
final StringWriter writer = new StringWriter();
new Thread(new Runnable() {
public void run() {
IOUtils.copy(process.getInputStream(), writer);
}
}).start();
final int exitValue = process.waitFor();
final String processOutput = writer.toString();
Just stumbled on this after I had a similar issue. Agreeing with nos, you need to handle the output. I had something like this:
ProcessBuilder myProc2 = new ProcessBuilder(command);
final Process process = myProc2.start();
and it was working great. The spawned process even did output some output but not much. When I started to output a lot more, it appeared my process wasn't even getting launched anymore. I updated to this:
ProcessBuilder myProc2 = new ProcessBuilder(command);
myProc2.redirectErrorStream(true);
final Process process = myProc2.start();
InputStream myIS = process.getInputStream();
String tempOut = convertStreamToStr(myIS);
and it started working again. (Refer to this link for convertStreamToStr() code)
Edit: I have no concern over the I/O stream created from the Process, and have thus taken no steps to deal with that--could this cause a hang in the Process itself?
If you don't read the output streams created by the process then it is possible that the application will block once the application's buffers are full. I've never seen this happen on Linux (although I'm not saying that it doesn't) but I have seen this exact problem on Windows. I think this is likely related.
JDK7 will have builtin support for subprocess I/O redirection:
http://download.oracle.com/javase/7/docs/api/java/lang/ProcessBuilder.html
In the meantime, if you really want to discard stdout/stderr, it seems best (on Linux) to invoke ProcessBuilder on a command that looks like:
["/bin/bash", "-c", "exec YOUR_COMMAND_HERE >/dev/null 2>&1"]
Another solution is to start the process with Redirect.PIPE and close the InputStream like this:
ProcessBuilder builder = new ProcessBuilder(cmd);
builder.redirectOutput(Redirect.PIPE);
builder.redirectErrorStream(true); // redirect the SysErr to SysOut
Process proc = builder.start();
proc.getInputStream().close(); // this will close the pipe and the output will "flow"
proc.waitFor(); //wait
I tested this in Windows and Linux, and works!
In case you need to capture stdout and stderr and monitor the process then using Apache Commons Exec helped me a lot.
I believe the problem is the buffering pipe from Linux itself.
Try to use stdbuf with your executable
new ProcessBuilder().command("/usr/bin/stdbuf","-o0","*executable*","*arguments*");**
The -o0 says not to buffer the output.
The same goes to -i0 and -e0 if you want to unbuffer the input and error pipe.
you need to read the output before waiting to finish the cycle. You will not be notified If the output doesn't fill the buffer. If it does, it will wait until you read the output.
Suppose you have some errors or responses regarding your command which you are not reading. This would cause the application to stop and waitFor to wait forever. A simple way around is to re-direct the errors to the regular output.
I was spent 2 days on this issue.
public static void exeCuteCommand(String command) {
try {
boolean isWindows = System.getProperty("os.name").toLowerCase().startsWith("windows");
ProcessBuilder builder = new ProcessBuilder();
if (isWindows) {
builder.command("cmd.exe", "/c", command);
} else {
builder.command("sh", "-c", command);
}
Process process = builder.start();
BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream()));
String line;
while ((line = reader.readLine()) != null)
System.out.println("Cmd Response: " + line);
process.waitFor();
} catch (IOException e) {
e.printStackTrace();
} catch (InterruptedException e) {
e.printStackTrace();
}
}
I have java cde that jars class files together:
List<String> args = new ArrayList<String>();
String path = FileSystemUtils.JavaBin() + "\\jar.exe";
args.add(path);
args.add("-cfv");
args.add(jarName);
args.addAll(FileSystemUtils.getAllFiles(directory, ".class"));
ProcessBuilder pb = new ProcessBuilder(args);
File wd = new File(directory);
pb.directory(wd);
Process p = pb.start();
//Waiting for process to exit
p.waitFor();
int res = p.exitValue();
Tis code works great.
However, on some computers - not on all of them, when there are 7+ files, the p.waitFor(); never return, even though the jar was created.
Looking at the task manager, jar.exe really did not terminate.... what can be the cause?
running the same command manually from the command line exits immediately.
This seems very weird. Does someone have any hint?
Found the solution myself.
Apparently if you use ProcessBuilder.start in Java to start an external process you have to consume its stdout/stderr, otherwise the external process hangs.
This is beacuase OS creates a pipe.
All Unix like OSs and Windows behave the same in this regard: A pipe with a 4K is created between parent and child. When that pipe is full (because one side isn't reading), the writing process blocks.
It seems that when there are 7+ files the jar.exe consume 4K, nad then stuck.
Javadoc of process:
By default, the created subprocess does not have its own terminal or
console. All its standard I/O (i.e. stdin, stdout, stderr) operations
will be redirected to the parent process, where they can be accessed
via the streams obtained using the methods getOutputStream(),
getInputStream(), and getErrorStream(). The parent process uses these
streams to feed input to and get output from the subprocess. Because
some native platforms only provide limited buffer size for standard
input and output streams, failure to promptly write the input stream
or read the output stream of the subprocess may cause the subprocess
to block, or even deadlock.
Everything I read says that the only way to call a batch file from within a java program is to do something like this:
Process p = Runtime.getRuntime().exec("cmd /c start batch.bat");
From what I understand this creates a process to run CMD.exe, which in turn creates a process to run the batch file. However, the CMD.exe process appears to exit once it has instantiated the batch file process.
How can I confirm that the batch file has completed before the CMD process exits?
What jeb said, or try passing the /wait parameter to start. That should cause start to wait till the batch process completes. Try it at the command line first -- faster than rebuilding your Java app.
You could try to start the batch without the "start" command after cmd /c, as the "start.exe" creates a new process for the batch.bat.
Process p = Runtime.getRuntime().exec("cmd /c batch.bat");
This should be the correct form for you.
Process p = Runtime.getRuntime().exec("cmd /c start batch.bat");
To wait for the batch file to finish (exit) you remove the "start" after the /c.
The /c tells cmd.exe to return when cmd is finished, this you want, however, "start" means start a new cmd.exe process to execute the batch file. At this point the cmd.exe you started is finished and exits. Probably not what you want.
Either of the two code snippets below will get the batch file running and get you a Process object that you will need.
Process p = Runtime.getRuntime().exec("cmd /c batch.bat");
Or
ProcessBuilder pb= new ProcessBuilder ("cmd", "/c", "batch.bat");
Process p = pb.start ();
Now that you have a Process object you have multiple ways to wait for your batch file to finish.
The simplest way is .waitFor()
int batchExitCode = -1;
try {
batchExitCode = p.waitFor ();
} catch (InterruptedException e) {
// kill batch and re-throw the interrupt
p.destroy (); // could also use p.destroyForcibly ()
Thread.currentThread ().interrupt ();
}
You could also use waitFor(long timeout, TimeUnit unit) or p.isAlive() if they meet your needs better.
Warning If your batch is going to output a lot of data (probably more than 1024 characters, maybe less) to stdout and / or stderr your program needs to handle this data in some manner, otherwise, the pipe (s) between your program and the batch process will fill up and the batch file will hang waiting for room in the pipe to add new characters and the batch file will never return.
This is the problem that originally brought me to Stack Overflow this time. I had 17,000+ commands in a batch file and each command generated 300+ characters to stdout and any errors executing a command generated 1000+ characters.
Some of the ways to solve this:
#echo off as the 1st line in the batch file, with this the cmd process will not echo each command in the batch file. If your batch file does not generate any other output to stdout or stderr you are done.
If one or more of the commands in the batch file can or may generate a lot of stdout and / or stderr output then you have to handle that data yourself.
If you are using "Runtime.getRuntime().exec" you only have one way to handle this and that is to get the InputStreams for the batch file's stdout and stderr by calling:
InputStream outIS = p.getInputStream (); // this gets the batch file's stdout
InputStream errIS = p.getErrorStream (); // this gets the batch file's stderr
Now you have to read each of the InputStreams (only when they have data or you will wait until you do). Assuming that you get a lot more data from one of the InputStreams than the other you are almost assured of hanging the batch file. There are ways to read multiple streams without hanging, however, they are beyond the scope of my already excessively long answer.
If you are using ProcessBuilder pb= new ProcessBuilder ("cmd", "/c", "batch.bat"); (and this is a major reason to do so) you have a lot more options available to you since you can ask for stdout and stderr to be combined pb.redirectErrorStream (true);, easier to handle one InputStream without blocking, or you can redirect stdout and stderr to files (null files may work on Windows definitely on Unix) so you program doesn't have to handle them.
Don't forget that if your batch file reads data from stdin, you have to handle that as well. This is supported in Process and with more options in ProcessBuilder.
From the output of cmd /?
/C Carries out the command specified by string and then terminates
/K Carries out the command specified by string but remains
Thus what you need is:
Process p = Runtime.getRuntime().exec("cmd /k start batch.bat");
If you want to simply find out whether the batch file process finished or not, why not add...
echo DONE
at the end of the batch file?
Or if you program is for public use, something like...
echo finished>log.txt
would work. Then verify that it is finished from within your java program...
if (new BufferedReader(new FileReader("log.txt")).readLine().equals("finished"))
{
System.out.println("the batch process has finished");
}
I'm launching wkhtmltopdf from within my Java app (part of a Tomcat server, running in debug mode within Eclipse Helios on Win7 64-bit): I'd like to wait for it to complete, then Do More Stuff.
String cmd[] = {"wkhtmltopdf", htmlPathIn, pdfPathOut};
Process proc = Runtime.getRuntime().exec( cmd, null );
proc.waitFor();
But waitFor() never returns. I can still see the process in the Windows Task Manager (with the command line I passed to exec(): looks fine). AND IT WORKS. wkhtmltopdf produces the PDF I'd expect, right where I'd expect it. I can open it, rename it, whatever, even while the process is still running (before I manually terminate it).
From the command line, everything is fine:
c:\wrk>wkhtmltopdf C:\Temp\foo.html c:\wrk\foo.pdf
Loading pages (1/6)
Counting pages (2/6)
Resolving links (4/6)
Loading headers and footers (5/6)
Printing pages (6/6)
Done
The process exits just fine, and life goes on.
So what is it about runtime.exec() that's causing wkhtmltopdf to never terminate?
I could grab proc.getInputStream() and look for "Done", but that's... vile. I want something that is more general.
I've calling exec() with and without a working directory. I've tried with and without an empty "env" array. No joy.
Why is my process hanging, and what can I do to fix it?
PS: I've tried this with a couple other command line apps, and they both exhibit the same behavior.
Further exec woes.
I'm trying to read standard out & error, without success. From the command line, I know there's supposed to be something remarkably like my command line experience, but when I read the input stream returned by proc.getInputStream(), I immediately get an EOL (-1, I'm using inputStream.read()).
I checked the JavaDoc for Process, and found this
The parent process uses these streams to feed input to and get output from the subprocess. Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the [b]subprocess to block, and even deadlock[/b].
Emphasis added. So I tried that. The first 'read()' on the Standard Out inputStream blocked until I killed the process...
WITH WKHTMLTOPDF
With the generic command line ap & no params so it should "dump usage and terminate", it sucks out the appropriate std::out, then terminates.
Interesting!
JVM version issue? I'm using 1.6.0_23. The latest is... v24. I just checked the change log and don't see anything promising, but I'll try updating anyway.
Okay. Don't let the Input Streams fill or they'll block. Check. .close() can also prevent this, but isn't terribly bright.
That works in general (including the generic command line apps I've tested).
In specific however, it falls down. It appears that wkhtmltopdf is using some terminal manipulation/cursor stuff to do an ASCII-graphic progress bar. I believe this is causing the inputStream to immediately return EOF rather than giving me the correct values.
Any ideas? Hardly a deal-breaker, but it would definitely be Nice To Have.
I had the same exact issue as you and I solved it. Here are my findings:
For some reason, the output from wkhtmltopdf goes to STDERR of the process and NOT STDOUT. I have verified this by calling wkhtmltopdf from Java as well as perl
So, for example in java, you would have to do:
//ProcessBuilder is the recommended way of creating processes since Java 1.5
//Runtime.getRuntime().exec() is deprecated. Do not use.
ProcessBuilder pb = new ProcessBuilder("wkhtmltopdf.exe", htmlFilePath, pdfFilePath);
Process process = pb.start();
BufferedReader errStreamReader = new BufferedReader(new InputStreamReader(process.getErrorStream()));
//not "process.getInputStream()"
String line = errStreamReader.readLine();
while(line != null)
{
System.out.println(line); //or whatever else
line = reader.readLine();
}
On a side note, if you spawn a process from java, you MUST read from the stdout and stderr streams (even if you do nothing with it) because otherwise the stream buffer will fill and the process will hang and never return.
To futureproof your code, just in case the devs of wkhtmltopdf decide to write to stdout, you can redirect stderr of the child process to stdout and read only one stream like this:
ProcessBuilder pb = new ProcessBuilder("wkhtmltopdf.exe", htmlFilePath, pdfFilePath);
pb.redirectErrorStream(true);
Process process = pb.start();
BufferedReader inStreamReader = new BufferedReader(new InputStreamReader(process.getInputStream()));
Actually, I do this in all the cases where I have to spawn an external process from java. That way I don't have to read two streams.
You should also read the streams of the spawned process in different threads if you dont want your main thread to block, since reading from streams is blocking.
Hope this helps.
UPDATE: I raised this issue in the project page and was replied that this is by design because wkhtmltopdf supports giving the actual pdf output in STDOUT. Please see the link for more details and java code.
A process has 3 streams: input, output and error. you can read both output and error stream at the same time using separate processes. see this question and its accepted answer and also this one for example.
You should read from the streams in a different thread.
final Semaphore semaphore = new Semaphore(numOfThreads);
final String whktmlExe = tmpwhktmlExePath;
int doccount = 0;
try{
File fileObject = new File(inputDir);
for(final File f : fileObject.listFiles()) {
if(f.getAbsolutePath().endsWith(".html")) {
doccount ++;
if(doccount >500 ) {
LOG.info(" done with conversion of 1000 docs exiting ");
break;
}
System.out.println(" inside for before "+semaphore.availablePermits());
semaphore.acquire();
System.out.println(" inside for after "+semaphore.availablePermits() + " ---" +f.getName());
new java.lang.Thread() {
public void run() {
try {
String F_ = f.getName().replaceAll(".html", ".pdf") ;
ProcessBuilder pb = new ProcessBuilder(whktmlExe , f.getAbsolutePath(), outPutDir + F_ .replaceAll(" ", "_") );//"wkhtmltopdf.exe", htmlFilePath, pdfFilePath);
pb.redirectErrorStream(true);
Process process = pb.start();
BufferedReader errStreamReader = new BufferedReader(new InputStreamReader(process.getInputStream()));
String line = errStreamReader.readLine();
while(line != null)
{
System.err.println(line); //or whatever else
line = errStreamReader.readLine();
}
System.out.println("after completion for ");
} catch (Exception e) {
e.printStackTrace();
}finally {
System.out.println(" in finally releasing ");
semaphore.release();
}
}
}.start();
}
}
}catch (Exception ex) {
LOG.error(" *** Error in pdf generation *** ", ex);
}
while (semaphore.availablePermits() < numOfThreads) {//till all threads finish
LOG.info( " Waiting for all threads to exit "+ semaphore.availablePermits() + " --- " +( numOfThreads - semaphore.availablePermits()));
java.lang.Thread.sleep(10000);
}
I'm trying to launch an external program from my java swing app using this:
Process proc = Runtime.getRuntime().exec(cmd);
But the external program never actually gets launched until I close out of my java app...everytime.
It waits to launch only after I have closed out.
the external program I am trying to run is an exe that takes arguments so:
cmd = "externalProgram.exe -v --fullscreen --nowing";
What could possibly be wrong here.
Funny enough it works as expected if i try something simple like:
Process proc = Runtime.getRuntime().exec("notepad.exe");
You may need to read from the process's standard output, or close the standard input, before it will proceed. For reading the output, the problem is that the buffer can get full, blocking the program; for closing the input, the problem is that some programs will try to read data from there if it's available, waiting to do so. One or both of these tricks is very likely to straighten things out for you.
You may also read the error output stream to check it the program is actually being unsuccessfully executed
String cmd = "svn.exe";
Process proc = Runtime.getRuntime().exec(cmd);
BufferedReader reader = new BufferedReader(new InputStreamReader(proc.getErrorStream()));
String line = null;
while((line=reader.readLine())!=null){
System.out.println(line);
}
reader.close();
My console shows
Type 'svn help' for usage.
Which evidently shows the program was executed by Java.