So I have a problem with a process that I am running, whenever I try to stop it using process.destroy(), it does not stop.
I want to create a file (ProcessHandler) that extends Process and do the following:
ProcessHandler process = (ProcessHandler)Runtime.getRuntime().exec("cmd.exe /c \"java net/com/codeusa/Server 43594\"");
So, my problem is trying to convert Process to ProcessHandler where I can override the destroy() command, to make it TSKILL itself. I have figured out how to do everything but when I try the above like of code, I get a ClassCastException..
Anyone have an idea how I can make these be compatible. BTW the exec(String) command returns an instance of Process.
Firstly, if you are using Java 5 or later, I would recommend using ProcessBuilder instead of Runtime.getRuntime().exec(). For one thing, you don't have to worry about quoting arguments. Each separate command-line argument is a separate parameter. For example:
ProcessBuilder builder = new ProcessBuilder("cmd.exe", "/C", "java net/com/codeusa/Server 43594");
Process process = builder.start();
When starting a process using ProcessBuilder or Runtime.getRuntime().exec(), it's entirely up to the JVM to instantiate and return a subclass of Process of its choice, and there's no way to influence its decision. I assume your ProcessHandler class is one you've written yourself (I can't find a Java API class with that name). It might subclass Process, but even if it does there's no way for the JVM to return an instance of it when you use ProcessBuilder or Runtime.getRuntime().exec(). So your line of code above is guaranteed to throw a ClassCastException, assuming it doesn't throw some other exception.
I have had some experience in the past of processes that didn't respond to destroy() methods. Usually this was because the standard output or standard error being written by the process wasn't being read, and the process had ground to a halt because one or more of its I/O buffers had filled up. Does the process above write anything to its standard output or standard error, and if so, are you reading it?
Reading both the standard output and standard error streams is easier with ProcessBuilder: if you add the line builder.redirectErrorStream(true); between the two lines above, then you only need to read from the process's standard output. If you're stuck with Java 1.4 or earlier and Runtime.getRuntime().exec(), you'll have to set up two different objects in two different threads, one that reads from each stream.
I'm not sure what you are trying to achieve with your ProcessHandler class - you haven't provided the source code for it. Besides, I've never had the need to kill a process more forcibly than by using the destroy() method.
I figured out a whole new thing!! When I call the destroy() method for process, it destroys the cmd.exe process.. but I replaced cmd.exe with "java" and now when I call destroy(), the java.exe process terminates.. HURAY
Related
I have a batch file called 'StartUpdate.bat' which contains something like this:
set CLASSPATH="myclasspath"
java -cp %CLASSPATH% UpdateProgram
runMyApp.bat
If I run 'StartUpdate.bat' directly from command line, it executes UpdateProgram and then runMyApp.bat immediately after. This is the intention.
However, if I call 'StartUpdate.bat' from another Java program, it terminates immediately after completing UpdateProgram. 'StartUpdate.bat' is called from this other Java program using
Runtime.getRuntime().exec(path + "StartUpdate.bat");
StartUpdate.bat is executed just fine, as is UpdateProgram inside it, but nothing else following UpdateProgram.
Why does it behave this way? What should I do so that it executes the remainder of the batch file?
You can use call or start to execute the java program
Explicitly use a user Thread with setDaemon(false). It seems that there was the problem.
As long as there is a user (non-daemon) thread, the JVM will keep the application alive. Daemon threads are closed when no user threads exist anymore.
As daemon threads are typically used for such "server" like purposes, an often misconception.
For the rest ProcessBuilder would be a more robust class for this task.
ProcessBuilder pb = new ProcessBuilder("dir");
Process process = pb.start();
int returnCode = process.waitFor();
I have a Python script wherein a JAR is called. After the JAR is called, two shell scripts are called. Initially I was doing this:
proc = subprocess.Popen(jar_command, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
proc.wait()
output, errors = proc.communicate()
proc = subprocess.Popen(prune_command, shell=True)
proc.wait()
proc = subprocess.call(push_command, shell=True)
I have to wait for the first two processes to finish so I use Popen() and the final one I can let it run in the background, so I call() it. I pass shell=True because I want the called shell scripts to have access to environment variables.
The above works, however, I don't get any logging from the JAR process. I've tried calling it this way:
proc = subprocess.call(jar_command)
This logs as I would expect, but the two shell scripts that follow are not executed. Initially I thought the logs just weren't going to stdout but it turns out they're not being executed at all. I.E. not removing superfluous files or pushing to a database.
Why are the followup shell scripts being ignored?
If you are certain your shell scripts are not running at all, and with the first code everything works - then it must be the java command deadlocks or not terminates correctly using the call() function.
You can validate that by adding a dummy file creation in your bash scripts. Put it in the first line of the script, so if it is executed you'll get the dummy file created. If it's not created, that means the scripts weren't executed, probably due to something with the java execution.
I would have try couple things:
First I would return the Popen instead of call. Instead of using wait(), use communicate():
Interact with process: Send data to stdin. Read data from stdout and stderr, until end-of-file is reached. Wait for process to terminate.
communicate() returns a tuple (stdoutdata, stderrdata).
proc = subprocess.Popen(jar_command, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
proc.communicate()
Make sure to check both streams for data (stdout and stderr). You might miss an error the java process raises.
Next I would try disabling the buffer by providing bufsize=0 to Popen. It will eliminate the option it relates to python buffering.
If both options still don't work, try to see if there is an exception by using check_call():
proc = subprocess.check_call(jar_command)
Run command with arguments. Wait for command to complete. If the return code was zero then return, otherwise raise CalledProcessError.
These options might have the answer; if not, they would help the debugging process. Feel free to comment how this progress.
Most likely, you are forgetting that the processes streams are in fact OS-level buffers with some finite capacity.
For example, if you run a process that produces a lot of output in PIPE mode, and you wait for it to finish before trying to consume whatever that process wrote to output, you have a deadlock:
The process has filled up the output buffer and is now blocked on writing more data to its output. Until somebody empties the buffer by reading from pipe, the process cannot continue.
Your program is waiting for the subprocess to finish before you read the data from its buffer.
The correct way is to start a thread in your program that will "drain" the pipe constantly as the process is running and while your main thread is waiting. You must first start the process, then start the drain threads, then wait for process to finish.
For differential diagnosis, check whether the subprocess will run fine with little output (i.e. as long as the buffer does not fill up, such as a line or two).
The documentation for subprocess has a note about this.
I am launching a process from java to run a command for me. This process runs for a little while, then needs a password to continue. Now I know that I can write to the in stream of the proces, but I am not quite sure how to detect when I need to write to it.
Possible solutions:
Is there a way that I can detect that the process is blocking?
Can I just write to the standard in immediately after executing the command and when the process hits a point when it needs it, it can just read from it?
Any other ideas?
It is not necessary to detect if the child process is blocking or not. If the child process is designed to block until input is provided to it via stdin, it will block until such input is provided.
It it necessary to keep in mind that the standard input, output and error buffer sizes are limited, and therefore it would be necessary for the child process to process the contents of the input buffer, and for the parent process to process the contents of the output and error buffers as soon as possible. Not doing so will result in the child process hanging.
Maybe you should get around the runas problem but not using runas. Google found me this: http://www.source-code.biz/snippets/c/1.htm Lets you pass your password at runtime....
I wish to create a process using java's runtime:
for example. Process proc = Runtime.getRuntime().exec("cmd");
Then, I want to somehow wait for the process until it is in 'ready for input' state, which will verify it has finished all of its work. Anyway on how to do it?
One thing that is possible is echoing "---finished---" and checking if this line was written using the stdin of the process. But I dislike the idea.
Is there any better (more formal) approach to do this?
By the way, I need this effect as I descriobed it. I want to wait before writing new commands to the batch file.
Thank you.
Is there any better (more formal) approach to do this?
No. Java provides no way to determine if another program is waiting for input. Indeed, I don't believe that this is possible to determine this at all (for Linux/UNIX) without digging around in kernel memory. (And that would be a really bad idea ... IMO).
Your best bet is to check the external program's output for some text that indicates that a batch has just finished ... and hope that you don't get a false positive. (Or just queue up the batches without waiting for them to complete, though it sounds like that won't be a solution for you.)
Here are a couple ideas. They hinge on whether or not the program you're running expects to read and write from stdin and stdout.
There's no need to wait for a "ready for input" state once you've launched the process. If you need to send commands, use getOutputStream and write them. If the other program is still getting itself ready, the data you've written will happily sit in a buffer until the program reads from stdin.
Using ProcessBuilder will let you easily control arguments and environment. Example direct from JavaDocs:
ProcessBuilder pb = new ProcessBuilder("myCommand", "myArg1", "myArg2");
Map<String, String> env = pb.environment();
env.put("VAR1", "myValue");
env.remove("OTHERVAR");
env.put("VAR2", env.get("VAR1") + "suffix");
pb.directory(new File("myDir"));
Process p = pb.start();
Then, to read or write:
OutputStream out = p.getOutputStream();
out.write("some useful data".getBytes());
out.flush();
Your question seems like a really complicated way of obtaining the final required result.
First, Java has the capacity to write to arbitrary files. Why not use that capability in a single threaded manner? Calling flush will ensure the write is actually performed...
If you really need concurrency for some reason, use a thread over a process. Then, make the thread do what it needs to do and nothing more. Have the parent thread join() the child thread. When the child thread is finished, your parent thread will resume.
Finally, calling available() on the output stream or input stream can tell you have much data Java can read before blocking. This isn't what you want exactly, but can be adapted to work I'm guessing.
This is in the context of a local Processing program. I would like to run an external program to get some data. Is there a popen() or equivalent function I can use?
Process process = Runtime.getRuntime().exec("your command");
Then you can read and write the data using the Process streams.
JDK5 introduced ProcessBuilder for more control over the process generation.
Process process = new ProcessBuilder(command).start()
Be aware of the fact, that internally forkAndExec is invoked, and fork 'makes a copy of the entire parents address space', so that even a little command can lead to OutOfMemoryErrors, when the parent process has big amount of memory space acquired.
see here
A close friend of popen() is to make a named pipe as input and/or output, like in UNIX:
mknod /tmp/mypipe.12345 p ; sort -o /tmp/mypipe.12345 /tmp/mypipe.12345 &
Then open /tmp/mypipe.12345, write, close, open /tmp/mypipe.12345, read, close. Since a sort cannot write anything until EOF on input, the output open will occur after the input close. The popen() call cannot do this!
For simpler scenarios, the named pipe can just be read or written.
Of course, you still need to spin it off, as in a system(...) call.
You want to remove the named pipe when you are done. On some UNIX systems, /tmp is cleared upon reboot.
/tmp is shared so name collisions are quite possible. You can generate a partly random pipe file name (numeric part of /tmp/mypipe.12345) in Java to generally prevent this. In some systems, Bash creates named pipes in /var/tmp for every <(...) or >(...) use. Unfortunately, it is a bit of a challenge to determine when they can be removed without effect (fuser?)!