Child process stops when Thread.sleep() is called (in Java under Windows) - java

I have a Java application that launches an external process (Internet Explorer) using ProcessBuilder. Strangely enough, this child process freezes when the parent Java thread calls Thread.sleep. It does not happen with all processes, for instance Firefox, but with IE it happens all the time.
Any ideas ?
P.S. I tried Robot.delay() with the same effect

How are you consuming the child process stdout and stderr ? It may be worth posting your code.
You need to consume the output streams concurrently, otherwise either your stdout or stderr buffer will fill up, and your child process will block. See here for more details.

Related

Proper shutdown of JVM when launching from C++

I'm launching JVM from C++ code via JNI. I have a problem that when just quitting my C++ process it seems some shutdown hooks from JVM are not run, and therefore some temp resources are still being around, that in my particular case prevents launching JVM next time I open a C++ process.
I tried jvm->DestroyJavaVM(), but after all my process windows were closed, I still could see the process running. What's the best wait to ensure that the JVM is shut down properly when launched via JNI?
Thanks!
First of all, jvm->DestroyJavaVM() won't return till all non-daemon jvm threads have stopped, it does nothing but waiting for them to stop, so you should stop them in java.
Secondly, System.exit will cause the whole process to be shut down.
So what you really need is check your java code that which thread is not stopped yet, for example the background message loop thread of the ui framework such as gwt or swing.
The easiest way is to call System.exit via JNI.

I/O completion ports and stdout processing

I'm using I/O completion ports for a process management library (yes, there's a reason for this). You can find the source for what I'm talking about here: https://github.com/jcommon/process/blob/master/src/main/java/jcommon/process/platform/win32/Win32ProcessLauncher.java (take a look at lines 559 and 1137 -- yes, that class needs to be refactored and cleaned up).
I'm launching a child process and using named pipes (not anonymous pipes b/c I need asynchronous, overlapped ReadFile()/WriteFile()) in order to process the child process' stdout and stderr. This is mostly actually working. In a test, I launch 1,000 concurrent processes and monitor their output, ensuring they emit the proper information. Typically either all 1,000 work fine or 998 of them, leaving a couple which have problems.
Those couple of processes are showing that not all their messages are being received. I know the message is being output, but the thread processing GetQueuedCompletionStatus() for that process returns from the read with ERROR_BROKEN_PIPE.
The expected behavior is that the OS (or the C libs) would flush any remaining bytes on the stdout buffer upon process exit. I would then expect for those bytes to be queued to my iocp before getting a broken pipe error. Instead, those bytes seem to disappear and the read completes with an ERROR_BROKEN_PIPE -- which in my code causes it to initiate the teardown for the child process.
I wrote a simple application to test and figure out the behavior (https://github.com/jcommon/process/blob/master/src/test/c/stdout-1.c). This application disables buffering on stdout so all writes should effectively be flushed immediately. Using that program in my tests yields the same issues as launching "cmd.exe /c echo hi". And at any rate, shouldn't the application (or OS?) flush any remaining bytes on stdout when the process exits?
The source is in Java, using direct-mapped JNA, but should be fairly easy for C/C++ engineers to follow.
Thanks for any help you can provide!
Are you sure that the broken pipe error isn't occurring with a non zero ioSize? If ioSize is not zero then you should process the data that was read as well as noting that the file is now closed.
My C++ code which does this basically ignores ERROR_BROKEN_PIPE and ERROR_HANDLE_EOF and simply waits for either the next read attempt to fail with one of the above errors or the current read to complete with zero bytes read. The code in question works with files and pipes and I've never seen the problem that you describe when running the kind of tests that you describe.

how to run shell script asynchronously from within Java program

I want to run a shell script from within a Java program asynchronously-- i.e. after the java program starts execution of that shell script, it carries on with other operations-- and does some further work only when the shell script returns a response to it.. i.e. it does not explicitly stop and wait for the shell script's response.
Is this possible/feasible? How do I implement such functionality?
Basically i will be monitoring multiple servers using a single server that will manage all those servers-- for this it will run shell scripts on each of those servers...since there are many servers, and in java its recommended that number of threads not exceed number of cpu cores... hence I need a solution to this problem, that is not dependent on threading (because of threading limitations)...so that I can simultaneously (or near-simultaneously) fire off many such shell scripts without waiting for one of those scripts responses' (as waiting would affect processing for other shell script commands)... another issue.. the shell commands need to be invoked either on local machine or on remote machines and response is needed from both types of shell script execution(viz local execution and remote execution)...
did you try anything?
you can try something like:
ProcessBuilder builder = new ProcessBuilder("your command to launch script");
Process process = builder.start();
And it does NOT wait by default for the process to complete, so you can execute your code next.
And if you want to do some processing after the process is finished you can try:
int exitVal = process.waitFor();
When you execute another process and want to obtain a result from it, you usually have to read the output of that process, as the process might block if its output buffer becomes full. The easiest way to achieve this is by having a single thread in your Java application which starts the script and then reads its output into some buffer. Other threads of the Java application can do whatever they want to do, and if the process is done, the thread can signal others about that event and then terminate.
I don't know where your recommendation to not use more threads than CPUs originates from, but I'd not hold with that in general. This is true for worker threads, where each active thread keeps one core busy, but in your case, most threads would be idle most of the time. There is some OS level resource overhead associated even with idle threads, so if there are really really many processes, using a single thread to read from all the streams would be better, but a lot more complicated.
You can use Runtime.exec or ProcessBuilder in a different thread than your application main thread to run your shell script asynchronously.
This post shows how to use Runtime or ProcessBuilder. Read this post to learn java threads if you are not aware of it.

Launch a process without consuming its output

I used this line to execute a python script from a java application:
Process process = Runtime.getRuntime().exec("python foo.py", null, directory);
The script runs a TCP server that communicates with my java app and other clients.
When I was debugging the script I had a few console prints here and there and everything was fine. As soon as the script was launched from Java Code, after a fixed time my TCP server was not responding. Following some time of debugging and frustration I removed my prints from the script and everything worked as expected.
It seems that there is some memory allocated for the process' standard output and error stream and if you don't consume it the process gets stuck while trying to write to a full buffer.
How to launch the process so that I don't have to consume the standard output stream? I'd like to keep the prints though for debugging, but don't want to start a thread that reads a stream that I don't need.
You have to consume the child process's output, or eventually it will block because the output buffer is full (don't forget about stderr, too). If you don't want to modify your Java program to consume it, perhaps you could add a flag to your script to turn off debugging altogether or at least direct it to a file (which could be /dev/null).
Java exposes ProcessBuilder. This is something you can use to execute what you want. This does a fork-exec and also lets you handle the output and error streams.
The fact that your process hangs is not because of java. Its a standard problem where your process is blocked because the stream is full and no one is consuming it.
Try using the ProcessBuilder with a thread that reads the streams.
I did some similar in the past and I redirect the exit of the process to a particular file log relative with the process. Later you could see what is happening. And you could maintenance your trace of your python script.

Spawn a process in Java that survives a JVM shutdown

I need to spawn a process in Java (under Linux exclusively) that will continue to run after the JVM has exited. How can I do this?
Basically the Java app should spawn an updater which stops the Java app, updates files and then starts it again.
I'm interested in a hack & slash method to just get it working as well as a better design proposal if you have one :)
If you're spawning the process using java.lang.Process it should "just work" - I don't believe the spawned process will die when the JVM exits. You might find that the Ant libraries make it easier for you to control the spawning though.
It does actually "just work", unless you're trying to be clever.
My wrapped java.lang.Process was trying to capture the script's output, so when the JVM died, the script didn't have anywhere to send output so it just dies. If I don't try to capture the output, or the script doesn't generate any or redirects everything to a file or /dev/null, everything works as it should.
I was having trouble with this and the launched process was getting killed when the JVM shutdown.
Redirecting stdout and stderr to a file fixed the issue. I guess the process was tied to the launched java app as by default it was expecting to pass its output to it.
Here's the code that worked for me (minus exception handling):
ProcessBuilder pb = new ProcessBuilder(cmd);
pb.redirectOutput(logFile);
pb.redirectError(logFile);
Process p = pb.start();
I thought the whole point of Java was that it's fully contained within the JVM. It's kinda hard to run bytecode when there's no runtime.
If you're looking to have a totally separate process you might look into trying to start a second java.exe instance. Although for your application, it might be easier to simply make a synchronized block that stops (but doesn't kill) your app, does the updating, and then re-initializes your app's data.
It won't always "just work". When JVM spawns the child and then shuts down, the child process will also shutdown in some cases. That is expected behaviour of the process. Under WIN32 systems, it just works.
E.g. If WebLogic server was started up by a Java process, and then that process exits, it also sends the shutdown signal to the WebLogic via shutdown hook in JVM, which causes WebLogic to also shutdown.
If it "just works" for you then there is no problem, however if you find yourself in a position that child process also shutsdown with JVM it is worth having a look at the "nohup" command. The process won't respond to SIGTERM signal, but will respond to SIGKILL signal, as well as normal operations.
Update: The way described above is a bit of an overkill. Another way of doing this would be to use "&" on the end of command. This will spawn a new process that is not a child of current java process.
P.S. Sorry for so many updates, I have been learning and trying it from scratch.
>>don't believe the spawned process will die when the JVM exits.
Child process is always dying on my box(SuSE) whenever I kill java. I think, the child process will die if it's dealing with I/O of the parent process(i.e., java)
If you're looking at making an updater on Linux, you're probably barking up the wrong tree. I believe all major linux distros have a package manager built in. You should use the package manager to do your updating. Nothing frustrates me more than programs that try to self-update... (I'm looking at you, Eclipse)

Categories

Resources