Keeping java subprocess alive in background - java

I have a class with a method that creates a process using process builder. I then have 2 other methods that use global variables to write commands and collect responses from the process.
I want to start the process from another class and then be able to write commands using the other methods as I wish from this original class. However, I can't see a way to keep the sub-process alive. If I use .waitFor() then I'm unable to run any other commands as it hangs (waiting for the sub-process to exit). But if I just leave the create method to complete, once I've returned to the original class and call the input method the process has been killed. Is there any way to keep the process alive but asleep? Allowing the program to continue but being able to call back to it.

You need to run your subprocess in new Thread. If you are not familiar with concept of multi-threading and concurrent processing check out this article: http://www.vogella.com/tutorials/JavaConcurrency/article.html
Further if you want your subprocess to live after your main program has finished, you need to set that thread to be daemon thread.
To learn more about Thread and daemon threads checkout Java documentation: http://docs.oracle.com/javase/8/docs/api/java/lang/Thread.html

Related

JNI - Java exits before native threads finish executing

I'm in the early stages of developing an API in C++, which I'm wrapping in Java using JNI. The native code creates a socket listener thread using WinAPI which should run indefinitely, thereby keeping the program open indefinitely (tested and works fine).
However, when I try to invoke this code in Java, the JVM still terminates when it reaches the end of main, ignoring the running thread. A little research has hinted that Java might think the thread is a daemon rather than a "user thread". But if that's the case, then I can't quite figure out how to convince Java that it actually is a user thread.
Does anyone have a clue about this?
You need to call AttachCurrentThread() for all your native threads, to ensure Java knows about them, so it will wait for them to finish.
Windows doesn't have daemon threads. The process exits when ExitProcess() is called or when the initial thread returns from the application's main function. (In principle, it will also exit if the last thread exits, but that can't be relied upon because Windows may create threads in your process that you don't know about.)
The Java runtime presumably waits for all of its own threads to exit (except for those that it considers daemon threads) before exiting the process. But your threads were created directly via the Win32 API, so Java doesn't know about them and therefore won't wait for them.
If your API wants to continue performing some task beyond the natural lifetime of the calling process, it should probably create a child process rather than a thread. (Or, if the API is Java-specific, it can presumably make use of JNI to ask that Java create the thread on its behalf, or to register the thread with Java.)

Launch a windows batch file from Java GUI Application

I have created a Desktop Application using Java Swing. It takes some input from user, creates a config file and a batch file to run a python scripts. Many concerns are:
- I want the GUI to be in active mode when the batch file execution in progress
- There is a button like ShowLog in the app to check the console output at during execution. That should work on clicking
- I have a "Task in Progress" kind of message in GUI which should be replaced by "Task is Completed" when batch file execution is done
- A "Stop" button also is there to stop the batch file execution forcefully. That should work fine as well
(Note: The batch file execution will take hours to complete)
Can anybody come up with some ideas how I can achieve all these?
As you seem to be aware, Swing is a single threaded framework, which means that anything that is run within the context of the Event Dispatching Thread will prevent it from updating the screen or responding to user input.
The basic solution would be to use a Thread to run the batch process in, but this raises issues with synchornisation of updates to the UI, as you should never modify or interact with the UI from outside the context of the EDT.
A better solution would be to use a SwingWorker, which provides you with the ability to run long running tasks in the background, but provides you with the ability to publish updates to and process updates within the context of the EDT, it also provides you with a done method which is called after the doInBackground method exits and is called within the context of the EDT.
Finally, it provides you with a cancel option - This, however is where the problem occurs. Presumably you will be reading the input from the process in a secondary thread and will be waiting for the process to exit within the same thread (SwingWorker) you started it. SwingWorker relies on the interrupt funcitonality of Thread which may not trigger the waitFor method to return.
Having now gone a read the Process documentation, waitFor does throw an InterruptedException
if the current thread is interrupted by another thread while it is
waiting, then the wait is ended and an InterruptedException is thrown.
This would suggest that when done is called, you would need to call isCancelled to check if the worker was cancelled or not. If it was you would need to call destroy on the Process and shut down any secondary Threads you might have running.
You could use an additional SwingWorker to read the input from the process and utilise it's publish/process functionality to update the logs.
This would mean, you would start a SwingWorker to execute your external process. This would presumably be done in response to some event, like a button push.
When this worker's doInBackground method is called, it would execute the external process and call Process#waitFor. This would stop the doInBackground method from returning until the process has exited.
Before you call Process#waitFor, you could create another SwingWorker and pass the Process's OutputStream to it. This would allow this worker to process the output from the process independently. You would then be able to use this to send output of the process back to the EDT via the SwingWorker's publish/process functionality which could be added to something like a JTextArea.
This would save you a lot of hassle with dealing with SwingUtilities.invokeLater.
Do you need the second work? That depends on what you want to the workers to do. I tend to process all the output of external process in separate threads and allow who ever created the process to use waitFor, it isolates the responsibility a little more and prevents the IO from getting locked up an never reaching waitFor, but that's just me.
Take a look at Concurrency in Swing for more details
You can run bat file in java with Runtime
Runtime.getRuntime().exec("cmd /c start your_batch_file.bat");

how to run shell script asynchronously from within Java program

I want to run a shell script from within a Java program asynchronously-- i.e. after the java program starts execution of that shell script, it carries on with other operations-- and does some further work only when the shell script returns a response to it.. i.e. it does not explicitly stop and wait for the shell script's response.
Is this possible/feasible? How do I implement such functionality?
Basically i will be monitoring multiple servers using a single server that will manage all those servers-- for this it will run shell scripts on each of those servers...since there are many servers, and in java its recommended that number of threads not exceed number of cpu cores... hence I need a solution to this problem, that is not dependent on threading (because of threading limitations)...so that I can simultaneously (or near-simultaneously) fire off many such shell scripts without waiting for one of those scripts responses' (as waiting would affect processing for other shell script commands)... another issue.. the shell commands need to be invoked either on local machine or on remote machines and response is needed from both types of shell script execution(viz local execution and remote execution)...
did you try anything?
you can try something like:
ProcessBuilder builder = new ProcessBuilder("your command to launch script");
Process process = builder.start();
And it does NOT wait by default for the process to complete, so you can execute your code next.
And if you want to do some processing after the process is finished you can try:
int exitVal = process.waitFor();
When you execute another process and want to obtain a result from it, you usually have to read the output of that process, as the process might block if its output buffer becomes full. The easiest way to achieve this is by having a single thread in your Java application which starts the script and then reads its output into some buffer. Other threads of the Java application can do whatever they want to do, and if the process is done, the thread can signal others about that event and then terminate.
I don't know where your recommendation to not use more threads than CPUs originates from, but I'd not hold with that in general. This is true for worker threads, where each active thread keeps one core busy, but in your case, most threads would be idle most of the time. There is some OS level resource overhead associated even with idle threads, so if there are really really many processes, using a single thread to read from all the streams would be better, but a lot more complicated.
You can use Runtime.exec or ProcessBuilder in a different thread than your application main thread to run your shell script asynchronously.
This post shows how to use Runtime or ProcessBuilder. Read this post to learn java threads if you are not aware of it.

Is it possible to know if a process is waiting in Blocked state on a Receive() call on Linux?

My main purpose is to execute processes one by one in a round-robin fashion until one calls receive() and is blocked, so that the execution switches to the next process in the queue. There is a controller application which is coded in Java and it executes these processes(which are also Java applications) using Runtime.getRuntime().exec() and keeps the return values which are Process objects.
To achieve this purpose, I need to capture the receive() calls(or their states, which is blocked) and tell them to the controller(master) application.
I can go as low-level as you want if this is possible.. My first thought was to get this information from the driver and then tell it to my controller Java application. I have written a linux kernel network module which captures the send and receive operations, but AFAIK the socket.receive() function does not tell anything to the network driver.
So, I think the options are to get this information from either the JVM, somehow get it from a linux command or so, or possibly through the linux kernel module?
What are your suggestions?
If you want to know if your threads are blocked, or exactly what they are blocked on, you can either take a thread dump or use a tool like jvisualvm to attach to the process and take a look (in jvisualvm you would attach to the process, take a thread dump, and then look at the activity of each thread).
Have you looked at systemtap? Should be readily available on recent Fedora systems.
Best
Anders
I don't know if this will help you, but you could get information about the state of a Java thread on your machine using local attach.
1) Add the tools.jar to your classpath and use VirtualMachine.list() to get a list of the running JVM on you machine.
2) Attach to the JVM processed using VirtualMachine.attach(virtualMachineDescriptor)
3) Get the local connector address, vm.getAgentProperties().get("com.sun.management.jmxremote.localConnectorAddress");
4) Use JMXConnectorFactory.newJMXConnector(...) to connect to the JVM
5) From the JMX connection lookup up the ThreadMXBean
6) From the ThreadMXBean you get an array of ThreadInfos that describes all threads in the JVM.
7) From TheadInfo#getThreadState() you can check if the state is ThreadState.BLOCKED
You should use interprocess communication primitives in your worker processes to notify the controller application that they are ready to receive data.
You can't make assumptions about how the child processes implement their socket read. They could be using recv, or select, or poll, etc., to wait for network data.
There are actually a few points here. The Linux scheduler is smart enough to pre-empt a blocked task. Meaning, if you call receive() and there's nothing waiting to receive, your task will probably be put to sleep until such a time that the call will return. You don't need to handle the scheduling; the Linux kernel will do it for you.
That said, if you need to know whether your task is blocked from some daemon application, if you're willing to write an LKM, why not just get the task in the task list that you're interested in, and check its state?
Of course, simply checking the state of the task might not tell you exactly what you want. If your task state is TASK_INTERRUPTIBLE, it only tells you that your task is waiting on something, but it might not be a trivial matter to figure out what that something is. Similarly, your task can be in a TASK_RUNNING state and not actually be running on the CPU at the current moment (but, at least, in the TASK_RUNNING state you know your task isn't blocked).
You can just send a QUIT signal (Ctrl-\ on the console) to get a thread dump.

Is it possible to kill a Java Virtual Machine from another Virtual Machine?

I have a Java application that launches another java application. The launcher has a watchdog timer and receives periodic notifications from the second VM. However, if no notifications are received then the second virtual machine should be killed and the launcher will perform some additional clean-up activities.
The question is, is there any way to do this using only java? so far I have to use some native methods to perform this operation and it is somehow ugly.
Thanks!
I may be missing something but can't you call the destroy() method on the Process object returned by Runtime.exec()?
You can use java.lang.Process to do what you want. Once you have created the nested process and have a reference to the Process instance, you can get references to its standard out and err streams. You can periodically monitor those, and call .destroy() if you want to close the process. The whole thing might look something like this:
Process nestedProcess = new ProcessBuilder("java mysubprocess").start();
InputStream nestedStdOut = nestedProcess.getInputStream(); //kinda backwards, I know
InputStream nestedStdErr = nestedProcess.getErrorStream();
while (true) {
/*
TODO: read from the std out or std err (or get notifications some other way)
Then put the real "kill-me" logic here instead of if (false)
*/
if (false) {
nestedProcess.destroy();
//perform post-destruction cleanup here
return;
}
Thread.currentThread().sleep(1000L); //wait for a bit
}
Hope this helps,
Sean
You could also publish a service (via burlap, hessian, etc) on the second JVM that calls System.exit() and consume it from the watchdog JVM. If you only want to shut the second JVM down when it stops sending those periodic notifications, it might not be in a state to respond to the service call.
Calling shell commands with java.lang.Runtime.exec() is probably your best bet.
The usual way to do this is to call Process.destroy()... however it is an incomplete solution since when using the sun JVM on *nix destroy maps onto a SIGTERM which is not guaranteed to terminate the process (for that you need SIGKILL as well). The net result is that you can't do real process management using Java.
There are some open bugs about this issue see:
link text
OK the twist of the gist is as follows:
I was using the Process API to close the second virtual machine, but it wouldn't work.
The reason is that my second application is an Eclipse RCP Application, and I launched it using the eclipse.exe launcher included.
However, that means that the Process API destroy() method will target the eclipse.exe process. Killing this process leaves the Java Process unscathed. So, one of my colleagues here wrote a small application that will kill the right application.
So one of the solutions to use the Process API (and remove redundant middle steps) is to get away with the Eclipse launcher, having my first virtual machine duplicate all its functionality.
I guess I will have to get to work.
java.lang.Process has a waitFor() method to wait for a process to die, and a destroy() method to kill the subprocess.
You can have the java code detect the platform at runtime and fire off the platform's kill process command. This is really an refinement on your current solution.
There's also Process.destroy(), if you're using the ProcessBuilder API
Not exactly process management, but you could start an rmi server in the java virtual machine you are launching, and bind a remote instance with a method that does whatever cleanup required and calls System.exit(). The first vm could then call that remote method to shutdown the second vm.
You should be able to do that java.lang.Runtime.exec and shell commands.

Categories

Resources