jvm forking duplicate child sporadically - java

I have one java application which is forking child JVM for doing some execution. Child process looks same like parent one including all arguments.
TO confirm the parent, i verified parent Id and its always original JVM.
Application is running as plain java process i.e. not in any server like apache, weblogic etc.
As processes have same args, both are writing to same log files and open files are too same.
This is new behaviour which was not present earlier. Could someone provide clue around what to check to find cause
JVM version:- OpenJDK Runtime Environment (IcedTea 2.5.6) (7u79-2.5.6-0ubuntu1.14.04.1)
OS: Ubuntu 14.04

One possibility is an explicit call to Runtime exec method.
Something like the following:
Process process = Runtime.getRuntime ().exec("java -jar myApp.jar");
This will create a new process child of the main process.
This can be done explicitly (with the previous code) or implicitly calling some external library that for example start a new process depending on some configuration file.

We were calling tar via Processbuilder which was getting hang at forkAndExec. Fork was successful , so could see java process but not exec. its getting hang at close syscall for major time due to high number of Disk Operations.

Related

Spawn another process on same JVM

From a Java application I want to run another Java application on the same Java installation but in a separate process.
AFAIK for a new process I would use ProcessBuilder to run a command like
java -jar my.jar
but what if java is in a different directory, or should be java.exe since we are on Windows, or java.exe has some other name since the first application was jlinked and jpackaged?
Edit: What I learned meanwhile is that a jpackaged application comes with a native executable that sits in front of the JVM but passes all arguments to the application. That means it is no longer possible to specify an alternative jar to be executed, and some other mechanism is necessary.
If jlink image used within jpackage based apps is built without using the --strip-native-commands flag then the runtime image will contain bin/java (or bin/java.exe and bin/javaw.exe on Windows).
This will mean that you should be able to determine a path to launch a new JVM which works inside or outside of jpackage apps just based on the current runtime's java.home and by appending extra VM arguments in normal way for the classpath and main / module of the other application.
On Windows the environment property echo %PATHEXT% usually contains .EXE so the following should specify the java executable for the current runtime without needing to add the correct file extension on Windows:
String java = Path.of(System.getProperty("java.home"),"bin", "java").toString();
You can test above inside your jpackage and non-jpackaged app with a simple one-liner (note that this is not proper way to use ProcessBuilder):
new ProcessBuilder(java, "-version").start().getErrorStream().transferTo(System.out);
Obviously, the above is no help if you wish to determine whether to use Windows console enabled java.exe versus non-console javaw.exe.

Ktor creates an extra Java process

I am working on a KTor server. I run the generated jar file using java -jar command. So I expect that only one Java process should run. After running for a while another Java process is being created which is bound to different port.
I checked the details of the process using ps -a [PID] and find this new Java process is "kotlin-compiler-embeddable" program.
I am wondering why this process is being created, what is use of this and is it safe to kill it.
Thanks for any pointer.
kotlin-compiler-embeddable is used in scenarios when you have to package the compiler in a single jar, and no external dependencies. This is the case of Ktor.

Java self-contained package wait for completion

I have an application that I need to be a self-contained app, installable, on computers that may not have Java. I'm using javapackager command to create an EXE that can be sent out to the users, containing all the parts needed. The app, in a simple sense, reads in a file referenced by the first param, does transformations on it, and writes back out next to the source file a result. All of that works when running it as a JAR directly, and also when running it via the built EXE.
The problem is that when triggering the executable, it immediately returns execution to the command prompt, rather than waiting for the process to finish. I don't want to have to poll the output directory to check if the file exists and then give some arbitrary timeout on when to stop looking - I want the app to know that once the console command has completed, the processing is done. At that time I can do logic based on if I find the result file, and alert if it is not found or whatever other logic is right.
Is there a way to tell the javapackager command to set a wait until Java has died (good or bad) before returning control? Barring that, is there a code snippet/concept that would make the app hold off releasing control back to the terminal until the JVM has died?
I can confirm this behaviour for apps that I've packaged using javapackager included with Oracle JDK 8, 9 and 10 as well as with snapshots from OpenJDK 13. While the resulting binaries behave "as expected" under macOS and Linux, the Windows binary merely seems to spawn a new process and exits immediately.
Beginning with Java 9 jpackager invokes jlink internally, which has an option --strip-native-commands which basically removes binaries from the embedded JRE. With Java 8 you didn't have this option and executables such as java.exe are always removed. But my tests show that the JRE is otherwise complete and you should be able to simply include java.exe manually (better test this thoroughly!).
Therefore I propose the following workaround:
Create the application directory using javapackager -deploy -native image
Copy java.exe from your system-wide JRE installation to {OUTDIR}\runtime\bin (check if your license covers redistributing java.exe!)
Create a .bat file inside {OUTDIR} that calls runtime\bin\java.exe -jar ./app/yourapp.jar.
Now add {OUTDIR} to your .msi or innosetup installer or .zip file or whatever means of distribution you choose ;-)
You can then invoke the bat file instead of the exe. Since it is just a wrapper for java -jar it will wait until your program finished.

Scala - java.exe accessing internet?

Hm, so I set up Scala in order to start learning it.
When I compile a .scala script, though (i.e. "scala whatever.scala" in the terminal), java.exe is accessing the internet?
Why? Is that intended behaviour or did I forget to configure something?
The script I run was fairly simple, if that should matter:
args.forall(println)
It seems to me that Scala compilation happens inside a Java JVM. So when you compile Scala, the java command is executed.
Java JRE has a mecanism to update itself. When a new version is out, it asks the users (at least on windows?) if they want to install the new version.
It is possible that everytime a java command is launched, it checks for updates (?)
Edit: it is possible that this is because in some cases you are using a "compile server" for Scala. This means an extra JVM is spawned just for compilation and is kept alive after your initial compilation. Then next compilation will be faster because the compilation server will already have been started and all the classes will be loaded.
It is possible that a client JVM is communicating to the compilation server JVM by using a network protocol.
Check some links:
http://blog.jetbrains.com/scala/2012/12/28/a-new-way-to-compile/
https://github.com/typesafehub/zinc

using javaw to run jars in batch files results in more than one java processes in process explorer - XYNTService

I have a somewhat strange issue. I have a java application that installs few services that run as Jars. Previously I used installed Java to run these Jars. There are four services and all will be instantiated from a single batch file with sequential call to each other. Something like this,
start %JAVA_HOME% commandtoruntjarfile
would work and all four services will run in the background and only one java.exe visible in process explorer. So I had another service installed as windows service which would start stop these services by calling the run bat or shutdown bat.
Now the customer requirement changed to using an internalized version of java. I extract java to a location, make our own environment variable name "ABC_HOME" and the required syntax in batch changes to
%ABC_HOME%\javaw commandtorunjarfile
When its run it works. but there is no stopping these. When I go to process explorer I see 4 java.exe running each for the four run commands in the batch file. If I stop the windows service all the four keep working. If I restart the windows service the number of java.exe in process explorer goes to eight and keeps going up until windows has had enough of it.
How do I get around it? I think the solution should be to have the one java process in process explorer but I cant seem to find any solution for that.
[EDIT]
The four sub services are actually XYNT processes. In the normal scenario it would be something like this
[Process1]
CommandLine = java -Xrs -DasService=yes -cp jarfiles
WorkingDir = c: bin scache
PauseStart = 1000
PauseEnd = 1000
UserInterface = No
Restart = Yes
For using java from a specific location the following change was needed
CommandLine = %JAVA_PATH%\bin\java.exe -Xrs -DasService=yes -cp jarfiles
but this wouldn't work as it would not accept the path variable in XYNT.ini file. so I called a batch file here and in that batch file I used the above code. So here is what the change looks like,
CommandLine = batchfile.bat
and in batchfile.bat
%JAVA_PATH%\bin\java.exe -Xrs -DasService=yes -cp jarfiles
Usually, every Java program run on your system has its own virtual machine running, which means: one java.exe/javaw.exe per instance of your program.
I can not tell why it "worked" from your point of view with java.exe like you described first, but the behaviour you described for javaw.exe (having 4 java processes in the process explorer) would be what I'd have expected.
For me the question is not why you're seeing 4 vs. 1 java processes, but how you can start/stop the "services". Killing the Java VM externally doesn't seem a very good solution. I'd consider building some IPC into the Java services that allow you to gracefully terminate the processes.

Categories

Resources