I am using Alpn-boot to add support for HTTP.2 requests using OkHttp. In order for that to work i always have to launch my compiled jar file through the command line and specify the path of alpn-boot.jar file with a <-Xbootclasspath> argument :
java -jar -Xbootclasspath/p:<path_to_alpn-boot.jar> <path_to_myjar.jar>
Is there anyway to avoid this, and make the jar run with the -Xbootclasspath argument specified by default?
Edit: I thought about using a batch file to do this, but is there no other native way?
Batch file is convenient and standard enough. Unless you have specific reason not to use batch file, batch file would do it
Related
I have written a small application to parse a large XML file using SAX with Intellij.
I pass -DentityExpansionLimit=0 option to my application by going to Run\Edit Configurations... and set VM options.
It works perfectly when I run the application with Intellij, but when I create the artifact with intellij it doesn't work and I get the error which needed to set that option. This is obvious that the option didn't pass to the created jar file.
How should I achieve this goal?
Is there any command that I create with a batch file or something to set this option for my user? Is there any setting file that I can modify to set this option for my machine? (I use windows 10)
Usually, to send system properties to a jar, the command is something like that:
java -DentityExpansionLimit=0 -jar thejar.jar
You are mixing up two things here:
the JVM command line command, and the fact that you can pass arguments to your application, or properties to the JVM itself
your deployment artefact (probably a JAR file)
Meaning: It seems like you want to either pass command line arguments (to some main function) or properties to your application. But the JAR file doesn't have support for that.
JAR files are just a container of class files. You can add some META information via the manifest (which class to run), but that is about it. You can't magically push your IntelliJ "runtime configuration settings" into the JAR.
In other words: IntelliJ has no way of putting these values into your JAR.
When you invoke java -jar Your.jar ... then you (or some other tooling) has to add the required values to the command line.
I have built a jar file which has a log4j.properties file in it (mvn package put it there by default from the resources directory). But when I run this jar file, I want to pass a different logging config, so I add -Dlog4j.configuration=file:{path to file}. The issue that bugs me is that the order matters here as follows:
When I run java -jar {path to jar} -Dlog4j.configuration=file:{path to file} then it reads the log file packaged in the jar.
When I run java -Dlog4j.configuration=file:{path to file} -jar {path to jar}, then it reads the config from the file I pass in the parameters.
I have rough understanding how classpaths work in java and that if I were to load several java classes with the same name, it would make a difference. But this way I am passing a config parameter with a -D prefix, so the way I expect this to work is for some code in log4j library to check whether -Dlog4j.configuration is set and if so, then load the config from there, otherwise try to find it on the classpath.
Any ideas on what I am missing?
If you provide anything after naming the JAR file, it is treated as an argument to your main method. For Log4J you actually have to define a property, and this needs to be done before you specify -jar.
I have a jar application that works with an external properties file. The file is used so that the user can override default properties. The default and core properties are part of the build so no problem there.
Normaly I would do something like one of theese:
java -jar MyAwesomeApp.jar user.properties
OR
java -jar MyAwesomeApp.jar -Dmyapp.userproperties=user.properties
But with hadoop, the jar gets executed inside the hadoop framework like this
/bin/hadoop jar MyAwesomeApp.jar input output
And no matter where I put the -D I can't get the value via System.getProperty(...). The properties is not set. The hadoop documentation said that -D is a GENERIC OPTION and is set after the command. But if I do so I get an error that -D is not a valid jar file (duh...)
I aim to keep the application as clean as possible...so I only want to pass the user configuration as a parameter as a last resort, i.e.
/bin/hadoop jar MyAwesomeApp.jar input output user.properties
I hope someone can tell me what I have to do to get the -D working :/
Hadoop is running pseudo distributed so I am actually using HDFS too...
Assume that I have a java program A. The Java program A needs to call another java program B(jar), passes arguments to it and receives the return value from B. How can I achieve this?
The easiest solution would be to have the jar as a build-time dependency and invoke it statically from your code.
Please check this: How to run a jar file from a separate jar file?
The above is more applicable for you compared to Execute another jar in a java program
I believe your question is answered here Execute another jar in a java program depending on your runtime environment you may need to specify a path to java and/or the jar file
All we have to do is introduce an entry into the JAR file's manifest (MANIFEST.MF in the JAR's META-INF subdirectory), like
Main-Class: com.tedneward.jars.Hello
All a user has to do to execute the JAR file now is specify its filename on the command-line, via java -jar outapp.jar.
If I have a jar that I need to run using java -jar FOO.jar on unix, does this depend on the read, write or execute bit? Or some combination thereof?
You just need read on the .jar, since java is what you're executing, and it reads the jar.
You will need read, since the jar is virtually executed by java (of course java needs to have exec permissions). But if you have a program that stores some data inside itself (it can happen, for example storing settings) I would suggest to have also the write attribute set.