I'm trying to create a .bat file to run my generated executable JAR file. I found this method of creating .bat files for running a project. So, I read up on the plugin here and added the following to my pom.xml.
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>appassembler-maven-plugin</artifactId>
<version>1.10</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>assemble</goal>
</goals>
</execution>
</executions>
<configuration>
<assembleDirectory>${assembleDir}</assembleDirectory>
<generateRepository>false</generateRepository>
<repositoryName>lib</repositoryName>
<configurationDirectory>conf</configurationDirectory>
<copyConfigurationDirectory>false</copyConfigurationDirectory>
<programs>
<program>
<mainClass>com.companyname.tests.TestRunner</mainClass>
<id>AutoConfigTest</id>
</program>
</programs>
</configuration>
</plugin>
And, yes, as the name suggests, this JAR contains JUnit test cases.
I prevented the plugin from unpacking JARs and creating the repo folder and set that to my already generated lib folder, which contains all the JARs(executables and the dependencies). The .bat file is being generated but, when running it, I'm getting the following error.
Error: Could not find or load main class com.companyname.tests.TestRunner
Also, I want the command prompt to stay after execution. In this case it is closing immediately. Maybe its because I'm getting an error. I'm not sure.
So, got into searching again and found this. But as the accepted answer suggests, my pom.xml already contains -
<packaging>jar</packaging>
The assembled directory is -
AutoConfigTest
|
|--bin
| `- contains the .bat file
|--conf
| `- contains the property files and other configuration files
|--lib
`- contains all the JARs
What am I doing wrong here?
Maybe it's related to (from the README.md)
All dependencies and the artifact of the project itself are placed in a generated Maven repository in a defined assemble directory. All artifacts (dependencies + the artifact from the project) are added to the classpath in the generated bin scripts.
In your pom.xml you prevent the generation of that repository. So you need to ensure that the artifact from the project is copied at the expected place.
Assuming following project settings
<groupId>com.companyname</groupId>
<artifactId>Maven-AppAssembler</artifactId>
<version>0.0.1-SNAPSHOT</version>
the artifact is expected to be at (the CLASSPATH setting in the scripts bin/AutoConfigTest)
"$REPO"/com/companyname/Maven-AppAssembler/0.0.1-SNAPSHOT/Maven-AppAssembler-0.0.1-SNAPSHOT.jar
where $REPO resolve to target/appassembler/lib.
I found the issue. #SubOptimal was correct in pointing out that the main class isn't visible to the batch file.
For some reason, the test JAR file (which contains the main class) isn't being added to the classpath variable of the batch file. As a result, whenever I ran the batch file, I was getting the error mentioned in the question.
I went back to the documentation and found this.
Sometimes it happens that you have many dependencies which means having a very long classpath, and becomes too long (in particular on Windows based platforms). This option can help in such situation. If you activate this option, your classpath contains only a classpath wildcard (REPO/*). But be aware that this works only in combination with Java 1.6 and above and with repositoryLayout flat.
So, instead of adding individual JAR files into the path, I added the whole lib directory to the classpath by adding the following to the pom.xml.
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>appassembler-maven-plugin</artifactId>
<version>1.10</version>
...
<configuration>
...
<repositoryLayout>flat</repositoryLayout>
<useWildcardClassPath>true</useWildcardClassPath>
...
</configuration>
...
</plugin>
I could do this because the repository layout of lib was already flat. There were no hierarchies. No other change was required. The batch file now behaves as expected.
Related
I have a project compose by more than one module, and an integration test (in the test folder) where I want to run this script using the #sql annotation. By default the class path resource is used.
The test is inside this folder:
mainFolder/module1/src/test/java/com/.../.../controllers/TestClass.java
while the script is present in this folder:
mainFolder/scripts/postgres/script.sql
Basically I'm not sure which string (relative path) I should put in the value parameter of the #Sql annotation.
I am afraid that if your scripts are not copied to the class path there are not a lot of options.
Please, try something like:
#Sql("file:/path-to-mainFolder/mainFolder/scripts/postgres/script.sql")
As you can see in the documentation you can use any valid resource type.
Having said that, I think the best option is to make these resources available in the classpath. If you are using maven, you can use for instance the copy-resources goal of the maven resources plugin to copy your resources when running your tests:
<project>
...
<build>
<plugins>
<plugin>
<artifactId>maven-resources-plugin</artifactId>
<version>3.2.0</version>
<executions>
<execution>
<id>copy-resources</id>
<!-- here the phase you need: validate, test-compile... -->
<phase>validate</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<outputDirectory>${basedir}/target/scripts</outputDirectory>
<resources>
<resource>
<!-- Depending on your project, try defining the scripts src location as you consider more appropriate -->
<directory>mainFolder/scripts/postgres/script.sql</directory>
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
...
</build>
...
</project>
With this setup, now you can define your script location as a classpath resource:
#Sql("/scripts/postgres/script.sql")
Perhaps, I am not sure about that, in a similar fashion you can use the plugin testResources goal as well.
If you want to use the scripts on your local environment machine you can also add the directory mainFolder/scripts/postgres as a dependency. In Intellij, you con go to Project Structure -> Modules -> your module -> dependencies -> + -> Jars or directories -> your script directory -> Classes.
Then you will be able to use
#Sql("/yourScript.sql")
In DevOps pipeline you would need to add the dedicated directory to your application server classpath.
Keep in mind that you will have to keep unique names for your sql files, because if you intend to use a structure with multiple directory levels, I think the jvm will load only the first entry. It is not a recommended approach, but it is fast if you only need to generate a report or test something on your local environment.
Another approach would be to specify at runtime -Xbootclasspath/a:. Then you would be able to use #Sql with relative paths as you initially wanted.
E.g.
-Xbootclasspath/a:path-to-mainFolder
then you would be able to use
#Sql("/scripts/postgres/script.sql")
Read almost all links under the title (when creating topic) and more of them in google, did not find the answer.
So, the problem is: jenkins builds maven web project. I installed the Deploy plugin, so that jenkins would publish .WAR file to tomcat.
Tests section passed and WAR file is built - OK, but when jenkins starts to
[INFO] --- tomcat7-maven-plugin:2.1:run (default-cli) # webapp ---
I see:
ERROR] Error starting static Resources
java.lang.IllegalArgumentException: Document base /var/lib/jenkins/workspace/AppFolder/AppName/src/main/webapp does not exist or is not a readable directory
And if I look to the project's folder, there really is no such folder, because it is:
/var/lib/jenkins/workspace/AppFolder/AppName/src/com/companyname/webapp
so, I just don't know where to fix the path. Tried to edit pom.xml:
<build>
<sourceDirectory>src/com/companyname</sourceDirectory>
...
</build>
Just don't get it. Where that path is specified?
Looks like I made errors in configuration.
<groupId>org.apache.tomcat.maven</groupId>
<artifactId>tomcat7-maven-plugin</artifactId>
<configuration> ... <configuration>
Now everything works. Thanks for your concern.
I think this email might help you. Have a read through it, but from what I can see this is the important part (Tomcat Maven docs - warSourceDirectory).
Quote:
<configuration>
<warSourceDirectory>target/${artifactId}-${version}</warSourceDirectory>
will be better with
<warSourceDirectory>${project.build.outputDirectory}/${artifactId}-${version}</warSourceDirectory>
</configuration>
Due to the way my build system is designed (RTC Build Engine), I would like to provide maven with property values via a properties file, instead of specifying -Dkey=value for every property.
I found a couple of questions on S.O. (How to set build properties from a file in Maven POM? and How to read an external properties file in Maven) that relate precisely to this question, but they are relatively old, and both require custom plugins to work (in alpha state).
I realize that passing parameters to Maven like this is probably not the best solution, but the other option is specifying everything on the command line via -D settings which is not ideal either.
Furthermore, given that this properties file is only really used by the build engine (and not by the individual developer), I don't truly believe it belongs in the pom. But I cannot find any other mechanism that would allow me to specify a plugin to use - settings.xml does not permit specifying plugins.
Is my only choice in this case to use a plugin and specify it in the project pom?
in the pom you can place...
<properties>
<core-version>1234</core-version>
<lib-version>1234</lib-version>
<build-version>9999</lib-version>
<build-date>20150101</build-date>
</properties>
with all the properties you require.
Or you can use...
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>properties-maven-plugin</artifactId>
<version>1.0-alpha-2</version>
<executions>
<execution>
<phase>initialize</phase>
<goals>
<goal>read-project-properties</goal>
</goals>
<configuration>
<files>
<file>dev.properties</file>
</files>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
and the file dev.properties will contain the properties
core-version=1234
lib-version=1234
build-version=9999
build-date=20150101
...
Or... you can inject the properties using a settings.xml file as shown here
You may also find the Maven build number plugin useful... here
The best in such cases is to upgrade to at least Maven 3.2.1 which supports defining such properties on the command line like the following:
mvn -Drevision=1234 -Dchangelist=WhatEver -Dsha1=XXXX clean package
But you can only use the above names.
Excerpt from release notes:
A simple change to prevent Maven from emitting warnings about versions
with property expressions. Allowed property expressions in versions
include ${revision}, ${changelist}, and ${sha1}. These properties can
be set externally, but eventually a mechanism will be created in Maven
where these properties can be injected in a standard way. For example
you may want to glean the current Git revision and inject that value
into ${sha1}. This is by no means a complete solution for continuous
delivery but is a step in the right direction.
We use Jenkins which use md5 fingerprinting to identify artifacts and whether the artifact has changed since the last build. Unfortunately Maven builds always generate binary different artifacts.
Therefore I am looking into making Maven generate the same jar artifact for the same set of input files regardless of where and when they were built, which amongst other things mean that the entries in the jar file must be sorted - not only in the index, but in the order they are written to the jar file.
After examining maven-jar-plugin which use maven-assembly-plugin, my conclusions are that they do not collect all files to be written in memory before writing them all at once, but write one at a time. This mean that it may be better to postprocess the generated jar instead of changing the current behavior so I at that time can sort the entries, zero the timestamps, etc.
I am unfamiliar with writing Maven plugins, so my question is, how should I write a plugin which Maven knows how to tell where the artifact-jar-in-progress is located and how I hook it up in my pom.xml?
(At first I need this to work for jar files, but war files would be nice too).
As mentioned, this can be done based on something similar to maven-shade-plugin. I went ahead and wrote a simple plugin to add this capability -- see https://github.com/manouti/jar-timestamp-normalize-maven-plugin (available on the Central repo).
The behavior is based on the shade plugin. It consists of a single goal called normalize which can be bound to the package lifecycle phase and configured in the project's POM:
<plugins>
<plugin>
<groupId>com.github.manouti</groupId>
<artifactId>jar-timestamp-normalize-maven-plugin</artifactId>
<version>1.0-SNAPSHOT</version>
<executions>
<execution>
<id>jar-normalize</id>
<goals>
<goal>normalize</goal>
</goals>
<phase>package</phase>
</execution>
</executions>
</plugin>
</plugins>
A few notes about the plugin:
The artifact under build is accessed via project#getArtifact() where project is a org.apache.maven.project.MavenProject.
Normalization consists of mainly three steps:
Setting the last modified time of all Jar entries to a specific timestamp (default value is 1970-01-01 00:00:00AM but can be changed via -Dtimestamp system property).
Reordering (alphabetically) of attributes in the manifest except for Manifest-Version which always comes first.
Removing comments from the pom.properties file which contain a timestamp that causes the Jar to differ from one build to another.
Once invoked, the goal will generate the output file next to the original artifact (named artifactId-version-normalized.jar), i.e. in the project.build.directory directory.
To create maven plugin project
mvn archetype:generate \
-DgroupId=sample.plugin \
-DartifactId=hello-maven-plugin \
-DarchetypeGroupId=org.apache.maven.archetypes \
-DarchetypeArtifactId=maven-archetype-plugin
invoke this command it will generate a skeleton project with a class called MyMojo.java
write your stuff inside execute() method, and install that plugin to your repository by mvn clean install
then attach its execution with your project, in your project pom.xml
<build>
<plugins>
<plugin>
<groupId>sample.plugin</groupId>
<artifactId>hello-maven-plugin</artifactId>
<version>1.0-SNAPSHOT</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>sayhi</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
to access project properties inside your Mojo
/**
* The Maven project.
*
* #parameter expression="${project}"
* #required
* #readonly
*/
private MavenProject project;
and then
project.getProperties("build.directory")
and read other properties to determine your jar file packed
See
maven: guide-java-plugin-development
I agree on creating a custom maven plugin seems like a better option. I dont know about an existing plugin provides solution for what you asked.
md5 checksum (or sha-1 in my repository) is generated with install plugin, so seems like you need to extend this or write a new plugin which works after install phase.
I have 2 suggestions about this plugin:
1) When thinking simple, this plugin should:
Read generated jar:
Extract all entries.
Exclude some entries (e.g. MANIFEST.MF).
Sort remaining entries .
Extract md5s for each in memory.
Generate a single md5 from all of those extracted.
However when considering about where & when independency: Accordig to .class file structure Java_class_file there is minor, major versions entries are held in compiled class files. So if compiler changes, .class files will be changed. In this case we need a check on source code level from this point :( So this solution become useless if there is no guarantee on copiler version.
2) As very dirty but easy solution, this plugin may only extract your module's pom.xml file's md5 code. But you must guarantee each change in your jar reflects to a minor version (or built number) manually.
Instead of writing your own plugin you can write a Groovy script that is executed by groovy-maven-plugin:
<plugin>
<groupId>org.codehaus.gmaven</groupId>
<artifactId>groovy-maven-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<source>
import java.util.jar.*
String fileName = '${project.build.directory}/${project.build.finalName}.jar'
println "Editing file ${fileName}"
JarFile file = new JarFile(fileName);
// do your edit
</source>
</configuration>
</execution>
</executions>
</plugin>
I've been banging my head against a wall for about an hour on this: I'm trying to pass a simple property (java.library.path) to exec-maven-plugin. The goal is to have it integrate with Netbeans Right Click file > Run File procedure.
So I set my POM like this:
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.1.1</version>
<configuration>
<environmentVariables>
<java.library.path>native/win32-x86</java.library.path>
</environmentVariables>
</configuration>
</plugin>
</plugins>
</build>
(I use an old version so I can see the execution args, but its fully reproducible with 1.2)
Then I right click my file and click "Run File". Netbeans starts this process:
W:\programming\apache-maven-2.2.1\bin\mvn.bat -Dexec.classpathScope=runtime -Dexec.args=-classpath %classpath org.quackedcube.camera.CameraDemo -Dexec.executable=C:\Program Files\Java\jdk1.6.0_21\bin\java.exe -Dnetbeans.execution=true -Dmaven.repo.local=W:\programming\maven-repo process-classes exec:exec
(The original full classpath execution was changed to exec:exec so hopefully my configuration applied)
But my environment variable is apparently ignored, as the resulting executed program is:
Result of cmd.exe /X /C ""C:\Program Files\Java\jdk1.6.0_21\bin\java.exe" -classpath *snip* org.quackedcube.camera.CameraDemo" execution is: '1'.
I've tried
Using separate Key and Value tags inside an enviornmentVariable tag
Use a key and value tag directly inside an enviornmentVariables tag (worth a try)
binding to a phase
passing as a maven arg and using exec:java instead
Passing -Djava.library.path=native/win32-x86 as a Run argument and VM option in Project Configuration page
and all have failed. I'm really at a loss here.
I guess this is the disadvantage of using JNI in maven: You have to pass as an argument to your tests, your runtime, your module run POM, and your parent POM.
So my question: How can I pass a java.library.path property to an executed file? It would be nice if it integrated with Netbeans Run File functionality (therefor I don't have to change the class name in a POM, build, then run)
Didn't know this, but apparently when doing this you need to put this property first. I didn't think it was necessary since the classpath isn't immediately executed, but apparently it does make a difference.
To fix it, I simply changed this in Project Properties > Actions > Run File via Main
exec.classpathScope=${classPathScope}
exec.args=-Djava.library.path="native/win32-x86" -classpath %classpath ${packageClassName}
exec.executable=java
The reason you can't specifcy it in the POM is that NB passes the classpath and what its execution via command line exec.args, which overrides whats in your POM.
While this might be ugly and platform dependant, its what happens when you mix JNI and Maven. There isn't really another way that I can see.
Not sure if you tried this but as long as you need to set property on a level of JVM it should be done with -Djava.library.path=/some/path
So in order to specify it for exec-maven-plugin you could write something like this:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.1</version>
<configuration>
<executable>java</executable>
<arguments>
<argument>-Djava.library.path=${java.library.path}</argument>
</arguments>
</configuration>
</plugin>
You need, of course, to update the executable and maybe add another attributes.