I have a Question about the Exec Maven Plugin.
I want to execute my setup.iss file (generated with Inno Setup) with the exec maven plugin.
One question: Should I define a path for my file in my pom or in which destination the setup.iss has to be put for maven to find it?
Here is the code from my pom:
<profiles>
<profile>
<id>exec</id>
<activation>
<property>
<name>exec</name>
</property>
</activation>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.6.0</version>
<configuration>
<mainClass>de.audi.analysis.main.Main</mainClass>
<executable>ISCC.exe</executable>
<workingDirectory></workingDirectory>
<arguments>
<argument>firstsetup.iss</argument>
</arguments>
</configuration>
<executions>
<execution>
<phase>install</phase>
<goals>
<goal>exec</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
The exec-maven-plugin is simply calling the iscc.exe with the arguments you provide. In this instance the plugin would execute iscc.exe firstsetup.iss
I believe it assumes the firstsetup.iss will be in the ${project.basedir} of the maven project (where the pom.xml is) or the workingDirectory if provided. A specific file path can be passed with argument as well.
<argument>${project.basedir}/<some-path>/firstsetup.iss</argument>
The problem was that i have to add all the dll to my solution. After adding all inno dll files it works fine and i get build success. Thank you for your Answer Adam. Here is my pom configuration:
<configuration>
<executable>src/main/resources/innosetup/ISCC.exe</executable>
<workingDirectory>src/main/resources/innosetup</workingDirectory>
<arguments>
<argument>audience-setup1.iss</argument>
</arguments>
</configuration>
<executions>
<execution>
<goals>
<goal>java</goal>
</goals>
</execution>
</executions>
Related
Using mvn and the maven-assembly-plugin, I create a .jar with dependencies and run it like this:
java -cp ../target/module-jar-with-dependencies.jar module.Launcher --project=example --network=toy_ags_network.sif
I wanted to create a mvn profile that does exactly that. So in my pom.xml I added this:
<profiles>
<profile>
<id>runExample</id>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.6.0</version>
<executions>
<execution>
<phase>compile</phase>
<goals>
<goal>java</goal>
</goals>
<configuration>
<mainClass>module.Launcher</mainClass>
<arguments>
<argument>--project</argument>
<argument>example</argument>
<argument>--network</argument>
<argument>toy_ags_network.sif</argument>
</arguments>
</configuration>
</execution>
</executions>
<configuration>
<mainClass>com.test.Startup</mainClass>
<cleanupDaemonThreads>false</cleanupDaemonThreads>
</configuration>
</plugin>
</plugins>
</build>
</profile>
</profiles>
So, when I do: mvn compile -P runExample I would get the same results. It seems though that some classes from a dependency are not fully loaded or something and this throws exceptions, etc. and when I don't include that particular code that uses these other classes then everything is fine. I want to make sure that with my way above I have included all dependencies, e.g. that the java command and the maven one are equal.
Edits
I managed to have a simple plugin that behaves the same way as the java command, by running mvn exec:exec:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.6.0</version>
<configuration>
<executable>java</executable>
<arguments>
<argument>-cp</argument>
<argument>target/module-jar-with-dependencies.jar</argument>
<argument>module.Launcher</argument>
<argument>--project</argument>
<argument>example</argument>
<argument>--network</argument>
<argument>toy_ags_network.sif</argument>
</arguments>
</configuration>
</plugin>
But I want a profile with that plugin inside, that's what I still not have!
The correct configuration for the pom.xml is:
<profiles>
<profile>
<id>runExample</id>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.6.0</version>
<executions>
<execution>
<phase>compile</phase>
<goals>
<goal>exec</goal>
</goals>
<configuration>
<executable>java</executable>
<arguments>
<argument>-cp</argument>
<argument>target/module-jar-with-dependencies.jar</argument>
<argument>module.Launcher</argument>
<argument>--project</argument>
<argument>example</argument>
<argument>--network</argument>
<argument>toy_ags_network.sif</argument>
</arguments>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
Thus, running: mvn compile -P runExample is the same as:
java -cp ../target/module-jar-with-dependencies.jar module.Launcher --project=example --network=toy_ags_network.sif
I would like to execute git describe as part of a maven build and use the resulting output in the manifest for building a .jar package.
I know how to do this in ant via the <exec> task with outputproperty to an ant property variable, but I have very little experience with Maven and don't even know where to look.
Is this possible?
I found this in a sample pom.xml file so adding something to the manifest looks pretty easy:
<project>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.6</version>
<configuration>
<archive>
<manifest>
<addClasspath>true</addClasspath>
<mainClass>my.class.here.Myclass</mainClass>
<classpathLayoutType>custom</classpathLayoutType>
<customClasspathLayout>lib/$${artifact.artifactId}-$${artifact.version}$${dashClassifier?}.$${artifact.extension}</customClasspathLayout>
</manifest>
</archive>
</configuration>
</plugin>
</plugins>
</build>
</project>
Not sure how to capture command execution though.
Here is a suggested approach:
Use the Exec Maven Plugin for launching your git commands and write to a properties file (name=value pattern, if possible)
Use the Properties Maven Plugin to load the configuration
The configuration loaded can then be used as properties in your POM. We are basically dynamically creating properties of our build. To do so (to use these properties), the steps above must be executed as early as possible in the build flow (i.e. validate or initialize phase).
Below an example of flow, just tested and work perfectly (on Windows machine):
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.sample</groupId>
<artifactId>generation</artifactId>
<version>0.0.1-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.1</version>
<executions>
<execution>
<id>retrieve-config</id>
<phase>validate</phase>
<goals>
<goal>exec</goal>
</goals>
</execution>
</executions>
<configuration>
<executable>echo</executable>
<arguments>
<argument>jar.name=from-exec</argument>
<argument>></argument>
<argument>config.properties</argument>
</arguments>
<workingDirectory>${basedir}/src/main/resources/</workingDirectory>
</configuration>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>properties-maven-plugin</artifactId>
<version>1.0-alpha-2</version>
<executions>
<execution>
<id>read-properties</id>
<phase>initialize</phase>
<goals>
<goal>read-project-properties</goal>
</goals>
<configuration>
<files>
<file>${basedir}/src/main/resources/config.properties</file>
</files>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.4</version>
<configuration>
<finalName>${jar.name}</finalName>
</configuration>
</plugin>
</plugins>
</build>
</project>
Basically, the exec plugin attached to the validate phase will be executed at the beginning of the build, writing to a config.properties file (via the echo command) the content jar.name=from-exec.
Then the properties plugin attached to the initialize phase will read that config.properties file and load the properties to be used as part of the build.
Then, as an example, the jar plugin will use that property as part of its configuration (the <finalName>${jar.name}</finalName> part).
Running mvn clean package, you will find the from-exec.jar file in the target folder.
If you can't get a way of having the result of git describe as name=value pattern, you can (worst case) have two Exec Maven Plugin executions, the first writing to the file the property name and the equals character (i.e. via an echo), the second (git describe) appending to the file the property value.
There is a Maven plugin here https://github.com/ktoso/maven-git-commit-id-plugin that will do what you want.
If you hook into your build it will generate a Maven variable named ${git.commit.id.describe} that you can then use Maven's resource filtering to dynamically modify your manifest.
Building on what #A_Di-Matteo said, you can get the git tag into the properties file like this
In the maven exec plugin.
<execution>
<id>set-git-tag</id>
<phase>validate</phase>
<goals>
<goal>exec</goal>
</goals>
<configuration>
<executable>bash</executable>
<arguments>
<argument>-c</argument>
<argument>echo git.tag=`git describe --always --dirty=-modified`</argument>
<argument>></argument>
<argument>config.properties</argument>
</arguments>
<workingDirectory>${basedir}/src/main/resources/</workingDirectory>
</configuration>
</execution>
You can also append to an existing file like so
<execution>
<id>set-git-tag</id>
<phase>validate</phase>
<goals>
<goal>exec</goal>
</goals>
<configuration>
<executable>bash</executable>
<arguments>
<argument>-c</argument>
<argument>echo git.tag=`git describe --always --dirty=-modified`</argument>
<argument>>></argument>
<argument>${basedir}/src/main/resources/config.properties</argument>
</arguments>
</configuration>
</execution>
The limitation here is that it requires bash.
I have a maven config file that triggers autogeneration of xsd and wsdl classes as follows:
<plugin>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-xjc-plugin</artifactId>
<version>${cxf-xjc-plugin}</version>
<executions>
<execution>
<id>generate-sources</id>
<phase>generate-sources</phase>
<configuration>
<sourceRoot>${project.build.directory}/generated/src/main/java</sourceRoot>
<xsdOptions>
//xsds, wsdls etc
</xsdOptions>
</configuration>
<goals>
<goal>xsdtojava</goal>
</goals>
</execution>
</executions>
</plugin>
The generated classes go to: target/generated/src/main/java.
Problem: running 'mvn clean package' will always remove those classes. How can I prevent it? Is it possible having clean remove the full content of the target directory apart from the generated/ one?
It is possible not to delete some directory using maven-clean-plugin but this is definitely not a good idea:
it goes against Maven conventions
it forces you to change your POM everytime you want those classes to be generated
Solution to your exact question (NOT RECOMMENDED)
You can exclude directories with the maven-clean-plugin using excludeDefaultDirectories and filesets parameters:
<plugin>
<artifactId>maven-clean-plugin</artifactId>
<version>2.6.1</version>
<configuration>
<excludeDefaultDirectories>true</excludeDefaultDirectories>
<filesets>
<fileset>
<directory>${project.build.directory}</directory>
<excludes>
<exclude>generated/*</exclude>
</excludes>
</fileset>
</filesets>
</configuration>
</plugin>
Note that I strongly urge you NOT to use this solution.
Proposed solution
Your actual problem is not to re-generate the classes everytime you build because it takes a lot of time. The goal is then to avoid generation by using a custom profile:
<profiles>
<profile>
<id>noGenerate</id>
<properties>
<xjc.generate>none</xjc.generate>
</properties>
</profile>
<profile>
<id>generate</id>
<activation>
<activeByDefault>true</activeByDefault>
</activation>
<properties>
<xjc.generate>generate-sources</xjc.generate>
</properties>
</profile>
</profiles>
With the following plugin definition:
<plugin>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-xjc-plugin</artifactId>
<version>${cxf-xjc-plugin}</version>
<executions>
<execution>
<id>generate-sources</id>
<phase>${xjc.generate}</phase>
<configuration>
<sourceRoot>${project.build.directory}/generated/src/main/java</sourceRoot>
<xsdOptions>
//xsds, wsdls etc
</xsdOptions>
</configuration>
<goals>
<goal>xsdtojava</goal>
</goals>
</execution>
</executions>
</plugin>
It seems cxf-xjc-plugin does not have any skip parameter so we have to resort to setting the phase to none when we want to avoid execution (this is an undocumented feature but it works).
The trick is to define two profiles: one, activated by default, sets a property telling the cxf-xjc-plugin to execute in the generate-soures phase, while the other one sets a property telling the cxf-xjc-plugin not to execute.
As such, when you want to classes to be generated, you can invoke Maven with mvn clean install and when you want the classes not to be generated, you can invoke Maven with mvn clean install -PnoGenerate.
The real gain and advantage here is that you do not need to change your POM everytime you decide to generate or not the classes.
I would have another approach.
1st) Configure your xjc plug-in to not erase any files i.e.:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>jaxb2-maven-plugin</artifactId>
<version>2.2</version>
<executions>
<execution>
<goals>
<goal>xjc</goal>
</goals>
</execution>
</executions>
<configuration>
<clearOutputDir>false</clearOutputDir>
<outputDirectory>${project.build.directory}</outputDirectory>
<sources>
<source> [any xsd] </source>
</sources>
</configuration>
2nd) Use maven-clean-plugin to clean jxc class generated dir in clean stage:
<plugin>
<artifactId>maven-clean-plugin</artifactId>
<version>3.0.0</version>
<configuration>
<filesets>
<fileset>
<directory>${basedir}/generated/src/main/java/...where the generated classes are</directory>
<includes>
<include>**/*.java</include>
</includes>
<followSymlinks>false</followSymlinks>
</fileset>
</filesets>
</configuration>
This works for me and I hope would be useful.
I'm trying to run a java class which populates my database with dummy data. In eclipse I do it just by right clicking and running as java program. The problem is I'd like make jenkins do it... obvious solution would be running a class using maven as it would put everything needed on the classpath.
I have tried http://mojo.codehaus.org/exec-maven-plugin/ like this:
<profile>
<id>populatedb</id>
<activation>
<activeByDefault>false</activeByDefault>
<property>
<name>populatedb</name>
</property>
</activation>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.2.1</version>
<executions>
<execution>
<phase>install</phase>
<goals>
<goal>java</goal>
</goals>
</execution>
</executions>
<configuration>
<mainClass>com.example.DatasetReader</mainClass>
</configuration>
</plugin>
</plugins>
</build>
</profile>
But it gives me ClassNotFound on com.example.DatasetReader before the project is even built. I use this command:
mvn clean install exec:java -Dpopulatedb -Dclasspath -Dexec.mainClass="com.example.DatasetReader"
I think it have to do something with execution phase... but there is nothing like post-install...
Thanks!
I think the problem is to do with the classpath that is used by the exec-maven-plugin. By default the exec-maven-plugin uses the runtime classpath. I presume that your DatasetReader class is a test class so is only available on the test classpath.
To pass a different classpath to the exec-maven-plugin you use the classpathScope property.
So you would use <classpathScope>test</classpathScope> in your pom to have the plugin run with the test classpath.
So you would simply need to modify your POM to be as follows:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.2.1</version>
<executions>
<execution>
<phase>install</phase>
<goals>
<goal>java</goal>
</goals>
</execution>
</executions>
<configuration>
<classpathScope>test</classpathScope> <!-- this is the extra config -->
<mainClass>com.example.DatasetReader</mainClass>
</configuration>
</plugin>
Try without additional phases:
mvn exec:java -Dexec.mainClass="com.example.DatasetReader"
or adding classpath scope from runtime:
mvn exec:java -Dexec.mainClass="com.example.DatasetReader" -Dexec.classpathScope=runtime
Our application can be built for several application servers, and used in several environments.
Type of application server and target environment should be specified using Maven profiles. One and only one of each profile type should be present when compiling the code. All profiles cause execution of one or several mavent-antrun-plugin copy tasks in order to include correct setting files to the generated JAR.
Below is part of the pom.xml file. Part of AS profile "oracle" is included, as well as part of environment profile "development". The purpose is, that in order to create JAR which can be deployed to Oracle AS in development environment, the code is compiled using two profile switches mvn -P oracle,development
AS profiles have also other tasks (not shown below) which have to be executed before the environment profile tasks take place (that's the reason profiles have different phases).
My issue is, that Maven refuses to execute tasks in both of the selected profiles.
mvn -Poracle works just as it's supposed. So does mvn -Pdevelopment. However, mvn -Poracle,development results in execution of only the tasks in oracle profile. If all the tasks in oracle profile's antrun plugin are commented out, then the tasks in development profile do get executed.
My questions are:
* Why does Maven refuse to execute ant tasks in both of these profiles?
* Is there a way to fix this?
Combining the profiles (oracle-development, jboss-development etc.) is not an option for us, because this module is part of a bigger project and would require modifications to several other projects.
We use currently Maven 2.2.0.
<profile>
<id>oracle</id>
<build>
<plugins>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<phase>validate</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<tasks>
<copy .../>
</tasks>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
...jboss, glassfish profiles...
<profile>
<id>development</id>
<build>
<plugins>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<phase>compile</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<tasks>
<copy .../>
</tasks>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
...production, test profiles...
Add a unique execution id to each <execution>:
<profile>
<id>oracle</id>
<build>
<plugins>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<id>execution1</id>
<phase>validate</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<tasks>
<echo>ORACLE</echo>
</tasks>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
<profile>
<id>development</id>
<build>
<plugins>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<id>execution2</id>
<phase>compile</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<tasks>
<echo>DEV</echo>
</tasks>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
Tested working solution :) Without the <id> element, I guess that one <execution> overrides the other.