How to resolve maven expression in source code in Intellij - java

I'm usin templating-maven-plugin to in order to filter main sources, so it injects some value directly in java code while project building. And it works well while I'm using only maven but when it comes to run the project in InteillIj it doesn't resolve meven exporession in the code like "${someParam}";
Using properties file it isn't a solution for me
Example
public class TestObject {
String surname = "${someParam}";
public void print(){
System.out.println("My name is " + surname);
}
}
POM CONFIG
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>templating-maven-plugin</artifactId>
<version>1.0.0</version>
<executions>
<execution>
<id>filter-src</id>
<phase>process-resources</phase>
<goals>
<goal>filter-sources</goal><!--filter main sources -->
</goals>
</execution>
</executions>
</plugin>
So when I do mvn clean install -DsomeParam=TEST it works well but when I run it in intellij even with run config -DsomeParam=TEST it prints My name is ${someParam}

Related

Quarkus 2.0 maven build is not creating uber-jar for AWS lambda

I'm using Quarkus 2.0 to build uber-jar to be used as AWS lambda.
Maven build script is as follows:
<properties>
<quarkus.package.type>uber-jar</quarkus.package.type>
</properties>
<dependencies>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-amazon-lambda</artifactId>
</dependency>
</dependencies>
<build>
<plugin>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-maven-plugin</artifactId>
<version>2.0.3.Final</version>
<executions>
<execution>
<goals>
<goal>build</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
application.properties also contains the quarkus.package.type=uber-jar config.
When I debug Maven build, I see that in the moment of making decision, quarkus-maven-plugin executes the code:
#BuildStep
public JarBuildItem buildRunnerJar(CurateOutcomeBuildItem curateOutcomeBuildItem, OutputTargetBuildItem outputTargetBuildItem, TransformedClassesBuildItem transformedClasses, ApplicationArchivesBuildItem applicationArchivesBuildItem, ApplicationInfoBuildItem applicationInfo, PackageConfig packageConfig, ClassLoadingConfig classLoadingConfig, List<GeneratedClassBuildItem> generatedClasses, List<GeneratedResourceBuildItem> generatedResources, List<UberJarRequiredBuildItem> uberJarRequired, List<UberJarMergedResourceBuildItem> uberJarMergedResourceBuildItems, List<UberJarIgnoredResourceBuildItem> uberJarIgnoredResourceBuildItems, List<LegacyJarRequiredBuildItem> legacyJarRequired, QuarkusBuildCloseablesBuildItem closeablesBuildItem, List<AdditionalApplicationArchiveBuildItem> additionalApplicationArchiveBuildItems, MainClassBuildItem mainClassBuildItem, Optional<AppCDSRequestedBuildItem> appCDS) throws Exception {
if (appCDS.isPresent()) {
this.handleAppCDSSupportFileGeneration(transformedClasses, generatedClasses, (AppCDSRequestedBuildItem)appCDS.get());
}
if (!uberJarRequired.isEmpty() && !legacyJarRequired.isEmpty()) {
throw new RuntimeException("Extensions with conflicting package types. One extension requires uber-jar another requires legacy format");
} else if (legacyJarRequired.isEmpty() && (!uberJarRequired.isEmpty() || packageConfig.type.equalsIgnoreCase("uber-jar"))) {
/* I want it get there, but it doesn't due to "legacyJarRequired" containing an item, ("packageConfig == uber-jar" as expected) */
return this.buildUberJar(curateOutcomeBuildItem, outputTargetBuildItem, transformedClasses, applicationArchivesBuildItem, packageConfig, applicationInfo, generatedClasses, generatedResources, uberJarMergedResourceBuildItems, uberJarIgnoredResourceBuildItems, mainClassBuildItem);
} else {
/* execution gets there because "legacyJarRequired" contains an item */
return legacyJarRequired.isEmpty() && !packageConfig.isLegacyJar() && !packageConfig.type.equalsIgnoreCase("legacy") ? this.buildThinJar(curateOutcomeBuildItem, outputTargetBuildItem, transformedClasses, applicationArchivesBuildItem, packageConfig, classLoadingConfig, applicationInfo, generatedClasses, generatedResources, additionalApplicationArchiveBuildItems, mainClassBuildItem) : this.buildLegacyThinJar(curateOutcomeBuildItem, outputTargetBuildItem, transformedClasses, applicationArchivesBuildItem, packageConfig, applicationInfo, generatedClasses, generatedResources, mainClassBuildItem);
}
}
And item in the legacyJarRequired is added in here
#BuildStep(onlyIf = IsNormal.class, onlyIfNot = NativeBuild.class)
public void requireLegacy(BuildProducer<LegacyJarRequiredBuildItem> required) {
required.produce(new LegacyJarRequiredBuildItem());
}
How can I avoid adding this element into build config to receive versioned xxx-yyy-zzz-runner.jar from my application build?
function.zip is built all right, but it's not an option for me, because I'd like to push the results of the build to maven repo.
I also needed to deploy an uber-jar to artifactory, for further deployment as AWS lambda. Finally I solved it with build-helper-maven-plugin:attach-artifact plugin. It attached function.zip to artifact in Nexus, so Jenkins was able to get the archive and deploy it to AWS.
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>3.2.0</version>
<executions>
<execution>
<id>attach-artifacts</id>
<phase>package</phase>
<goals>
<goal>attach-artifact</goal>
</goals>
<configuration>
<artifacts>
<artifact>
<file>./target/function.zip</file>
<type>zip</type>
</artifact>
</artifacts>
</configuration>
</execution>
</executions>
</plugin>

Share properties between maven plugin and the calling maven project pom

I have created a Maven Plugin P, which I want to use as a dependency in another Maven project A. I am providing some parameters to that the plugin P from the pom of Maven project A.
I want to set some properties in plugin P based on parameters provided by project A and want them to be referenced in pom of project A. How can I do that ?
I have tried setting properties for MavenProject in the plugin P. How can I refer them in the pom for project A?
Project A pom snippet:
<plugin>
<groupId>sample.plugin</groupId>
<artifactId>sample-plugin</artifactId>
<version>1.0.0-SNAPSHOT</version>
<executions>
<execution>
<goals>
<goal>testing</goal>
</goals>
<configuration>
<param1>value1</param1>
<param2>value2</param2>
</configuration>
</execution>
</executions>
</plugin>
Plugin P code snippet
#Mojo( name = "testing")
public class TestMojo extends AbstractMojo
{
.
.
#Parameter(property = "param1")
private String param1;
#Parameter(property = "param2")
private String param2;
#Parameter(defaultValue = "${project}")
private org.apache.maven.project.MavenProject project;
public void execute() throws MojoExecutionException
{
if(param1.equalsIgnoreCase("value1")){
project.getProperties().setProperty("PROP1","val1");
} else{
project.getProperties().setProperty("PROP1","val3");
}
if(param2.equalsIgnoreCase("value2")){
project.getProperties().setProperty("PROP2","val2");
} else{
project.getProperties().setProperty("PROP2","val3");
}
}
}
I expect the PROP1 and PROP2 to be used in project A
Found the solution, if we add ${project} A as a parameter to the plugin configuration, we can add properties to it, which can be referred in project A pom.
Ex:
<plugin>
<groupId>sample.plugin</groupId>
<artifactId>sample-plugin</artifactId>
<version>1.0.0-SNAPSHOT</version>
<executions>
<execution>
<goals>
<goal>testing</goal>
</goals>
<configuration>
<param1>value1</param1>
<param2>value2</param2>
<project>${project}</project>
</configuration>
</execution>
</executions>
</plugin>
in Plugin one can use this Maven project
project.getProperties.setProperty("projectProperty",propertyValue);
If i'm understanding this question correctly, try adding:
<dependencies>
<dependency>
<groupId>sample.plugin</groupId>
<artifactId>sample-plugin</artifactId>
<version>1.0.0-SNAPSHOT</version>
</dependency>
</dependencies>
at the bottom of Plugin P's pom.xml file, right before the end of </project>
I am not entirely sure this will even work as I have limited knowledge of Maven, but please let me know.
Best of luck to you.

Maven: run plugin twice during a phase, interleaved with another plugin

For our end-2-end test we need to execute the following logical flow:
Create and set up e2e schema (user) in the database (pre-integration-test)
Run Liquibase to initially populate the schema (pre-integration-test)
Add e2e-specific test data to the DB tables (pre-integration-test)
Start Tomcat (pre-integration-test)
Run the web application in Tomcat (integration-test) using Protractor
Shut down Tomcat (post-integration-test)
Clean up the DB: drop the schema (post-integration-test)
For running SQL the sql-maven-plugin is used, however this flow doesn't fit the regular POM layout:
The SQL plugin has to run during pre-integration-test twice, before and after the liquibase-maven-plugin
The SQL plugin has to run before Tomcat plugin during pre-integration-test, however it has to run after during post-integration-test, so that the DB schema is dropped after Tomcat has shut down.
As far as I could conclude from Maven docs, the order of plugins in the POM defines the order of execution during the same phase, and a plugin cannot be mentioned twice in the same POM.
Question: Is there any way to achieve this, apart from writing a shell script that would invoke Maven multiple times?
P.S. found a similar unanswered question.
Given the sample POM below:
<project>
<modelVersion>4.0.0</modelVersion>
<groupId>com.sample</groupId>
<artifactId>sample-project</artifactId>
<version>0.0.2-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.5</version>
<executions>
<execution>
<id>print-hello</id>
<phase>validate</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<target>
<echo message="hello there!" />
</target>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.5.0</version>
<executions>
<execution>
<id>exec-echo</id>
<phase>validate</phase>
<configuration>
<executable>cmd</executable>
<arguments>
<argument>/C</argument>
<argument>echo</argument>
<argument>hello-from-exec</argument>
</arguments>
</configuration>
<goals>
<goal>exec</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.5</version>
<executions>
<execution>
<id>print-hello-2</id>
<phase>validate</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<target>
<echo message="hello there 2!" />
</target>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
We are actually configuring:
The maven-antrun-plugin to print the hello there! message
The exec-maven-plugin to print the hello-from-exec message
The maven-antrun-plugin to print the hello there 2! message
Goal executions are all attached to the same phase, validate, and we would expect to be executed in the same defined order.
However, when invoking (the -q option is used to have exactly and only their output):
mvn validate -q
we would have as output:
main:
[echo] hello there!
main:
[echo] hello there 2!
hello-from-exec
That is, for the same phase, Maven executed the defined plugins, however merging all of the defined executions for the same plugins (even if defined as different plugin sections) and then execute them in the order to merged definitions.
Unfortunately, there is no mechanism to avoid this merging. The only options we have for configuring plugins execution behaviors are:
The inherited configuration entry:
true or false, whether or not this plugin configuration should apply to POMs which inherit from this one. Default value is true.
The combine.children and combine.self to
control how child POMs inherit configuration from parent POMs by adding attributes to the children of the configuration element.
None of these options would help us. In this case we would need a kind of merge attribute on the execution element or have a different behavior by default (that is, Maven should respect the definition order).
Invoking the single executions from command line as below:
mvn antrun:run#print-hello exec:exec#exec-echo antrun:run#print-hello-2 -q
We would instead have the desired output:
main:
[echo] hello there!
hello-from-exec
main:
[echo] hello there 2!
But in this case:
We are not attached to any phase
We are invoking directly specific executions (and their configurations) via command line (and via a new feature only available since Maven 3.3.1
You can achieve exactly the same via scripting or via exec-maven-plugin invoking maven itself, but - again - the same would apply: no phase applied, only sequence of executions.

Maven: Replace token in source file before compilation

I want to replace a token #NAME# in a source file (in my case *.java) before compilation.
I try to use google replacer plugin but I am open for anything which will help me.
1.pom.xml
The pom file look like this
<plugin>
<groupId>com.google.code.maven-replacer-plugin</groupId>
<artifactId>replacer</artifactId>
<version>1.5.3</version>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>replace</goal>
</goals>
</execution>
</executions>
<configuration>
<includes>
<include>src/main/java/com/test/sample/File.java</include>
</includes>
<replacements>
<replacement>
<token>#NAME#</token>
<value>New content</value>
</replacement>
</replacements>
</configuration>
</plugin>
But after I run mvn package the output is:
--- replacer:1.5.3:replace (default) # MyProject --- [INFO] Replacement run on 0 file.
Because there is no error I do not know what I have done wrong.
Maybe:
Defined phase is wrong
Defined include is wrong
...
Greetings!
I think there are two options.
If you keep using the plugin I think you need to add the ${basedir} to the include statement:
<include>${basedir}/src/main/java/com/test/sample/File.java</include>
If you dont want to modify the file in src/main but filter the file and add that one to the build you can use the standard resource filtering and the buildhelper plugin to add those "generated sources" to the build.
So step one would be using resource filtering to copy the file: http://maven.apache.org/plugins/maven-resources-plugin/examples/filter.html
And then use the http://www.mojohaus.org/build-helper-maven-plugin/ to add those sources to the build.
Some IDEs (IntelliJ) will recognize /target/genereated-sources automatically if you keep using that folder (its not standard but very common). If you search for "maven" and "generated-sources" you will find quite some tutorials.
Hope this helps :)
While this is something you usually should not do in the first place, sometimes you have no choice (in my case it was "converting" an old project to Maven with changing as little of the code as possible). The above somehow did not work (while I could replace a placeholder in the source file and add the generated-sources folder to be compiled, it complained about duplicate source files).
Then I found an easier way by using the templating-maven-plugin as described here http://www.mojohaus.org/templating-maven-plugin/examples/source-filtering.html:
Put the file with the placeholder in the folder /src/main/java-templates. Excerpt from my source code:
public static final String APPLICATION_VERSION = "r${project.version}";
Add the following to your pom's plugins section:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>templating-maven-plugin</artifactId>
<version>1.0.0</version>
<executions>
<execution>
<id>filter-src</id>
<goals>
<goal>filter-sources</goal>
</goals>
</execution>
</executions>
</plugin>

Using property file in maven

I don't quite understand how it can be used. There is a property defined in the file. I try to use maven property plugin to read it and save. The property is used in the liquibase plugin:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>properties-maven-plugin</artifactId>
<version>1.0-alpha-1</version>
<executions>
<execution>
<phase>initialize</phase>
<goals>
<goal>read-project-properties</goal>
</goals>
<configuration>
<files>
<file>src/main/resources/properties/app.properties</file>
</files>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-maven-plugin</artifactId>
<version>2.0.5</version>
<configuration>
<propertyFile>src/main/resources/db/config/${env}-data-access.properties</propertyFile>
<changeLogFile>src/main/resources/db/changelog/db.changelog-master.xml</changeLogFile>
<migrationSqlOutputFile>src/main/resources/db/gen/migrate.sql</migrationSqlOutputFile>
<!--<logging>debug</logging>-->
<logging>info</logging>
<promptOnNonLocalDatabase>false</promptOnNonLocalDatabase>
<!--<verbose>false</verbose>-->
<dropFirst>true</dropFirst>
</configuration>
</plugin>
According to the documentation in order to read property and save it I have to run: mvn properties:read-project-properties. But I'm getting the following error in this case:
[ERROR] Failed to execute goal org.codehaus.mojo:properties-maven-plugin:1.0-alpha-2:read-project-properties (default-cli) on project SpringWebFlow:
The parameters 'files' for goal org.codehaus.mojo:properties-maven-plugin:1.0-alpha-2:read-project-properties are missing or invalid -> [Help 1]
I've changed pom.xml, removed the <execution> section and moved the <configuration> section:
<groupId>org.codehaus.mojo</groupId>
<artifactId>properties-maven-plugin</artifactId>
<version>1.0-alpha-1</version>
<configuration>
<files>
<file>src/main/resources/properties/app.properties</file>
</files>
</configuration>
Ok. now, when I run mvn properties:read-project-properties the error disappeared. But where in this case the property is saved? Cause when I start the following maven goal:
mvn liquibase:update
I can see that the ${env} property is not defined. Liquibase tries to use the src/main/resources/db/config/${env}-data-access.properties file.
What am I doing wrong? How to read a property from the file, so it could be accessible from different maven plugins?
The problem is that "mvn liquibase:update" is a special plugin goal and is not part of the maven life cycle. So it never passes the initialize phase and so the property plugin is not executed.
The following will work
mvn initialize liquibase:update
One solution would be to call liquibase:update in one of the maven lifecylce phases like compile, package ..., but then it would be executed on every build.
Or you use the maven-exec plugin to call "initialize liquibase:update" from maven. Or you create a profile were you bind the liquibase:update to the lifecylce phase initialize and the udate is executed when you call
mvn initialize -Pliquibase
I do not know a better solution to this problem and I could not find a suitable solution for this.
For reference:
Maven lifecycle

Categories

Resources