I have a Java web project that we deploy on the server of two different customers, 99% of the code is the same, right now I have two ApplicationBuilders, which is the class that contains the customization for each customer.
Anytime I want to deploy the new version I have to manually comment a line, build (with maven), uncomment that line, comment the other one and build again.
public class ApplicationBuilderFactory {
private static final IApplicationBuilder app;
static {
// app = new Customer1ApplicationBuilder()
app = new Customer2ApplicationBuilder();
}
}
public static IApplicationBuilder get() { return app; }
}
I want to avoid all this and the best thing would probably just create two different wars.
What's a good way to do this? I don't use (nor like) dependency injection frameworks and it seems overkill to add one just for a single class, but I may consider it.
One way to approach this is to use the Maven WAR Plugin Overlays feature.
Instead of trying to build multiple artifacts from one project (which can become unwieldy after a while), you create one base WAR project, and then a separate WAR project for each customer that only contains the components that need to be different.
Each customer specific WAR will be overlaid with the base WAR. This will make it easier to customise not only the ApplicationBuilderFactory but also specific web content and assets.
This also has the following benefits
customer specific features are guaranteed to be isolated from each other;
different customers can have their own release cycle and source control repository
it's easy to add subsequent customers
Create 2 different Maven Profiles, one for each customer, that copies a version of class ApplicationBuilderFactory to the right directory before compile stage.
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.7</version>
<executions>
<execution>
<id>copy-files</id>
<phase>generate-sources</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<target name="copy files">
<copy file="${project.build.sourceDirectory}/pkg/ApplicationBuilderFactory.java.${extension}" tofile="${project.build.sourceDirectory}/pkg/ApplicationBuilderFactory.java" />
</target>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
<profiles>
<profile>
<id>customer1</id>
<properties>
<extension>customer1</extension>
</properties>
</profile>
<profile>
<id>customer2i</id>
<properties>
<extension>customer2</extension>
</properties>
</profile>
Instead of having only one src/main/java/pkg/ApplicationBuilderFactory.java, we have:
src/main/java/pkg/ApplicationBuilderFactory.java.customer1
src/main/java/pkg/ApplicationBuilderFactory.java.customer2.
So before compiling java code, we copy one of these versions to the src/main/java/pkg/ApplicationBuilderFactory.java.
So generate 2 different .wars using 2 different profiles.
Related
In Spring Boot, you can do the following:
src/main/resources/META-INF/spring.factories
# Auto Configure
org.springframework.boot.autoconfigure.EnableAutoConfiguration=\
org.springframework.boot.autoconfigure.admin.AConfiguration,\
org.springframework.boot.autoconfigure.admin.BConfiguration,\
org.springframework.boot.autoconfigure.admin.CConfiguration,\
org.springframework.boot.autoconfigure.admin.DConfiguration,\
org.springframework.boot.autoconfigure.admin.EConfiguration,\
org.springframework.boot.autoconfigure.admin.FConfiguration,\
Which is very nice. However after a year of development the list of auto configuration is now > 15 lines, which makes it hard to manage.
Would like to know if it is possible to separate the spring.factories into multiple files? Preferably would like to keep the whole project in one JAR.
Or maybe there is another ways to help organize the EnableAutoConfiguration that I am not aware of?
Thanks in advance!
While using spring-boot we use multiple "starters", each with an auto-configuration and spring.factories file.
So, one way could be to split your project into modules - one for each auto-configuration, define a dedicated spring.factories file in the module, and import all the modules as a runtime dependency in the main application module.
You can use maven or gradle to manage the multi-module project and the dependencies among them:
Gradle: https://guides.gradle.org/creating-multi-project-builds/
Maven: https://www.baeldung.com/maven-multi-module
Example:
root
moduleA
src/main/resources/META-INF/spring.factories
moduleB
src/main/resources/META-INF/spring.factories
and so on...
I have found a solution for this question.
Note: This exact solution assume that you only used EnableAutoConfiguration in your spring.factiores, it would crash if you use more than one type of config inside spring.factories.
One can do:
src/main/resources/META-INF/spring.factories
src/main/resources/META-INF/spring-2.factories
src/main/resources/META-INF/spring-3.factories
src/main/resources/META-INF/spring-4.factories
and merge this into one file.
Note, I am using Maven Antrun but I suspect Gradle would also have a similar feature.
In your pom.xml, add the following:
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.8</version>
<executions>
<execution>
<id>default-ci</id>
<goals>
<goal>run</goal>
</goals>
<phase>process-resources</phase>
<configuration>
<target>
<replace token='org.springframework.boot.autoconfigure.EnableAutoConfiguration=' value=','
dir="${project.build.directory}/classes/META-INF">
<include name="spring-*.factories"/>
</replace>
<concat destfile="${project.build.directory}/classes/META-INF/spring.factories" overwrite="yes" append="yes">
<fileset dir="${project.build.directory}/classes/META-INF" includes="spring-*.factories" />
</concat>
</target>
</configuration>
</execution>
</executions>
</plugin>
And in spring.factories is the normal config:
# Auto Configure
org.springframework.boot.autoconfigure.EnableAutoConfiguration=\
org.springframework.boot.autoconfigure.admin.AConfiguration,\
org.springframework.boot.autoconfigure.admin.BConfiguration
But in spring-2.factories and others, you start with ,\ instead of the default statement:
spring-2.factories:
org.springframework.boot.autoconfigure.EnableAutoConfiguration=\
org.springframework.boot.autoconfigure.admin.CConfiguration
spring-3.factories:
org.springframework.boot.autoconfigure.EnableAutoConfiguration=\
org.springframework.boot.autoconfigure.admin.DConfiguration
After all that, the outcome spring.factories in your output class directories will be a very nice:
# Auto Configure
org.springframework.boot.autoconfigure.EnableAutoConfiguration=\
org.springframework.boot.autoconfigure.admin.AConfiguration,\
org.springframework.boot.autoconfigure.admin.BConfiguration,\
org.springframework.boot.autoconfigure.admin.CConfiguration,\
org.springframework.boot.autoconfigure.admin.DConfiguration
I have packaged a number of composite components in a JAR. However, when using them in another project (using Maven), Netbeans editor puts red error lines under lines which use the composite component, even though the project compiles and runs as expected.
The folder structure for the composite component JAR look like:
compositeComponent.jar
META-INF
faces-config.xml
highcharts-taglib.xml
MANIFEST.MF
web.xml
maven
// maven stuff.
resources
highcharts
Chart.xhtml
Series.xhtml
Tooltip.xml
nz
co
kevindoran
highcharts
example
NZPopulationTrend.class
The highcharts.taglib.xml looks like:
<facelet-taglib version="2.0" xmlns="http://java.sun.com/xml/ns/javaee"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-facelettaglibrary_2_0.xsd">
<namespace>http://nz.co.kevindoran/highcharts-jsf</namespace>
<composite-library-name>highcharts</composite-library-name>
</facelet-taglib>
[Side note: The faces-config.xml and web.xml are present to allow the 'JAR' to be deployed as a WAR by changing the file extension to WAR (this is to done to run the examples).]
In my current project, I have specify a Maven dependency on the above project like so:
<dependency>
<groupId>nz.co.kevindoran</groupId>
<artifactId>jsf-menu</artifactId>
<version>1.0-SNAPSHOT</version>
</dependency>
In a JSF page, I use on of the composite components like so:
<html xmlns="http://www.w3.org/1999/xhtml"
xmlns:hc="http://nz.co.kevindoran/highcharts-jsf">
....
<hc:TimeChart title="Price Over Time" xLabel="Date" yLabel="Sold Price (NZD)">
<hc:TimeSeries name="Sold" series="#{cc.attrs.model.priceVsTimeChart.soldSeries}"/>
</hc:TimeChart>
....
</html>
Red error lines appear under all lines above, with message: "No library found for namespace http://nz.co.kevindoran/highcharts-jsf"
How do I get these error lines to be removed? I have seen many Netbeans bug reports for similar issues, but all seem resolved.
This error occurs on Netbeans 7.1, 7.2 and 7.3 (including 7.3.1).
I have absolutely the same problem. In my case it depends on the /src/main/java folder. If it's exist (only in the project and not even in the jar) the project which includes this library shows the "No library found for namespace... "
When i remove the "java" folder it works. But then my backing bean class is missed in the jar...
Tried with Netbeans 7.2 and 7.3, maven 2
Solution:
Generate a second project which contains the Java source files. (called: jsf-lib-java)
In jsf-lib project (your composite component project with xhtml) delete the "java" folder and all *.java sources.
add in the jsf-lib pom.xml following configuration:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>unpack</id>
<phase>generate-resources</phase>
<goals>
<goal>unpack</goal>
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>com.mycompany.project</groupId>
<artifactId>jsf-lib-java</artifactId>
<version>1.0-SNAPSHOT</version>
<type>jar</type>
<overWrite>true</overWrite>
<outputDirectory>src/main/</outputDirectory>
<includes>**/*.class</includes>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>
That's it. This will generate a "good" jar file with the required *.class files. So it's possible to "trick" Netbeans.
Now i work with this solution. It's a hack but didn't found a better solution.
I want to use Maven to handle artifact generation for the different local and testing regions. I believe I can use different profiles but I am not certain.
In Maven can I select different directories to select files used on packaging (such as application.properties)? How would I set that up?
An idea of what I want is to have a the following folders for resources in my project
local
build server
dev
sys
prod
Each folder should contain a different version of application.resources which is a file in Spring that can be used to handle hard-coded strings for use in variables. For local builds- our developers also work on different operating systems. Should I require I want to make it seamless on different OS' also.
Key outcomes would be:
Control Maven lifecycle phases from inside the IDE (IntelliJ)
Not complicate phases and team processes
Keep things as consistent for each developer
Make the different configurations per developer/region appear invisible when running a phase e.g. install
Ideally I would have my project set up according to best practices (Duvall, Matyas, Glover).
We provide different properties currently but not by way of different folders. We do this
via a mix of
Spring's PropertyPlaceholderConfigurer
Maven profiles (something we use to build our Dev environment),
Build Server (TeamCity in our case)
Maven phases to produce the correct artifact
start-up and build arguments
My understanding of what we do is limited, but hopefully this serves as a useful example for others and maybe myself to consider.
We provide parameters, as you'll see below, to point to different property files.
Each property file has configuration for a region/environment. I'll explain the current use
as best I can in-case it provides some use to others.
To use Maven profiles we have created a profile in our pom identified as development which includes a region configuration property called env. I don't yet know entirely how that is being used yet in our project however you'll see below our POM includes the a Maven Compiler plugin and a Maven Tomcat plugin.
Day to day, as developers we run our our applications locally on Tomcat from within IntelliJ
and provide the env property. On start-up the env property is provided as an argument to
set to classpath*:dev-common.properties.
This file is a properties configuration file - setting placeholder values for our different
regions.
The value of env is made available to our PropertyPlaceholderConfigurer
Example 1 - Implementation of Maven profile in pom.xml:
The implementation of a profile in our pom is:
<profile>
<id>development</id>
<activation>
<property>
<name>env</name>
<value>development</value>
</property>
</activation>
<build>
<pluginManagement>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>1.6</source>
<target>1.6</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.tomcat.maven</groupId>
<artifactId>tomcat7-maven-plugin</artifactId>
<version>2.0-SNAPSHOT</version>
...
</plugin>
</plugins>
</pluginManagement>
</build>
</profile>
Example 2 - Property placeholder configurer for normal build:
We also make use ofa Spring component, a PropertyPlaceholderConfigurer. We use this in collaboration with a build argument to set up a classpath pointer to resource files.
<bean id="propertyConfigurer" class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
<property name="locations">
<list>
<value>
${env}
</value>
</list>
</property>
Example 3 - Property placeholder configurer for test:
We have Spring Contexts specifically set up for integration testing which also use the PropertyPlaceholderConfigurer. These are picked up by a integration testing class using a combination of #ContextConfiguration(locations = {"classpath:test-dataexchange-application-context.xml"}) and #RunWith(SpringJUnit4ClassRunner.class)).
In the testing context we configure the PropertyPlaceholderConfigurer as follows to pick up the properties of an integration testing region:
<bean id="testpropertyConfigurer" class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
<property name="locations">
<list>
<value>classpath*:dev-local-common.properties</value>
</list>
</property>
Other notes:
We make use of Team City for build management but I have not seen how these settings are
used there, if they are at all. I can conceive there's an ability to combine the above approaches together to aid Continuous Integration and Delivery.
I do not see where the profile identified as development is being used. It is something I
must follow up with my fellow team members.
Resources:
Building for different environments at the Maven Project site.
Maven 3 does not allow configuration of a profile outside of a pom or settings.xml (the Maven configuration file) and says that users who used these external settings should now put them inside of settings.xml
If you are using Spring boot, there is an easy way of doing this.
Create two profiles in maven, and set a property in each profile with the name of the Spring profile you want to execute.
<profile>
<id>dev</id>
<activation>
<activeByDefault>true</activeByDefault>
</activation>
<properties>
<!-- Default spring profile to use -->
<spring.profiles.active>dev</spring.profiles.active>
<!-- Default environment -->
<environment>develop</environment>
</properties>
</profile>
Inside your application.properties, add this property:
spring.profiles.active=${spring.profiles.active}
Create an application.property for each profile, using this pattern application-profile.properties. For example:
application-dev.properties
application-prod.properties
Be sure to active filtering in the resource plugin:
...
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
</resource>
...
Another way is to create a file during the execution of maven called activeprofile.properties. Spring boot looks this file to load the active profile. You can create this file as follows:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<phase>prepare-package</phase>
<configuration>
<target>
<echo message="spring.profiles.active=${spring.profiles.active}" file="target/classes/config/activeprofile.properties" />
</target>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
<configuration>
</configuration>
</plugin>
Aim to generate an artifact for each environment at one time on the central server (CI/Build server), aim to generate an artifact and start/test the application with one click locally, provide a consistent easy to learn way to check out and run your build, and check in and configure your CI.
You can use profiles in Maven and utilize Maven targets to achieve the right build using a build server which in our case is TeamCity.
Use property placeholder configurer in Spring context with an application.resources file for each region and a filename mask e.g. application-resources-${region}.
I have a maven plugin that should run in the compile phase, so in the project that consumes my plugin, I have to do something like this:
<executions>
<execution>
<phase>compile</phase>
<goals>
<goal>my-goal</goal>
</goals>
</execution>
</executions>
What I need is to by default attach my-goal to the compile phase if the user has included my plugin already (ideally the above part wouldn't be necessary, just the plugin declaration).
Is this possible?
Put an #phase annotation in your Mojo classdef annotations.
The doc says:
#phase <phaseName>
This annotation specifies the default phase for this goal. If you add an execution for this goal to a pom.xml and do not specify the phase, Maven will bind the goal to the phase specified in this annotation by default.
If this doesn't work, I guess a JIRA is warranted.
Create an instance of src\main\resources\META-INF\plexus\components.xml in your plugin.
In there create a LifeCycle mapping for the artifact types that your want your Mojo to support. Make sure that it lists all the phases and plugins you want to support. Probably best to copy from the one from maven-core.jar.
Then add your plugin in to the appropriate LifeCycle(s) at the phase at which you want them built.
Eg the consume-aar Mojo added into the compile phase of the aar lifecycle.
<!-- Android archive (aar) support -->
<component>
<role>org.apache.maven.lifecycle.mapping.LifecycleMapping</role>
<role-hint>aar</role-hint>
<implementation>
org.apache.maven.lifecycle.mapping.DefaultLifecycleMapping
</implementation>
<configuration>
<phases>
<generate-sources>
com.jayway.maven.plugins.android.generation2:android-maven-plugin:generate-sources
</generate-sources>
<process-resources>org.apache.maven.plugins:maven-resources-plugin:resources</process-resources>
<compile>
com.jayway.maven.plugins.android.generation2:android-maven-plugin:consume-aar,
org.apache.maven.plugins:maven-compiler-plugin:compile
</compile>
This is possible, but it is an undocumented maven feature.
Use this components.xml:
<component-set>
<components>
<component>
<role>org.apache.maven.lifecycle.Lifecycle</role>
<implementation>org.apache.maven.lifecycle.Lifecycle</implementation>
<role-hint>myplugin</role-hint>
<configuration>
<id>accurest</id>
<phases>
<phase>my-plugin-not-used-phase</phase>
</phases>
<default-phases>
<compile>
my.package:my-plugin:${project.version}:my-goal
</compile>
</default-phases>
</configuration>
</component>
</components>
but your plugin need to be added with <extensions>true</extensions> to modify existing lifecycle.
More: How to bind plugin mojos (goals) to few phases of default lifecycle?
Real project: https://github.com/spring-cloud/spring-cloud-contract/blob/master/spring-cloud-contract-tools/spring-cloud-contract-maven-plugin/src/main/resources/META-INF/plexus/components.xml
You associate plugin to maven lifecyle goal. The plugin configuration should be declared in phase.
For example if you wan to run some plugin during build phase you'll need to do something like this :
<project>
...
...
<build>
<plugin>
**Configuration of plugin**
</plugin>
</build>
</project>
Please read carefully about maven lifecycles here (it is fundamental for understanding of maven):
http://maven.apache.org/guides/introduction/introduction-to-the-lifecycle.html
And then read about how to configure a plugin : http://maven.apache.org/guides/mini/guide-configuring-plugins.html
P.S. Getting into logic of maven is not easy at the beginning. But it is rewarding afterwards.
We are using ANT for our build process and don't have plans to change this in the near future.
Is it possible to use Maven to just fetch common Open Source jar files (e.g. Log4J, SWT, JFace) and put them in the right location of our project, so we don't have to store them in our version control — preferable without creating the typical Maven-cache in the home directory?
NO NO NO Everyone!
If you're using Ant, the best way to use Maven repositories to download jar dependencies is to use Ivy with Ant. That's exactly what Ivy is for.
Installing Ivy and getting to work with current Ant projects is simple to do. It works with Nexus and Artifactory if you use those as your local Maven repositories.
Take a look at Ivy. It is probably exactly what you want.
In variation of org.life.java's answer, I would not do mvn install.
Instead, in the pom.xml I would add the following bit:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.1</version>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<outputDirectory>${project.build.directory}/lib</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
Now you just need to do mvn generate-sources, which is a lot faster than the full mvn install, and all dependencies will be copied to the specified directory.
Oh btw, isn't that what Apache Ivy is about? Extending Ant to understand Maven's dependency management?
It's possible, you should use maven-ant-tasks.
In particular its dependencies ant task. With this setup no Maven install is required.
<?xml version="1.0" encoding="UTF-8"?>
<project
name="download-dependency"
basedir="."
default="download-dependency"
xmlns:artifact="antlib:org.apache.maven.artifact.ant"
>
<target name="download-dependency">
... define properties ...
<taskdef
resource="org/apache/maven/artifact/ant/antlib.xml"
uri="antlib:org.apache.maven.artifact.ant"
/>
<artifact:dependencies>
<localRepository path="${local-repo.dir}"/>
<remoteRepository id="central" url="${repository-uri}"/>
<dependency
groupId="${groupId}"
artifactId="${artifactId}"
version="${version}"
type="${type}"
classifier="${classifier}"
scope="runtime"
/>
</artifact:dependencies>
</target>
</project>
The only binary you should check into your project is maven-ant-tasks.jar.
Actually in our project I used Sonatype Nexus ( documentation ) Maven repository manager to centralize access to different repositories and even maintain some binaries unique to our environment. With Nexus' help I just fetch maven-ant-tasks.jar with ant's <get> task from a known URL. You don't have to use Nexus, but it greatly speeds up builds, because it caches binaries close to your developer's machines.
Ivy does just this,
when it bootstraps itself:
http://ant.apache.org/ivy/history/latest-milestone/samples/build.xml
<property name="ivy.install.version" value="2.0.0-beta1"/>
<property name="ivy.jar.dir" value="lib"/>
<property name="ivy.jar.file" value="${ivy.jar.dir}/ivy.jar"/>
<target name="resolve" unless="skip.download">
<mkdir dir="${ivy.jar.dir}"/>
<echo message="installing ivy..."/>
<get src="http://repo1.maven.org/maven2/org/apache/ivy/ivy/${ivy.install.version}/ivy-${ivy.install.version}.jar" dest="${ivy.jar.file}" usetimestamp="true"/>
</target>