In our infrastructure, we have lots of little Java projects built by Maven2. Each project has its own pom.xml that ultimately inherits from our one company "master" parent pom.
We've recently started adding small profiles to our parent pom, disabled by default, that, when enabled, execute a single plugin in a conventional manner.
Examples:
The 'sources' profile executes the maven-source-plugin to create the jar of project sources.
The 'clover' profile executes the maven-clover2-plugin to generate the Clover report. It also embeds our Clover license file so it need not be re-specified in child projects.
The 'fitnesse' profile executes the fitnesse-maven-plugin to run the fitnesse tests associated with the project. It contains the fitnesse server host and port and other information that need not be repeated.
This is being used to specify builds in our CI server like:
mvn test -P clover
mvn deploy site-deploy -P fitnesse,sources
and so on.
So far, this seems to provide a convenient composition of optional features.
However, are there any dangers or pitfalls in continuing on with this approach (obvious or otherwise)? Could this type of functionality be better implemented or expressed in another way?
The problem with this solution is that you may be creating a "pick and choose" model which is a bit un-mavenesque. In the case of the profiles you're describing you're sort of in-between; if each profile produces a decent result by itself you may be ok. The moment you start requiring specific combinations of profiles I think you're heading for troubles.
Individual developers will typically run into consistency issues because they forget which set of profiles should be used for a given scenario. Your mileage may vary, but we had real problems with this. Half your developers will forget the "correct" combinations after only a short time and end up wasting hours on a regular basis because they run the wrong combinations at the wrong time.
The practical problem you'll have with this is that AFAIK there's no way to have a set of "meta" profiles that activate a set of sub-profiles. If there had been a nice way to create an umbrella profile this'd be a really neat feature. Your "fitnesse" and "sources" profiles should really be private, activated by one or more meta-profiles. (You can activate a default set in settings.xml for each developer)
There isn't a problem with having multiple profiles in Maven, in fact I think they are an excellent way of allowing your build to enable and disable classes of functionality. I'd recommend naming them based on their function rather than the plugin though, and consider grouping functionally related plugins in the same profile.
As a precedent for you to follow, the Maven super POM has a "release-profile" defined, which includes configurations for the source, javadoc, and deploy plugins.
You should consider following this approach, so your "fitnesse" profile would become "integration-test", and you could choose to define additional plugins in that profile if needed at a later date. Similarly the "clover" profile could be renamed "site", and you could define additional reports in that profile, e.g. configurations for the JDepend, JXR, PMD plugins.
You seem slightly suspicious towards that approach, but you're not really sure why - after all, it is quite convenient. Anyway, that's what I feel about it: I don't really know why, but it seems somewhat odd.
Let's consider these two questions:
a) what are profiles meant for?
b) what are the alternative approaches we should should compare your approach with?
Regarding a), I think profiles are meant for different build or execution environments. You may depend on locally installed software, where you would use a profile to define the path to the executable in the respective environments. Or you may have profiles for different runtime configurations, such as "development", "test", "production".
More about this is found on http://maven.apache.org/guides/mini/guide-building-for-different-environments.html and http://maven.apache.org/guides/introduction/introduction-to-profiles.html.
As for b), ideas that come to my head:
triggering the plug-ins with command line properties. Such as mvn -Dfitnesse=true deploy. Like the well known -DdownloadSources=true for the eclipse plugin, or -Dmaven.test.skip=true for surefire.
But that would require the plugin to have a flag to trigger the execution. Not all the plug-ins you need might have that.
Calling the goals explicitly. You can call several goals on one command line, like "mvn clean package war:exploded". When fitnesse is executed automatically (using the respective profile), it means its execution is bound to a lifecycle phase. That is, whenever that phase in the lifecycle is reached, the plugin is executed.
Rather than binding plugin executions to lifecycle phases, you should be able to include the plugin, but only execute it when it is called explicitly.
So your call would look like "mvn fitnesse:run source:jar deploy".
The answer to question a) might explain the "oddness". It is just not what profiles are meant for.
Therefore, I think alternative 2 could actually be a better approach. Using profiles might become problematic when "real" profiles for different execution or build environments come into play. You would end up with a possibly confusing mixture of profiles, where profiles mean very different things (e.g. "test" would denote an environment while "fitnesse" would denote a goal).
If you would just call the goals explicitly, I think that would be very clear and flexible. Remembering the plugin/goal names should not be more difficult that remembering the profile names.
Related
How to make maven/gradle package change what it’s exported by scope? Is it possible?
Like, use like this
<dependency>
<groupId>org.blahblah</groupId>
<artifactId>anything</artifactId>
<version>5.8</version>
<scope>test<scope/>
</dependency>
To get different binaries by use this another way
[...]
<scope>compile<scope/>
[...]
yes and no - sort of.
The general guideline is to create one artifact per pom.xml - most tools work quite nice with this concept. As soon as you go beyond sometimes funky stuff happens. Changing the jar only by scope isn't possible afaik. It would also confuse people a lot. And probably will make troubleshooting very difficult.
But there is a workaround. As you mentioned tests: the jar plugin allows you to export the classes in src/test/java as test-jar and use that as dependency specifying a type.
See How to create a test-jar.
I assume the same mechanism with the type can be used for other things as well.
There is also the concept of classifiers (this is usually used for sources, javadoc and things like that). See this question.
While these things tend to work with maven on the command line, IDEs sometimes start to behave a bit weird if you push type and classifier usage too far.
I'm building a project using maven and I need to ensure it works over Java SE and different Java EE containers. There're already integration tests written for WildFly container. And now I'm moving to Java SE.
But face to this multiple test environments, how should I handle them in maven ? Should I use <profile>, <module> or something else ?
<profile> is useful to switch between different profiles and each of them can have their specific dependencies. So in my case, there might be profiles: wildfly-embedded, wildfly-managed, java-se etc. But I need to ensure the project works on every profile, is it possible to run all the profiles in one command ?
<module> can handle project inheritance. After reading the post SO • Why and when to create a multi-module Maven project?, I'm still confused about if I should use it in my case.
Can somebody give me some ideas ? That will be very helpful, thanks.
If you keep in mind that the resulting artifact should always be the same no matter the activated profiles, then you should understand that profile is not the correct solution (although it is very often abused for it. Don't try to go for that advice!)
Configuration should be outside the artifact, so you can reuse the same artifact over and over again. Since many people ask for the proper solution with embedded configuration, Karl Heinz created https://github.com/khmarbaise/multienv-maven-plugin . This is probably the closest you'll get to a valid Maven project setup.
I'm working on source code that is split across several projects with a specifically defined build order. I want to see the projects sorted by the build order so I can always tell which classes can be used in which projects. Does anyone know how to do this in Eclipse Kepler?
If you are not already using working sets in Eclipse, they provide a good way to organize your projects. The organization is single-level, rather than hierarchical, but you can group projects and then quicly select in the project explorer view settings whether you want the working sets to be shown or not. A potentially useful detail is that a project may be contained in multiple working sets, so that you can have multiple grouping criteria at the same time.
In your case, you could define a working set for each phase of your build, prefixing its name with a letter or number that would ensure its presentation in a specific order. Or you could define a working set for each set of projects with the same dependencies.
Alternatively, you might be able to just rename your projects appropriately. In many cases the project name itself is mostly cosmetic, although it is often used as a default in generated files.
In my opinion, however, the easiest way to "tell which classes can be used" is to just configure your project build paths correctly and let the editor do the rest. For me it is more natural not to use a class because it is not proposed for auto-completion or because any such use results in a compiler error, rather than explicitly checking the dependencies each and every time...
IMHO there is no such feature in eclipse. But you can use Resource Tagger plugin or Resource decorator plugin and filter the resources based on different conditions.
I'm developing a Maven plugin that will have provide 5 goals. You can either execute goals 1-4 individually, or execute goal5, which will execute goals 1-4 in sequence. I've been looking for a way to reuse (i.e. invoke) one Maven goal from within another, but haven't found it yet.
Of course, I could just have goalX delegate to ClassX for most of it's functionality, then when goal5 is invoked, it delegates to Class1...Class4, but this still involves a certain amount of code duplication in terms of specifying, reading and validating each goal's configuration.
Is there a way to reuse one goal within another?
Thanks,
Don
Is there a way to reuse one goal within another?
AFAIK, the Maven API doesn't offer any facility for this because the Maven folks don't want to promote a practice leading to strong coupling between plugins which is considered as bad. You'll find background on that in Re: calling plugin in another plugin?.
That being said, this blog post shows how you could instantiate a Mojo and use reflection to set its field before to call execute.
You might also want to check the mojo-executor library.
But be sure to read the mentioned thread, I think it's important.
Of course, I could just have goalX delegate to ClassX for most of it's functionality, then when goal5 is invoked, it delegates to Class1...Class4, but this still involves a certain amount of code duplication in terms of specifying, reading and validating each goal's configuration.
So then why not provide a common class for your other classes for the purpose of goal validation? I think the easiest thing to do here is to have one goal invoke the other in your code.
The "Maven mindset" appears to be that configuration is the responsibility of the pom.xml author, not the Mojo implementor. If you move all your configuration and such into a common base class, you end up bypassing this mechanism.
It kind of sounds like what you want are sub-projects: Each of your goals 1-4 live in their own project, or you can run goal 5, which runs them all. Perhaps this might help?: http://i-proving.com/space/Technologies/Maven/Maven+Recipes/Split+Your+Project+Into+Sub-Projects
If your source trees don't split nicely along project lines, you might be able to do something with profiles (though I haven't tried this). Check out the accepted answer here: How to bind a plugin goal to another plugin goal.
I want to run my unit tests automatically when I save my Eclipse project. The project is built automatically whenever I save a file, so I think this should be possible in some way.
How do I do it? Is the only option really to get an ant script and change the project build to use the ant script with targets build and compile?
Update I will try 2 different approaches now:
Running an additional builder for my project that executes the ant target test (I have an ant script anyway)
ct-eclipse, recommended by Thorbjørn
For sure it it unwise to run all tests, because we can have for example 20.000 tests whereas our change could affect only, let's say 50 of them, among which are tests for the class we have changed and tests for classes that collaborate with our class.
There is an unseful plugin called infinitetest http://improvingworks.com/products/infinitest/ which runs only some tests ( related to class we've changed ) just after we save changes. It also integrate quite nicely with editor ( using annotations ) and problem view - displaying not-passing tests like errors.
Right click on your project > Properties > Builders > New, and there add your ant ant builder.
But, in my opinion, it is unwise to run the unit tests on each save.
See if Eclipse has a plugin for Infinitest.
I'd also consider TestNG as an alternative to JUnit. It has a lot of features that might be helpful in partitioning your unit test classes into shorter and longer running groups.
I believe you are looking for http://ct-eclipse.tigris.org/
I've experimented with the concept earlier, and my personal conclusion was that in order for this to be useful you need a lot of tests which take time. Personally I save very frequently so this would happen frequently, and I didn't find it to be an advantage. It might be different for you.
Instead we bit the bullet and set up a "build server" which watches our CVS repository and builds projects as they change. If the compilation fails or the tests fail we are notified quickly so we can remedy it.
It is as always a matter of taste what works for you. This is what I've found.
I would recommend Inifinitest for the described situation. Infinitest is nowadays a GPL v3 licensed product. Eclipse update site: http://infinitest.github.com
Then you must use INFINITEST. INFINITEST helps you to do Continuous Testing.
Whenever you make a change, Infinitest runs tests for you.
It selects tests intelligently, and only runs the ones you need. It reports unit test failures like compiler errors, and provides additional information that helps you write better tests.