Maven dependency based on argument - java

I have a Maven project with a number of modules. When building, I have an argument that determines which directory config files and such are copied from, depending on environment it will be running in - ie UAT, DEV, TEST, etc. I do not want to use profiles. Now, I want to package all integration tests into a separate jar that can be executed from command line as well as in integration-test phase. Basically there will be only one test class with one method that does something like
Class.forName("...").getMethod("main").invoke(null, args);
Only problem, is that since I do not want to use profiles Id have to add/remove the dependency on the test solution jar depending on if I want to run integration tests or not. I would like to do something like
mvn clean install -Denv=IT
and let it be. Is there a way to do so?

The standard mechanism for running different kinds of build in Maven is to use profiles (Maven is a highly opinionated build framework, so you are forced to play by its rules)
Your also appear to be building binaries to match the system you intend to deploy. This is generally a bad idea, you are better advised to look at some mechanism that allows the run-time configuration of your application. (In J2EE there is JNDI, but could be just a simple property file). This allows you to certify a single binary that ideally is pushed into a shared repository for sharing between development, test and production.

Related

Acceptance test within maven structure?

My question is about the directory location for a docker based acceptance test.
My project is a Spring Boot based command line application which extracts data from a table and builds a spreadsheet. It has unit tests and a JUnit based acceptance test. The JUnit runner for the acceptance test is a standard JUnit runner, not a Spring based runner.
Finally, I have an acceptance test structure which tests the Docker components against a dedicated DB2 instance created specifically for each test. At this point, there's a docker-compose file that:
Launches a DB2 container instance exclusively and solely for this test.
Launches a Flyway migration container to load test data.
Launches a container that does the above mentioned Spring Boot command line application.
After the close of the docker-compose, a comparison is done against the generated spreadsheet and an expected file. If they're byte for byte equivalent, the test is considered passed.
Given the acceptance test above is heavily docker laden and a few steps removed from the Java side, is it still appropriate to put this test under /src/test/acceptance?
There are many approaches to this. In general maven has two plugins: surefire and failsafe. They are very similar in terms of configurations, however surefire is mainly aimed for running unit tests, and failsafe is for integration tests.
So, first off you probably want to configure acceptance tests with a failsafe plugin. You will:
Run them during a different building phase (at least way after the unit tests run)
If your build was broken and some unit tests do not succeed you won't even want to run the acceptance/failsafe tests - it might save some build time.
You will get different reports for integration and unit tests (technically these plugins create different report folders surefire-reports and failsafe-reports)
Now to physically separate the tests you can:
Merely rely on the naming convention. These plugins look for tests named differently, say: SampleTest.class will be run with surefire while SampleTestIT.class will be run with the filesafe plugin. Of course its all customizable at the level of plugibs configurations in the pom file.
Usually unit tests are required to be put to the same package as a real class conceptually. For example: if you have a class Foo in com.myorg.Foo.java, so you place it in src/main/java/com/myorg/Foo.java and the corresponding unit test for it will be in src/main/java/com/myorg/FooTest.java. For integration tests is not usually the case so you can simply create a folder it or something and use run them with different plugins automatically, again because you'll name the tests differently.
Another approach is to separate the folders within the same module, it was already described above. So technically you maintain src/test/java and src/test/resources and next to it you will have something like src/it/java and src/it/resources. Probably you'll still want to use both surefire and failsafe plugin as I've described above. You'll still run both types of the tests in the same maven lifecycle.
The most "radical" approach is to separate the acceptance tests to different maven module. This will give you the ability to run the module with acceptance tests separately in a different build step. This might be handy in CI tool for example. Of course you can achieve a similar effect with properties or with maven profiles.
As you can refer in https://maven.apache.org/guides/introduction/introduction-to-the-standard-directory-layout.html
It is optimal to have separate directories for test. I would suggest put all tests under test and have separate folder, test/acceptance/docker-test or something, but overall it is up to you.
Separate folder does help to run and de-couple different tests.

Patching a classpath when running Surefire tests

We are developing code in the context of a legacy Java application that heavily uses static members and system properties, expecting files in various locations on the disk. The builds are run in Maven.
We are trying to allow unit testing of our code without having to deploy, configure and start the whole application. I have managed to do this by patching a small number of classes in the framework, providing my own variants of the relevant source files in Maven's test sources in src/test/java.
As a next step I would like to make this patch re-usable by providing a JAR file that can be pulled in as a test dependency on any project that develops a part of the larger application. I would like to deploy this via our normal binary repository.
Surefire offers an option to set <additionalClasspathElements>, but according to the documentation this works only with absolute paths and will add the dependency at the end of the class path.
In theory ordering the project dependencies correctly could work, but I cannot find any documentation on how that order works across multiple scopes. I would need Maven to guarantee that my test dependency is loaded before the runtime ones.
What is a reliable way of patching classes for a Surefire run by using a JAR pulled via Maven's dependency resolution mechanisms?

Maven - Separating Deployment & Project

What is the 'best practice' way of separating Maven deployment configuration from the build config?
I have a war project, that is built by Jenkins. I'd like Jenkins to deploy this to Elastic Beanstalk, but alas the best solution available at the moment is to use the beanstalk-maven-plugin.
I'm not sure it makes sense for the POM.xml to include information about deployment; after all, at build time that .war could end up anywhere.
In this situation, is there some way of using Maven modules to store the beanstalk-maven-plugin config in a separate POM to that of the actual software project?
I think you have to solutions.
Just add the beanstalk-maven-plugin definition to your regular pom.xml. The configuration can be stored in separate properties file or provided via system properties in command line (-D option). Add beanstalk goal to command line of maven in Jenkins. So, each build will be deployed on beanstalk. Alternatively you can define yet another project in Jenkins that just runs the deployment without compilation. You can run this deployment project on scheduled basis or via projects dependencies in Jankins.
Create yet another maven project. It will just run beanstalk plugin. I personally do not see serious advantages to do this.
I think about three things:
a. I'm not sure (I'm admit I was a bit busy trying to come up with 0.2.7-RC7), but I think the Elastic Beanstalk Configuration Files are supported in Java.
So it perhaps could be a good idea to separate (I admit managing config in Beanstalker is Boring)
b. Another option is using war overlays in maven-war-plugin's overlay feature, and create a war which depends on your other war.
In my personal case, if you ask, I do have a separate deployment profile in Maven, and that feature often come in handy

Using maven to produce production ready output

I have a muti-module maven project, and I created a new module that depends on 3 other modules. (I already have a web app maven module that produces a .war file, now I need this)
This module's output is a .jar, and it has a few resources also which are:
spring context xml file
properties file
Now I want to produce a production ready folder so I can upload it to my server. I am hoping maven can do this for me.
I need the following layout:
myjar.jar
/libs/ (the 3 other maven modules that are dependancies)
/resources
Also, there are some generic dependancies that my parent pom.xml have like slf4j/log4j/ that I also need to package.
It would be cool if I could add a switch to mvn that will produce this like:
mvn clean install production
I plan on running this on my server via the command line.
I think what you are looking for is a Maven Assembly:
https://maven.apache.org/plugins/maven-assembly-plugin/
You can use profiles to disable the generation of the assembly by default (can speed up the development process).
#puce is right in that you may be best to use the Assembly Plugin. What you can't do easily is add another lifecycle 'production' to maven. If you have time you could write a plugin to do this, but you might be better off using a profile called 'production' or 'prod-deploy' to enable the coping into place on the server.
mvn clean install -Pprod-deploy
One thing to remember with maven is that it is very good at building projects in using it's conventions, but it is pretty bad at actually script things to happen out side of the build lifecycle.
I have on several occasions used external scripting tools such as ant/python/bash and groovy to first run the build using mvn then to script the deployment in a more natural language.
The intention of Maven is building not deployment in the sense to production. For this purpose i would recommend things like Chef or Puppet. From a technial point of view it's of course possible to handle such things via Maven. What also possible to build on CI solution like Jenkins. Furthermore it's possible to run a script from Jenkins to do the deployment on production.

Why do I need Maven if I use Eclipse?

I have seen that if I right click on a project in Eclipse and choose to run it on a server, then I can see output which means the project is running.
If everything is working fine without Maven, what's the point of using it. How is it different than simply running it via eclipse?
Maven is a build tool (build manager, in fact), similar to ANT. The main job of any build tool is configure the project, compile using required projects and do the final packaging. A build script in your project gives a blue-print of project's deliverable structure. This frees you from any configurable dependencies on specific IDE like Eclipse. All you need to know is the standard command to perform the build and you can build your code almost anywhere.
Now, back to your question, why wouldn't do it in Eclipse?
For a simple project and small team Maven is an overkill. You can easily communicate the configuration, IDE to use, and instruct any special steps to be taken. In big projects, however, there exits lots of loosely coupled dependencies. To start with, there will be different settings for developer machine build, test build and production build. There are requirements to run automated test, integration tests, store the build package (artifact) to a commonly accessible repository, update versions of various modules.
Obviously, if all the steps mentioned above is done manually there are chances of missing a step. Moreover, the manual process is time consuming.
Ideally, you should prefer a tool which fits the best for you. If you think that you're able to achieve what you required without Maven, it makes sense to not to use Maven/build-tool just because everyone uses it.
It is suggested to study automated deployment, this will give you bigger picture on what all the stuffs that you can do with build tools. And if you do not feel that it adds any value to your current process, you probably don't need Maven or any other build tool right now.
Your question does not make much sense. Do you expect your users to access your application from eclipse? If so that is a very strange set up in my opinion.
Perhaps your question should be about how to build your project. Maven provides you a way to centralize dependency libraries across the enterprise. It lets you automate your build process (most likely in conjunction with a CI server like hudson, cruise control, etc). It lets you automate your unit testing. Maven makes the packaging of app very easy to do. A developer does not have to follow arcane set of steps to package an application. You add the right plugin and maven takes care of it as part of the build life cycle. All of this magic can happen because of the principle of convention over configuration. There are many more benefits, I just named a few.
Maven is not replacing how you run the app, rather how you package the app, automate that process, and manage the dependencies of your app.
Some links on why someone should use maven:-
Why maven ? What are the benefits?
why I use Maven
Why you should use Maven
Use Maven

Categories

Resources