My Maven 2 project consists of several sub-modules. It s structured containing and EJB, WAR and Jar sub modules.. Now i want to instrument the packaged EAR that contains all submodules as mentioned.
Example:
Interface Maven Project consists of several modules:
- InterfacePOM --> parent pom
- InterfaceEAR --> EAR module which does not actual code but is the packaging for Domain, EJB and WAR
- InterfaceEJB --> EJB module
- interfaceWAR --> WAR module
- interfaceDomain --> JAR module
When using Cobertura i can succesfully instrument all various independent modules but that generated a .ser file per module. Is there a way to instrument an enitre EAR file at once? So that the result will be a single .ser file which i can use??
Short answer: no (have a look at this previous answer for more details). You'll need an external plugin like the dashboard (actually, don't use it, see my previous answer), XRadar or Sonar to aggregate the reports. But in this area, Sonar is the clear winner (this project just rocks) and I'd recommend it without any hesitation. Check out Nemo, their public instance, pick up any project and have a look at the drill down of code coverage (for example Apache CXF) to get an idea of what it can do.
UPDATE: It appears that I missed the point of the initial question so I'm updating my answer accordingly. Basically, I now understand the question as "how to instrument an ear with cobertura" and this is indeed a totally different story.
Unfortunately, while cobertura can instrument an ear, sar, zip, war, jar, I' don't think that the cobertura-maven-plugin supports this out-of-the-box and it may be a better option to use cobertura's ant task with the antrun plugin. See MCOBERTURA-86, this thread and this discussion for more background on this (and an antrun sample).
TBH, what you are trying to do is really not easy in terms of build lifecycle, packaging, reporting, etc and is going to be a tough task because of the lack of support from the cobertura plugin. I'd really think about it twice (time to invest to get the whole thing working vs the value generated) or consider spending that time (understand money) to get a clover license (which offers better support for this).
I implemented a solution now based on your previous answers around Maven 2. It is still not very easy to use but so far it goes OK. The implementation is as follows:
1. Modified my parent pom to generate cobertura instrumented classes when giving a specific profile.This generates the .ser files and the instrumented classes.
2. The instrumented classes are copied to the /target/classes folder by using the maven-resource-plugin, so the actual packaging uses the instrumented classes.
3. As there is no module wide .ser file i manually combine those .ser files from the EJB,WAR,JAR and ear file using the commandline solution given by Cobertura. coberture-merge.bat/.sh
4. Deploy the .ser file into my JBoss container and also deploy the instrumented EAR.
5. After testing i run a report on the merged .ser file and voila.. it seems to work..
I surely will look into Clover as the solution implemented is somewhat manual..
It looks like there is no task in the cobertura-maven-plugin for merging .ser files from individual projects into a single report.
A google search turned up this feature request and patch for the plugin to add a merge task, but it doesn't look like it was accepted. One of the comments suggests using the dashboard plugin to accomplish the same thing, you might have some success with that.
Related
We have a multi-module maven project. One of the modules has a bunch of .proto files, which we compile to java files. Pretty much every other module depends on this module. Most of them use Protobuf 2.4, but one needs to use 2.5.
Is there any nice way to do this? (The not nice way is to edit the pom file to say "2.5", build a jar, manually copy that jar to wherever we need it, and then change the pom file back to 2.4.)
Never used protobuf, but, as I understand it's a plugin that generate stuff.
So I'm gonna give you generic pointer hoping it will help.
I think you should either try to make 2 jar with different classifier from a single module, see https://maven.apache.org/plugins/maven-jar-plugin/examples/attached-jar.html
For example classifier proto2.4 and proto2.5
then you can add the classifier when you define the dependency to that module.
Other option I see is having 2 modules, the real one, you have now, and another one for 2.5
Generate a zip from the main one and the second module would be empty but have a dependency on the generated zip, unzip it and then compile with the plugin config for 2.5
Slower at execution, a bit dirtier imho, but can be needed if for example you need more customization than just the version.
I finally got things to work more or less how I wanted. I created a new module with JUST a pom file; that pom file refers to the proto files of the original module, but compiles them with the proto2.5 compiler and puts the result into its own directory.
In my new project I am confronted with a complex infrastructure with several modules which have grown over the years in an unpleasant, uncontrolled way.
To come to the point: The build process is the horror. There are over 40 different, complex Ant files, which are connected multiple times and the SOA framework also generates several dynamic Ant files. It took a few days to really understand all the dependencies and to finally build the whole project without any errors.
My plan was or is to migrate the whole project from Ant to Maven, since new components are planned and I would like to avoid these problems in the future and well, because it is just horrible the way it is now ;-)
Since I am new to the migration of bigger projects, I am a little bit confused about the best workflow. There are dozens of XML files and scripts involved, which are distributed in a non-Maven directory structure. Overall there are over 3000 files involved. One of the main problems is that I don't know if I really should try to migrate everything in the known Maven directory structure and therefore risk endless editing and refactoring of every single file. Or should I keep the folder structure as it is and bloat my pom.xml files and possibly run into problems with all the different involved plugins? Honestly, both ways don't sound quite constructive.
Does it even make sense to migrate a project in this dimension to Maven? Especially when the SOA framework must use its own Ant files - therefore a combination of Ant and Maven would be necessary. What would be the best strategy to simplify this process?
Thanks for all suggestions.
Here's a simple and quick answer to Mavenizing an Ant project:
DON'T DO IT!
This is not some anti-Maven screed. I use Maven, and I like Maven. It forces developers not to do stupid things. Developers are terrible at writing build scripts. They want to do things this way and not the way everyone else does. Maven makes developers setup their projects in a way that everyone can understand.
The problem is that Ant allows developers to do wild and crazy things that you have to completely redo in Maven. It's more than just the directory structure. Ant allows for multiple build artifacts. Maven only allows for one per pom.xml1. What if your Ant project produces a half dozen different jar files -- and those jar files contain many of the same classes? You'll have to create a half dozen Maven projects just for the jars, and then another half dozen for the files that are in common between the jars.
I know because I did exactly this. The head of System Architecture decided that Maven is new and good while Ant must be bad and Evil. It didn't matter that the builds worked and were well structured. No, Ant must go, and Maven is the way.
The developers didn't want to do this, so it fell to me, the CM. I spent six months rewriting everything into Maven. We had WSLD, we had Hibernate, we had various frameworks, and somehow, I had to restructure everything to get it to work in Maven. I had to spawn new projects. I had to move directories around. I had to figure out new ways of doing things, all without stopping the developers from doing massive amounts of development.
This was the inner most circle of Hell.
One of the reasons why your Ant projects are so complex probably has to do with dependency management. If you are like our current shop, some developer decided to hack together develop their own system of dependency management. After seeing this dependency management system, I now know two things developers should never write: Their own build files, and dependency management systems.
Fortunately, there is an already existing dependency management system for Ant called Ivy. The nice thing about Ivy is that it works with the current Maven architecture. You can use your site's centralized Maven repository, and Ivy can deploy jars to that repository as Maven artifacts.
I created an Ivy project that automatically setup everything for the developers. It contained the necessary setup and configuration, and a few macros that could replace a few standard Ant tasks. I used svn:externals to attach this Ivy project to the main project.
Adding the project into the current build system wasn't too difficult:
I had to add in a few lines in the build.xml to integrate our ivy.dir project into the current project.
I had to define an ivy.xml file for that project.
I changed any instance of <jar and </jar> to <jar.macro and </jar.macro>. This macro did everything the standard <jar/> task did, but it also embedded the pom.xml in the jar just like Maven builds do. (Ivy has a task for converting the ivy.xml file into a pom.xml).
I Ripped out all the old dependency management crap that the other developer added. This could reduce a build.xml file by a hundred lines. I also ripped out all the stuff that did checkouts and commits, or ftp'd or scp'd stuff over. All of this stuff was for their Jenkins build system, but Jenkins can handle this without any help from the build files, thank you.
Add a few lines to integrate Ivy. The easiest way was to delete the jars in the lib directory, and then just download them via ivy.xml. All together, it might take a dozen lines of code to be added or changed in the build.xml to do this.
I got to the point where I could integrate Ivy into a project in a few hours -- if the build process itself wasn't too messed up. If I had to rewrite the build.xml from scratch, it might take me a two or three days.
Using Ivy cleaned up our Ant build process and allowed us many of the advantages we would have in Maven without having to take a complete restructuring.
By the way, the most helpful tool for this process is Beyond Compare. This allowed me to quickly verify that the new build process was compatible with the old.
Moving onto Maven Anyway...
The funny thing is that once you have integrated your Ant projects with Ivy, turning them into Maven projects isn't that difficult:
Clean up the logic in your build.xml. You might have to rewrite it from scratch, but without most of the dependency management garbage, it's not all that difficult.
Once the build.xml is cleaned up, start moving directories around until they match Maven's structure.
Change source to match the new directory structure. You may have a WAR that contains *css files in a non-standard location, and the code is hardwired to expect these files in that directory. You may have to change your Java code to match the new directory structure.
Break up Ant projects that build multiple projects into separate Ant projects that each builds a single artifact.
Add a pom.xml and delete the build.xml.
1 Yes, I know this isn't entirely true. There are Maven projects with sub-projects and super poms. But, you will never have a Maven project that builds four different unrelated jars while this is quite common in Ant.
I have done a similar migration in the past, and I had the same doubts you had; however, I went for the "keep the folder structure intact and specify the paths in the POM files" way and I noticed it wasn't as bad as I thought.
What I actually had to do was to appropriately set the <sourceDirectory> and the <outputDirectory>and maybe add some inclusion and exclusion filters, but in the end I'd say that even if Maven's way is really convention-over-configuration-ish and makes your life easier if you follow its directives on where to place files, it doesn't really make it much harder if you don't.
Besides, something that really helped me when migrating was the possibility to divide the Maven project in modules, which I initially used to replicate the Ant structure (i.e. I had one Maven module for each build.xml file) making the first stage of the migration simpler, and then I changed the module aggregation to make it more meaningful and more Maven-like.
Not sure if this does actually make any sense to you, since I didn't have any generated Ant files which I recon may be the biggest issue for you, but I would definitely follow this road again instead of refactoring and moving files everywhere to Mavenize my project structure.
I currently manage a few separate Maven projects in which I use Protobufs as a serialization format and over the wire. I am using David Trott's maven-protoc plugin to generate the code at compile time.
All is good and well until I want those project to communicate between one another — or rather, use each other's protobufs. The protobuf language has an "import" directive which does what I want but I'm faced with the challenge of having project A exporting a ".proto" file (or possibly some intermediate format?) for project B to depend upon.
Maven provides a way for a project to bundle resources but AFAIK, these are meant to be used at runtime by the code and not by a goal during the compile / source generation phase — at least I haven't been able to find documentation that describes what I want to achieve.
I've found another way to achieve, and it doesn't involve any Maven magic. Diving into the code for the maven-protoc plugin, I found that this is a supported use case -- the plugin will look for and collect and .proto files in dependent jars and unpack them into a temporary directory. That directory is then set as an import path to the protoc invocation.
All that needs to happen is for the .proto file to be included in the dependency's package, which I did by making it a resource:
projects/a/src/main/resources/a.proto
Now in projects/b/pom.xml, add 'a' as a regular Maven dependency and just import a.proto from b.proto as if it existed locally:
b.proto:
import "a.proto";
This isn't ideal, since files names may clash between various projects, but this should occur rarely enough.
You can package your .proto files in a separate .jar/.zip in the project where they are generated, and publish them in your repository using a dedicated classifier. Using the assembly plugin might help here to publish something close to "source jars" that are built during releases.
Then, on projects using them, add previously created artifact as dependency.
Use the dependency plugin with the "unpack-dependencies" goal, and bind it to a phase before "compile".
I have 3 Java projects with the same entities.
I want to share entities between these projects because entities can evolve during the development phase.
We are thinking about building a jar with entities and sharing it using Maven (with a repository).
Maybe you have another solution ?
I also can recommend to use Maven to share code between projects.
Here are some tips to get started:
Use a Maven Repository Manager such as Nexus. It will help you to
create a stable development environment.
Every developer (also the Continuous Integration Server user) should configure their settings file to use your Maven Repository
Manager. Don't specify your repositories in the POMs, confiugre them
only in your Maven Repository Manager.
http://www.sonatype.com/books/nexus-book/reference/maven-sect-single-group.html
Use the dependencyManagement and pluginManagement elements of your parent POMs to specify all versions of the plugins and dependencies
you are using. Omit these versions in the other POMs (they will
inherit them from the parent POM).
I also recommend to use different POMs for multi-module builds and parent POMs.
If you want to share common interfaces, classes, functionality or components, Maven is the way to go. In addition to the dependency management, you also get the added bonus of a standard project layout that will simplify things. Easy integration with most common continuous integration servers and a standard release process are further benefits.
Definitely take a look at Maven!
making an own JAR-library is definitely a good solution.
The jar-file is easy to distribute via dependency management (maven, ivy, gradle ..)
The jar is versioned
The projects using the library can be tested against a certain verion. Otherwise it may gets a problem if you change enties and forget to change a depending project. -> integration tests
Regards
Entities are the representation of a given object am I correct? If so the default mechanism implemented by Java is Object serialization - http://en.wikipedia.org/wiki/Serialization. In the case of jar files if an entity changes you would have to change jar once again each time as well. It may be tedious.
Geneate a standard war file in roo.. But then change it's package to jar file.
Then from any standard war file you can just deploy this jar (Ill use the jar as a maven dependency). Ill maintain a unique named applicationConext like pizzaShop-applicationContext.xml and like pizzaShop-applicationContext-jpa.xml. so from a parent spring project I can stack up various roo projects in this fashion.
Ill also keep their generated webapps folder to allow for the generator to work more easily. (This means I have to open up the pom.xml and keep changing it back to jar). Also helps with cut and paste fodder for non roo generated war files web.xml entry additions.
Seems like it may be a confusing point about roo.. You can just mix and match these jars as you would any spring project. They function like self contained units of springness and work fine sitting side by side with other spring jars all under the same webapp/web.xml context.
Its tedious but still better then writing spring code by hand.
I am a newbie to J2EE and I am not able to understand the directory structure created on building the java web project. After bit of googling i understood what we store in WEB-INF but
1)i am not able to understand that what we store in META-INF ?
2)how target folder get created?
3)where we mention that what all files should be placed in target folder?
I am using Maven to build the project which is a spring-hibernate based project.
thanks in advance
1) What's the purpose of META-INF?
2) Maven creates the target folder for you. It's where all of the Maven plugins dump their work by default.
3) Maven has mechanisms for excluding files from it.
The key to understanding Maven is that Maven works on conventions. That means that Maven will do a lot of things really well with almost no effort on your part if you structure your project according to Maven's expectations. For example, this is how you differentiate between Java classes and resources in the source directory:
src/main/java/com/mycompany/MyObj.java
src/main/resources/my/company/spring.context.xml
src/test/java/com/mycompany/MyObjTest.java
src/test/resources/my/company/spring.context.xml
When you run mvn test it will compile all of that, move it appropriately over to the target folder, load the JUnit runner and give you a classpath that will let Spring have easy access to the spring context under the test folder. It'll run the tests and drop the reports under target.
Maven is not like Ant. In Ant, you have to tell it everything. Maven works on the opposite end in that it assumes everything by default until you tell it otherwise.
This is a common problem because java has grown so big. Its often hard to tell where one technology ends and another begins. You need to familiarize yourself with the documentation for all the various components you are using.
For instance, if you have a 'target' folder then I assume you are using maven. Maven is a java utility used for dependency management. When you 'mavenize' a project, you agree to adhere to a bunch of standards and maven in turn does a lot of the grunt work for you(compiling code, finding dependent libraries, and running tests). Part of what maven does is create standard maven directories in this case 'target'
more maven info - http://maven.apache.org/
As for META-INF this is part of the Java EE spec. It does have a purpose concerning packaging and deployment, but you'll generally not finding yourself using it very often. Its generally the same principle as maven. You adhere to the Java EE standard and the Java EE compliant tools do most of the work for you.
For more info look at this link - http://java.sun.com/blueprints/guidelines/designing_enterprise_applications/packaging_deployment/index.html
In general to understand these you should check out some tutorials on Java EE and refer to your container's examples and documentation.
1) What is the purpose of the META-INF
2-3)Target folder creates Macven, it manages all dependensies, etc: one, two