We have a multi-module POM, which also serves as a parent POM for all sub-modules involved. Call it MultiModulePOM. We have about 70 modules, say numbered Module1 to Module70.
Now: The first 30 of these modules require a set of JAR files at compile-time only. That is - scope=provided. Since we're talking about a set of JAR files, it is quite tedious to keep those 30 modules in sync and in general, I am not a huge fan of copying definitions around.
So, I fell into the pitfall of dependency grouping. Seemed like a good idea, however it doesn't work for provided dependencies. In other words: if I group the dependent JARs in a module called ExtDependencies, and make Module1 depend on ExtDependencies, the JARs referred-to by ExtDependencies won't be transitively-added to Module1, because their scope is provided.
(If the last paragraph is not true, please let me know as it could really get me out of a jam)
The only other option that I could see was to create a parent POM called (for example) IntermediaryPOM. IntermediaryPOM extends MultiModulePOM and enlists the set of dependent JAR files with scope=provided. Modules Module1-Module30 then extend IntermediaryPOM.
That seemed to do the trick but I have three problems with it:
It adds another layer of POM that I'm not sure is really needed.
Later, during distribution time, I find myself having to install/deploy the intermediary POM's as well.
Consider the general case: the intermediary POM may have other siblings used for other sets of JARs (for modules 31-50). Therefore, this solution doesn't seem to scale well.
So my question is - according to your experience, what is the best way to approach this? any known best practices for such a use case?
I'm afraid there is no easy solution here.
You're right saying that if you declare common dependencies in ExtDependencies as provided they won't be added to the classpath of any other module that is dependent on ExtDependencies. That's how provided works.
But you could declare these common dependencies without scope (e.g. with default compile scope) and add provided dependency on ExtDependencies. In this case all of the ExtDependencies dependencies will be added to classpath. God, that's a lot of "dependencies" :)
You've also mentioned other possible option -- introduce another level of abstraction (which, as you might know, is a way to solve almost any problem). But such multi-level hierarchy is less elegant and more difficult to maintain (I have it in our projects, so I've been there).
In general, I haven't come across this problem in such a scale but if I were to solve it I'd go with the first option taking into account scoping suggestion.
Related
How to make maven/gradle package change what it’s exported by scope? Is it possible?
Like, use like this
<dependency>
<groupId>org.blahblah</groupId>
<artifactId>anything</artifactId>
<version>5.8</version>
<scope>test<scope/>
</dependency>
To get different binaries by use this another way
[...]
<scope>compile<scope/>
[...]
yes and no - sort of.
The general guideline is to create one artifact per pom.xml - most tools work quite nice with this concept. As soon as you go beyond sometimes funky stuff happens. Changing the jar only by scope isn't possible afaik. It would also confuse people a lot. And probably will make troubleshooting very difficult.
But there is a workaround. As you mentioned tests: the jar plugin allows you to export the classes in src/test/java as test-jar and use that as dependency specifying a type.
See How to create a test-jar.
I assume the same mechanism with the type can be used for other things as well.
There is also the concept of classifiers (this is usually used for sources, javadoc and things like that). See this question.
While these things tend to work with maven on the command line, IDEs sometimes start to behave a bit weird if you push type and classifier usage too far.
I am working on a Java maven built project, which consists of several modules. I (as many before me) face this issue that I have classes that are used in multiple of modules. I wish to find an elegant solution to the issue of sharing classes across all modules.
I am aware that this is possible to be accomplished by making another module called for example common where I would put all shared classes. After this module can be compiled into separate jar and can be used as dependency in other modules.
However I do not find this solution elegant enough and am looking for a more direct sharing. This essentially means that I would like to have those classes as separate module common, but this module would not be compiled as separate jar, instead those classes would be directly included into compilation/packaging of all depending modules.
Is this possible to achieve using maven?
UPD: To add an example why I do not find mentioned above way as acceptable - when writing code and mid way realize that some changes should be done to the common classes, all IDEs would require after those changes to run install goal on common module in order to have it as compiled jar in classpath (so that those changes will be visible in other modules). This is just one of the examples why I find this way inconvenient and am looking for more elegant solution.
I'm currently in the process of removing the Spring dependency from Flyway. In the future though other types of dependencies might be needed to support a subset of users (such as JBoss VFS support).
Which is the best way to support optional dependencies (optional=true in Maven POM)?
Qualities of the solution would be:
Ease of use for end-users (minimum work required to use functionality if dependency is present)
Ease of use for developers (code dealing with optional dependency should be as readable and as straightforward as possible)
No unnecessary required dependencies (if some end-user don't need this functionality, no need to pull in the dependency)
I think Maven's optional dependency functionality is quite limited.
http://maven.apache.org/guides/introduction/introduction-to-optional-and-excludes-dependencies.html
Optional dependencies will not get pulled down (as transitive dependencies) by default. However, if your users need to use these optional features the missing dependencies must be explicitly declared, in their POM.
Personally, it's not clear to me how this is helpful to users.... I suppose the optional dependencies in your POM do document which versions your code depends on. Not all users however will read the POM, all they'll see is the "NoClassDef Found" error :-(
My final observation is that this is one of those rare scenarios where a dependency manager like ivy offers more flexibility. Ivy has a concept called "configurations". Module authors can assemble different combinations of dependencies, for example "with-spring" or "without-spring".
There's a choice of:
keep the project in a single module; and use optional dependencies.
split the project into multiple modules; where each module has a (non-optional) dependency on any libraries;
I think the first makes more sense in most cases: users need to figure out their way around fewer artifacts. Typically, they'll have to add fewer new dependencies to their pom. Unless the code to support third-party projects is large, this will help improve maven download times too (fewer round-trips). With the latter approach, you can find yourself in awkward situations where the user has defined their own set of versions, but only for some of the third-party dependencies.
I prefer to see the optional dependencies in the pom (I sometimes look to see which version it's built against). It's true that some people might not look. I think copy-and-pasteable pom snippets on the website is the best solution for that. For example, if you have a page about Spring integration, you could put the relevant pom snippet on that page.
I'd suggest that non-free dependencies (or anything not easily resolvable) be kept in a separate maven module, so that contributors are always able to build the primary artifact. (I had that problem with Quartz, which IIRC has an optional dependency on an Oracle JDBC jar).
Edit: If you're worried about users seeing NoClassDefFoundErrors, it wouldn't do any harm to check that the class can be resolved before trying to use it. For example, you could can an exception, and throw a more meaningful error message pointing the user to documentation. SLF4J is a good example of this.
Do you follow any design guidelines in java packaging?
is proper packaging is part of the design skill? are there any document about it?
Edit : How packages has to depend on each other?, is cyclic packages unavoidable?, not about jar or war files.
My approach that I try to follow normally looks like this:
Have packages of reasonable size. Less then 3 classes is strange. Less then 10 is good. More then 30 is not acceptable. I'm normally not very strict about this.
Don't have dependency cycles between packages. This one is tough since many developers have a hard time figuring out any way to keep the dependencies cycle free. BUT doing so teases out a lot of hidden structure in the code. It becomes easier to think about the structure of the code and easier to evolve it.
Define layer and modules and how they are represented in the code. Often I end up with something like <domain>.<application>.<module>.<layer>.<arbitrary substructure as needed> as the template for package names
No cycles between layers; no cycles between modules.
In order to avoid cycles one has to have checks. Many tools do that (JDepend, Sonar ...). Unfortunatly they don't help much with finding ways to fix cycles. That's why I started to work on Degraph which should help with that by visualizing dependencies between classes, packages, modules and layer.
Packaging is normally about release management, and the general guidelines are:
consistency: when you are releasing into integration, pre-production or production environment several deliveries, you want them organized (or "packaged") exactly the same way
small number of files: when you have to copy a set of files from one environment to another, you want to copy as many as possible, if their number is reasonable (10-20 max per component to deliver), you can just copy them (even if those files are important in size)
So you want to define a common structure for each delivery like:
aDelivery/
lib // all jar, ear, war, ...
bin // all scripts used to launch your application: sh, bat, ant files, ...
config // all properties files, config files
src // all sources zipped into jars
docs // javadoc zipped
...
Plus, all those common directory structures should be stored into one common repository (a VCS, or a maven repo, or...), in order to be queried, without having to rebuilt them every time you need them (you do not need that if you have only one or two delivery components, but when you have 40 to 60 of them... a full rebuilt is out of the question).
You can find a lot of information here:
What strategy do you use for package naming in Java projects and why?
The problem with packaging in Java is that it has very little relation to what you would like to do. For example, I like following the Eclipse convention of having packages marked internal, but then I can't define their classes with a "package" protection level.
I have a large application (~50 modules) using a structure similar to the following:
Application
Communication modules
Color communication module
SSN communication module
etc. communication module
Router module
Service modules
Voting service module
Web interface submodule for voting
Vote collector submodule for voting
etc. for voting
Quiz service module
etc. module
I would like to import the application to Maven and Subversion. After some research I found that two practical approaches exists for this.
One is using a tree structure just as the previous one. The drawback of this structure is that you need a ton of tweaking/hacks to get the multi-module reporting work well with Maven. Another downside is that in Subversion the standard trunk/tags/branches approach add even more complexity to the repository.
The other approach uses a flat structure, where there are only one parent project and all the modules, submodules and parts-of-the-submodules are a direct child of the parent project. This approach works well for reporting and is easier in Subversion, however I feel I lose a bit of the structure this way.
Which way would you choose in the long term and why?
We have a largish application (160+ OSGi bundles where each bundle is a Maven module) and the lesson we learned, and continue to learn, is that flat is better. The problem with encoding semantics in your hierarchy is that you lose flexibility. A module that is 100% say "communication" today may be partly "service" tomorrow and then you'll need to be moving things around in your repository and that will break all sorts of scripts, documentation, references, etc.
So I would recommend a flat structure and to encode the semantics in another place (say for example an IDE workspace or documentation).
I've answered a question about version control layout in some detail with examples at another question, it may be relevant to your situation.
I think you're better off flattening your directory structure. Perhaps you want to come up with a naming convention for the directories such that they sort nicely when viewing all of the projects, but ultimately I don't think all of that extra hierarchy is necessary.
Assuming you're using Eclipse as your IDE all of the projects are going to end up in a flat list once you import them anyway so you don't really gain anything from the additional sub directories. That in addition to the fact that the configuration is so much simpler without all the extra hierarchy makes the choice pretty clear in my mind.
You might also want to consider combining some of the modules. I know nothing about your app or domain, but it seems like a lot of those leaf level modules might be better suited as just packages or sets of packages inside another top level module. I'm all for keeping jars cohesive, but it can be taken too far sometimes.