I want to update this package's version, but I didn't find this package in my pom file under root directory
How can I update this package's version? Do I need to do it directly in the pom file under the Maven package?
This is my dependency tree, and I want to upgrade to 1.31
If you don’t use it directly, then it is coming from one of your dependencies. You can check which one using
mvn dependency:tree
With IntelliJ IDEA, you can also open the Maven view, then right-click the project and select “Analyze Dependencies…” to get the same information.
Ideally, you should keep it as a transitive dependency, otherwise you will have to take care of its upgrade every time you upgrade the library that actually depends on it. Moreover, there can be issues if you upgrade only the transitive dependency and not the intermediate one (e.g. for Spring).
The best solution would thus be to upgrade that intermediate dependency, assuming that they have released a new version of it (SnakeYAML 1.29 being affected by CVE-2022-25857, there are good chances).
Only if you can’t do that, you should add the dependency in the <dependencyManagement> section of your pom.xml, but don’t forget tot maintain it from now on (and remove it once the intermediate dependency is upgraded).
If you can't find it in your pom then it means it's a transitive dependency pulled in by one of your other dependencies. You can just redefine this as a normal dependency in your pom and it will override the version to be whatever you like.
I need to check the Maven dependency injection order automatically. In a Maven module I take two dependencies with the same classes name and package. One dependency should always have priority on the other one, this mean it have to be declared first because of the order of dependencies injection of Maven.
It's possible to verify it with the dependency tree but manually, I wish there is a way to check this automatically.
Do you know if it is possible to do this with Maven?
First of all, "dependency injection" is something completely different and unrelated: managed magical mechanisms to pass needed objects from "outside" when constructing objects.
Maven just compiles code with dependency jars in the classpath. Any variant of the monkeypatched classes should allow your code to be compiled identically, so (to minimize breakage) you should avoid duplicate or conflicting Maven dependencies.
Reliably loading your replacement classes at runtime when Maven is no longer involved is an entirely different problem, whose solution depends on how your application is packaged and executed. For example, the order of jars in a typical classpath list should be reliable, and most application servers offer ways to specify and override with one another various global and application-specific classpaths.
I'm struggling with how to approach jar dependency hell. I have a Maven-IntelliJ Scala project that uses some aws sdk's. Recently adding the kinesis sdk has introduced incompatible versions of Jackson.
My question is: how do I systemically approach the problem of Jar hell?
I understand class loaders and how maven chooses between duplicate Jars, but I am still at a loss regarding actual practical steps to fix the issue.
My attempts at the moment are based on trial and error, and I am outlining the here with the Jackson example:
First, I see what the actual exception is, in this case NoSuchMethodError, on the Jackson data bindings ObjectMapper class. I then look at the Jackson docs to see when the method was added or removed. This is usually quite tedious, as I manually check the api docs for each version (question 1: is there a better way?).
Then, I use mvn dependency:tree to figure out which version of the Jackson I am actually using (question 2: is there an automatic way of asking maven which version of a jar is in use, rather than combing through the tree output?).
Finally, I compare the mvn dependency:tree output before adding the Kinesis SDK, and after, to detect differences in the mvn dependency:tree output, and hopefully see if the Jackson version changed. (question 3: How does maven use the libraries in shaded jars, when dependency resolution occurs? Same as any other?).
Finally, after comparing the tree outputs, I try to add the lastest working version of Jackson explicitly in the POM, to trigger precedence in the maven dependency resolution chain. If the latest does not work, I add the next most recent lib, and so forth.
This entire procedure is incredibly tedious. Besides the specific questions I asked, I am also curious about other people's systemic approaches to this problem. Does any one have any resources that they use?
I then look at the Jackson docs to see when the method was added or removed. This is usually quite tedious, as I manually check the api docs for each version (question 1: is there a better way?)
To check API (breaking) compatibility there are several tools which would automatically analyze jars and provide you the right information. From this Stack Overflow post there are nice hints for some handy tools.
JAPICC seems quite good.
Then, I use mvn dependency:tree to figure out which version of the Jackson I am actually using (question 2: is there an automatic way of asking maven which version of a jar is in use, rather than combing through the tree output?)
The maven-dependency-tree is definitely the way to go, but you can filter out since the beginning the scope and only get what you are actually looking for, using its includes option as following:
mvn dependency:tree -Dincludes=<groupId>
note: you can also provide further info to the includes option in the form groupId:artifactId:type:version or use wildcards like *:artifactId.
It seems a small hint, but in large projects with many dependencies narrowing down its output is of great help. Normally, simply the groupId should be enough as a filter, the *:artifactId is probably the fastest though if you are looking for a specific dependency.
If you are interested in a list of dependencies (not as a tree) also alphabetically ordered (quite handy in many scenarios), then the following may also help:
mvn dependency:list -Dsort=true -DincludeGroupIds=groupId
question 3: How does maven use the libraries in shaded jars, when dependency resolution occurs? Same as any other?
By shaded jars you may mean:
fat jars, which also bring it other jars into the classpath. In this case, they are seen as one dependency, one unit for Maven Dependency Mediation, its content would then be part of the project classpath. In general, you shouldn't have fat-jars as part of your dependencies since you don't have control over packed libraries it brings in.
jars with shaded (renamed) packages. In this case - again - there is no control as far as Maven Dependency Mediation is concerned: it's one unit, one jar, based on its GAVC (GroupId, ArtifactId, Version, Classifier) which makes it unique. Its content then it's added to the project classpath (according to the dependency scope, but since its package was renamed, you may have conflicts difficult to handle with. Again, you shouldn't have renamed packages as part of your project dependencies (but often you can't know that).
Does any one have any resources that they use?
In general, you should understand well how Maven handles dependencies and use the resources it offers (its tools and mechanisms). Below some important points:
dependencyManagement is definitely the entry point in this topic: here you can deal with Maven Dependency Mediation, influence its decision on transitive dependencies, their versions, their scope. One important point is: what you add to dependencyManagement is not automatically added as a dependency. dependencyManagement is only taken into account once a certain dependency of the project (as declared in the pom.xml file or via transitive dependencies) has a matching with one of its entries, otherwise it would be simply ignored. It's an important part of the pom.xml since it helps on governing dependencies and their transitive graphs and that's why is often used in parent poms: you want to handle only one and in a centralized manner which version of, e.g., log4j you want to use in all of your Maven projects, you declare it in a common/shared parent pom and its dependencyManagement and you make sure it will be used as such. Centralization means better governance and better maintenance.
dependency section is important for declaring dependencies: normally, you should declare here only the direct dependencies you need. A good rule of thump is: declare here as compile (the default) scope only what you actually use as import statement in your code (but you often need to go beyond that, e.g., JDBC driver required at runtime and never referenced in your code, it would then be in runtime scope though). Also remember: the order of declaration is important: the first declared dependency wins in case of conflict against a transitive dependency, hence by re-declaring esplicitely a dependency you can effectively influence dependency mediation.
Don't abuse with exclusions in dependencies to handle transitive dependencies: use dependencyManagement and order of dependencies for that, if you can. Abuse of exclusions make maintenance much more difficult, use it only if you really need to. Also, when adding exclusions always add an XML comment explaining why: your team mates or/and your future self will appreciate.
Use dependencies scope thoughtfully. Use the default (compile) scope for what you really need to for compilation and testing (e.g. loga4j), use test only (and only) for what is used under test (e.g. junit), mind the provided scope for what is already provided by your target container (e.g. servlet-api), use the runtime scope only for what you need at runtime but you should never compile with it (e.g. JDBC drivers). Don't use the system scope since it would only imply troubles (e.g. it is not packaged with your final artifact).
Don't play with version ranges, unless for specific reasons and be aware that the version specified is a minimum requirements by default, the [<version>] expression is the strongest one, but you would rarely need it.
use Maven property as placeholder for the version element of families of libraries in order to make sure you have one centralised place for the versioning of a set of dependencies which would all have the same version value. A classic example would be a spring.version or hibernate.version property to use for several dependencies. Again, centralisation means better governance and maintenance, which also means less headache and less hell.
When provided, import BOM as an alternative to the point above and to better handle families of dependencies (e.g. jboss), delegating to another pom.xml file the management of a certain set of dependencies.
Don't (ab)use SNAPSHOT dependencies (or as less as possible). If you really need to, make sure you never release using a SNAPSHOT dependency: build reproducibility will be in high danger otherwise.
When troubleshooting, always check the full hierarchy of your pom.xml file, using help:effective-pom may be really useful while checking for effective dependencyManagement, dependencies and properties as far as the final dependency graph would be concerned.
Use some other Maven plugins to help you out in the governance. The maven-dependency-plugin is really helpful during troubleshooting, but also the maven-enforcer-plugin comes to help. Here are few examples worth to mention:
The following example will make sure that no one (you, your team mates, your future yourself) will be able to add a well-known test library in compile scope: the build will fail. It makes sure junit will never reach PROD (packaged with your war, e.g.)
<plugin>
<artifactId>maven-enforcer-plugin</artifactId>
<version>1.4.1<.version>
<executions>
<execution>
<id>enforce-test-scope</id>
<phase>validate</phase>
<goals>
<goal>enforce</goal>
</goals>
<configuration>
<rules>
<bannedDependencies>
<excludes>
<exclude>junit:junit:*:*:compile</exclude>
<exclude>org.mockito:mockito-*:*:*:compile</exclude>
<exclude>org.easymock:easymock*:*:*:compile</exclude>
<exclude>org.powermock:powermock-*:*:*:compile</exclude>
<exclude>org.seleniumhq.selenium:selenium-*:*:*:compile</exclude>
<exclude>org.springframework:spring-test:*:*:compile</exclude>
<exclude>org.hamcrest:hamcrest-all:*:*:compile</exclude>
</excludes>
<message>Test dependencies should be in test scope!</message>
</bannedDependencies>
</rules>
<fail>true</fail>
</configuration>
</execution>
</executions>
</plugin>
Have a look at other standard rules this plugin offers: many could be useful to break the build in case of wrong scenarios:
you can ban a dependency (even transitively), really handy in many cases
you can fail in case of SNAPSHOT used, handy in a release profile, as an example.
Again, a common parent pom could include more than one of these mechanisms (dependencyManagement, enforcer plugin, properties for dependency families) and make sure certain rules are respected. You may not cover all the possible scenarios, but it would definitely decrease the degree of hell you perceive and experience.
Use Maven Helper plugin to easily resolve all conflict by excluding old versions of dependencies.
In my experience I didn't found anything fully automated, but I found the following approach quite sistematic and useful for myself:
First of all I try to have a clear map of the project structure, relations between projects and I usually use Eclipse graphical dependency view, which tells me, for example, if a dependency is omitted for conflict with another one.
Moreover it tells you the resolved dependencies for the project.
I sincerely don't use IntelliJ IDEA but I believe it has a similar feature.
Usually I try to put very common dependency higher in the structure and I exploit the <dependencyManagement> feature to take care of the version for transitive dependencies, and most important, to avoid duplicates in the project structure.
In this Maven - Manage Dependencies blog post you can find a good tutorial about dependency management.
When adding a new dependency to my project , as in your case, I take care of where it is added in my project structure and make changes accordingly, but in most cases the dependency management mechanism is capable of deal with this problem.
In this Maven Best Practices blog post you can find:
Maven's dependencyManagement section allows a parent pom.xml to define
dependencies that are potentially reused in child projects. This
avoids duplication; without the dependencyManagement section, each
child project has to define its own dependency and duplicate the
version, scope, and type of the dependency.
Obviously if you need a particular version of a dependency for a project you can always specify the version you need locally, deep in the hierarchy.
I agree with you, it could be quite tedious, but dependency management could give you a good help.
Even replacing all the jar with same name you can still have some classes with same fully qualified name. I used maven shade plugin in one of my project. It print classes with same qualified name coming from different jar. Maybe that can help you
I have a project that has 3rd party dependencies, as well as dependencies on internal projects. I need to strip the version numbers from the dependent artifacts that are developed in-house.
For example: spring-2.5.6.jar should be in the final output as spring-2.5.6.jar but MyInternalProject-1.0.17.jar needs to be changed to MyInternalProject.jar.
I can identify the internal dependencies easily enough by their group ID (they are all something like com.mycompany.*). The maven-dependency-plugin has a stripVersion option, but it does not seem to be selective enough. Is there a way to do this, short of explicitly naming each dependency and what their final name should be?
Phrased another way:
I would like to have different outputFileNameMappings for the maven-assembly-plugin for artifacts based on group ID. Is there a way to do this?
I think you can using the following recipe:
First, in your aggregator pom use the dependency:copy-dependencies goal to copy your jars to some intermediate location. You will need two executions, one with <stripVersion>true</stripVersion> for your internal dependencies; and one with <stripVersion>false</stripVersion> for 3rd party libraries. You may include/exclude artifacts based on GroupId, see http://maven.apache.org/plugins/maven-dependency-plugin/copy-dependencies-mojo.html for full details.
Then it should be a simple task to build a .zip using the maven-assembly-plugin!
Based on the comments, I would re-evaluate your approach here. Generally checking jars into source control is not a good idea, especially unversioned jars. Imagine if I just had a project that referenced someArtifact.jar and I was trying to debug it - how would I know which version it used?
Tools like artifactory and nexus were built for storing versions of your jars, both internal and 3rd party (they can also proxy public repositories like maven central). In order to keep builds reproducible, I would check your binaries into a tool designed for that, and then you can reference them by version. During development, you can reference SNAPSHOT versions of your jars to get the latest, and then when you do a release, you can reference stable versions of your binaries.
Source control systems were meant for storing source, not binaries.
I am still fairly new to Maven, I finally have it how I want but now I need to break it all over again.
Here is my scenario:
I need to write two different server applications, which use identical core functionality; just, what is done with that framework is very different. One server application is very easy/simple - it's already done - whereas the other is a lot more complicated.
The code is written in a dependency injection style (using Guice, if it matters), so it should be extremely easy to break apart.
My question is this: how would you structure the projects in Eclipse, using Maven? Would you set up three different projects, something like:
server-core
server-appEasy
server-appComplicated
where each server would have it's own pom. Or, would you keep it all in one project? I need to be able to easily recompile appEasy in, say, a month from now, while I work on appComplicated. The classes for appEasy are already in a subpackage. Note: core would not work by itself without at least a mock dependency injection. It doesn't have a main class.
All thoughts appreciated, even on things I haven't thought of.
I would have a structure like this:
/server
/server-core
pom.xml
/server-appeasy
pom.xml
/server-appcomplicated
pom.xml
pom.xml
So each project has its own pom.xml that allows you to build that project in isolation.
However the parent folder also has a pom.xml, which will build all the projects if run. You can do this by including the projects as modules in the parent pom.
E.g. In the parent pom.xml
<modules>
<module>server-core</module>
<module>server-appeasy</module>
<module>server-appcomplicated</module>
</modules>
You can also use managed dependencies in the parent pom tio allow you to centralise external dependency and plugin version numbers.
I would suggest to structure all as a Maven Multi Module project.
The parent project, would have the three projects as modules, the 3th party dependency versions, and the version of your project as a property.
Then, in the server-appComplicated and server-appEasy I would add a dependecy to the server-core.
In this way you will gain:
1- A root project to compile (the parent), that it will generate the two servers and the core-lib.
2- A point where to handle the version numbers and the common dependencies.
I hope it helps
Im not a maven expert but here is my 2 cents.
Each project needs its own pom.
Do you need to build all the projects together? In that case it might make sense to have a parent pom , which has all the common dependencies.
EDIT: In that case, I feel just have three separate 3 pom files for each project.
There are multiple ways to do this, depending on how you need it when it comes to deployment. Assuming that 'server-core' is a shared artifacts among your 'server-appEasy' and 'server-appComplicated' artifacts, I would suggest something as below
Create a Maven Project 'server-core'
Add two module projects
by name 'server-appEasy' and 'server-appComplicated'
Make sure the module projects have their parent set as the 'server-core'
In the end you should have three projects (each has separate pom.xml), where
a. Building 'server-core' will also build the modules
b. Building either of the 'easy' and 'complicated' modules independently on need basis will also build the server-core.
Hope this helps!