What are unused/undeclared dependencies in Maven? What to do with them? - java

Maven dependency:analyze complains about the dependencies in my project. How does it determine which are unused and which are undeclared? What should I do about them?
Example:
$ mvn dependency:analyze
...
[WARNING] Used undeclared dependencies found:
[WARNING] org.slf4j:slf4j-api:jar:1.5.0:provided
[WARNING] commons-logging:commons-logging:jar:1.1.1:compile
[WARNING] commons-dbutils:commons-dbutils:jar:1.1-osgi:provided
[WARNING] org.codehaus.jackson:jackson-core-asl:jar:1.6.1:compile
...
[WARNING] Unused declared dependencies found:
[WARNING] commons-cli:commons-cli:jar:1.0:compile
[WARNING] org.mortbay.jetty:servlet-api:jar:2.5-20081211:test
[WARNING] org.apache.httpcomponents:httpclient:jar:4.0-alpha4:compile
[WARNING] commons-collections:commons-collections:jar:3.2:provided
[WARNING] javax.mail:mail:jar:1.4:provided
Note:
A lot of these dependencies are used in my runtime container and I declared them as provided to avoid having the same library on the classpath twice with different versions.

Not sure how Maven determines this. It is not required to address all the items reported by this, but this information can be used as appropriate.
Used undeclared dependencies are those which are required, but have not been explicitly declared as dependencies in your project. They are however available thanks to transitive dependency of other dependencies in your project. It is a good idea to explicitly declare these dependencies. This also allows you to control the version of these dependencies (perhaps matching the version provided by your runtime).
As for unused declared dependencies, it is a good idea to remove them. Why add unnecessary dependency to your project? But then transitivity can bring these in anyway, perhaps, conflicting with your runtime versions. In this case, you will need to specify them — essentially to control the version.
By the way, mvn dependency:tree gives the dependency tree of the project, which gives you a better perspective of how each dependency fits in in your project.

The answer to:
"How does it determine which are unused and which are undeclared?".
Maven uses Object WebASM framework that analyzes your raw bytecode. It goes through all your classes and then builds a list of all classes that these reference. That is the how.
As to what to do, I would not recommend removing the "unused, declared dependecies" unless you are absolutely sure they actually unused.

Used undeclared dependencies
Simply, they are the transitive dependencies which you are using them but WITHOUT declaring them explicitly inside your POM file.
In the below digram, the orange colored one.
Hint:
It is good idea to declare them inside your POM file to be loosly coupled against your first level dependencies, so in the future if they planned to change their implementation and not to use this transitive dependency anymore, your application will be safe!
Unused declared dependencies
Simply, they are the dependencies which you are declearing them inside your POM file WITHOUT using them in your application code.
In the below digram, the red colored one.
Hint:
It is good idea to remove them from your POM file, because they are not used and to save the final size of the application artifact also to avoid any developer from using wrong classes by mistake!

This can be easily fixed with by adding ignoredUnusedDeclaredDependencies in pom.xml
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<configuration>
<ignoredUnusedDeclaredDependencies>
<ignoredUnusedDeclaredDependency>org.slf4j:slf4j-api</ignoredUnusedDeclaredDependency>
</ignoredUnusedDeclaredDependencies>
</configuration>
</execution>
</executions>
</plugin>

Related

How to list all dependencies of a package from maven (including scopes)

I have a graph with 40K artifacts and I can list all possible dependencies of a package (I do so by parsing a list of effective poms)
For example, I have the following for this package:
There's 2 dependencies without taking in mind different versions.
I would like to show that this results are valid by showing that maven also lists these dependencies for this package. But when I use mvn dependency:tree after I add the com.google.guava:guava:14.0.1, I get no dependencies listed.
This is the pom file of the package:
It clearly has those 2 dependencies, but their scopes are provided. Even if I use -Dinclude=provided or -Dscope=provided as a parameter, I still cannot list them.
So, how do I list all dependencies of a package no matter the scope used?
Use Analyze Dependencies... action in the Maven tool window:
It will show the list of dependencies in the project with their scopes and usages in project:
Scope provided means it's provided at runtime, which implies that it's not a package dependency:
A dependency with this scope is added to the classpath used for compilation and test, but not the runtime classpath. It is not transitive.

Is there a way to use jars with no-compliant name? [duplicate]

My project depends on Netty Epoll transport. Here is dependency:
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-transport-native-epoll</artifactId>
<version>${netty.version}</version>
<classifier>${epoll.os}</classifier>
</dependency>
The auto-generated module name for this dependency is:
netty.transport.native.epoll
And as the native keyword is reserved in Java 9 I can't add this module as a dependency to my project:
module core {
requires netty.transport.native.epoll;
}
Due to:
module not found: netty.transport.<error>
Additionally the jar tool --describe-module reports the following:
Unable to derive module descriptor for:
netty-transport-native-epoll-4.1.17.Final-SNAPSHOT-linux-x86‌_64.jar
netty.transport.native.epoll: Invalid module name: 'native' is not a
Java identifier
Are there any workarounds? (except "release correct netty artifact", of course).
EDIT:
As the quick fix for maintainers - you can add next line to build:
<manifestEntries>
<Automatic-Module-Name>netty.transport.epoll</Automatic-Module-Name>
</manifestEntries>
The solution to this seems to be:-
A way possible to uninterruptedly using the same artifact name with a new(different) module name could be by packaging META-INF/MANIFEST.MF of the artifact with an attribute Automatic-Module-Name which governs the name of the module to be used by the module descriptor when converted as an automatic module.
OR
Artifact owners can add module declarations using module-info.java to their JAR. (this could result in a slow bottom-up migration)
Since the module declaration defined in the specs as:
A module declaration introduces a module name that can be used in
other module declarations to express relationships between modules. A
module name consists of one or more Java identifiers (§3.8) separated
by "." tokens.
Intersetingly the declarations suggests -
In some cases, the Internet domain name may not be a valid package
name. Here are some suggested conventions for dealing with these
situations:
If the domain name contains a hyphen, or any other special character
not allowed in an identifier (§3.8), convert it into an underscore.
If any of the resulting package name components are keywords (§3.9),
append an underscore to them.
If any of the resulting package name components start with a digit, or
any other character that is not allowed as an initial character of an
identifier, have an underscore prefixed to the component.
But keep in mind as you do so that Underscore is a keyword in Java9
int _; // is would throw an error on javac based out of JDK9
int _native; // works fine
From now on you can also use this small Maven plugin to automatically modify the manifest file in a Scala jar in your local Maven repo: https://github.com/makingthematrix/scala-suffix
Under the link you will find the overview of the whole issue and what you need to add to you pom.xml, but I was asked to also explain here, so here it goes:
As it was mentioned already, Java does not recognize suffixes in modules names like _2.13 as version numbers and treat them as integral parts of modules names. So, when your project tries to use a class from the Scala dependency, it will look for your.scala.dependency.2.13 instead of just your.scala.dependency, it will fail to do it, and it will crash.
To fix this on your side (i.e. without any action from the library's creator) add this to the <plugins> section of your pom.xml:
<plugin>
<groupId>io.github.makingthematrix</groupId>
<artifactId>scala-suffix-maven-plugin</artifactId>
<version>0.1.0</version>
<configuration>
<libraries>
<param>your-scala-dependency</param>
</libraries>
</configuration>
<executions>
<execution>
<goals>
<goal>suffix</goal>
</goals>
</execution>
</executions>
</plugin>
where your-scala-dependency is a name of your Scala dependency without the version suffix (if there are more than one, just add them with more <param> tags). This should be the same as artifactId in your <dependency> section.
The plugin modifies the dependency's JAR file in your local Maven repository. It opens the jar, reads META-INF/MANIFEST.MF and adds to it a line:
Automatic-Module-Name: your-scala-dependency
If the property Automatic-Module-Name already exists, the plugin does nothing - we assume that in that case the dependency should already work. This prevents the plugin from modifying the same JAR file more than once.

Require akka in module-info.java [duplicate]

My project depends on Netty Epoll transport. Here is dependency:
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-transport-native-epoll</artifactId>
<version>${netty.version}</version>
<classifier>${epoll.os}</classifier>
</dependency>
The auto-generated module name for this dependency is:
netty.transport.native.epoll
And as the native keyword is reserved in Java 9 I can't add this module as a dependency to my project:
module core {
requires netty.transport.native.epoll;
}
Due to:
module not found: netty.transport.<error>
Additionally the jar tool --describe-module reports the following:
Unable to derive module descriptor for:
netty-transport-native-epoll-4.1.17.Final-SNAPSHOT-linux-x86‌_64.jar
netty.transport.native.epoll: Invalid module name: 'native' is not a
Java identifier
Are there any workarounds? (except "release correct netty artifact", of course).
EDIT:
As the quick fix for maintainers - you can add next line to build:
<manifestEntries>
<Automatic-Module-Name>netty.transport.epoll</Automatic-Module-Name>
</manifestEntries>
The solution to this seems to be:-
A way possible to uninterruptedly using the same artifact name with a new(different) module name could be by packaging META-INF/MANIFEST.MF of the artifact with an attribute Automatic-Module-Name which governs the name of the module to be used by the module descriptor when converted as an automatic module.
OR
Artifact owners can add module declarations using module-info.java to their JAR. (this could result in a slow bottom-up migration)
Since the module declaration defined in the specs as:
A module declaration introduces a module name that can be used in
other module declarations to express relationships between modules. A
module name consists of one or more Java identifiers (§3.8) separated
by "." tokens.
Intersetingly the declarations suggests -
In some cases, the Internet domain name may not be a valid package
name. Here are some suggested conventions for dealing with these
situations:
If the domain name contains a hyphen, or any other special character
not allowed in an identifier (§3.8), convert it into an underscore.
If any of the resulting package name components are keywords (§3.9),
append an underscore to them.
If any of the resulting package name components start with a digit, or
any other character that is not allowed as an initial character of an
identifier, have an underscore prefixed to the component.
But keep in mind as you do so that Underscore is a keyword in Java9
int _; // is would throw an error on javac based out of JDK9
int _native; // works fine
From now on you can also use this small Maven plugin to automatically modify the manifest file in a Scala jar in your local Maven repo: https://github.com/makingthematrix/scala-suffix
Under the link you will find the overview of the whole issue and what you need to add to you pom.xml, but I was asked to also explain here, so here it goes:
As it was mentioned already, Java does not recognize suffixes in modules names like _2.13 as version numbers and treat them as integral parts of modules names. So, when your project tries to use a class from the Scala dependency, it will look for your.scala.dependency.2.13 instead of just your.scala.dependency, it will fail to do it, and it will crash.
To fix this on your side (i.e. without any action from the library's creator) add this to the <plugins> section of your pom.xml:
<plugin>
<groupId>io.github.makingthematrix</groupId>
<artifactId>scala-suffix-maven-plugin</artifactId>
<version>0.1.0</version>
<configuration>
<libraries>
<param>your-scala-dependency</param>
</libraries>
</configuration>
<executions>
<execution>
<goals>
<goal>suffix</goal>
</goals>
</execution>
</executions>
</plugin>
where your-scala-dependency is a name of your Scala dependency without the version suffix (if there are more than one, just add them with more <param> tags). This should be the same as artifactId in your <dependency> section.
The plugin modifies the dependency's JAR file in your local Maven repository. It opens the jar, reads META-INF/MANIFEST.MF and adds to it a line:
Automatic-Module-Name: your-scala-dependency
If the property Automatic-Module-Name already exists, the plugin does nothing - we assume that in that case the dependency should already work. This prevents the plugin from modifying the same JAR file more than once.

Maven-dependency-plugin and annotations with SOURCE RetentionPolicy

In a mvn project where I am utilizing maven-dependency-plugin to detect unused dependencies, there is seemingly no dependency scope I can specify for Google's AutoValue (com.google.auto.value:auto-value) to that will convince the plugin that the dependency is being used in spite of the fact that annotations from the package are being used (e.g. #AutoValue) and the project won't build if auto-value is excluded.
Now one solution is simply adding a configuration entry to my plugin:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<configuration>
<usedDependencies>
<usedDependency>com.google.auto.value:auto-value</usedDependency>
</usedDependencies>
</configuration>
</plugin>
But I would be curious to know whether it's possible to configure either the maven-dependency-plugin or the dependency entry for auto-value in a way that would detect usage of the dependency per its annotations?
My suspicion is that this isn't possible because the RetentionPolicy of the annotations I'm using from auto-value are of RetentionPolicy.SOURCE and are discarded by the compiler. Is this correct?
Unfortunately, your suspicion is correct. The maven-dependency-plugin documentation specifically lists this as a concern here for source level annotations: http://maven.apache.org/shared/maven-dependency-analyzer/
Warning: Analysis is not done at source but bytecode level, then some cases are not detected (constants, annotations with source-only retention, links in javadoc) which can lead to wrong result if they are the only use of a dependency.
You can force AutoValue as used with usedDependencies as you have in your example or use the ignoredUnusedDeclaredDependencies configuration instead (which is what I did recently).
I don't believe it is possible to configure the dependency section to avoid this because maven doesn't provide a scope level that is compile only. I mark AutoValue with the provided scope to keep it out of any shaded jars I might make.
Lastly, you could write (or find if it exists) a custom dependency analyzer that takes source level annotations into account. See the documentation here http://maven.apache.org/plugins/maven-dependency-plugin/analyze-mojo.html#analyzer. Probably not worth the effort.

Maven: Resolving Duplicate Dependencies

I'm developing an application that will be used internally at our company. In order for it to interop with our other internal systems I have to use some maven dependencies that we use internally, but this is causing some issues with using some external 3rd party dependencies that I also need.
So essentially my pom looks like this:
<dependencies>
<dependency>
internal-framework-artifact
</dependency>
<dependency>
necessary-third-party-artifact
</dependency>
</dependencies>
I've come to find that both of these dependencies have the apache's commons-collections as one of their own dependencies (among a large number of others, but we'll just keep it at one for this question's simplicity).
If I place exclusion rules on both of them for the commons-collections pom I can compile the project, but my resulting jar won't have access to either version of commons-collections and will just result in a java.lang.NoClassDefFoundError exception. Removing the exclusion rule on either of them just results in a mvn compiler error:
[WARNING] Rule 2: org.apache.maven.plugins.enforcer.BanDuplicateClasses failed with message:
Duplicate classes found:
I've been looking through various so q/a's and I can't really seem to find something that's 100% relevant to my situation. I'm really at a loss as to how to resolve this. Am I missing something really obvious?
I've never actually used the maven-shade-plugin for shading, but I think this is the exact use case it was designed for.
Create a new project that uses the maven-shade-plugin (see: http://maven.apache.org/plugins/maven-shade-plugin/) to produce an uber-jar version of internal-framework-artifact which contains that classes in internal-framework-artifact and all its dependencies. Configure the plugin so that it relocates all the classes that are also dependencies of necessary-third-party-artifact to some non-conflicting package names. This new project should produce a .jar with a different name, something like internal-framework-artifact-with-dependencies.
Now modify your original pom so that it is dependent on internal-framework-artifact-with-dependencies instead, and it should work.

Categories

Resources