I am coming from .NET background and I need to do some JAVA work these days. One thing I don't quite understand is how JAvA runtime resolve its jar dependencies. For example, I want to use javax.jcr to do some node adding. So I know I need to add these two dependencies because I need to use javax.jcr.Node and org.apache.jackrabbit.commons.JcrUtils.
<dependency>
<groupId>javax.jcr</groupId>
<artifactId>jcr</artifactId>
<version>2.0</version>
</dependency>
<dependency>
<groupId>org.apache.jackrabbit</groupId>
<artifactId>jackrabbit-jcr-commons</artifactId>
<version>2.8.0</version>
</dependency>
</dependency>
Now I passed the compilation but I get an exception in runtime. Then someone told me to add one more dependency which solves the problem.
<dependency>
<groupId>org.apache.jackrabbit</groupId>
<artifactId>jackrabbit-jcr2dav</artifactId>
<version>2.6.0</version>
</dependency>
From my understanding, jackrabbit-jcr-commons needs jackrabbit-jcr2dav to run. If the jar misses a dependecy, how can it pass the compilation? And also how do I know I miss this particular dependency from jcr-common? This is a general question, it doesn't have to be specific to java jcr.
Java doesn't have any built-in way to declare dependencies between libraries. At runtime, when a class is needed, the Java ClassLoader tries to load it from all the jars in the classpath, and if the class is missing, then you get an exception. All the jars you need must be explicitly listed in the classpath. You can't just add one jar, and hope for Java to transitively load classes from this jar dependencies, because jar dependencies are a Maven concept, and not a Java concept. Nothing, BTW, forbids a library writer to compile 1000 interdependant classes at once, but put the compiled classes in 3 several different jars.
So what's left is Maven. I know nothing about JCR. But if a jar A published on Maven depends on a jar B published on Maven, then it should list B in its list of dependencies, and Maven should download B when it downloads A (and put both jars in the classpath).
The problem, however, is that some libraries have a loose dependency on other libraries. For example, Spring has native support for Hibernate. If you choose to use Spring with Hibernate, then you will need to explicitly declare Hibernate in your dependencies. But you could also choose to use Spring without Hibernate, and in that case you don't need to put Hibernate in the dependencies. Spring thus chooses to not declare Hibernate as one of its own dependencies, because Hibernate is not always necessary when using Spring.
In the end, it boils down to reading the documentation of the libraries you're using, to know which dependencies you need to add based on the features you use from these libraries.
Maven calculates transitive dependencies during compile-time, so compilation passes ok. The issue here is that, by default, maven won't build a proper java -cp command line to launch your application with all of its' dependencies (direct and transitive).
Two options to solve it:
Adjust your Maven project to build a "fat jar" -- jar which will include all needed classes from all dependencies. See SO answer with pom.xml snippet to do this: https://stackoverflow.com/a/16222971/162634. Then you can launch by just java -cp myfatjar.jar my.app.MainClass
For multi-module project, with several result artifacts (that is, usually, different java programs) it makes sense to build custom assembly.xml which will tell Maven how to package your artifacts and which dependencies to include. You'll need to provide some kind of script in resulting package which will contain proper java -cp ... command. As far as I know, there's no "official" Maven plugin to build such a script during compilation/packaging.
There's free Maven book which more or less explains how dependencies and assemblies work.
Your question mixes Maven (a java-centric dependency resolution tool) and Java compile-time and run-time class-resolution. Both are quite different.
A Java .jar is, in simplified terms, a .zip file of Java .class files. During compilation, each Java source file, say MyClass.java, results in a Java bytecode file with the same name (MyClass.class). For compilation to be successful, all classes mentioned in a Java file must be available in the class-path at compile-time (but note that use of reflection and run-time class-name resolution, ala Class.forName("MyOtherClass") can avoid this entirely; also, you can use several class-loaders, which may be scoped independently of each other...).
However, after compilation, you do not need to place all your .class files together into the same Jar. Developers can split up their .class files between jars however they see fit. As long as a program that uses those jars only compile-time refers to and run-time loads classes that have all their dependencies compile-time and run-time available, you will not see any runtime errors. Classes in a .jar file are not recompiled when you compile a program that uses them; but, if any of their dependencies fails at run-time, you will get a run-time exception.
When using Maven, each maven artifact (typically a jar file) declares (in its pom.xml manifest file) the artifacts that it depends on. If it makes any sense to use my-company:my-library-core without needing my-company:my-library-random-extension, it is best practice to not make -core depend on -random-extension, although typically -random-extension will depend on -core. Any dependencies of an artifact that you depend on will be resolved and "brought in" when maven runs.
Also, from your question, a word of warning -- it is highly probable that jackrabit-jcr2dav version 2.6.0 expects to run alongside jackrabbit-jcr-commons version 2.6.0, and not 2.8.0.
If I had to guess (without spending too much time delving into the Maven hierarchies of this particular project), I believe your problem is caused by the fact that jackrabbit-jcr-commons has an optional dependency on jackrabbit-api. That means that you will not automatically get that dependency (and it's dependencies) unless you re-declare it in your POM.
Generally speaking, optional dependencies are a band-aid solution to structural problems within a project. To quote the maven documentation on the subject (http://maven.apache.org/guides/introduction/introduction-to-optional-and-excludes-dependencies.html):
Optional dependencies are used when it's not really possible (for
whatever reason) to split a project up into sub-modules. The idea is
that some of the dependencies are only used for certain features in
the project, and will not be needed if that feature isn't used.
Ideally, such a feature would be split into a sub-module that depended
on the core functionality project...this new subproject would have
only non-optional dependencies, since you'd need them all if you
decided to use the subproject's functionality.
However, since the project cannot be split up (again, for whatever
reason), these dependencies are declared optional. If a user wants to
use functionality related to an optional dependency, they will have to
redeclare that optional dependency in their own project. This is not
the most clear way to handle this situation, but then again both
optional dependencies and dependency exclusions are stop-gap
solutions.
Generally speaking, exploring the POMs of your dependencies will reveal this kind of problem, though that process can be quite painful.
Related
From the viewpoint of a Gradle java library author, I understand that a dependency specified in the implementation configuration will be marked with the runtime scope in the resulting POM file that gets published (using the maven-publish Gradle plugin). This makes sense, as anyone consuming my published library doesn't need the dependency for compilation (it is an internal dependency), but instead only for runtime. If I specify a dependency in the api configuration, it will be marked with the compile scope in the resulting POM file, which again makes sense, as anyone consuming my library needs this for compilation (and runtime).
This makes me believe that the meaning of the Maven dependency scope is relative to anyone consuming the component, and not relative to the component itself. Consider a published Maven library (containing Java class files) a dependency marked with compile should mean:
If you compile against me, then use this dependency on the compilation classpath too!
However, according to the Maven docs, it seems that it means:
I was compiled with that dependency on my compilation classpath, and if you want to compile me again, do the same!
If this were true, then one could not distinguish between API-dependencies and implementation-dependencies like Gradle does. Also, it would only make sense if the published component actually contains the sources, not only the class files.
Did Gradle actually "misuse" the meaning of these scopes to make some improvements, or did I fundamentally misunderstand something?
Gradle cleverly "misuses" the scopes.
Maven has the design flaw that the build POM is published 1:1 as consumer POM (this will change with the upcoming Maven 4.x). So Maven does not have the chance to use something for compilation in the project, but for runtime when consumed by another project (at least not without applying tricks). The Maven docs therefore do not discuss the possibility of "implementation/api".
Sorry for my newbie question.
Supposedly I have a package A which declares B, C as its dependencies in its maven files for example. B, C use two different versions of log4j for logging. I have a couple of questions:
If I use maven, and declare B, C as A's dependencies. When maven pull in artifact (.jar) of B,C from mavencentral repo. Do B,C jar files contain log4j class files or just contain only their own compiled files (B,C own source, not dependency).
If I understand correctly, when build happens, at the end, there will only be one log4j class file in the build (even if B, C use different versions of log4j). Which version of log4j to be selected to build here? Does it mean that I need to declare log4j as A dependency as well (in A's maven build file) - and that version will be selected version to build.
B, C might use totally different log4j versions. There API might be completely different. It should cause problem at run time? But in reality, it's very rare? Why so?
Thanks.
jar files usually do not contain their dependencies. There is a way to do this called fat jar. What is a fat JAR? but let's assume you are using regular jar dependencies. The jars will only declare their dependencies in their own pom.xml. So for your example, B and C will contain only their own compiled source code.
It really depends on how you pack the files. In general, if you only generate a simple jar, it will not contain the dependencies, and it is the responsibility of the runner to supply the correct dependencies. In case of a war, for example, maven will throw in all dependencies. Another way as mentioned before is fat jar. One more common way is to zip all the dependencies and supply them separately.
I do not know why you have not encountered a conflict before, I have, with many other libraries, I do not remember a case with log4j though. One way to handle these kind of conflicts, as a library maintainer, is when you make a non backward compatible change, to change the package name, this way the user can safely have multiple versions in the classpath (which should be avoided anyway).
Maven has a way to avoid these kind of conflict, it will give priority to the closest defined dependency version. For example, if you have version a declared in A and version b in B, then the effective version will be A.
Also, there are some other mechanisms like dependency management. You can look here: https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html
This topic is a very serious and may cause a lot of hard to detect production errors. Hope this helps...
They should only contain their own classes. Not the classes of their dependencies. You can just open the jar files and see by yourself. Jar files are just zip files.
Maven will solve the conflict by picking the version that is the closest to the root of the dependency tree. If both versions are at the same depth in the dependency tree, then the first one is picked (IIRC). If A itself depends on log4j, or if you want a specific version to be used at runtime, you should specify log4j as a direct dependency of A, with the version you want. Or at least specify it in the dependencyManagement section of your build.
Because libraries as popular as Log4J strive to have a very stable API, which thus doesn't break code compiled against older versions of the library.
An artefact will normally not contain its dependencies (however there are packaging options that do).
Maven will determine only one version using some rules (that I don't remember in detail). If you have to override this for some reason you can put a dependency management section into the POM.
Yes this can cause problems. They can only be avoided by being careful when making changes to the public API.
If log4j is specified as a dependency of B and C and you do not use special plugins that create uber-jars/fat-jars, both B and C will not contain log4j class files.
One dependency with same coordinates (groupId, artifactId). As someone here already mentioned, version is usually picked by shortest path to root. So if you want to use a specific log4j-version you just can specify it in your pom.
If you use log4j the standard way, i.e. by just specifying the config files, both versions (log4j and log4j2) can usually coexist due to the fact they are using different packages and different configuration files. Just check the migration site of log4j: Migration from log4j to log4j2
We are building an ear that is going to run on a Websphere where j2ee.jar is provided.
Now we have the situation that an ejb (call it ejb.jar) depends on another jar (call it util.jar) which depends on j2ee.jar.
If mark j2ee.jar in the pom of util.jar as "provided", the ejb.jar won't build because provided is not transitive. If we mark it as "compile", it may become a compile dependency of the ear, unless we overwrite the scope.
What is the best approach? Should util.jar have provided dependencies, even if it is just a humble jar? Or should jars only have compile dependencies?
JARs can have provided dependencies... but the user having a dependency on it needs to make sure that this dependency is actually going to be provided at run-time. Since provided dependencies are not transitive, they also need to make sure that they do not depend on it for compilation; but if they do, the best practice would be to declare it explicitly with the compile (or provided) scope, and not rely on some form of transitivity (look at the analyze goal of the Dependency Plugin, which, for example, lists used, but undeclared, dependencies).
Provided dependencies in JARs can be useful when creating executable JARs. Consider the building of an uber-jar (a JAR with the classes all of its dependencies included in it): you may want to say that a specific dependency shouldn't end up in the uber-jar, because the container launching it will provide it at run-time.
Also, a JAR may need a dependency to compile its code, but does not actually need it to run; as example, consider Maven plugins which declares maven-plugin-annotations as a provided dependency because they only need the annotations to be built.
Final point, there are JARs that have a good idea in which context they are going to be used: Spring WebMVC, for example, certainly depends on the Servlet API to compile, but at run-time, it knows it's going to be used in a Java EE context, and that the Servlet API will be provided by the Java EE server.
As a rule of thumb though, apart from the cases above, you probably don't want to have provided JAR dependencies inside of a JAR project: it should be up the client to decide whether some compile-time dependencies of yours are going to be provided for their specific case, and let the client override the scope. As a library writer, you don't really know how your library is going to be used.
In your specific case, since ejb.jar actually needs j2ee.jar to compile, it would be best to declare that dependency with the compile, or even with the provided scope in your case, regardless of what scope util.jar has set for j2ee.jar. (I'll note that it's weird for an utility JAR to have a dependency on what appears to be a JAR from Java EE web application classes.)
I'm getting below error in STS:
The type org.springframework.core.env.EnvironmentCapable cannot be resolved. It is indirectly referenced from required .class files
This sounds like a transitive dependency issue. What this means is that your code relies on a jar or library to do something - evidently, you depend on Spring framework code. Well, all that Spring code also depends on libraries and jars.
Most likely, you need to add the corerctly versioned org.springframework.core jar to your classpath so that the EnvironmentCapable class can be found when your IDE attempts to build your project.
This might also be a jar collision issue as well, although that sounds less likely. When an application experiences jar collision (also known as "dll hell"), the compiler is finding multiple jars and classes with the same fully-qualified name. For example, let's say you added Spring to your classpath, along with the entire Tomcat server library. Well, those two jars may contain the same exact named classes, maybe the same version, maybe different versions. But either way, when the compiler looks for that EnvironmentCapable class, it finds two (in this contrived example) - one in the Spring jar and one in the Tomcat jar. Well, it doesn't know which one to choose, and so it throws a ClassDefNotFoundException, which would/could manifest itself as the error you experienced.
I faced same error while i work with spring security on spring-security-config.i jsut deleted that jar in maven repo and gave maven->update Project in eclipse.
it is resolved.Please try it once.
From command line, run "mvn clean install", you'll see project failed and you'll see artifacts in the logs that cause such a problem.
After that, remove artifacts from .m2/repository, then maven update from eclipse.
To avoid jar collision, make sure you declare your dependency versions under the properties tag in the aggregate pom.xml, and use the property name as a placeholder throughout the project. For example 4.2.5.RELEASE in the parent pom, and then in the child modules just use ${spring.version} instead of 4.2.5.RELEASE. This way you can avoid having two different versions of the same library on the classpath.
Also it is recommended to be consistent with the version of spring dependencies. Use the same version for spring-core, spring-web etc.
If you are using maven, then you can use the maven enforcer plugin to ensure dependency convergence, and avoid further issues with transitive dependencies.
This is more a question about what's out there, and future directions about resolving tools such as Ivy. Is there anything that can mention class-level dependencies for packages, rather than package level dependencies?
For example, let's say I have an apache-xyxy package, that comes with an ivy.xml that lists all it's dependencies. But suppose I only use class WX in apache-xyxy, which doesn't require most of those dependencies. Couldn't a resolver be intelligent and identify that class WX can only possibly invoke the set of other classes (AB, DC, EF), and none of those classes use any of other dependencies, to create a minimal subset of required dependencies? This would be easier and safer than cherry picking to remove some package dependencies that aren't needed because of the specific classes used in that package, and also prevent breaking down several larger packages into smaller ones just for this reason.
Then, if I later decided to use class GH from apache-xyxy, I could do an ivy resolve, and it would dynamically bring in the additional required libraries.
When packaging compiled java code for distribution it's common practice to bundle Java "packages" together. It's also quite possible (but silly) to split a java package across multiple jars. Large frameworks (like Spring) have lots of sub packages in different jars so that users can pick and choose what they need at run-time..... Of course the more jar options one has, the more complex it becomes to populate the run-time classpath...
The keyword here is "run-time".... Tools like Apache ivy and Apache Maven are primarily designed to manage dependencies needed at build time....
Apache Maven does have a "runtime" scope, for it's dependencies, but it's limited to a single list of jars. Typically this scope is used for deciding which jars are needed for testing and populating the lib directory of a WAR file.
Apache ivy has a similar more flexible mechanism called "configurations". It's possible to create as many runtime configurations as you need, and these can be used to decide which jars are downloaded by ivy.
So while it would appear ivy has the answer, I've rarely seen ivy used when launching programs (The one exception is Groovy's Grape annotations)
So what, you might ask, is the answer?
The future of "run-time" classpath management is either OSGI or project jigsaw. I'm more familiar with OSGI where special dependency indicators are added the the jar file's manifest, stating what it's dependencies are. The idea is that when a container loads a jar (called a "bundle") it can check and see whether the other dependencies are already loaded. These dependencies can be retrieved and loaded from a common repository. This is fundentally different way to launch java. Traditionally each application is loaded onto it's own isolated classpath.....
Time will tell if either project catches on. In the meantime we use Apache ivy and Apache Maven to build self-contained and possibly over-bloated WAR (EAR, etc) packages.