This is more a question about what's out there, and future directions about resolving tools such as Ivy. Is there anything that can mention class-level dependencies for packages, rather than package level dependencies?
For example, let's say I have an apache-xyxy package, that comes with an ivy.xml that lists all it's dependencies. But suppose I only use class WX in apache-xyxy, which doesn't require most of those dependencies. Couldn't a resolver be intelligent and identify that class WX can only possibly invoke the set of other classes (AB, DC, EF), and none of those classes use any of other dependencies, to create a minimal subset of required dependencies? This would be easier and safer than cherry picking to remove some package dependencies that aren't needed because of the specific classes used in that package, and also prevent breaking down several larger packages into smaller ones just for this reason.
Then, if I later decided to use class GH from apache-xyxy, I could do an ivy resolve, and it would dynamically bring in the additional required libraries.
When packaging compiled java code for distribution it's common practice to bundle Java "packages" together. It's also quite possible (but silly) to split a java package across multiple jars. Large frameworks (like Spring) have lots of sub packages in different jars so that users can pick and choose what they need at run-time..... Of course the more jar options one has, the more complex it becomes to populate the run-time classpath...
The keyword here is "run-time".... Tools like Apache ivy and Apache Maven are primarily designed to manage dependencies needed at build time....
Apache Maven does have a "runtime" scope, for it's dependencies, but it's limited to a single list of jars. Typically this scope is used for deciding which jars are needed for testing and populating the lib directory of a WAR file.
Apache ivy has a similar more flexible mechanism called "configurations". It's possible to create as many runtime configurations as you need, and these can be used to decide which jars are downloaded by ivy.
So while it would appear ivy has the answer, I've rarely seen ivy used when launching programs (The one exception is Groovy's Grape annotations)
So what, you might ask, is the answer?
The future of "run-time" classpath management is either OSGI or project jigsaw. I'm more familiar with OSGI where special dependency indicators are added the the jar file's manifest, stating what it's dependencies are. The idea is that when a container loads a jar (called a "bundle") it can check and see whether the other dependencies are already loaded. These dependencies can be retrieved and loaded from a common repository. This is fundentally different way to launch java. Traditionally each application is loaded onto it's own isolated classpath.....
Time will tell if either project catches on. In the meantime we use Apache ivy and Apache Maven to build self-contained and possibly over-bloated WAR (EAR, etc) packages.
Related
If by default, when I produce a jar application, the dependencies are not include, does that mean that the user should download all the dependencies of my app to use it?
Why includes dependencies in jar is not the default thing?
How can I expect users to have/download all the dependencies at with the exact version needed?
From https://imagej.net/Uber-JAR:
Advantages:
A single JAR file is simpler to deploy. There is no chance of mismatched versions of multiple JAR files. It is easier to construct a Java classpath, since only a single JAR needs to be included.
Disadvantages:
Every time you need to update the version of the software, you must redeploy the entire uber-JAR (e.g., ImageJ is ~68 MB as of May 2015). If you bundle individual JAR components, you need only update those that changed. This issue is of particular relevance to Java applications deployed via Java Web Start, since it automatically downloads the latest available version of each JAR dependency; in that case, your application startup time will suffer if you use the uber-JAR.
You cannot cherry-pick only the JARs containing the functionality you need, so your application's footprint may suffer from bloat.
If downstream code relies on any of the same dependencies which are embedded in an unshaded uber-jar, you may run into trouble (e.g., NoSuchMethodError for unshaded uber-JARs) with multiple copies of those dependencies on your classpath, especially if you need to use a different version of that dependency than is bundled with the uber-JAR.
As you can see, it is important to understand how use of the uber-JAR will affect your application. In particular, Java applications will likely be better served using the individual component JARs, ideally managed using a dependency management platform such as Maven or Ivy. But for non-Java applications, the uber-JAR may be sufficient to your needs.
It basically depends on the use case. If the jar will be used in development of other applications and its dependencies might be updated time to time, it makes sense to use a normal jar, but if the jar is to be run/deployed, it might be better to use an uber/fat jar.
The basic problem is as such: I've got a project that already uses multiple Maven modules for various sub-projects. However, one of the modules (the core module) could itself be split into multiple OSGi bundles when created. This is due to the core module containing several optional dependencies, each of which have isolated Java packages where they're required. For instance, support for JSON input files are optional as they require the optional dependencies from Jackson. The classes that rely on the Jackson dependencies are all isolated to certain json packages within the module. Thus, in theory, I could create a minimal bundle from core that doesn't include the packages that rely on optional dependencies.
Normally, I'd simply split up this module into more Maven modules to make life easier for creating bundles via Felix's maven-bundle-plugin. The problem here is that I still want to create a core JAR for non-OSGi users who don't want to have to include several extra JARs just to use optional functionality (which requires they provide the optional dependencies on the class path as it is). Not only that, but I don't wish to have to split up this module into more modules as it makes development on the project more tedious for the developers as well, especially when we're already splitting up code into proper package-based modules as it is.
The way we were trying to use OSGi already was to make the API module a fragment host (in order to allow it to load a provider bundle without requiring OSGi support), then make the other bundles use said fragment host. This seemed to work well for the smaller modules outside of core, but for core, we wanted to be able to provide multiple bundles from a single module so that optional dependencies wouldn't be required in the bundle itself. As it stands, for plugins, we already have a mechanism for scanning them and ignoring plugins that don't have all the required classes to load them (e.g., if a plugin requires a JPA provider but the JPA API is not available, that plugin isn't loaded). Once we can successfully split up the core module into multiple bundles, I can use declarative services as the plugin method in an OSGi environment (instead of the default class path JAR scanning mechanism in place for normal Java environments), so that isn't an issue.
Is there any way to do all this using Felix's maven-bundle-plugin? Or will I have to use the assembly plugin to copy subsets of the module where bundles can be generated from? Or will I have to resort to writing an Ant script (or Maven plugin) to do this? We've tried using separate Maven modules that simply import the core module as a dependency and generating a bundle from there, but the resultant bundle is always empty regardless of import/export package settings and embed dependencies.
Or, is there a better way to do this? We already use the <optional>true</optional> configuration for the optional dependencies, yet the Felix plugin doesn't seem to care about that and imports all of those dependencies anyways without using the optional attribute.
Well, this is what I'm ending up doing to accomplish this. I'm using the maven-assembly-plugin to copy the binaries I need and filtering out the classes I don't want to include using the <fileSets/> element similar to the <fileset/> element in Ant.
Using the generated directories for each assembly, I'm using the maven-bundle-plugin along with the <buildDirectory/> configuration option to specify where the bundle's class files are located.
It's not ideal, but it's better than writing an Ant script for a Maven project!
I'm new on a web project where we have to develop plugins (called "extensions" in that specific project). The main application runs in a modified Tomcat web server and we have to add our plugins' .jars in a common lib folder. I'm still not very used to the application and how it works, but I'm pretty sure there is a common classloader for the application and all its plugins. All libraries in that lib folder are shared.
My question is how to deal with the plugins' dependencies and potentially conflictual versions, in that environment.
Should we have shared libraries for example some-common-lib-1.3.4 as jars in the lib folder and plugins have to use those versions when they need to use a library?
Or should a plugin contain its own dependencies (using Maven Shade Plugin for example) so different versions of a same dependency are not an issue?
The problem I see with having shared libraries with a specific version to use for all plugins, is about transitive dependencies. If a common library has a dependency to some-transitive-dependency-1.0.0 and we have a specific plugin which requires a new library which itself have a transitive dependency on some-transitive-dependency-2.0.0 then we're screwed... We would then need both some-transitive-dependency-1.0.0 and some-transitive-dependency-2.0.0 in the lib folder and who knows what will happen.
Also, if for one specific plugin we need to update a dependency to a new major version, we may have to update all plugins since that library is shared by all.
Any real world experience with such situation? Any tips?
Since OSGI is not an option, and potentially everyone could create new plugins, the only feasible way to separate them is, as you already suggested, using the shade plugin or some similar technique.
Since you cannot separate classloaders and recompiling all plugins (for which you might not even have the source code) is really not an option and sometimes you might even have non-resolvable conflicts (asm 1.x and 2.x are totally incompatible), you have to use your own "poor-man's OSGI" and use shade.
Note, however, that this does reduce the option of plugins working together or sharing common data not defined in the main application.
I developing a web application with a lot of libraries like, Spring, Apache CXF, Hibernate, Apache Axis, Apache Common and so one. Each of these framework comes with a lot of *.jar libraries.
For development I simple take all of the delivered libraries and add them to my classpath.
For deployment not all of these libraries are required, so is there a quick way to examine all the required libraries (*.jar) which are used by my source code?
If you move your project to use Maven such things become easier:
mvn dependency:analyze
mvn dependency:tree
For your example, Maven + IDE + nice dependency diagrams could help allot.
See an example of this : it's much easier this way to figure out what happens in a project, and this way you don't need to add to your project "all delivered libraries" - just what it's required.
JDepend traverses Java class file
directories and generates design
quality metrics for each Java package.
JDepend allows you to automatically
measure the quality of a design in
terms of its extensibility,
reusability, and maintainability to
manage package dependencies
effectively.
So, as a quick, dirty, and potentially inefficient way, you can try this in Eclipse:
Create two copies of your project.
In project copy #2 remove all the jars from the classpath.
Pick a source file that now has errors because it can't resolve a class reference. Pick one of the unresolved classes and note its fully qualified class name.
Do Control-Shift-T and locate the unresolved class. You should be able to see which jar its contained in since all the jars are still in the classpath for project copy #1.
Add the jar that contains this unresolved class back into your classpath in project copy #2, then repeat steps 3 and 4 until all class references are resolved.
Unfortunately you're not done yet since the jar files themselves may also have dependencies. Two ways to deal with this:
Go read the documentation for all the third-party packages you're using. Each package should tell you what its dependencies are.
Run your application and see if you get any ClassNotFoundExceptions. If you do, then use Control-Shift-T to figure out what jar that class comes from and add it to your classpath. Repeat until your project runs without throwing any ClassNotFoundExceptions.
The problem with #2 is that you don't really know you've resolved all the dependencies since you can't simulate every possible execution path your project might take.
Quite new to maven here so let me explain first what I am trying to do:
We have certain JAR files which will not be added to the repo. This is because they are specific to Oracle ADF and are already placed on our application server. There is only 1 version to be used for all apps at anyone time. In order to compile though, we need to have these on the class path. There are a LOT of these JARS, so if we were to upgrade to a newer version of ADF, we would have to go into every application and redefine some pretty redundant dependencies. So again, my goal is to just add these JARs to the classpath, since we will control what version is actually used elsewhere.
So basically, I want to just add every JAR in a given network directory (of which devs do not have permission to modify) to maven's classpath for when it compiles. And without putting any of these JAR files in a repository. And of course, these JARs are not to be packaged into any EAR/WAR.
edit:
Amongst other reasons why I do not want to add these to the corporate repo is that:
These JARs are not used by anything else. There are a lot of them, uncommon and exclusive to Oracle.
There will only be one version of a given JAR used at anyone time. There will never be the case where Application A depends on 1.0 and Application B depends on 1.1. Both App A and B will depend on either 1.1 or 1.2 solely.
We are planning to maintain 100+ applications. That is a lot of pom.xml files, meaning anytime we upgrade Oracle ADF, if any dependency wasn't correctly specified (via human error) we will have to fix each mistake every time we edit those 100+ pom.xml files for an upgrade.
I see three options:
Put the dependencies in a repository (could be a file repository as described in this answer) and declare them with a scope provided.
Use the dirty system scope trick (i.e. declare the dependencies with a system scope and set the path to the jars in your file system.
Little variation of #2: create a jar with a MANIFEST.MF referencing all the jars (using a relative path) and declare a dependency on this almost empty jar with a system scope.
The clean way is option #1 but others would work too in your case. Option #3 seems be the closest to what you're looking for.
Update: To clarify option #3
Let's say you have a directory with a.jar and b.jar. Create a c.jar with a Class-Path entry in its META-INF/MANIFEST.MF listing other jars, something like this:
Class-Path: ./a.jar ./b.jar
Then declare a dependency in your POM on c (and only on c) with a system scope, other jars will become "visible" without having to explicitly list them in your POM (sure, you need to declare them in the manifest but this can be very easily scripted).
Although you explicitly stated you don't want them in the repository, your reasons are not justified. Here's my suggestion:
install these jars in your repostory
add them as maven dependencies, with <scope>provided</scope>. This means that they are provided by your runtime (the application server) and will not be included in your artifacts (war/ear)
Check this similar question
It is advisable that an organization that's using maven extensively has its own repository. You can see Nexus. Then you can install these jars in your repository and all developers will use them, rather than having the jars in each local repository only.
(The "ugliest" option would be not to use maven at all, put put the jars on a relative location and add them to the classpath of the project, submitting the classpath properties file (depending on the IDE))
if you are developing ADF (10g / 11g I guess) components, I suppose you'll be using JDeveloper as IDE. JDeveloper comes with a very rich Library Management Tool that allows you to define which libaries are required for compiling or which ones should be packaged for deployment. I I suppose you will already know how to add libraries to projects and indicate in the deployment profile which ones should be picked while packaging. If you want to keep your libraries out of maven, maybe this could be the best approach. Let´s say the libraries you refer too are the "Webcenter" ones, using this approach will guarantee you you have the adequate libraries as JDeveloper will come with the right version libraries.
Nevertheless, as you are using maven I would not recommend to keep some libraries out of control and maven repositories. I´d recommend choose between maven and Oracle JDeveloper library management. In our current project we are working with JDeveloper ADF 11g (and WebCenter) and we use maven, it simply make us library management easier. At the end of the day, we will have a big amount of third party libraries (say Apache, Spring, etc.) that are useful to be managed by maven and not so many Oracle libraries really required for compiling in the IDE (as you would only need the API ones and not their implementations). Our approach has been to add the Oracle libraries to our maven repository whenever they are required and let maven to control the whole dependency management.
As others say in their answers if you don´t want the dependencies to be included in any of your artifacts use <scope>provided</scope>. Once you configure your development environment you will be grateful maven does the work and you can (almost) forget about dependency management. To build the JDeveloper IDE files we are using the maven jdev plugin, so mvn jdev:jdev would build generate our project files and set up dependencies on libraries and among them to compile properly.
Updated:
Of course, you need to refer to ADF libraries in your pom files. In our project we just refer to the ones used on each application, say ADF Tag Libraries or a specific service, not the whole ADF/WebCenter stack. For this purpose use the "provided" scope. You can still let JDeveloper to manage your libraries, but we have found that it's simpler to either have a 100% JDeveloper libraries approach or a 100% maven approach. If you go with the maven approach it will take you some time to build your local repo at first, but once that's done it's very easy to maintain, and the whole cycle (development, build, test, packaging and deployment) will be simpler, having a more consistent configuration. It's true that in a future you'll have to update to later ADF versions, but as your repository structure will already be defined it should be something fast. For future upgrades I'd recommend to keep the ADF version as a property on the top pom, that will allow you to switch faster to a new version.