Eclipse plugin depenedency analysis in an automated manner - java

I am in process to create a maven repository of a JAVA module which is part of Eclipse (probably OSGI). I am trying to get transitive dependencies of the JAR files I need using this way (http://wiki.eclipse.org/JFace).
It use Plugin dependency analyser feature of the Eclipse to create a tree. I am wondering how does it work in background. I can find Imported namespace in manifest file of a JAR. But how does it find the relevent JAR file using that information?
My end goal is to export all these transitive depenedencies JAR and convert them to maven repository. I want to automate it ideally so I don't have to do it manually whenever there is an update.

In general Eclipse PDE projects have a target platform. This target platform holds the dependencies.
In OSGi bundles, the dependencies are specified in the manifest. There are many types of dependencies, the OSGi dependency model is magnitudes more powerful than the simplistic 'require' model of other module systems. Not only has it many types of dependencies (require bundle (the classic one), import package, require execution environment, require an implementation for an API, require a service, etc.) specified in namespaces, it also supports a powerful filter that is asserted on the properties of a capability. A capability is the opposite of a requirement.
To handle these capabilities and requirements, OSGi has a resolver. It takes a set of initial requirements and finds a solution where all requirements are satisfied and all the rules are followed. The result of the resolver is a set of wires that connect bundles together.
By far the best environment to play with this is bndtools. In bndtools you can use a P2 repository directly (Normally the target platform is a P2 repository). (You can also directly use a Maven repository or an OSGi standardized repository aka OBR.) In the bndtools bndrun file you can then select one or more root bundles of your graph. The resolver then calculates the set of bundles that are a happy closure. This is all done on GUI level.
There is also a command line called 'bnd' that you can install via brew or download its jar from https://bndtools.ci.cloudbees.com/job/bnd.master/lastSuccessfulBuild/artifact/biz.aQute.bnd/generated/
That is a miraculous little tool that can do wonders but very badly documented although it does have a help command. :-( (I am the author)

Related

How to start with OSGi

In my working place, they asked to me to learn the OSGi framework and to decide what is the best approach to work with it.
In the last two weeks, I surfed the web and I discovered a lot of different approaches to work with OSGi, for example, I found the OSGi enRoute approaches, and an Eclipse plug-in called BndTools. I discovered that I can use simply Declarative Services or framework like AIOLOS.
I'm a little bit confused about all these different approaches and technologies... What do you think is the best approach to get started with OSGi for a beginner? Is there an implementation that is better than the others (for instance Equinox)? Do you have a preferred approach to work with this framework?
Thank you very much in advance!
update To address this question better, I've written a booklet & videos that allows you to get inside OSGi quickly. It is very interactive, using Bndtools and the OSGi Gogo shell. You can find it here.
updated text for readability & current state
I can understand the confusion ... there are a plethora of build tools and they nowadays all support OSGi. Since you sometimes need combinations of tools, the space is complex.
bnd
To use OSGi you need to build bundles. A bundle is a JAR file, the default format of Java libraries/executables. They are bundles when the manifest in the JAR file contains OSGi metadata. This metadata provides information to tooling and the OSGi framework what capabilities it requires from the runtime and what capabilities it provides to the runtime. This information is used to assemble runtimes as well as verify during runtime that things are compatible.
Maintaining this information by hand is a lot of nauseating work. For this reason bnd was developed by me, already about 19 years ago. bnd is currently the primary library in the industry to create this metadata, simplify decorating the metadata, and verifying the validity of the metadata. It does extensive analysis and annotations of the bundle to minimize the manual work.
In the bnd case, it also supports the OSGi standard build annotations for Declarative Services, Manifest annotations, and more. (Many OSGi standards originated in bnd.)
IDE & Continuous Integration
IDEs are the preferred to tools to read, write and debug code. However, without a continuous integration solution that runs on a remote server, you cannot rely on the results since your IDE might depend on the information that you only have on your laptop. For professional development, it is, therefore, a must to have some server that builds your software from scratch without using caches.
Clearly, it is paramount that when you develop software on your laptop the results are identical when you build on the server. For this reason, bnd provides a library that can be used in the IDEs and different build tools. Although there are a myriad of combinations feasible with bnd, there are few most popular ones.
Models
Maven Only
Maven is a popular build tool for Java applications. It defines all build information in POM (Project Object Model) files, which are XML files. POMs can inherit from other POMs. Each POM (and artifact) is identified by a group id, an artifact id, and an opaque version. Maven has a pretty fixed structure for a project. All build work is done via plugins that get their configuration from the POM.
There are two Maven plugins, based on bnd, that provide the necessary OSGi metadata generation.
Apache Felix Maven Plugin
Bnd Maven Plugins
In this model, bnd is only used for providing the metadata in the bundle. All dependencies must be in Maven repositories.
There is a third plugin, Tycho, that builds bundles using the Eclipse PDE model. I hear few people recommending going to PDE/Tycho today, it has not been a painless development and many PDE users are looking at alternatives. This plugin must bridge a very large semantic gap between PDE and Maven.
Gradle Only
Although Gradle is also relying on plugins for all the low-level work it heavily relies on Groovy to provide the build actions.
The bndtools group provides a plugin that makes it trivial to generate the OSGi metadata in the normal Java build:
bnd Gradle plugin
In this model, all dependencies must be stored in repositories accessible by Gradle, which are generally Maven repositories.
Eclipse, M2E, Maven, and Bndtools
In this quartet, Eclipse is the basic IDE, M2E is a plugin that teaches Eclipse how to build the bundles according to a maven specification (pom file). In this quartet, bnd runs inside Maven as a plugin. Bndtools provides some bonus OSGi IDE functionality that M2E is lacking. This is mainly focused on creating assemblies for OSGi runtimes and viewing bundles.
In this model, all build information is stored in Maven POMs. This is the model that BJ Hargrave in another answer to this question posted.
In this model, bnd is only used for providing the metadata in the bundle. All dependencies must be in Maven repositories.
Eclipse, Bndtools, Gradle
The other model, which was developed specifically for OSGi/bnd, is the bnd workspace model. In this model, a workspace is a directory with the OSGi bundle projects, a special directory (cnf) holds workspace wide information. All build information is stored in bnd files which are good old Java property files.
Eclipse/Bndtools and Gradle (and for that matter Ant!) have plugins that read the information in the workspace and project directories and teach their respective build tools where the sources are, where the binaries should be stored, etc. The plugins for these tools that use bnd go out of their way to ensure the build results are identical.
The original archived OSGi enRoute was based on this model. Although it is archived, it is still the primary model for Bndtools and used by many companies. At EclipseCon 2018 there are several presentations about this model:
How OSGi drives cross-sector energy management – Fully based on the v2Archive OSGi enRoute model
How we used OSGi to build open liberty – Build with Bndtools and Gradle
Migrating from PDE to Bndtools in Practice – Migrated recently from PDE (the original Eclipse bundle build model) to Bndtools
This is also the model that OSGi uses itself to build the specification JARs, Reference Implementations, and the Compliance Test suites.
The bnd workspace model is the only model that supports all repository standards. This includes Maven repositories (Maven Central!), OSGi repository standard, Eclipse P2 repositories, directory-based repositories, and even POM based repositories. The support for external repositories in the bnd workspace model is extraordinarily flexible.
Where to Start?
Generally, developers that start in OSGi already have Java experience. This generally drives their choice for a tool since there is already a legacy.
If you can start with a blank slate then my personal preference is the bnd workspace model. It puts the priority on the IDE, which is where you spent most of your time while having an extremely good fidelity with the continuous integration build. In the last 2 years, I've helped two companies, one to start with OSGi from scratch, the other having 8 years experience with PDE. Both are now based on this workspace model and I am quite impressed by how they've been able to reap the benefits of OSGi without any prior experience much faster than I'd ever seen before.
An interactive tutorial, using Bndtools and the OSGi Gogo shell can be found here. With videos!
Start with OSGi enRoute. It will discuss using Bndtools as an IDE. It already uses the Bnd maven plugins to build bundles and demonstrates using Declarative Services to code providing and using services.

Is Maven similar to npm?

As I have worked with npm which looks for dependencies in package.json file and download it for you. Similarly, I see a pom.xml file in Java project. Does maven looks in this file and download dependencies for me. Can I pass around this pom.xml file like package.json, rather than giving the dependency jars ? Are these tools similar and just build for different platforms ?
Same tool, different language?
Maven is the most popular build and dependency resolution tool for Java, just like NPM is for JS. But it's not just the same tool for a different language. There are obviously huge differences between Java and JS builds, and these differences are directly visible in the way Maven operates. For example, while many JS tools rely on Git to do some heavy-lifting, Maven works with custom filesystem-based Maven repositories, as Maven predates Git and needs to handle binary artifacts, which Git historically didn't handle well. In Maven there's a clear separation between sources and binaries, while they are often the same thing in JS world.
Maven basics
Maven in its purest form follows a declarative model, where pom.xml (similar to package.json) defines different properties of the build, but contains no scripts. The disadvantage is it can be a challenge to fine-tune some aspects of the build without using scripts as you have to rely on plugins. The advantage is it can be easier to understand other builds just by looking at pom.xml, as they usually follow the same approach without too much customization. Gradle is a popular Groovy-based tool built on top of Maven standards and conventions, and is specifically designed to simplify pom.xml and break this "no script" barrier.
Referencing your dependencies
Similarly to package.json, you don't work with pom.xml of your dependency directly, but rather define dependency coordinates and let your build tool handle the rest. In Maven the basic form of these coordinates is GAV (groupId, artifactId, version).
Flat dependency tree?
Based on comments in the other answer, Maven provides "flat dependency tree", not "nested dependency tree" that NPM provides by default. Maven does not allow multiple versions of the same dependency. If it happens that different versions are requested, Maven uses dependency resolution to pick a single version. This means that sometimes your transitive dependencies will get a different version than they require, but there are ways to manage this. However, this limitation comes from Java, not Maven, as (normally) in Java a class loader will only provide access to a single class definition even if multiple definitions are found on the classpath. Since Java is not particularly good at handling this, Maven tries to avoid this scenario in the first place.
Note: since npm v3 the dependencies are flatten. The alternative package manager yarn also does the same.
Maturity
Furthermore, Maven is considerably older than NPM, has a larger user base, huge number of custom plugins, and so far could probably be considered more mature overall. Sometimes Maven is used for non-Java or even polyglot projects, as there are plugins for handling other languages or specific environments, such as Android. There are plugins that bridge Maven and other build tools, such as frontend-maven-plugin that actually handles multiple JS build tools.
Yes they are similar in the context that their main purpose is to provide a way describing the project dependencies, instead of keeping them within the project code, and their secondary purpose is to provide developers with an easy way to perform, define and share dev-time/build-time tasks. Both of the above are expressed inside a descriptor file.
Now deciding which one to use is, most of the times, straightforward because it depends on the primary language you are working on. A rough grouping is:
java: maven
javascript/typescript: npm
Below I provide a detailed explanation of the common features and differences. I use | to separate between maven | npm terms respectively:
Common features:
Both tools support dynamic fetch of dependencies ( artifacts | packages ) based on a descriptor file pom.xml|package.json, and also allow you to deploy | publish your own artifacts | packages.
They both have a default public repository | registry ( http://repo.maven.apache.org/maven2/ | https://registry.npmjs.org), but 3rd-party can also be used (via settings.xml|.npmrc ).
They both support the concept of build-level dependencies (plugins | devDependencies used in scripts). *Maven supports provided dependencies also but this does not seem to apply to npm, since javascript is rarely deployed into containers.
They both support dependency namespacing: groupId|scope
Differrences:
maven has an additional local repository(cache):
No need to fetch again the same dependency for differrent projects.
Artifacts that are installed locally, are automatically accessible by other local projects.
dependencies from a project build in maven are downloaded in <homedir>/.m2. With npm they are downloaded in <projectdir>/node_modules.
Building in maven is commonly a one-step process: mvn package (fetch deps , build). In npm it is a 2-step process: npm install (fetch deps) , npm build (build)
maven defines build lifecycles (for building,testing,deploying) consisted of phases, to which default operations(plugin goals) attach, based on differrent packaging options(.jar,.war,.ear e.t.c). You can then overwrite these operations, or inject new ones (via the plugin system). This provides kind of an out-of-the box solution for build,docgen,test,deploy e.t.c.
npm approach is more simplistic ( see: scripts)
Due to the above, npm is labeled as a package-management tool for javascript while maven is labeled as a build-automation and dependency-management tool for java.
In maven setting-up the build process more commonly involves editing the pom.xml.
In npm it involves writing code or configuring complementary build tools like gulp,webpack e.t.c
For some reason version ranges defined by users in npm modules, are much more loose than in maven. This can cause issues with transitive dependencies, that is why an additional file was recently added: package-lock.json
With npm it is much more straightforward to start a new project: npm init. With maven, you need to know how to write a minimal pom.xml, or read about archetypes.
In general it is much more common to edit pom.xml than package.json. E.g. adding dependencies in maven is done manually (or via IDE) while in npm via command line.
As with all build tools, you can call one tool from inside the other, but I think its much more common to call npm from inside maven, than the opposite.
npm supports dev,production builds. In maven this needs to be defined through profiles.
yes. it's a similar packaging tool for java. look for gradle also which gives you more liberty with groovy language, but for start you can use maven to organize your dependencies. you include them as tags there and maven does the job for you.
it traverses the dependency tree and downloads all the appropriate jars.
Yes, same with gradle, but they are not user friendly as npm.

How can I generate multiple OSGi bundles from a single Maven project?

The basic problem is as such: I've got a project that already uses multiple Maven modules for various sub-projects. However, one of the modules (the core module) could itself be split into multiple OSGi bundles when created. This is due to the core module containing several optional dependencies, each of which have isolated Java packages where they're required. For instance, support for JSON input files are optional as they require the optional dependencies from Jackson. The classes that rely on the Jackson dependencies are all isolated to certain json packages within the module. Thus, in theory, I could create a minimal bundle from core that doesn't include the packages that rely on optional dependencies.
Normally, I'd simply split up this module into more Maven modules to make life easier for creating bundles via Felix's maven-bundle-plugin. The problem here is that I still want to create a core JAR for non-OSGi users who don't want to have to include several extra JARs just to use optional functionality (which requires they provide the optional dependencies on the class path as it is). Not only that, but I don't wish to have to split up this module into more modules as it makes development on the project more tedious for the developers as well, especially when we're already splitting up code into proper package-based modules as it is.
The way we were trying to use OSGi already was to make the API module a fragment host (in order to allow it to load a provider bundle without requiring OSGi support), then make the other bundles use said fragment host. This seemed to work well for the smaller modules outside of core, but for core, we wanted to be able to provide multiple bundles from a single module so that optional dependencies wouldn't be required in the bundle itself. As it stands, for plugins, we already have a mechanism for scanning them and ignoring plugins that don't have all the required classes to load them (e.g., if a plugin requires a JPA provider but the JPA API is not available, that plugin isn't loaded). Once we can successfully split up the core module into multiple bundles, I can use declarative services as the plugin method in an OSGi environment (instead of the default class path JAR scanning mechanism in place for normal Java environments), so that isn't an issue.
Is there any way to do all this using Felix's maven-bundle-plugin? Or will I have to use the assembly plugin to copy subsets of the module where bundles can be generated from? Or will I have to resort to writing an Ant script (or Maven plugin) to do this? We've tried using separate Maven modules that simply import the core module as a dependency and generating a bundle from there, but the resultant bundle is always empty regardless of import/export package settings and embed dependencies.
Or, is there a better way to do this? We already use the <optional>true</optional> configuration for the optional dependencies, yet the Felix plugin doesn't seem to care about that and imports all of those dependencies anyways without using the optional attribute.
Well, this is what I'm ending up doing to accomplish this. I'm using the maven-assembly-plugin to copy the binaries I need and filtering out the classes I don't want to include using the <fileSets/> element similar to the <fileset/> element in Ant.
Using the generated directories for each assembly, I'm using the maven-bundle-plugin along with the <buildDirectory/> configuration option to specify where the bundle's class files are located.
It's not ideal, but it's better than writing an Ant script for a Maven project!

Is there a dynamic java class level Ivy-like resolver?

This is more a question about what's out there, and future directions about resolving tools such as Ivy. Is there anything that can mention class-level dependencies for packages, rather than package level dependencies?
For example, let's say I have an apache-xyxy package, that comes with an ivy.xml that lists all it's dependencies. But suppose I only use class WX in apache-xyxy, which doesn't require most of those dependencies. Couldn't a resolver be intelligent and identify that class WX can only possibly invoke the set of other classes (AB, DC, EF), and none of those classes use any of other dependencies, to create a minimal subset of required dependencies? This would be easier and safer than cherry picking to remove some package dependencies that aren't needed because of the specific classes used in that package, and also prevent breaking down several larger packages into smaller ones just for this reason.
Then, if I later decided to use class GH from apache-xyxy, I could do an ivy resolve, and it would dynamically bring in the additional required libraries.
When packaging compiled java code for distribution it's common practice to bundle Java "packages" together. It's also quite possible (but silly) to split a java package across multiple jars. Large frameworks (like Spring) have lots of sub packages in different jars so that users can pick and choose what they need at run-time..... Of course the more jar options one has, the more complex it becomes to populate the run-time classpath...
The keyword here is "run-time".... Tools like Apache ivy and Apache Maven are primarily designed to manage dependencies needed at build time....
Apache Maven does have a "runtime" scope, for it's dependencies, but it's limited to a single list of jars. Typically this scope is used for deciding which jars are needed for testing and populating the lib directory of a WAR file.
Apache ivy has a similar more flexible mechanism called "configurations". It's possible to create as many runtime configurations as you need, and these can be used to decide which jars are downloaded by ivy.
So while it would appear ivy has the answer, I've rarely seen ivy used when launching programs (The one exception is Groovy's Grape annotations)
So what, you might ask, is the answer?
The future of "run-time" classpath management is either OSGI or project jigsaw. I'm more familiar with OSGI where special dependency indicators are added the the jar file's manifest, stating what it's dependencies are. The idea is that when a container loads a jar (called a "bundle") it can check and see whether the other dependencies are already loaded. These dependencies can be retrieved and loaded from a common repository. This is fundentally different way to launch java. Traditionally each application is loaded onto it's own isolated classpath.....
Time will tell if either project catches on. In the meantime we use Apache ivy and Apache Maven to build self-contained and possibly over-bloated WAR (EAR, etc) packages.

Avoid duplicating OSGi imports in maven dependancies?

Currently when I am writting a bundle in that depends on a package, I have to "import" or "depend" on a whole other bundle in Maven that contains that package.
This seems like it is counter-productive to what OSGi gives me.
For example let's say I have two bundles: BundleAPI and BundleImpl.
BundleAPI provides the API interfaces:
// BundleAPI's manifest
export-package: com.service.api
BundleImpl provides the implementation:
//BundleImpl's manifest
import-package com.service.api
However, when I am coding BundleImpl in Eclipse, I am forced to "depend" in maven POM on BundleAPI itself - so that eclipse does not complain.
//BundleImpl's POM
<dependency>
<groupId>com.service</groupId>
<artifactId>com.service.api</artifactId>
[...]
</dependency>
So - on one hand, I am depending only on the package com.service.api, while on the other - I need to have the whole bundle - BundleAPI.
Is there a way to make maven or eclipse smart enough to just find the packages somewhere, instead of whole bundles?
I am very much confused as to how this works - any type of clarity here would be great. Maybe I am missing something fundamentally simple?
The key is to distinguish between build-time dependencies and runtime dependencies.
At build time you have to depend on a whole artifact, i.e. a JAR file or bundle. That's pretty much unavoidable because of the way Java compilers work. However at runtime you depend only on the packages you use in your bundle, and this is how OSGi manages runtime substitution. This is the Import-Package statement in your final bundle.
Of course as a developer you don't want to list two parallel sets of dependencies, that would be crazy. Fortunately maven-bundle-plugin is based on a tool called bnd that calculates the Import-Package statement for you based on analysing your code and discovering the actual packages used. Other tools such as bndtools (an Eclipse-based IDE for OSGi development) also use bnd in this way. Incidentally bnd is much more reliable and accurate than any human at doing this job!
So, you define only the module-level dependencies that you need at build time, and the tool generates the runtime package-level dependencies.
I would recommend against using Tycho because it forces you to use Eclipse PDE, which in turn forces you to manually manage imported packages (for the sake of full disclosure, I am the author of bndtools which competes against PDE).
You cannot develop bundles like regular Java projects with Maven and eclipse. You basically have 2 options.
Apache Felix Bundle Plugin: Basically you develop the project as a regular Java project and use Maven as you normally would. This plugin will be used to add all the OSGi specifics to the jar manifest at deployment time to OSGi enable it. The disadvantage of this aproach is that you are using a Java project in your workspace instead of a bundle, which makes running your project in the OSGi container a little extra work since Eclipse doesn't recognize it as a plugin project. Thus you have to add the jar from the Maven build as part of the target platform manually.
Tycho: This is another Maven plugin that attempts to actually bring theses two environments together and does a pretty good job of it. In this scenario, you actually create an Eclipse bundle/plugin project, which obviously makes for seamless integration in Eclipse. The pom then marks the project as being an eclipse-plugin type, which effectively makes Maven resolve the project dependencies (defined in the manifest) via the target platform instead of Maven itself.
I would take the Tycho approach as it gives a much more integrated approach with Eclipse.
Having the whole jar as a dependency shouldn't be a problem, that's how you have to do it with Maven anyway.

Categories

Resources