I am using the maven felix plugin to create OSGi bundles, however suppose you have a package "com.example" that exists in project1 and project2. Additionally project2 has a dependency to project1.
If you export the package in project2, it will have the code from project2 and from project1. This -to me- is really odd behavior. The only reason I can think of that they enabled such behavior is because OSGi requires it somehow? (I have already looked at http://felix.apache.org/site/apache-felix-maven-bundle-plugin-bnd.html but can't seem to find a way to turn it off)
If two jars (A and B) export the same package but different classes in it and a third jar (C) has a dependency to that package, I would assume that C can see both A & B at runtime. Or does OSGi require a different package per jar?
If OSGi is not mandating this, how can I turn this "feature" off?
If OSGi is mandating it, then...why?
UPDATE
The answer provided by Christian clears up the OSGi requirement for the different packages in different jars. However I still have a problem with felix where I have an "api" jar that contains:
com.example.api: the actual interfaces
com.example: a factory class, a utility class,...
And an implementation package that has:
com.example.impl
Now when I build the implementation package with felix and I export "com.example.impl", it will indeed have everything in "com.example.impl" but for some reason it also includes all the classes in "com.example" (not those in .api). Any combination of settings I have tried will not prevent felix from adding the "base" package for some reason...
So basically in the resulting jar of the "impl" project, I actually have the class com.example.MyFactory which is in the api package. How can I block this?
OSGi is not mandating that you have the same package in two projects. In fact you should avoid to have the same package / version combination in two bundles with different contents.
In OSGi the wiring happens when the bundle goes from installed to resolved. In that step the framework matches each Import-Package statement with an exported package that matches name and version range. In OSGi only one package will be wired to each bundle even if several bundles export the same package. This is different to standard java where you would have a mix of the classes from all jars that have the package which can have quite unpredictable results.
In OSGi there is one pattern where you have the same package in several bundles. It is used for official APIs from OSGi a lot. There when you implement an API you also include the API package and have an Import-Package as well as an Export-Package statement for the API package. This allows the implementing bundle to be installed without the need for an additional API bundle. This works well even if there are more than one bundle that includes the API as the framework will select one of the API packages and wire all others to the same package. So they all see the same set of classes and there is not conflict.
You can also do this for you own applications but there it is more common to just have the API package in one bundle and all others just import it.
You can find some info at the apache felix OSGi Frequently Asked Questions
To answer your updated question. I guess you only export the com.example.api package. So the maven bundle plugin knows it can refer to this package using an Import-Package statement. As com.example is not exported the plugin knowns that an Import-Package would not work. So it embeds the classes.
So what you should take away is that you need to export all packages that are needed by other bundles. Btw. you normally do not export an impl package in OSGi. Rather you hide the impl behind a service. The service interface is placed in API. The impl bundle then implements the interface and exports the impl as an OSGi service. So other bundles can bind the service by its interface and the whole impl can stay private.
Related
Is there a way to use an OSGI bundle as a maven dependency without getting all the packages from it into the classpath that it doesn't even export?
Background of the question: We just added the org.apache.sling.xss bundle as a maven dependency into our project. As you can see in the pom.xml, it just exports the package org.apache.sling.xss but embeds various dependencies directly into it's jar as Private-Package. While OSGI hides this stuff from other bundles, maven does not. So we get collisions: org.apache.sling.xss embeds e.g. version 1.7.0 of commons-beanutils, which is incompatible with the newer version of commons-beanutils we have been using, so that we now get compile errors.
Is there any good solution to this - both from our perspective, as from the perspective of the maintainers of the org.apache.sling.xss bundle? Ideally, you'd get only the exported org.apache.sling.xss package into the classpath if you use that bundle. So maybe a good solution would be if org.apache.sling.xss provided a separate API jar, containing only that class? Is there a standard way to do this, or another solution? Can we somehow tell maven just to include only that package into the classpath? (I know maven has dependency exclusion, but that doesn't help here, since the problematic stuff is not a transitive dependency of sling.xss, but actually included into the org.apache.sling.xss jar.)
Unfortunately there is no good way to handle such bundles with maven.
The recommended way is to split the bundle into an API bundle that just defines the API and an implementation bundle that imports the API and imports or embeds all implementation dependencies.
This way consumers will only have a maven dependency to the API and the problem does not occur at all.
Can you open an jira issue for sling xss to provide an API bundle?
You could include in your build system a step that performs OSGi resolution, for example using the bnd-export-maven-plugin.
In this way the build as a whole will fail if you write code that depends on a non-exported package of another bundle. This is because bnd will automatically add an Import-Package for the package that you used in your bundle, but the resolve step will be unable to find a corresponding Export-Package in the other bundle.
Of course this is not ideal. You will still be able to write the code in your IDE and compile it with the maven-compiler-plugin. But you at least find out about the problem before you hit the runtime.
I have an OSGI bundle that is working perfectly, I added, as a maven dependency, unirest a lightweight HTTP library, when deploying to serviceMix I get a missing requirement:
filter:="(osgi.wiring.package=com.mashape.unirest.http)"
Of course I am using that package in my bundle, but as far as serviceMix is concerned, that library is just classes in my classpath like my own classes
I guess I'am missing something Here
I know it is possible to embed a library, but I don't understand why any additional manipulation is needed ? how is that different from just adding that library as a maven dependency
Any answers and pointers to articles/documentation is really appreciated
There is a difference between the way dependencies are treated in maven and osgi.
Maven treats the dependencies as a single classpath/classloader where the classes from every jar can access the classes from every other jar.
This is historical from main apps, and its the way maven runs unit tests.
In the osgi run-time environment each bundle has its own classloader, and by default, only has access to the classes and jars embedded in its own bundle.
The classes in each bundle are isolated unless the package is exported from one bundle and imported to another.
The maven bundle plugin attempts to bridge the gap. Just adding a dependency to maven is not enough, one must also tell the maven bundle plugin how to package and deploy it. The options are 1) embed the dependency in your bundle, or 2) deploy the dependency as a separate bundle.
The maven bundle plugin defaults to option 2: import everything.
To understand the exports from your dependency better, I suggest looking at the manifest file in the bundle jar. (META-INF/manifest.mf)
I just looked at the unirest jar on maven central and found no packages exported in the manifest. In fact unirest has no bundle headers, so its not an osgi bundle.
The error message
missing requirement : filter:="(osgi.wiring.package=com.mashape.unirest.http)"
means your bundle is attempting to import a package, but servicemix has no bundle exporting it.
To understand this better, I suggest looking at the manifest file in your bundle jar. (META-INF/manifest.mf) I expect it will contain an import of com.mashape.unirest.http. Also look at servicemix to see if anything exports it:
From the command from try: exports | grep com.mashape.unirest.http
I expect you will find no bundle exporting it.
I suggest you configure the maven bundle plugin to embed unirest instead of importing it. like this:
<Embed-Dependency>unirest-java</Embed-Dependency>
(if you already have an Embed-Dependency configuration you will need to merge them.)
Ref. http://felix.apache.org/documentation/subprojects/apache-felix-maven-bundle-plugin-bnd.html#detailed-how-to
Another way is deploy unirest-java as a bundle by wrapping it.
That could be as simple as copying it to the deploy folder:
https://karaf.apache.org/manual/latest/#_wrap_deployer
That would have the advantage of sharing unirest-java.
Lastly, if unirest does its own class loading, then that may turn out to be incompatible with osgi.
I have two third party bundles that both depend on the javax.transaction package. This package is exported by the system bundle as version 0.0.0. One of the bundles imports any version of the package and declares it as a uses-constraint on its exported package. The other bundle explicitly requires version 1.1.0. This package is provided by a different bundle.
My own bundle requires both third party bundles. However, since the system bundle is resolved first, the third party bundle that accepts any version is wired against version 0.0.0. As a result, a package uses conflict arises.
What are the options to fix this?
Options 1:
Append a version to Import-Package in every bundle where javax.transaction is imported. I would not suggest this. :)
Option 2:
Do not import the javax.transaction package in the boot delegation. That is what we do, too. You can set the packages that should be imported via the org.osgi.framework.system.packages system variable. To see an example, look for the name in the variable in this pom file. There is an example for felix and one for equinox.
With this option, you might have the problem that javax.sql depends on javax.transaction.xa package. In that case, you can use javax.sql from a bundle that is available here (JDBC version 4.0.0). Of course, you must exclude javax.sql packages from the boot delegation as well (the examples exclude them).
As javax.transaction.xa is only used from javax.sql and javax.sql is not used from any other
JDK packages, these packages can be safely separated and used from bundles.
You basically have two bundles (the system bundle and some other bundle) that export this package. If the contents of the package is the same in both cases, the easiest solution is to only provide it once. Since one of your consuming bundles needs at least version 1.1.0 you should either make sure you export at least that version, or change your consumer to accept any version.
So you can either remove the "other bundle" that provides the package (or modify it in case it provides more than just this package) or you can modify the list of packages that the framework exports. There is a property in the specification to do this called "org.osgi.framework.system.packages" that lists all these packages, so you can make a new list that exludes this package.
Like Balazs says, I would stay away from boot delegation, but judging from your description you were not using that.
The basic problem is as such: I've got a project that already uses multiple Maven modules for various sub-projects. However, one of the modules (the core module) could itself be split into multiple OSGi bundles when created. This is due to the core module containing several optional dependencies, each of which have isolated Java packages where they're required. For instance, support for JSON input files are optional as they require the optional dependencies from Jackson. The classes that rely on the Jackson dependencies are all isolated to certain json packages within the module. Thus, in theory, I could create a minimal bundle from core that doesn't include the packages that rely on optional dependencies.
Normally, I'd simply split up this module into more Maven modules to make life easier for creating bundles via Felix's maven-bundle-plugin. The problem here is that I still want to create a core JAR for non-OSGi users who don't want to have to include several extra JARs just to use optional functionality (which requires they provide the optional dependencies on the class path as it is). Not only that, but I don't wish to have to split up this module into more modules as it makes development on the project more tedious for the developers as well, especially when we're already splitting up code into proper package-based modules as it is.
The way we were trying to use OSGi already was to make the API module a fragment host (in order to allow it to load a provider bundle without requiring OSGi support), then make the other bundles use said fragment host. This seemed to work well for the smaller modules outside of core, but for core, we wanted to be able to provide multiple bundles from a single module so that optional dependencies wouldn't be required in the bundle itself. As it stands, for plugins, we already have a mechanism for scanning them and ignoring plugins that don't have all the required classes to load them (e.g., if a plugin requires a JPA provider but the JPA API is not available, that plugin isn't loaded). Once we can successfully split up the core module into multiple bundles, I can use declarative services as the plugin method in an OSGi environment (instead of the default class path JAR scanning mechanism in place for normal Java environments), so that isn't an issue.
Is there any way to do all this using Felix's maven-bundle-plugin? Or will I have to use the assembly plugin to copy subsets of the module where bundles can be generated from? Or will I have to resort to writing an Ant script (or Maven plugin) to do this? We've tried using separate Maven modules that simply import the core module as a dependency and generating a bundle from there, but the resultant bundle is always empty regardless of import/export package settings and embed dependencies.
Or, is there a better way to do this? We already use the <optional>true</optional> configuration for the optional dependencies, yet the Felix plugin doesn't seem to care about that and imports all of those dependencies anyways without using the optional attribute.
Well, this is what I'm ending up doing to accomplish this. I'm using the maven-assembly-plugin to copy the binaries I need and filtering out the classes I don't want to include using the <fileSets/> element similar to the <fileset/> element in Ant.
Using the generated directories for each assembly, I'm using the maven-bundle-plugin along with the <buildDirectory/> configuration option to specify where the bundle's class files are located.
It's not ideal, but it's better than writing an Ant script for a Maven project!
This is more a question about what's out there, and future directions about resolving tools such as Ivy. Is there anything that can mention class-level dependencies for packages, rather than package level dependencies?
For example, let's say I have an apache-xyxy package, that comes with an ivy.xml that lists all it's dependencies. But suppose I only use class WX in apache-xyxy, which doesn't require most of those dependencies. Couldn't a resolver be intelligent and identify that class WX can only possibly invoke the set of other classes (AB, DC, EF), and none of those classes use any of other dependencies, to create a minimal subset of required dependencies? This would be easier and safer than cherry picking to remove some package dependencies that aren't needed because of the specific classes used in that package, and also prevent breaking down several larger packages into smaller ones just for this reason.
Then, if I later decided to use class GH from apache-xyxy, I could do an ivy resolve, and it would dynamically bring in the additional required libraries.
When packaging compiled java code for distribution it's common practice to bundle Java "packages" together. It's also quite possible (but silly) to split a java package across multiple jars. Large frameworks (like Spring) have lots of sub packages in different jars so that users can pick and choose what they need at run-time..... Of course the more jar options one has, the more complex it becomes to populate the run-time classpath...
The keyword here is "run-time".... Tools like Apache ivy and Apache Maven are primarily designed to manage dependencies needed at build time....
Apache Maven does have a "runtime" scope, for it's dependencies, but it's limited to a single list of jars. Typically this scope is used for deciding which jars are needed for testing and populating the lib directory of a WAR file.
Apache ivy has a similar more flexible mechanism called "configurations". It's possible to create as many runtime configurations as you need, and these can be used to decide which jars are downloaded by ivy.
So while it would appear ivy has the answer, I've rarely seen ivy used when launching programs (The one exception is Groovy's Grape annotations)
So what, you might ask, is the answer?
The future of "run-time" classpath management is either OSGI or project jigsaw. I'm more familiar with OSGI where special dependency indicators are added the the jar file's manifest, stating what it's dependencies are. The idea is that when a container loads a jar (called a "bundle") it can check and see whether the other dependencies are already loaded. These dependencies can be retrieved and loaded from a common repository. This is fundentally different way to launch java. Traditionally each application is loaded onto it's own isolated classpath.....
Time will tell if either project catches on. In the meantime we use Apache ivy and Apache Maven to build self-contained and possibly over-bloated WAR (EAR, etc) packages.