tl;dr
We have two groups of bundles. Call them Group A and Group B. Bundles in Group A MUST import packages from ONLY other bundles in Group A, regardless of whether some bundle in Group B exports a package which could satisfy the dependency. Bundles in Group B MAY resolve their package imports from any bundle, regardless of whether it is in Group A or Group B.
Longer explanation
My team is trying to find a way to enforce a kind of "safe mode" for a product built using OSGi. Essentially we have our core product, and we allow customers to install their own components on top to extend our functionality. Obviously, this is the type of thing OSGi is made for.
However, we have noticed that if a customer installs a bundle which happens to export a package used by one of our core bundles, there is a chance that the core bundle will get wired up to something installed by a third party. While I understand that semantic versioning implies that this is not a major cause for concern, we have noticed that a significant portion of our core bundles are restarted if/when some third party bundles are refreshed.
What we want to do is ensure that bundles in our core product do not wire up to any bundle installed by a third party. We are using bundle start levels to set our core bundles to a start level before third party bundles. This lets us set the framework start level to exclude all bundles after our core in the event that we need to debug issues with third party code. However, start levels alone are not enough to prevent package level dependencies from wiring up to our core components.
The only way I can think of to do this (which I do not believe is a good solution and we have no plan to implement) is to maintain a list of all third party bundles added to the runtime after our core product is set up. Then, when the framework shuts down, uninstall all bundles in this list (backing up the actual bundle file). On startup, the core product would start and wire up correctly, then we'd automate the re-installation of all the third party bundles that have been installed. This feels to me like a very fragile, hacky, and just plain wrong way to achieve what we want, but I can't think of any other way. So I am turning to the SO community for help!
If you have developed your system around services then sometimes the best approach is to start another framework and run the customer code in the other framework. If you're more classic Java oriented then this is too much work usually.
Then there are a number of possibilities:
Resolve Hooks – With these hooks you can hide bundles and very likely control things exactly as you want.
Security – With the security manager you can limit package imports and exports
Attributes – Export packages from the core with a magic attribute and import only when that attribute is set.
Weaving – Have a weaving hook that adds a non-existent package import to every non-core bundle. Initialize core, synthesize a bundle with the non-existent package and install/resolve it.
Obviously both Resolve Hooks and Weaving require your bundle to run very early. If you use the Bndtools launcher then there is a way to do this.
Related
Good localtime community,
First a disclaimer, I am a greenhorn when it comes to Maven and OSGi, though I have read through half a dozen threads on the subject here. We are migrating from our current RESTful architecture to Apache's Fuse (ESB) and of course this requires us to sort out our dependencies and carefully craft our POMS for each bundle/container. Generally speaking each container will contain 3 in-house dependency bundles:
The container specific bundle which requires the next two in house bundles).
A processing bundle that all of our in-house containers will require (though they might require different versions), and which requires the bundle below.
A utility bundle that all of our containers will require (though they might require different versions).
Additionally each container can require one or more (and possibly up to over 100) third party jars, which are often reused between containers (though again no guarantee if versions will be the same).
We have (or will soon have) profiles for each container and for the ESB as a whole. What I would like to know is:
What is the best (cleanest) way to reuse third party jars between containers when we can?
I've read that we should never use if we can avoid it, but we are heading down to jarmageddon when we try to explicitly import the jars/packages we need for our three bundles. I have read that *;resolution:=optional is not the way to go either (and for what it's worth, that doesn't seem to work for us anyway). Any thoughts?
I read on some forums that bundling third party jars is the way to go (though that seems a little overboard), and I have read on others that would kind of defeat the purpose of OSGi. Any thoughts there?
Our in-house bundles often require many of the same third party jars/versions. Is this simply a matter of building/installing our bundles in the correct order (utility, processing, and container specific) and exporting jars that can be reused by the next bundle(s)?
We are in a position where we can get our container working if we do some things we do not want to do (export everything, import *), but we'd like to handle this as cleanly as possible since we will have to repeat the process for many containers (with increasing dependencies) and since we will have to live with supporting/updating our implementation.
Thanks in advance for your guidance.
I have a large ecosystem of applications and libraries which are currently deployed as a collection of .jars in various application servers (e.g. JBoss AS), and I'm trying to figure out a good set of tools to manage dependencies and life-cycles of the various packages.
I think of the packages as being in one of (at least) three possible states: "unloaded", "pending" and "loaded", loosely defined as follows:
Unloaded: The package is not available at the moment.
Pending: The package itself is available, but not all its dependencies. Therefore, it cannot be used at the moment.
Loaded: The package is available and has all its dependencies satisfied. If it's an application, it can run - if it's a library, it is ready to be used by another package.
(There might also be a few more states, such as "failed" for packages that tried to load but failed for some other reason than that dependencies were not satisfied, etc...)
In the life-cycle of a package, a number of things might cause a package to change state between these three:
A package with no dependencies is loaded, and goes from unloaded to loaded.
A package tries to load, but not all dependencies are satisfied; it goes from unloaded to pending.
A package in pending state suddenly has all its dependencies satisfied (because some other package went to state loaded), and automatically starts loading itself; transition from pending to loaded.
A package is unloaded. All loaded packages which depend on the now unloaded package go from loaded to pending.
A package is updated to a newer version. All dependent packages are automatically reloaded, to get access to the updated version.
I've started working on using OSGi for defining the dependencies - it rolls well with our build system and produces reliable dependency information. However, if I load the two OSGi bundles A and B into JBoss, where B depends on A, and then unload A, it seems like B happily keeps running. I've understood that there are some hooks that I could use to control this on a low level (framework events), but my spider sense is tingling, saying that there must be a better way to do this.
Is there a nice tool/framework/whatever-you-want-to-call-it that will compliment OSGi in these aspects?
In OSGi, existing class loaders will remain active until you refresh the framework. So If you unload B (where A depends on B) then A will continue churning happy along untile you refresh. You can refresh the whole framework or only the bundles affected by a given bundle (e.g. refresh B).
The purpose of the refresh is to update/uninstall/install a set of bundles and then apply the changes in an 'atomic' operation.
The old (and most used model) is to get the PackageAdmin service and then call refreshPackages(null) on it (osgi.org/javadoc/r4v43/core/org/osgi/service/packageadmin/… ).
This service is, however, deprecated. So the current way is:
FrameworkWiring fw = context
.getBundle(0)
.adapt(FrameworkWiring.class);
fw.refreshBundles(null);
updated
If you declare dependancy between modules, the lifecycle is properly managed and the descendant depandancies must be stopped before the top-level module is stopped.
However, stopping a module does nothing more than sending events to the bundle activator and removing references from classloader. Any activities (such as thread or distributed instances) must be manually clean. For example, you must call org.apache.commons.logging.LogFactory.release(ClassLoader) (if using commons logging) or remove any injected UI-component.
I'm looking to create a Java 'library' (JAR) that can be used within other projects, but I'd like my library to be extensible. I'm after something OSGi-esque, where my library has 'extension points' that other JARs can hook into. The thinking is that my library will have a core set of methods that will be called by the projects it's used in, but the exact implementation of these methods might vary based on the project/context.
I've looked into OSGi (e.g. Equinox), but I'm not sure it can be used in the way I'm after. It seems to be geared towards standalone apps rather than creating a library.
What would be the best way of achieving this? Can OSGi be used in this way, and if not are there frameworks out there that will do this?
I hope all that's clear - I have a clear idea of what I want, but it's hard to articulate.
OSGi is great, but I don't think that this is what you need. By using OSGi (-Services), you force the user of your library to use an OSGi environment, too.
I think as #Peter stated, you can do this by simply extending classes of your library in the specific project / context.
But in case you want to use OSGi, there is a simple way to achieve this. It's called Bundle Fragments. This way you can create a bundle and extend a so-called Host-Bundle", i.e. your library, without altering the original library. A popular use case for this is if you have platform specific code in your bundles.
What you are naming a Java library is named "Bundle" in OSGi context.
OSGi Bundle is a JAR file with some special Meta-Information in its MANIFEST.MF file. Now, every OSGi Bundle have either Exported-Packages or Imported-Packages.
Through Export-Packages Manifest header, you can show what all packages you are exporting.. And your other project can simply add the package it wants to use from them to its Import-Packages..
Here's an example: -
Bundle A Manifest: -
Export-Packages: com.demo.exported;
Bundle B Manifest: -
Import-Packages: com.demo.exported;version=(1.0.0, 2.0.0]
This way your bundle B (A different project) can call the methods from the class in the package that it imported from Bundle A..
Now, the version you see in the import-package, is just to show what all package version can it accept.. You can have 2 bundles with two different implementation of some interfaces and provide this package in two different version.. Both will be available..
Till now, I was talking about Static data-types..
You can also have your services exposed dynamically through Declarative Service.. In this case you will have to define one XML file (Component Definition) where you show what all services your Bundle will expose.. And in the other bundle, you can again define another XML, to show what all services it requires..
These are called, Provided Services and Referenced Services..
I think this will give you a little idea about what can be done.
And if I am wrong somewhere in interpreting your problem please specify the same..
*NOTE: - And of course OSGi is used for creating independent Bundles, that can be re-used in other projects.. They bring Modularity to your project..
As others have mentioned, you don't need OSGi or any framework for this. You can do this my employing patterns like the template method pattern or the strategy pattern. There are several other patterns for dynamically modifying/extending functionality, but these seem to fit your description most. They do not require any framework.
The benefit you would get from a framework like OSGi would be that it would manage the wiring for you. Normally, you'll have to write some code that glues your libraries and the extensions together - with a framework like OSGi, this will not be automated with minimal overhead (in case of OSGi, the overhead is some entries in the JAR-manifest).
[Clarification] Forgive the lack of clarity in the initial description. Allow me to re-phrase the question.
Does there exist a way to perform runtime compilation using the javax.tools API, usable in OSGi (again stressing runtime), which understands a bundle's dependencies and security constraints?
[update]
Please see https://github.com/rotty3000/phidias
It's a well formed OSGi bundle.
The readme provides all the details of the very tiny 4 class API (8k module).
In order to get from a set of package imports and exports to a list of bundles which can be used for compilation, you'll need some sort of repository of candidate bundles, and a provisioner to work out which bundles best provide which packages. If you're using 'Require-Bundle' (not a best practice), you'll know the bundle names, but not necessarily the versions, so some provisioning is still required.
For example, in Eclipse PDE, the target platform is used as the basic repository for compilation. You can also do more sophisticated things like using Eclipse's p2 provisioning to provision your target platform, so you can use an external p2 repository as your repository instead of setting one up yourself. For command line builds, Tycho allows Maven builds to use the same sort of mechanisms for resolving the classpath as Eclipse itself uses.
An alternative approach is to list your 'classpath' as Maven dependencies, and let the maven bundle plugin (based on bnd) generate your manifest for you.
If you can't take advantage of existing command line tools because you're compiling programatically (it's not entirely clear from your question what problem you're trying to solve), your best best is probably to take advantage of an existing provisioning technology, like OBR, Eclipse p2, or Apache Ace to work out the bundles which should be on the class path for compilation.
This is exactly what we do in bndtools ... If I had a bit of time I would add a compiler to bnd so it could also do this.
Sure you can, you just have to write a custom JavaFileManager which will supply the right classes to compile against to the JavaCompiler.
For example you can write one that gets its classes from an OSGi runtime. If you don't mind having a dependency from your compiler bundle to the libraries you need then it's pretty easy, otherwise you can use the wiring api to look to other bundles as well. (OSGi 4.3+ only). If you intercept which packages it requests while compiling you can generate Package-Import statements so you can generate a bundle.
I made a rough GitHub example a few months back:
https://github.com/flyaruu/test-dynamic-compiler
There were some issues (I couldn't get the Eclipse ecj compiler to work for example, I didn't look into bundle security at all, and due to the dynamic nature of OSGi you have to listen to bundle changes to update your compilation path.), but it works fine.
I've so far found that the real answer is "No there is not!"
The predominant runtime compilation scenario currently for java is JSP compilation. An investigation of the app servers I've had the occasion to review use one of these methods:
invocation of javac (through a system call)
use of ecj/jdt
uses javax.tools in a non-OSGi aware way
All of these approaches are based on collecting the available classpath by directly introspecting jars or classes in the file system.
None of the current approaches are aware of OSGi characteristics like the dynamic nature of the environment or the underlying restrictions imposed of the framework itself.
I've been thinking some about "good practice" regarding package structure within osgi bundles. Currently, on average, we have like 8-12 classes per bundle. One of my initiative/proposal has been to have two packages; com.company_name.osgi.services.api (for api-related classes/interfaces (which is exported externally) and one package com.company_name.osgi.services.impl for implementation (not exported)). What are the pros cons of this? Any other suggestions?
You might also consider puting the interfaces in com.company_name.subsystem, and the implementation in com.company_name.subsystem.impl, the OSGI specific code, if there is any, could be in com.company_name.subsystem.osgi.
Sometime you might have multiple implementation of the same interfaces. In this case you could consider - com.company_name.subsystem.impl1 and com.company_name.subsystem.impl2, for example:
com.company.scm // the scm api
com.company.scm.git // its git implementaton
com.company.scm.svn // its subversion implementation
com.company.scm.osgi // the place to put an OSGI Activator
In this sense package structure could be OSGi agnostic, if you later on move to a different container, you just put an additional
com.company.scm.sca // or whatever component model you might think of
Always having api and impl in your package name could be annoying. If in doubt use impl but not api.
It's not the number of classes that is important but the concepts. In my opinion you should have one conceptual entity in a bundle. In some cases this might be just a few classes in other several packages with 100s of classes.
What it important is that you separate the API and the implementation. One bundle contains the API of your concept and the other the implementation. Like this you can provide different implementations for a well defined API. In some cases this might be even necessary if you want to access the services from a bundle remotely (using e.g. R-OGSi)
The API bundles are then used by code sharing and the services from the implementation bundles by service sharing. Best way to explore those possibilities is to look at the ServiceTrackers.
In your case you could have the implementation in the same package, but all of its classes "package protected". This way, you can export the package and the implementation would not be visible to the outside, even when not using OSGi (but as a regular jar file).