Best practices for runtime-only dependencies in OSGi - java

In line with the Open-Closed Principle, I typically design my Java packages and libraries in such a way that there is a generic "interface" or "API" package/library and one or more implementations (quite similar to many common APIs like JDBC or JAXP/SAX).
To locate an implementation (or sometimes multiple implementations) in the base API library without violating OCP, I commonly use Java's ServiceLoader mechanism, or occasionally classpath scanning via third-party libraries like ClassGraph or Reflections. From a Maven perspective, the implementations are brought in as runtime dependencies (as they're only needed at execution time, but not at compile time). Pretty standard stuff.
So, now, I want to make some of these packages available as OSGi bundles (with API and implementation in separate bundles), but since in OSGi each bundle has its own class loader, neither classpath scanning nor the ServiceLoader API will work for this purpose. At first glance, OSGi's "fragment" mechanism seems to be the closest equivalent to the plain-Java setup described above. In that scenario, the API bundle would be the "fragment host", and concrete implementations would attach as fragments to that host bundle. As the fragment host and all its attached fragments use the same class loader, the standard plain-Java mechanisms like ServiceLoader or ClassGraph would conceivably still work. This would also have the advantage that there would be no need to detect whether a library/bundle is running in an OSGi context or not, and no OSGi framework dependencies are needed.
So, in a nutshell, my question is: are fragments the correct way to implement runtime-only dependencies in OSGi or is there a better (or more standard) way? Preferably, I'm looking for a solution that works in an OSGi container but does not require a dependency on OSGi itself.

No Fragments are almost always wrong outside the translations. The OSGi model is to use services.
The way to go then is to use DS. Using bnd (in maven, gradle, ant, sbt, or Bndtools) you can create components. A component is a Plain Old Java Object (POJO) that is annotated with injection and activation instructions. You could make those components to take all its dependencies in the constructor.
The bnd code uses the annotations to generate an XML file that is used in runtime to create, activate, inject, and register those components. This will then work out of the box in an OSGi Framework. The annotations are build time so they do not create dependencies in your runtime.
In your non-OSGi environment, you'd be responsible to call that constructor yourself. So you gather your dependencies using the Service Loader and then construct them in the right order.
#Component
public class MyComponent implements Foo {
final Bar bar;
#Activate
public MyComponent( #Reference Bar bar ) {
this.bar = bar;
}
...
}

Related

Can custom plugin for Intellij IDEA use DI in its code?

I am developing a plugin for Intellij IDEA 2018.2+ which will provide some additional inspections.
I have already learnt that there is a plugin.xml file which is the "heart" of plugin and is responsible for main behaviour of plugin.
As I understand, to implement (for example) additional inspection behaviour we need to define inspectionToolProvider in plugin.xml and inherit InspectionToolProvider interface. The same structure is defined for other extensions - we need to define something in .xml and implement some interface.
What bothers me is that if I want to implement some more-or-less complex algorithm, it looks like I need to use lots of static methods and utility classes, because I haven't found a way to use DI (e.g. Spring one) during plugin development.
Some examples in Intellij IDEA SDK docs also show "helper" methods as static ones defined in utility classes.
So overall question: is there a way to use dependency injection during Intellij IDEA plugin development?
IntelliJ IDEA has its own dependency injection, managed by PicoContainer. It allows you to inject any component or service into the constructor of any component, service or extension created on the same or lower level (possible levels are application, project and module). To use it, you simply declare a constructor parameter of the corresponding type; you do not need to apply any extra annotations.
You can also start your own DI container (using Spring or any other framework) in your plugin, but then it will be your own responsibility to support the injection of core IntelliJ IDEA components.

Factory which is not dependant on implementation

I have an api which has some base implementations. I have a factory which gives the instances of that api to the clients.
I want to change my factory to make it more generic so, if a new implementation of the api will be generated and its jar file will be put in classpath, the factory will understand it and any changes wouldn't be needed.
Use the java SPI, Service Provider Interface.
API jar - Provide one single interface.
Provider jar - Provide implementations in jars. You can even put several implementations in a jar. In a text file META-INF/services/my.package.MyInterface one lists implementing class(es).
Application - In the application the implementing jar should not be needed for compilation:
in maven scope runtime.
The service discovery happens with a ServiceLoader<T>:
public static void main(String[] args) {
ServiceLoader<MyInterface> loader = ServiceLoader.load(MyInterface.class);
for (MyInterface api : loader) {
api. ...
}
// Or take the first implementation:
MyInterface api = loader.iterator().next();
}
You could provide a class in the API jar with a static function for that discovery mechanism.
Advantages:
Separation
Several implementations possible
Selection of implementation can be done dynamically
Example of jars
xxx-api.jar
my/package/MyInterface.class
xxx-first-impl.jar
META-INF/services/my.package.MyInterface
my.package.impl.MyImpl1
my/package/impl/MyImpl1.class
public class MyImpl1 implements MyInterface { ... }
myapp1.jar
If you'd like to start with theory. Please read about Dependency inversion principle.
In object-oriented programming, the dependency inversion principle refers to a specific form of decoupling software modules. When following this principle, the conventional dependency relationships established from high-level, policy-setting modules to low-level, dependency modules are inverted (i.e. reversed), thus rendering high-level modules independent of the low-level module implementation details.
A. High-level modules should not depend on low-level modules. Both should depend on abstractions.
B. Abstractions should not depend on details. Details should depend on abstractions.
The principle inverts the way some people may think about object-oriented design, dictating that both high- and low-level objects must depend on the same abstraction.
Dependency Injection Library
As for specific implementations, you have many in Java. Specifically for the Dependency Injection approach. Spring Framework obviously comes to mind. But you can also look at Java EE Context and Dependency Injection.
Interface Injection
You can also...
Load the Jars manually : How should I load Jars dynamically at runtime?
Use Interface Injection : Spring interface injection example ( This title says Spring, but the answers show no Spring is needed for Interface Injection )

Programatically build and launch contained applications based on Maven dependecies

I would like to take several lists of Maven dependencies from the user, resolve and load each of them as contained applications. Here are the steps:
collect a list of all Maven dependencies (DONE)
resolve all dependencies with Aether (DONE)
resolve classpath with Aether (DONE)
bundle the above in a separate "container" (so that different Maven dependencies with potential conflicting version can be used).
repeat with other lists.
To give some context: I want to use the above in the context of UIMA, to be able to run different (natural language processing) pipelines that rely on different sets of libraries with different versions. My goal is to create an annotation-server in which one defines (Maven) dependencies and pipelines that can be called in a RESTful way. The pipelines (and their corresponding dependencies) should each run in a contained classpath environment (so as to avoid classpath clashes).
Is OSGi the way to go? Based on a classpath (:= a list of resolved jar), can I then build an OSGi bundle and deploy it? All programmatically? I do not have control over the maven dependencies (they are UIMA components, that's it), so no way to add OSGi metadata there.
Would maven-assembly-plugin combined with maven profiles take care of this for you?
You can filter dependencies differently on a per profile basis. You can use profile specific assembly descriptor documents and generate custom manifest to be placed in the war. You are describing a J2EE Web Application (war) assembly -- they will run in a firewalled classloader inside a servlet container so you generate a bunch of them based on the same source (just vary the web app context and the contents of the WEB-INF/lib on a per profile basis.
Drop them into the same Tomcat server, for example, and you are ready to go. Was this what you meant?
HTH,
Nick
You can certainly create a bundle that contains a list of jars, put all of those on the bundle's own classpath and deploy that bundle into an OSGi container. You probably do need to create a BundleActivator (which is the entry point for that bundle, like the main method is for traditional Java).
You then say you have multiple of such bundles, and do I understand correctly that you want to deploy each bundle in a separate container? If so, you can either use some kind of REST library to provide a REST endpoint for each bundle, or you can use OSGi remote services to publish a service that can be discovered by other containers.
I am not sure if this is what you mean, so I am also not sure if OSGi is the right way to go. From your description you use neither services (a very important reason to use OSGi as that decouples parts of your application from each other) nor do you intend to create different bundles for the components (another important reason to use OSGi). You are almost describing an architectural style currently hyped as "micro services". Can you please elaborate a bit more?
Based on your use case I'd suggest you look into the Java ServiceLoader API. The ServiceLoader API allows you to define an interface, and load implementations of that interface from different self-contained JARs. You can build your different libraries into their own jars, exposing the methods you need via the interface, and load them from your Java program independently. The ServiceLoader will even list the different implementations available for you.
From the documentation:
Suppose we have a service type com.example.CodecSet which is intended to represent sets of encoder/decoder pairs for some protocol. In this case it is an abstract class with two abstract methods:
public abstract Encoder getEncoder(String encodingName);
public abstract Decoder getDecoder(String encodingName);
Each method returns an appropriate object or null if the provider does not support the given encoding. Typical providers support more than one encoding.
If com.example.impl.StandardCodecs is an implementation of the CodecSet service then its jar file also contains a file named
META-INF/services/com.example.CodecSet
This file contains the single line:
com.example.impl.StandardCodecs # Standard codecs
The CodecSet class creates and saves a single service instance at initialization:
private static ServiceLoader<CodecSet> codecSetLoader
= ServiceLoader.load(CodecSet.class);
To locate an encoder for a given encoding name it defines a static factory method which iterates through the known and available providers, returning only when it has located a suitable encoder or has run out of providers.
public static Encoder getEncoder(String encodingName) {
for (CodecSet cp : codecSetLoader) {
Encoder enc = cp.getEncoder(encodingName);
if (enc != null)
return enc;
}
return null;
}
A getDecoder method is defined similarly.
That sounds a lot like you could use Apache Stanbol. It's a framework focused on semantic enhancement of content but can be used for any web based work flows involving content. You can define pipelines to process and/or store your data. There are components for NLP using Apache Tika and OpenNLP. As far as I know you can also integrate UIMA. It uses RESTful services and is based on OSGI.
If Stanbol doesn't fit your use case and you need to roll your own application, I think OSGI is still the way to go.
Depending on your use case you can either deploy bundles to a container or simply embed the OSGI framework in a small launcher app that loads the bundles you create.
Many Maven artifacts already contain OSGI metadata. Most of the time you can copy them to your bundle directory using the maven-dependency-plugin and load them directly as OSGI bundles.
Non-OSGI dependencies can be embed in your the bundles that need them. It should also be possible to setup a few maven plugins to modify the manifest to add some meta data based on the maven artifact ids and version and repack the dependencies as bundles (this won't work all the time though, since the Maven pom version and the packages' versions aren't always the same).
The users code and any required dependencies can be bundled up using the maven-bundle-plugin. It can generate the manifest for you.
For REST interfaces I usually would recommend JAX-RS (Jersey or Apache CXF DOSGI) but I haven't used the programmatic approach with those frameworks yet.

Creating an extensible Java library (JAR)

I'm looking to create a Java 'library' (JAR) that can be used within other projects, but I'd like my library to be extensible. I'm after something OSGi-esque, where my library has 'extension points' that other JARs can hook into. The thinking is that my library will have a core set of methods that will be called by the projects it's used in, but the exact implementation of these methods might vary based on the project/context.
I've looked into OSGi (e.g. Equinox), but I'm not sure it can be used in the way I'm after. It seems to be geared towards standalone apps rather than creating a library.
What would be the best way of achieving this? Can OSGi be used in this way, and if not are there frameworks out there that will do this?
I hope all that's clear - I have a clear idea of what I want, but it's hard to articulate.
OSGi is great, but I don't think that this is what you need. By using OSGi (-Services), you force the user of your library to use an OSGi environment, too.
I think as #Peter stated, you can do this by simply extending classes of your library in the specific project / context.
But in case you want to use OSGi, there is a simple way to achieve this. It's called Bundle Fragments. This way you can create a bundle and extend a so-called Host-Bundle", i.e. your library, without altering the original library. A popular use case for this is if you have platform specific code in your bundles.
What you are naming a Java library is named "Bundle" in OSGi context.
OSGi Bundle is a JAR file with some special Meta-Information in its MANIFEST.MF file. Now, every OSGi Bundle have either Exported-Packages or Imported-Packages.
Through Export-Packages Manifest header, you can show what all packages you are exporting.. And your other project can simply add the package it wants to use from them to its Import-Packages..
Here's an example: -
Bundle A Manifest: -
Export-Packages: com.demo.exported;
Bundle B Manifest: -
Import-Packages: com.demo.exported;version=(1.0.0, 2.0.0]
This way your bundle B (A different project) can call the methods from the class in the package that it imported from Bundle A..
Now, the version you see in the import-package, is just to show what all package version can it accept.. You can have 2 bundles with two different implementation of some interfaces and provide this package in two different version.. Both will be available..
Till now, I was talking about Static data-types..
You can also have your services exposed dynamically through Declarative Service.. In this case you will have to define one XML file (Component Definition) where you show what all services your Bundle will expose.. And in the other bundle, you can again define another XML, to show what all services it requires..
These are called, Provided Services and Referenced Services..
I think this will give you a little idea about what can be done.
And if I am wrong somewhere in interpreting your problem please specify the same..
*NOTE: - And of course OSGi is used for creating independent Bundles, that can be re-used in other projects.. They bring Modularity to your project..
As others have mentioned, you don't need OSGi or any framework for this. You can do this my employing patterns like the template method pattern or the strategy pattern. There are several other patterns for dynamically modifying/extending functionality, but these seem to fit your description most. They do not require any framework.
The benefit you would get from a framework like OSGi would be that it would manage the wiring for you. Normally, you'll have to write some code that glues your libraries and the extensions together - with a framework like OSGi, this will not be automated with minimal overhead (in case of OSGi, the overhead is some entries in the JAR-manifest).

OSGi bundle's package structure

I've been thinking some about "good practice" regarding package structure within osgi bundles. Currently, on average, we have like 8-12 classes per bundle. One of my initiative/proposal has been to have two packages; com.company_name.osgi.services.api (for api-related classes/interfaces (which is exported externally) and one package com.company_name.osgi.services.impl for implementation (not exported)). What are the pros cons of this? Any other suggestions?
You might also consider puting the interfaces in com.company_name.subsystem, and the implementation in com.company_name.subsystem.impl, the OSGI specific code, if there is any, could be in com.company_name.subsystem.osgi.
Sometime you might have multiple implementation of the same interfaces. In this case you could consider - com.company_name.subsystem.impl1 and com.company_name.subsystem.impl2, for example:
com.company.scm // the scm api
com.company.scm.git // its git implementaton
com.company.scm.svn // its subversion implementation
com.company.scm.osgi // the place to put an OSGI Activator
In this sense package structure could be OSGi agnostic, if you later on move to a different container, you just put an additional
com.company.scm.sca // or whatever component model you might think of
Always having api and impl in your package name could be annoying. If in doubt use impl but not api.
It's not the number of classes that is important but the concepts. In my opinion you should have one conceptual entity in a bundle. In some cases this might be just a few classes in other several packages with 100s of classes.
What it important is that you separate the API and the implementation. One bundle contains the API of your concept and the other the implementation. Like this you can provide different implementations for a well defined API. In some cases this might be even necessary if you want to access the services from a bundle remotely (using e.g. R-OGSi)
The API bundles are then used by code sharing and the services from the implementation bundles by service sharing. Best way to explore those possibilities is to look at the ServiceTrackers.
In your case you could have the implementation in the same package, but all of its classes "package protected". This way, you can export the package and the implementation would not be visible to the outside, even when not using OSGi (but as a regular jar file).

Categories

Resources