Programatically build and launch contained applications based on Maven dependecies - java

I would like to take several lists of Maven dependencies from the user, resolve and load each of them as contained applications. Here are the steps:
collect a list of all Maven dependencies (DONE)
resolve all dependencies with Aether (DONE)
resolve classpath with Aether (DONE)
bundle the above in a separate "container" (so that different Maven dependencies with potential conflicting version can be used).
repeat with other lists.
To give some context: I want to use the above in the context of UIMA, to be able to run different (natural language processing) pipelines that rely on different sets of libraries with different versions. My goal is to create an annotation-server in which one defines (Maven) dependencies and pipelines that can be called in a RESTful way. The pipelines (and their corresponding dependencies) should each run in a contained classpath environment (so as to avoid classpath clashes).
Is OSGi the way to go? Based on a classpath (:= a list of resolved jar), can I then build an OSGi bundle and deploy it? All programmatically? I do not have control over the maven dependencies (they are UIMA components, that's it), so no way to add OSGi metadata there.

Would maven-assembly-plugin combined with maven profiles take care of this for you?
You can filter dependencies differently on a per profile basis. You can use profile specific assembly descriptor documents and generate custom manifest to be placed in the war. You are describing a J2EE Web Application (war) assembly -- they will run in a firewalled classloader inside a servlet container so you generate a bunch of them based on the same source (just vary the web app context and the contents of the WEB-INF/lib on a per profile basis.
Drop them into the same Tomcat server, for example, and you are ready to go. Was this what you meant?
HTH,
Nick

You can certainly create a bundle that contains a list of jars, put all of those on the bundle's own classpath and deploy that bundle into an OSGi container. You probably do need to create a BundleActivator (which is the entry point for that bundle, like the main method is for traditional Java).
You then say you have multiple of such bundles, and do I understand correctly that you want to deploy each bundle in a separate container? If so, you can either use some kind of REST library to provide a REST endpoint for each bundle, or you can use OSGi remote services to publish a service that can be discovered by other containers.
I am not sure if this is what you mean, so I am also not sure if OSGi is the right way to go. From your description you use neither services (a very important reason to use OSGi as that decouples parts of your application from each other) nor do you intend to create different bundles for the components (another important reason to use OSGi). You are almost describing an architectural style currently hyped as "micro services". Can you please elaborate a bit more?

Based on your use case I'd suggest you look into the Java ServiceLoader API. The ServiceLoader API allows you to define an interface, and load implementations of that interface from different self-contained JARs. You can build your different libraries into their own jars, exposing the methods you need via the interface, and load them from your Java program independently. The ServiceLoader will even list the different implementations available for you.
From the documentation:
Suppose we have a service type com.example.CodecSet which is intended to represent sets of encoder/decoder pairs for some protocol. In this case it is an abstract class with two abstract methods:
public abstract Encoder getEncoder(String encodingName);
public abstract Decoder getDecoder(String encodingName);
Each method returns an appropriate object or null if the provider does not support the given encoding. Typical providers support more than one encoding.
If com.example.impl.StandardCodecs is an implementation of the CodecSet service then its jar file also contains a file named
META-INF/services/com.example.CodecSet
This file contains the single line:
com.example.impl.StandardCodecs # Standard codecs
The CodecSet class creates and saves a single service instance at initialization:
private static ServiceLoader<CodecSet> codecSetLoader
= ServiceLoader.load(CodecSet.class);
To locate an encoder for a given encoding name it defines a static factory method which iterates through the known and available providers, returning only when it has located a suitable encoder or has run out of providers.
public static Encoder getEncoder(String encodingName) {
for (CodecSet cp : codecSetLoader) {
Encoder enc = cp.getEncoder(encodingName);
if (enc != null)
return enc;
}
return null;
}
A getDecoder method is defined similarly.

That sounds a lot like you could use Apache Stanbol. It's a framework focused on semantic enhancement of content but can be used for any web based work flows involving content. You can define pipelines to process and/or store your data. There are components for NLP using Apache Tika and OpenNLP. As far as I know you can also integrate UIMA. It uses RESTful services and is based on OSGI.
If Stanbol doesn't fit your use case and you need to roll your own application, I think OSGI is still the way to go.
Depending on your use case you can either deploy bundles to a container or simply embed the OSGI framework in a small launcher app that loads the bundles you create.
Many Maven artifacts already contain OSGI metadata. Most of the time you can copy them to your bundle directory using the maven-dependency-plugin and load them directly as OSGI bundles.
Non-OSGI dependencies can be embed in your the bundles that need them. It should also be possible to setup a few maven plugins to modify the manifest to add some meta data based on the maven artifact ids and version and repack the dependencies as bundles (this won't work all the time though, since the Maven pom version and the packages' versions aren't always the same).
The users code and any required dependencies can be bundled up using the maven-bundle-plugin. It can generate the manifest for you.
For REST interfaces I usually would recommend JAX-RS (Jersey or Apache CXF DOSGI) but I haven't used the programmatic approach with those frameworks yet.

Related

Creating an extensible Java library (JAR)

I'm looking to create a Java 'library' (JAR) that can be used within other projects, but I'd like my library to be extensible. I'm after something OSGi-esque, where my library has 'extension points' that other JARs can hook into. The thinking is that my library will have a core set of methods that will be called by the projects it's used in, but the exact implementation of these methods might vary based on the project/context.
I've looked into OSGi (e.g. Equinox), but I'm not sure it can be used in the way I'm after. It seems to be geared towards standalone apps rather than creating a library.
What would be the best way of achieving this? Can OSGi be used in this way, and if not are there frameworks out there that will do this?
I hope all that's clear - I have a clear idea of what I want, but it's hard to articulate.
OSGi is great, but I don't think that this is what you need. By using OSGi (-Services), you force the user of your library to use an OSGi environment, too.
I think as #Peter stated, you can do this by simply extending classes of your library in the specific project / context.
But in case you want to use OSGi, there is a simple way to achieve this. It's called Bundle Fragments. This way you can create a bundle and extend a so-called Host-Bundle", i.e. your library, without altering the original library. A popular use case for this is if you have platform specific code in your bundles.
What you are naming a Java library is named "Bundle" in OSGi context.
OSGi Bundle is a JAR file with some special Meta-Information in its MANIFEST.MF file. Now, every OSGi Bundle have either Exported-Packages or Imported-Packages.
Through Export-Packages Manifest header, you can show what all packages you are exporting.. And your other project can simply add the package it wants to use from them to its Import-Packages..
Here's an example: -
Bundle A Manifest: -
Export-Packages: com.demo.exported;
Bundle B Manifest: -
Import-Packages: com.demo.exported;version=(1.0.0, 2.0.0]
This way your bundle B (A different project) can call the methods from the class in the package that it imported from Bundle A..
Now, the version you see in the import-package, is just to show what all package version can it accept.. You can have 2 bundles with two different implementation of some interfaces and provide this package in two different version.. Both will be available..
Till now, I was talking about Static data-types..
You can also have your services exposed dynamically through Declarative Service.. In this case you will have to define one XML file (Component Definition) where you show what all services your Bundle will expose.. And in the other bundle, you can again define another XML, to show what all services it requires..
These are called, Provided Services and Referenced Services..
I think this will give you a little idea about what can be done.
And if I am wrong somewhere in interpreting your problem please specify the same..
*NOTE: - And of course OSGi is used for creating independent Bundles, that can be re-used in other projects.. They bring Modularity to your project..
As others have mentioned, you don't need OSGi or any framework for this. You can do this my employing patterns like the template method pattern or the strategy pattern. There are several other patterns for dynamically modifying/extending functionality, but these seem to fit your description most. They do not require any framework.
The benefit you would get from a framework like OSGi would be that it would manage the wiring for you. Normally, you'll have to write some code that glues your libraries and the extensions together - with a framework like OSGi, this will not be automated with minimal overhead (in case of OSGi, the overhead is some entries in the JAR-manifest).

Best way for sharing of domain objects across Eclipse plugins?

I am making a set of Eclipse Plugins for the Eclipse Workbench.
I want these Eclipse Plugins to communicate with each other through some shared
data structures/managers.
Is there some bootstrapping or other initialization process wherein I
can pass the shared domain objects through the constructor for the plugins(depepndency injection)?
What is the standard and best practice for achieving sharing of data across plugins?
Eclipse is OSGi based using the equinox runtime. OSGi manages all of the runtime dependencies you need.
The simplest way is to deploy your common code as a bundle (plugin). Export all of the packages you need to other plugins. (Export-Package header in manifest.mf)
In the plugins you need the package, declare them as imported packages (Import-Package in the manifest.mf file)
If you want to go the extra mile, expose the managers you need as services, and add service consumers in the plugins you need.
Here's a simple tutorial to using services:
http://www.knopflerfish.org/osgi_service_tutorial.html

Spring jar auto-loading

My project uses a simple plugin mechanism based on multiple application contexts defined in plugin jars. However for this to work i have to include all of the plugin jars on the classpath. It would be nice if Spring could automatically load jars and containing components on it's own which are for example placed in the 'plugins' subdirectory of my project.
Is there some solution for this?
I went a bit furtherer and tried to solve this with Jar Class Loader.
Because i'm instantiating the Spring application context manually i can do the following:
GenericApplicationContext ctx = new GenericApplicationContext();
// Load context definitions from plugin jars
JarClassLoader jcl = new JarClassLoader();
jcl.add("plugins/");
XmlBeanDefinitionReader classPathBeansReader = new XmlBeanDefinitionReader(ctx);
classPathBeansReader.setBeanClassLoader(jcl);
classPathBeansReader.setResourceLoader(new PathMatchingResourcePatternResolver(jcl));
classPathBeansReader.loadBeanDefinitions("classpath*:META-INF/my-plugins-*.xml");
However this is not working. From Spring's log i can see that it doesnt read the XML definition in the plugin jar. If i replace the bottom block with
XmlBeanDefinitionReader classPathBeansReader = new XmlBeanDefinitionReader(ctx);
classPathBeansReader.setBeanClassLoader(jcl);
classPathBeansReader.loadBeanDefinitions(new ClassPathResource("META-INF/my-plugins-somemodule.xml",jcl));
it finds and loads the XML definition file and beans from the jar. However this way i'm hardwiring the XML resource name for one plugin, which i don't wan't. How can i make the pattern matching working with JCL?
You might like to consider using OSGi as your plugin loading mechanism.
The Eclipse Virgo open source project provides an OSGi runtime environment that is suited to your project because it has Spring built in. Virgo offers Tomcat and Jetty based servers and a standalone kernel which can be used on its own or to construct other types of server. See the Virgo web site for features and benefits.
OSGi has quite a different design point than you may be used to in Java. It gives you controlled isolation between plugins, known as bundles, unlike a linear classpath. Bundles are wired together in a dependency graph and support versioning and dynamic life cycle operations.
The preferred means for a bundle to use the facilities of other bundles is via the OSGi service registry. The Spring DM project enables normal Spring beans to be published to the service registry and looked up from the service registry. Spring DM is also built in to Virgo. Spring DM has been donated to Eclipse as the Gemini Blueprint project.
To use Virgo, you would add some Spring DM configuration to each of your plugins in the META-INF/spring directory. This configuration, which is a normal XML Spring configuration file, can reference beans in your other Spring files and publish those beans in the service registry, or can provide beans for services looked up in the service registry which may then be referenced by, and injected into, beans in your other Spring files.
You would then deploy your plugins into Virgo using any of the supported mechanisms. You could simply drop them in dependency order into the pickup directory. Or you could use the web admin console or shell console to deploy then.
Alternatively, and this would seem to fit your requirement rather well, you could place plugins providing packages for other plugins in the Virgo repository by dropping them into repository/usr and then deploy the plugins which depend (transitively) on the repository plugins via the pickup directory or web admin console. Virgo will automatically deploy the dependencies from the repository as the dependent plugins are deployed.
You could also group plugins together either in an archive, known as a PAR, or by storing them in the Virgo repository and then referencing them in an XML file, known as a plan. You would then deploy the PAR or plan as describe above. You can even put some of the dependencies in the Virgo repository and reduce the PAR or plan to contain just the dependent plugins.
If you would like further information about Virgo, just ask on the Virgo community forum.
It seems that JCL doesn't override ClassLoader#findResource(String)
JarClassLoader.java
AbstractClassLoader.java
PathMatchingResourcePatternResolver JavaDocs state:
Internally, this happens via a ClassLoader.getResources() call
JavaDocs for ClassLoader#getResources(String) defers to documentation for ClassLoader#findResource(String), which states:
Finds the resource with the given name. Class loader implementations should override this method to specify where to find resources.
So while my answer is based on just reading a few bits of docs, I'd surmise that JCL doesn't support this due to not overriding the documented methods.
You could test this by subclassing JarClassLoader and implementing findResource(String), to test my hypothesis.
Of course, I could be wildly wrong.

Best Practice For Referencing an External Module In a Java Project

I have a Java project that expects external modules to be registered with it. These modules:
Implement a particular interface in the main project
Are packaged into a uni-jar (along with any dependencies)
Contain some human-readable meta-information (like the module name).
My main project needs to be able to load at runtime (e.g. using its own classloader) any of these external modules. My question is: what's the best way of registering these modules with the main project (I'd prefer to keep this vanilla Java, and not use any third-party frameworks/libraries for this isolated issue)?
My current solution is to keep a single .properties file in the main project with key=name, value=class |delimiter| human-readable-name (or coordinate two .properties files in order to avoid the delimiter parsing). At runtime, the main project loads in the .properties file and uses any entries it finds to drive the classloader.
This feels hokey to me. Is there a better way to this?
The standard approach in Java is to define a Service Provider.
Let all module express their metadata via a standard xml file. Call it "my-module-data.xml".
On your main container startup it looks for a classpath*:my-module-data.xml" (which can have a FrontController class) and delegates to the individual modules FrontController class to do whatever it wants :)
Also google for Spring-OSGI and their doco can be helpful here.
Expanding on #ZZ Coder...
The Service Provider pattern mentioned, and used internally within the JDK is now a little more formalized in JDK 6 with ServiceLoader. The concept is further expanded up by the Netbeans Lookup API.
The underlying infrastructure is identical. That is, both API use the same artifacts, the same way. The NetBeans version is just a more flexible and robust API (allowing alternative lookup services, for example, as well as the default one).
Of course, it would be remiss to not mention the dominant, more "heavyweight" standards of EJB, Spring, and OSGi.

Existing implementations of OSGi Configuration Admin Service?

We are considering to use Configuration Admin Service as a primary API for configuring components in our OSGi-based application. It would be nice if we could reuse some existing implementation so I'm trying to investigate and evaluate the most popular ones. I know there is:
Apache Felix Config Admin (org.apache.felix.cm)
Equinox Config Admin (org.eclipse.equinox.cm)
Are there any other implementations to be considered?
Also I was not able to find any good documentation for these implementations. I would be mainly interested in the implementation-specific details. For example I was wondering how different implementations persist the configuration data (e.g. multiple property files? XML file? multiple XML files? database?, ...).
Felix's Configuration Admin has a default implementation that persists to the file system, but they define a service interface (org.apache.felix.cm.PersistenceManager) for alternative backends that you could plug in instead.
The default implementation does the following:
The FilePersistenceManager class stores configuration data in
properties-like files inside a given directory. All configuration files are
located in the same directory.
Configuration files are created in the configuration directory by appending
the extension ".config" to the PID of the configuration. The PID
is converted into a relative path name by replacing enclosed dots to slashes.
Non-symbolic-name characters in the PID are encoded with their
Unicode character code in hexadecimal.
The three public implementations I know of are
Apache Felix
Equinox …source (this has moved recently)
Knopflerfish …front page and …source
Equinox's implementation of the ConfigurationAdmin service appears not to support fine control over the persistence policy, as Felix's does, and the Knopflerfish implementation looks (I've only read the source briefly) similar to Equinox's.
The Felix one appears to be the most recently updated and the most reliable.
At present these are the only ones I can find; at dm Server we made the decision to use Felix's bundle, and this is now obtainable from the SpringSource Enterprise Bundle Repository, where you can quick-search for Apache Felix or ConfigAdmin.
Just to complete the answer further: I personally also prefer the Felix implementation. For an example of how to change the way storage occurs at the back-end using a PersistenceManager, see also this implementation that uses standard Java property files as backing storage. Has some limitations, but at least allows you to store your configuration with your application and apart from your OSGi framework implementation.

Categories

Resources