I have a Java project that expects external modules to be registered with it. These modules:
Implement a particular interface in the main project
Are packaged into a uni-jar (along with any dependencies)
Contain some human-readable meta-information (like the module name).
My main project needs to be able to load at runtime (e.g. using its own classloader) any of these external modules. My question is: what's the best way of registering these modules with the main project (I'd prefer to keep this vanilla Java, and not use any third-party frameworks/libraries for this isolated issue)?
My current solution is to keep a single .properties file in the main project with key=name, value=class |delimiter| human-readable-name (or coordinate two .properties files in order to avoid the delimiter parsing). At runtime, the main project loads in the .properties file and uses any entries it finds to drive the classloader.
This feels hokey to me. Is there a better way to this?
The standard approach in Java is to define a Service Provider.
Let all module express their metadata via a standard xml file. Call it "my-module-data.xml".
On your main container startup it looks for a classpath*:my-module-data.xml" (which can have a FrontController class) and delegates to the individual modules FrontController class to do whatever it wants :)
Also google for Spring-OSGI and their doco can be helpful here.
Expanding on #ZZ Coder...
The Service Provider pattern mentioned, and used internally within the JDK is now a little more formalized in JDK 6 with ServiceLoader. The concept is further expanded up by the Netbeans Lookup API.
The underlying infrastructure is identical. That is, both API use the same artifacts, the same way. The NetBeans version is just a more flexible and robust API (allowing alternative lookup services, for example, as well as the default one).
Of course, it would be remiss to not mention the dominant, more "heavyweight" standards of EJB, Spring, and OSGi.
Related
I would like to take several lists of Maven dependencies from the user, resolve and load each of them as contained applications. Here are the steps:
collect a list of all Maven dependencies (DONE)
resolve all dependencies with Aether (DONE)
resolve classpath with Aether (DONE)
bundle the above in a separate "container" (so that different Maven dependencies with potential conflicting version can be used).
repeat with other lists.
To give some context: I want to use the above in the context of UIMA, to be able to run different (natural language processing) pipelines that rely on different sets of libraries with different versions. My goal is to create an annotation-server in which one defines (Maven) dependencies and pipelines that can be called in a RESTful way. The pipelines (and their corresponding dependencies) should each run in a contained classpath environment (so as to avoid classpath clashes).
Is OSGi the way to go? Based on a classpath (:= a list of resolved jar), can I then build an OSGi bundle and deploy it? All programmatically? I do not have control over the maven dependencies (they are UIMA components, that's it), so no way to add OSGi metadata there.
Would maven-assembly-plugin combined with maven profiles take care of this for you?
You can filter dependencies differently on a per profile basis. You can use profile specific assembly descriptor documents and generate custom manifest to be placed in the war. You are describing a J2EE Web Application (war) assembly -- they will run in a firewalled classloader inside a servlet container so you generate a bunch of them based on the same source (just vary the web app context and the contents of the WEB-INF/lib on a per profile basis.
Drop them into the same Tomcat server, for example, and you are ready to go. Was this what you meant?
HTH,
Nick
You can certainly create a bundle that contains a list of jars, put all of those on the bundle's own classpath and deploy that bundle into an OSGi container. You probably do need to create a BundleActivator (which is the entry point for that bundle, like the main method is for traditional Java).
You then say you have multiple of such bundles, and do I understand correctly that you want to deploy each bundle in a separate container? If so, you can either use some kind of REST library to provide a REST endpoint for each bundle, or you can use OSGi remote services to publish a service that can be discovered by other containers.
I am not sure if this is what you mean, so I am also not sure if OSGi is the right way to go. From your description you use neither services (a very important reason to use OSGi as that decouples parts of your application from each other) nor do you intend to create different bundles for the components (another important reason to use OSGi). You are almost describing an architectural style currently hyped as "micro services". Can you please elaborate a bit more?
Based on your use case I'd suggest you look into the Java ServiceLoader API. The ServiceLoader API allows you to define an interface, and load implementations of that interface from different self-contained JARs. You can build your different libraries into their own jars, exposing the methods you need via the interface, and load them from your Java program independently. The ServiceLoader will even list the different implementations available for you.
From the documentation:
Suppose we have a service type com.example.CodecSet which is intended to represent sets of encoder/decoder pairs for some protocol. In this case it is an abstract class with two abstract methods:
public abstract Encoder getEncoder(String encodingName);
public abstract Decoder getDecoder(String encodingName);
Each method returns an appropriate object or null if the provider does not support the given encoding. Typical providers support more than one encoding.
If com.example.impl.StandardCodecs is an implementation of the CodecSet service then its jar file also contains a file named
META-INF/services/com.example.CodecSet
This file contains the single line:
com.example.impl.StandardCodecs # Standard codecs
The CodecSet class creates and saves a single service instance at initialization:
private static ServiceLoader<CodecSet> codecSetLoader
= ServiceLoader.load(CodecSet.class);
To locate an encoder for a given encoding name it defines a static factory method which iterates through the known and available providers, returning only when it has located a suitable encoder or has run out of providers.
public static Encoder getEncoder(String encodingName) {
for (CodecSet cp : codecSetLoader) {
Encoder enc = cp.getEncoder(encodingName);
if (enc != null)
return enc;
}
return null;
}
A getDecoder method is defined similarly.
That sounds a lot like you could use Apache Stanbol. It's a framework focused on semantic enhancement of content but can be used for any web based work flows involving content. You can define pipelines to process and/or store your data. There are components for NLP using Apache Tika and OpenNLP. As far as I know you can also integrate UIMA. It uses RESTful services and is based on OSGI.
If Stanbol doesn't fit your use case and you need to roll your own application, I think OSGI is still the way to go.
Depending on your use case you can either deploy bundles to a container or simply embed the OSGI framework in a small launcher app that loads the bundles you create.
Many Maven artifacts already contain OSGI metadata. Most of the time you can copy them to your bundle directory using the maven-dependency-plugin and load them directly as OSGI bundles.
Non-OSGI dependencies can be embed in your the bundles that need them. It should also be possible to setup a few maven plugins to modify the manifest to add some meta data based on the maven artifact ids and version and repack the dependencies as bundles (this won't work all the time though, since the Maven pom version and the packages' versions aren't always the same).
The users code and any required dependencies can be bundled up using the maven-bundle-plugin. It can generate the manifest for you.
For REST interfaces I usually would recommend JAX-RS (Jersey or Apache CXF DOSGI) but I haven't used the programmatic approach with those frameworks yet.
I had a situation of adding .dic file (DICOM: Digital Imaging and Communications in Medicine Format) into a swing java application and I had Dicom.jar to recognize the .dic
extentions
once I added the Dicom.jar to build path the Code ran normally when I removed the jar from the
build path there weren't any errors in the Code (I never used any of classes included in Dicom.jar)
But I'm very Confused how could the code recognized the Dicom.jar if it doesn't use any of its classes
Thanks in Advance :)
Probably the Dicom uses an approach called SPI (Service Provider Interface). This link provides a pretty good explanation.
This approach is used for images, but also for sound, and you can create your own SPI when needed.
The key to understanding what is going on is "dynamic class loading". Something in the runtime libraries that your application uses is finding and loading classes from that JAR file.
This is most likely happening because Dicom provides code that conforms to the SPI mechanism used by one of the subsystems of the Java standard libraries that do this kind of thing. (And there are a few instances of SPIs in the Java libraries.)
Dicom could also be interacting with other 3rd-party code in other ways; e.g. via a 3rd-party SPI or old-fashioned "put the classname into a property file" mechanisms.
I have solution split into 2 projects:
The independent project contains interface ExampleInf and declares some services needed for the application. Those services are provided by the third party API (Hadoop client API). This project contains GUI components and other application logic but does not link third party libraries that provide services declared by ExampleInf. There is no class implementing ExampleInf in this project.
The dependent project that contains links to third party libraries. This project contains class ExampleImpl that encapsulates third party API and implements ExampleInf.
In the independent project there is class (let's call it class A) that consumes (uses) services declared by ExampleInf. Because the independent does not link the dependent project, in order to use ExampleInf it needs to load its implementation ExampleImpl dynamically in runtime. Also it needs to dynamically load all the third party libraries required by ExampleImpl.
Currently this is done by a bunch of constants (public static final String attributes) that contain paths to dependent project where dynamically loaded resources are located and a lot of messy ClassLoader code. I do not consider this to be a good solution. Is there any pattern, best practice or common way how this can be done? What would you recommend in your experience?
This pattern reminds me a bit of dependency injection in Java EE. At least I think it is good idea to externalize the locations of classes and libraries (.jar-s) that need to be loaded dynamically to XML and then load them all in cycle instead of calling ClassLoader.loadClass separately for each constant. Is there any nice clean way how to load XML in the same package and load classes and jars specified by that XML? Code example would be much appreciated.
You can use the ServiceLoader utility to do this (this is how many of the jdk services are loaded, e.g. xml libraries and modern jdbc driver libraries). If the dependent project is part of the classpath at startup, then you are good to go (assuming it is setup correctly). otherwise, you would need to load the dependent project in a nested classloader and pass that to the load(Class,ClassLoader) method (or set the classloader as the current context classloader before calling load(Class)).
I see many Java packages have api, impl and bundle jars (name-api.jar, name-impl.jar, name-bundle.jar). Could someone explain what those mean? Are all three needed by the app?
The idea is that you can separate the dependencies of the application; in an attempt to make applications more portable. The idea is that you can make the application dependent on the api.jar when compiling. Then when you want to run the program you can then switch in the appropriate implementation jar (impl.jar) and the appropriate resource bundle jar (bundle.jar).
As an example suppose the library does some database interaction. You write your code so that it references the api.jar. Now suppose you need it to work with a specific type of database e.g. MySQL - you would then add the impl.jar that is specific to MySQL databases to the classpath to get it to work (if you need a different database later - you only need to switch that jar in the classpath).
The bundle.jar is a bit more obscure and not as common. This could be used to supply configuration setting for the library. For example it could be used to supply language specific settings, or some more specific config. In the case of the database library it might be that the implementation is designed for all versions of MySQL, and the resource bundle jar provides config files that allow it to work for a specific MySQL version.
Often :
name-api.jar contains only the interface of the API.
name-impl.jar provides an implementation of all interfaces in the name-api.jar
name-bundle.jar bundles everything with all the needed classes to run a Java application.
api.jar contains API interfaces. These are interfaces as a contract that the implementation of the API should follow.
impl.jar is the implementation of the api.jar. You can't just have the impl.jar without the api.jar.
bundle.jar is the resources (if I'm not mistaken). Those are resources needed for the implementation code necessary to run.
I've never seen such an arrangement.
If the designer packaged the app into three JARs, then I'd say all three are needed.
But you should recognize that it's just a choice made by the designer. It's possible that s/he could have just created a single JAR with everything in it and you'd be none the wiser.
I'm guessing now, but if you were to open those JARs you'd see only interfaces in the API JAR, implementations of those interfaces in the impl JAR, and resource bundles and other .properties files in the bundle JAR. Try it and see. You'll learn something.
I'd like to implement a dynamic plugin feature in a Java application. Ideally:
The application would define an interface Plugin with a method like getCapabilities().
A plugin would be a JAR pluginX.jar containing a class PluginXImpl implementing Plugin (and maybe some others).
The user would put pluginX.jar in a special directory or set a configuration parameter pointing to it. The user should not necessarily have to include pluginX.jar in their classpath.
The application would find PluginXImpl (maybe via the JAR manifest, maybe by reflection) and add it to a registry.
The client could get an instance of PluginXImpl, e.g., by invoking a method like getPluginWithCapabilities("X"). The user should not necessarily have to know the name of the plugin.
I've got a sense I should be able to do this with peaberry, but I can't make any sense of the documentation. I've invested some time in learning Guice, so my preferred answer would not be "use Spring Dynamic Modules."
Can anybody give me a simple idea of how to go about doing this using Guice/peaberry, OSGi, or just plain Java?
This is actually quite easy using plain Java means:
Since you don't want the user to configure the classpath before starting the application, I would first create a URLClassLoader with an array of URLs to the files in your plugin directory. Use File.listFiles to find all plugin jars and then File.toURI().toURL() to get a URL to each file. You should pass the system classloader (ClassLoader.getSystemClassLoader()) as a parent to your URLClassLoader.
If the plugin jars contain a configuration file in META-INF/services as described in the API documentation for java.util.ServiceLoader, you can now use ServiceLoader.load(Plugin.class, myUrlClassLoader) to obatin a service loader for your Plugin interface and call iterator() on it to get instances of all configured Plugin implementations.
You still have to provide your own wrapper around this to filter plugin capabilites, but that shouldn't be too much trouble, I suppose.
OSGI would be fine if you want to replace the plugins during runtime i.g. for bugfixes in a 24/7 environment. I played a while with OSGI but it took too much time, because it wasn't a requirement, and you need a plan b if you remove a bundle.
My humble solution then was, providing a properties files with the class names of plugin descriptor classes and let the server call them to register (including quering their capabilities).
This is obvious suboptimal but I can't wait to read the accepted answer.
Any chance you can leverage the Service Provider Interface?
The best way to implement plug-ins with Guice is with Multibindings. The linked page goes into detail on how to use multibindings to host plugins.
Apologize if you know this, but check out the forName method of Class. It is used at least in JDBC to dynamically load the DBMS-specific driver classes runtime by class name.
Then I guess it would not be difficult to enumerate all class/jar files in a directory, load each of them, and define an interface for a static method getCapabilities() (or any name you choose) that returns their capabilities/description in whatever terms and format that makes sense for your system.