I have solution split into 2 projects:
The independent project contains interface ExampleInf and declares some services needed for the application. Those services are provided by the third party API (Hadoop client API). This project contains GUI components and other application logic but does not link third party libraries that provide services declared by ExampleInf. There is no class implementing ExampleInf in this project.
The dependent project that contains links to third party libraries. This project contains class ExampleImpl that encapsulates third party API and implements ExampleInf.
In the independent project there is class (let's call it class A) that consumes (uses) services declared by ExampleInf. Because the independent does not link the dependent project, in order to use ExampleInf it needs to load its implementation ExampleImpl dynamically in runtime. Also it needs to dynamically load all the third party libraries required by ExampleImpl.
Currently this is done by a bunch of constants (public static final String attributes) that contain paths to dependent project where dynamically loaded resources are located and a lot of messy ClassLoader code. I do not consider this to be a good solution. Is there any pattern, best practice or common way how this can be done? What would you recommend in your experience?
This pattern reminds me a bit of dependency injection in Java EE. At least I think it is good idea to externalize the locations of classes and libraries (.jar-s) that need to be loaded dynamically to XML and then load them all in cycle instead of calling ClassLoader.loadClass separately for each constant. Is there any nice clean way how to load XML in the same package and load classes and jars specified by that XML? Code example would be much appreciated.
You can use the ServiceLoader utility to do this (this is how many of the jdk services are loaded, e.g. xml libraries and modern jdbc driver libraries). If the dependent project is part of the classpath at startup, then you are good to go (assuming it is setup correctly). otherwise, you would need to load the dependent project in a nested classloader and pass that to the load(Class,ClassLoader) method (or set the classloader as the current context classloader before calling load(Class)).
Related
I am working on delivering a new feature for a Java based enterprise application. A lot of workareas/projects will be uptaking my feature. But I am not willing that we import the jar containing the new feature and include it in all individual classpaths for all projects. I have been wondering wether it would be possible at all to instantiate a java object to use there methods, but without importing the containing jar in the project's classpath. Other than the generic methods to instantiate a java object: new, clone(), ApplicationContext.getBean/ autowiring in spring framework to mention a few, but these all generally expect the containing jar to be present in the context. Is there any possibility across any java based framework which could facilitate this programming routine?
One working example is from Oracle ADF, where we can make use of ApplicationModule in the data model. And with the help of the applicationModule instance, we can access any view object (read: java object) across any project in the workspace without importing the containing jar in classpath.
I want to emulate a given type from a third-party library (GAE),
a Java class that is not supported by GWT:
com.google.appengine.api.datastore.GeoPt;
How do I emulate this class so GWT will support it? Where should I put the GeoPt.java file in my GWT app?
I cannot put it in the client path the packaging is different that my app. What could be the solution for this?
Further I assume, you have a module com.example.Example.gwt.xml .
I think you have two options. You can create a separate module, eg. AppEngine.gwt.xml, which source tag set to "api" and you put it into on com.google.appengine level. Then, you module need to inherit it - <inherits name="com.google.appengine.AppEngine"/>. It can be even in the same project - one project might have multiple modules.
Another approach is when you eg. want to reimplement a class only in GWT, while use same one in pure Java. Then in your module you create tag which points to a folder that will be a kind of root for a replaced classes. So, in your module you add <super-source path="appengine"/> and then you would put the class to com.example.appengine.com.google.appengine.api.datastore.GeoPt. You can read more on this at Organizing Projects, go to Overriding one package implementation with another section
I would like to take several lists of Maven dependencies from the user, resolve and load each of them as contained applications. Here are the steps:
collect a list of all Maven dependencies (DONE)
resolve all dependencies with Aether (DONE)
resolve classpath with Aether (DONE)
bundle the above in a separate "container" (so that different Maven dependencies with potential conflicting version can be used).
repeat with other lists.
To give some context: I want to use the above in the context of UIMA, to be able to run different (natural language processing) pipelines that rely on different sets of libraries with different versions. My goal is to create an annotation-server in which one defines (Maven) dependencies and pipelines that can be called in a RESTful way. The pipelines (and their corresponding dependencies) should each run in a contained classpath environment (so as to avoid classpath clashes).
Is OSGi the way to go? Based on a classpath (:= a list of resolved jar), can I then build an OSGi bundle and deploy it? All programmatically? I do not have control over the maven dependencies (they are UIMA components, that's it), so no way to add OSGi metadata there.
Would maven-assembly-plugin combined with maven profiles take care of this for you?
You can filter dependencies differently on a per profile basis. You can use profile specific assembly descriptor documents and generate custom manifest to be placed in the war. You are describing a J2EE Web Application (war) assembly -- they will run in a firewalled classloader inside a servlet container so you generate a bunch of them based on the same source (just vary the web app context and the contents of the WEB-INF/lib on a per profile basis.
Drop them into the same Tomcat server, for example, and you are ready to go. Was this what you meant?
HTH,
Nick
You can certainly create a bundle that contains a list of jars, put all of those on the bundle's own classpath and deploy that bundle into an OSGi container. You probably do need to create a BundleActivator (which is the entry point for that bundle, like the main method is for traditional Java).
You then say you have multiple of such bundles, and do I understand correctly that you want to deploy each bundle in a separate container? If so, you can either use some kind of REST library to provide a REST endpoint for each bundle, or you can use OSGi remote services to publish a service that can be discovered by other containers.
I am not sure if this is what you mean, so I am also not sure if OSGi is the right way to go. From your description you use neither services (a very important reason to use OSGi as that decouples parts of your application from each other) nor do you intend to create different bundles for the components (another important reason to use OSGi). You are almost describing an architectural style currently hyped as "micro services". Can you please elaborate a bit more?
Based on your use case I'd suggest you look into the Java ServiceLoader API. The ServiceLoader API allows you to define an interface, and load implementations of that interface from different self-contained JARs. You can build your different libraries into their own jars, exposing the methods you need via the interface, and load them from your Java program independently. The ServiceLoader will even list the different implementations available for you.
From the documentation:
Suppose we have a service type com.example.CodecSet which is intended to represent sets of encoder/decoder pairs for some protocol. In this case it is an abstract class with two abstract methods:
public abstract Encoder getEncoder(String encodingName);
public abstract Decoder getDecoder(String encodingName);
Each method returns an appropriate object or null if the provider does not support the given encoding. Typical providers support more than one encoding.
If com.example.impl.StandardCodecs is an implementation of the CodecSet service then its jar file also contains a file named
META-INF/services/com.example.CodecSet
This file contains the single line:
com.example.impl.StandardCodecs # Standard codecs
The CodecSet class creates and saves a single service instance at initialization:
private static ServiceLoader<CodecSet> codecSetLoader
= ServiceLoader.load(CodecSet.class);
To locate an encoder for a given encoding name it defines a static factory method which iterates through the known and available providers, returning only when it has located a suitable encoder or has run out of providers.
public static Encoder getEncoder(String encodingName) {
for (CodecSet cp : codecSetLoader) {
Encoder enc = cp.getEncoder(encodingName);
if (enc != null)
return enc;
}
return null;
}
A getDecoder method is defined similarly.
That sounds a lot like you could use Apache Stanbol. It's a framework focused on semantic enhancement of content but can be used for any web based work flows involving content. You can define pipelines to process and/or store your data. There are components for NLP using Apache Tika and OpenNLP. As far as I know you can also integrate UIMA. It uses RESTful services and is based on OSGI.
If Stanbol doesn't fit your use case and you need to roll your own application, I think OSGI is still the way to go.
Depending on your use case you can either deploy bundles to a container or simply embed the OSGI framework in a small launcher app that loads the bundles you create.
Many Maven artifacts already contain OSGI metadata. Most of the time you can copy them to your bundle directory using the maven-dependency-plugin and load them directly as OSGI bundles.
Non-OSGI dependencies can be embed in your the bundles that need them. It should also be possible to setup a few maven plugins to modify the manifest to add some meta data based on the maven artifact ids and version and repack the dependencies as bundles (this won't work all the time though, since the Maven pom version and the packages' versions aren't always the same).
The users code and any required dependencies can be bundled up using the maven-bundle-plugin. It can generate the manifest for you.
For REST interfaces I usually would recommend JAX-RS (Jersey or Apache CXF DOSGI) but I haven't used the programmatic approach with those frameworks yet.
I have a Java project that expects external modules to be registered with it. These modules:
Implement a particular interface in the main project
Are packaged into a uni-jar (along with any dependencies)
Contain some human-readable meta-information (like the module name).
My main project needs to be able to load at runtime (e.g. using its own classloader) any of these external modules. My question is: what's the best way of registering these modules with the main project (I'd prefer to keep this vanilla Java, and not use any third-party frameworks/libraries for this isolated issue)?
My current solution is to keep a single .properties file in the main project with key=name, value=class |delimiter| human-readable-name (or coordinate two .properties files in order to avoid the delimiter parsing). At runtime, the main project loads in the .properties file and uses any entries it finds to drive the classloader.
This feels hokey to me. Is there a better way to this?
The standard approach in Java is to define a Service Provider.
Let all module express their metadata via a standard xml file. Call it "my-module-data.xml".
On your main container startup it looks for a classpath*:my-module-data.xml" (which can have a FrontController class) and delegates to the individual modules FrontController class to do whatever it wants :)
Also google for Spring-OSGI and their doco can be helpful here.
Expanding on #ZZ Coder...
The Service Provider pattern mentioned, and used internally within the JDK is now a little more formalized in JDK 6 with ServiceLoader. The concept is further expanded up by the Netbeans Lookup API.
The underlying infrastructure is identical. That is, both API use the same artifacts, the same way. The NetBeans version is just a more flexible and robust API (allowing alternative lookup services, for example, as well as the default one).
Of course, it would be remiss to not mention the dominant, more "heavyweight" standards of EJB, Spring, and OSGi.
I'd like to implement a dynamic plugin feature in a Java application. Ideally:
The application would define an interface Plugin with a method like getCapabilities().
A plugin would be a JAR pluginX.jar containing a class PluginXImpl implementing Plugin (and maybe some others).
The user would put pluginX.jar in a special directory or set a configuration parameter pointing to it. The user should not necessarily have to include pluginX.jar in their classpath.
The application would find PluginXImpl (maybe via the JAR manifest, maybe by reflection) and add it to a registry.
The client could get an instance of PluginXImpl, e.g., by invoking a method like getPluginWithCapabilities("X"). The user should not necessarily have to know the name of the plugin.
I've got a sense I should be able to do this with peaberry, but I can't make any sense of the documentation. I've invested some time in learning Guice, so my preferred answer would not be "use Spring Dynamic Modules."
Can anybody give me a simple idea of how to go about doing this using Guice/peaberry, OSGi, or just plain Java?
This is actually quite easy using plain Java means:
Since you don't want the user to configure the classpath before starting the application, I would first create a URLClassLoader with an array of URLs to the files in your plugin directory. Use File.listFiles to find all plugin jars and then File.toURI().toURL() to get a URL to each file. You should pass the system classloader (ClassLoader.getSystemClassLoader()) as a parent to your URLClassLoader.
If the plugin jars contain a configuration file in META-INF/services as described in the API documentation for java.util.ServiceLoader, you can now use ServiceLoader.load(Plugin.class, myUrlClassLoader) to obatin a service loader for your Plugin interface and call iterator() on it to get instances of all configured Plugin implementations.
You still have to provide your own wrapper around this to filter plugin capabilites, but that shouldn't be too much trouble, I suppose.
OSGI would be fine if you want to replace the plugins during runtime i.g. for bugfixes in a 24/7 environment. I played a while with OSGI but it took too much time, because it wasn't a requirement, and you need a plan b if you remove a bundle.
My humble solution then was, providing a properties files with the class names of plugin descriptor classes and let the server call them to register (including quering their capabilities).
This is obvious suboptimal but I can't wait to read the accepted answer.
Any chance you can leverage the Service Provider Interface?
The best way to implement plug-ins with Guice is with Multibindings. The linked page goes into detail on how to use multibindings to host plugins.
Apologize if you know this, but check out the forName method of Class. It is used at least in JDBC to dynamically load the DBMS-specific driver classes runtime by class name.
Then I guess it would not be difficult to enumerate all class/jar files in a directory, load each of them, and define an interface for a static method getCapabilities() (or any name you choose) that returns their capabilities/description in whatever terms and format that makes sense for your system.