I am making a set of Eclipse Plugins for the Eclipse Workbench.
I want these Eclipse Plugins to communicate with each other through some shared
data structures/managers.
Is there some bootstrapping or other initialization process wherein I
can pass the shared domain objects through the constructor for the plugins(depepndency injection)?
What is the standard and best practice for achieving sharing of data across plugins?
Eclipse is OSGi based using the equinox runtime. OSGi manages all of the runtime dependencies you need.
The simplest way is to deploy your common code as a bundle (plugin). Export all of the packages you need to other plugins. (Export-Package header in manifest.mf)
In the plugins you need the package, declare them as imported packages (Import-Package in the manifest.mf file)
If you want to go the extra mile, expose the managers you need as services, and add service consumers in the plugins you need.
Here's a simple tutorial to using services:
http://www.knopflerfish.org/osgi_service_tutorial.html
Related
I'm creating a small hobby Java task/todo application. I want to be able to write plugins for it, which will be stored in a directory somewhere, probably in a plugins directory next to the myapplication.jar.
I have some idea on how to load these plugins, and I want to write interfaces which the plugin creator can use, like SomeActionInterface, when implemented allows the plugin to add functionality to SomeAction.
My question is, where does that SomeActionInterface go, and how would the plugin creator access said interface?
Does the interface go in the main myapplication.jar which the user should have loaded on their classpath, or does it go in a separate myapplication-plugininterfaces.jar?
Normally you would expose SPI and API that the plugin authors can use to implement their code. Normally these classes are packaged as a separate JAR, this allows to have a minimal dependency to build a plugin.
There are some good examples of plugin architecture that you can explore:
JDBC exposes java.sql.Connection and relate classes so the database projects can implement drivers for Java.
SLF4J handles new logger framework bindings as plugins. There are slf4-api and slf4j-ext dependencies that are used to implement a plugin.
I would like to take several lists of Maven dependencies from the user, resolve and load each of them as contained applications. Here are the steps:
collect a list of all Maven dependencies (DONE)
resolve all dependencies with Aether (DONE)
resolve classpath with Aether (DONE)
bundle the above in a separate "container" (so that different Maven dependencies with potential conflicting version can be used).
repeat with other lists.
To give some context: I want to use the above in the context of UIMA, to be able to run different (natural language processing) pipelines that rely on different sets of libraries with different versions. My goal is to create an annotation-server in which one defines (Maven) dependencies and pipelines that can be called in a RESTful way. The pipelines (and their corresponding dependencies) should each run in a contained classpath environment (so as to avoid classpath clashes).
Is OSGi the way to go? Based on a classpath (:= a list of resolved jar), can I then build an OSGi bundle and deploy it? All programmatically? I do not have control over the maven dependencies (they are UIMA components, that's it), so no way to add OSGi metadata there.
Would maven-assembly-plugin combined with maven profiles take care of this for you?
You can filter dependencies differently on a per profile basis. You can use profile specific assembly descriptor documents and generate custom manifest to be placed in the war. You are describing a J2EE Web Application (war) assembly -- they will run in a firewalled classloader inside a servlet container so you generate a bunch of them based on the same source (just vary the web app context and the contents of the WEB-INF/lib on a per profile basis.
Drop them into the same Tomcat server, for example, and you are ready to go. Was this what you meant?
HTH,
Nick
You can certainly create a bundle that contains a list of jars, put all of those on the bundle's own classpath and deploy that bundle into an OSGi container. You probably do need to create a BundleActivator (which is the entry point for that bundle, like the main method is for traditional Java).
You then say you have multiple of such bundles, and do I understand correctly that you want to deploy each bundle in a separate container? If so, you can either use some kind of REST library to provide a REST endpoint for each bundle, or you can use OSGi remote services to publish a service that can be discovered by other containers.
I am not sure if this is what you mean, so I am also not sure if OSGi is the right way to go. From your description you use neither services (a very important reason to use OSGi as that decouples parts of your application from each other) nor do you intend to create different bundles for the components (another important reason to use OSGi). You are almost describing an architectural style currently hyped as "micro services". Can you please elaborate a bit more?
Based on your use case I'd suggest you look into the Java ServiceLoader API. The ServiceLoader API allows you to define an interface, and load implementations of that interface from different self-contained JARs. You can build your different libraries into their own jars, exposing the methods you need via the interface, and load them from your Java program independently. The ServiceLoader will even list the different implementations available for you.
From the documentation:
Suppose we have a service type com.example.CodecSet which is intended to represent sets of encoder/decoder pairs for some protocol. In this case it is an abstract class with two abstract methods:
public abstract Encoder getEncoder(String encodingName);
public abstract Decoder getDecoder(String encodingName);
Each method returns an appropriate object or null if the provider does not support the given encoding. Typical providers support more than one encoding.
If com.example.impl.StandardCodecs is an implementation of the CodecSet service then its jar file also contains a file named
META-INF/services/com.example.CodecSet
This file contains the single line:
com.example.impl.StandardCodecs # Standard codecs
The CodecSet class creates and saves a single service instance at initialization:
private static ServiceLoader<CodecSet> codecSetLoader
= ServiceLoader.load(CodecSet.class);
To locate an encoder for a given encoding name it defines a static factory method which iterates through the known and available providers, returning only when it has located a suitable encoder or has run out of providers.
public static Encoder getEncoder(String encodingName) {
for (CodecSet cp : codecSetLoader) {
Encoder enc = cp.getEncoder(encodingName);
if (enc != null)
return enc;
}
return null;
}
A getDecoder method is defined similarly.
That sounds a lot like you could use Apache Stanbol. It's a framework focused on semantic enhancement of content but can be used for any web based work flows involving content. You can define pipelines to process and/or store your data. There are components for NLP using Apache Tika and OpenNLP. As far as I know you can also integrate UIMA. It uses RESTful services and is based on OSGI.
If Stanbol doesn't fit your use case and you need to roll your own application, I think OSGI is still the way to go.
Depending on your use case you can either deploy bundles to a container or simply embed the OSGI framework in a small launcher app that loads the bundles you create.
Many Maven artifacts already contain OSGI metadata. Most of the time you can copy them to your bundle directory using the maven-dependency-plugin and load them directly as OSGI bundles.
Non-OSGI dependencies can be embed in your the bundles that need them. It should also be possible to setup a few maven plugins to modify the manifest to add some meta data based on the maven artifact ids and version and repack the dependencies as bundles (this won't work all the time though, since the Maven pom version and the packages' versions aren't always the same).
The users code and any required dependencies can be bundled up using the maven-bundle-plugin. It can generate the manifest for you.
For REST interfaces I usually would recommend JAX-RS (Jersey or Apache CXF DOSGI) but I haven't used the programmatic approach with those frameworks yet.
I have two java projects that are fairly independent beside the fact that they share a common mysql database.
I wanted to refactor these project and extract everything regarding the common data layer. I am using jOOQ, so most of this layer gets autogenerated in my build. Beside that i then have a few common entity classes that are used in both projects.
what would be the best practice to separate this, so that any change can be done one place and still propagate to both projects? create a third java simple project with the common code? what would you do
I work on a distributed system, and multiple daemons need access to the same Postgres database via jOOQ. Since each daemon is its own Java project, I am in the same boat as you basically.
The solution I've been using is to create a third Java project as a Java Library. If you're using Netbeans you can just include it as a subproject dependency and any changes to the library project can be recompiled into the individual application projects.
One thing of note, you'll need to specify the jOOQ library jars in all 3 projects. In Netbeans its easy to specify a project's library directory, and have multiple projects share these dependencies. Netbeans will copy the dependencies at deployment time.
Edit:
The steps are basically:
create a master layout for system, IE:
/master-project/
/master-project/library
/master-project/software
/master-project/software/daemon1
/master-project/software/daemon2
/master-project/common
/master-project/common/utility1
/master-project/common/utility2
create third-party "library" bundles of {jar,src,docs} under /master-project/library.
create "application" projects under /master-project/software, making sure to tell Netbeans to only use third-party libraries under /master-project/library.
create "library" projects under /master-project/common, making sure to tell NB only to use third-party libraries under /master-project/library.
create a "library" for jOOQ code to be shared, as in step 4.
Each project is responsible for its own compile script (including generating jOOQ code, if desirable), and correctly specifying its dependencies out of /master-project/library, and /master-project/common.
we are trying to develop a web application framework and build implementatins on top of it. This framwork will be versioned in SVN, live its own life in parallel to those implementations. It will have lots of spring config files, security config and so on. We would like to use those in those implementations.
What structure should such an project have? Keep everything together? Link particular folers (implementations) in "svn: externals"? We would like to use Maven, and create an archetype for those implementations, but is it possible to update the archetype after it has been changed in implementation applications?
Regards,
This is a good example :
http://www.sonatype.com/books/mvnex-book/reference/web.html
Also this book is very useful resource when starting with maven
I found this also :
http://www.avajava.com/tutorials/lessons/how-do-i-create-a-web-application-project-using-maven.html
I'd suggest you create your framework project as a simple jar project to include in your implementation, which would be war projects. For the Spring config files you have three options then:
Package them into your framework jar. This would make it hard for the implementations to customize it. I would not recommend it, unless your configuration is definitively fixed.
Use svn: externals. I have not much experience with that, but I think dependencies between svn repositories would be hard to manage.
Maintain these configuration files per implementation. So, an archetype would help to get started with an initial configuration. Then maintain these configuration files as your framework evolves. This is what we do most of the time. The good thing about Spring configuration is that it often rarely needs to be touched once you are confident with it.
In Java, I can dynamically add stuff to classpath and load classes ("dynamically" meaning without restarting my application). Is there a known framework/library which deals with dynamic loading/unloading of modules without restart?
The usual setup, especially for web-apps, is load balancer, several instances of application, and gradual deployment and restart of new version. I'm looking for something else - application with several services/plugins, possibly single-instance desktop application, where disabling single service is cheap, but bringing down or restarting complete application is not feasible.
I'm thinking about typical plugin infrastructure, where plugins can be upgraded or installed without restarting application. Do I have to program that from scratch, or is something already available? Spring-compatible and opensource is a plus, but not a requirement.
You might consider running your spring application in an OSGI framework.
I believe the DMServer is a module-based Java application server that is designed to run enterprise Java applications and Spring-powered applications, based on OSGI
You can find more details in this Hello, OSGi, Part 2: Introduction to Spring Dynamic Modules article, in particular how to use Spring DM to dynamically install, update, and uninstall modules in a running system.
Note: when you speak about "plugins can be upgraded or installed without restarting application", OSGI is the first candidate framework that comes to mind.
It is all about modularization of applications into smaller bundles.
Each bundle is a tightly-coupled, dynamically loadable collection of classes, jars, and configuration files that explicitly declare their external dependencies (if any).
Perhaps the simplest approach is to load each plugin with it's own class loader. Then discard the class loader and create a new one to reload the plugin. You will want init() and destroy() methods in the plugin API to allow a chance for startup/shutdown type functionality.
This also has the advantage of isolating the plugins from each other.
A URLClassLoader is your starting point for this. The general idea is that you provide a XxxPlugin superclass that any plugin subclasses. Consider the example of Applet, which is essentially a GUI plugin (or Midlet, etc).