How to write a pluggable application using CDI (weld)? - java

I want to write a small SE application to run OS-specific commands. These commands are supplied to the main application as "plugins", in order to be able to add new command implementation at runtime. This is a mandatory request: that no redeploy of the main application be required to execute new plugins.
So, I went about trying to setup something using CDI:
// On a common dependency
#Qualifier
#Retention(RUNTIME)
#Target({TYPE, METHOD, FIELD, PARAMETER})
public #interface Plugin {
String value();
}
public interface Pluggable {
void execute(PluginContext context);
}
A plugin implementation would be something like this (in a separate jar):
#Plugin("greeting")
public class GreetingPlugin implements Pluggable {
public void execute(PluginContext context) {
String greet = context.get("hello.world");
System.out.println(String.format("Hello, %s", greet));
}
}
And that works fine, when loaded using the following injection point, plus a select() call:
#Inject #Any Instance<Pluggable> plugin;
However, I wonder what's the best approach to add the ability to add classes at runtime, so that the event of adding a new file to the "plugins" directory automatically registers it on the ClassLoader and the Weld container.
Any suggestions? Pitfalls I've not yet considered? My experience with CDI is rather limited, and perhaps it might not even be a suitable choice for this problem.
Disclaimer OSGI is ruled out, due to company licensing policy. Can't help on that front.

It seems to me that what you're looking for has been feature requested for CDI 1.1, but it is very unlikely that it will make its way in even for CDI 2.0, see this JIRA. There are even several alternatives discussed in there that you might want to consider.
The simple answer is - no, CDI doesn't provide such functionality by itself. That said, assuming you can manage to implement the dynamic class loading yourself, in an SE environment it is trivial to simply restart the CDI container essentially dynamically re-configuring your application with the newly loaded plugins - see Bootstrapping CDI.
So you'd watch your /plugins directory for changes. This would in turn trigger the dynamic class loading and then a Weld restart. The dynamic class loading part can get hairy so I'll let you figure that out.
Hope this helps.

Related

Best practices for runtime-only dependencies in OSGi

In line with the Open-Closed Principle, I typically design my Java packages and libraries in such a way that there is a generic "interface" or "API" package/library and one or more implementations (quite similar to many common APIs like JDBC or JAXP/SAX).
To locate an implementation (or sometimes multiple implementations) in the base API library without violating OCP, I commonly use Java's ServiceLoader mechanism, or occasionally classpath scanning via third-party libraries like ClassGraph or Reflections. From a Maven perspective, the implementations are brought in as runtime dependencies (as they're only needed at execution time, but not at compile time). Pretty standard stuff.
So, now, I want to make some of these packages available as OSGi bundles (with API and implementation in separate bundles), but since in OSGi each bundle has its own class loader, neither classpath scanning nor the ServiceLoader API will work for this purpose. At first glance, OSGi's "fragment" mechanism seems to be the closest equivalent to the plain-Java setup described above. In that scenario, the API bundle would be the "fragment host", and concrete implementations would attach as fragments to that host bundle. As the fragment host and all its attached fragments use the same class loader, the standard plain-Java mechanisms like ServiceLoader or ClassGraph would conceivably still work. This would also have the advantage that there would be no need to detect whether a library/bundle is running in an OSGi context or not, and no OSGi framework dependencies are needed.
So, in a nutshell, my question is: are fragments the correct way to implement runtime-only dependencies in OSGi or is there a better (or more standard) way? Preferably, I'm looking for a solution that works in an OSGi container but does not require a dependency on OSGi itself.
No Fragments are almost always wrong outside the translations. The OSGi model is to use services.
The way to go then is to use DS. Using bnd (in maven, gradle, ant, sbt, or Bndtools) you can create components. A component is a Plain Old Java Object (POJO) that is annotated with injection and activation instructions. You could make those components to take all its dependencies in the constructor.
The bnd code uses the annotations to generate an XML file that is used in runtime to create, activate, inject, and register those components. This will then work out of the box in an OSGi Framework. The annotations are build time so they do not create dependencies in your runtime.
In your non-OSGi environment, you'd be responsible to call that constructor yourself. So you gather your dependencies using the Service Loader and then construct them in the right order.
#Component
public class MyComponent implements Foo {
final Bar bar;
#Activate
public MyComponent( #Reference Bar bar ) {
this.bar = bar;
}
...
}

How to handle Spring beans required by library?

I am refactoring an application using Spring by moving some shared components (a webservice client) into a library. The components cannot work on their own so still need some beans from the application that uses the library. What is the best practice for this?
I have created a #Configuration class so the application only needs to #Import it to use the library, but the application also needs to supply a Jackson ObjectMapper and a Settings object containing how to contact the webservice. I autowire the ObjectMapper and Settings beans into various beans used in the library. The application uses the library by injecting the Client into its code and calling it.
This works, but I'm not sure it's the right style. In IntelliJ IDEA as I develop the library, it complains that the beans the library injects don't exist with a red underline, which is true, they don't exist. But normally when I see red all over files that cannot be resolved, that tells me maybe I'm not doing it the right way.
The library needs to be used with applications using Spring 3 and 5. What is the proper way for a library to ask for things like ObjectMapper when it's not appropriate to define its own (because the app will already have Jackson), and "proprietary" beans like the Settings object?
Your question is a bit broad but hopefully I can give you a hint to the right direction.
The components cannot work on their own so still need some beans from the application that uses the library. What is the best practice for this?
First: This components should use an interface instead of some concrete beans.
Second: If you have a reusable library then this typical needs some configuration, that can not been part of the library itself because it depends on application that use that library
Third: because of second (and first): your library should not been based on any form of auto wiring, instead your library should been based on explicit (or default) configuration.
And this solve the problem. Use interfaces and an explicit configuration of your library in your application. As well as add an good documentation of the needed configuration to your lib.
Using inspiration from #Kayaman and #Ralph, I decided it's not appropriate to expose Spring as a part of a library to be used directly from the application's context. I realize now it's also not appropriate because the library could define duplicate "private" beans it did not want to expose. I was overthinking it. If I wanted to use Spring, I found out I could do this:
public class Factory {
public static Client createClient(ObjectMapper mapper, Settings settings) {
DefaultListableBeanFactory beanFactory = new DefaultListableBeanFactory();
beanFactory.registerSingleton("mapper", mapper);
beanFactory.registerSingleton("settings", settings);
AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext(beanFactory);
ctx.registerBean(ClientConfiguration.class);
ctx.refresh();
return ctx.getBean(Client.class);
}
}
Essentially, it's OK to use Spring as an implementation detail. Since the configuration I made exposed only one bean, it makes sense as a factory method. In the application I would create a method like this:
#Bean public Client makeClient(ObjectMapper mapper, Settings settings) {
return Factory.createClient(mapper, settings);
}
Of course the Bean method would have ObjectMapper and Settings injected in from the application's context, or could be inline constructors for ObjectMapper/Settings.
Instead, what I decided was since the client had few enough beans and none were lazy, I just removed Spring annotations entirely and just built the object graph by hand in almost as much code as the spring context took. Now the library has the benefit of not requiring Spring at all at runtime, in a supposed non-Spring application.

Play 2.4 dependency injection forgets about builtin classes in unit tests if modules are added

We're migrating our Java-only Play application from Play 2.4 to Play 2.5. First step: Get rid of GlobalSettings, still completely in the 2.4 realm. I wrote some StartModule which will take over functionality as the migration guide and "the internet" describes. I add
play.modules.enabled += "de.[...].modules.StartModule"
to the application's .conf file. Executing this via sbt run or sbt start works as expected. Massive problems, however, arise when I try to unittest this stuff with sbt test or sbt test-only.
We have a rather elaborated unit test setup as the application is complex and has large legacy parts. Eventually, the unit test instance of the Play server is started with
Helpers.start(testserver=Helpers.testServer(playhttpport,
app=new GuiceApplicationBuilder()
.configure(getConfiguration())
.build()));
This works as long as the play.modules.enabled line above is not visible to the unit test code. As soon as I enable it, I get a number of errors like
Test de.[...]Tests failed: com.google.inject.CreationException:
Unable to create injector, see the following errors:
1) No implementation for play.inject.ApplicationLifecycle was bound.
while locating play.inject.ApplicationLifecycle
or
2) Could not find a suitable constructor in play.api.Environment.
Classes must have either one (and only one) constructor annotated with #Inject
or a zero-argument constructor that is not private.
Same thing happens if I remove the play.modules.enabled line and change the server start to
Helpers.start(testserver=Helpers.testServer(playhttpport,
app=new GuiceApplicationBuilder()
.load(Guiceable.modules(new StartModule()))
.configure(getConfiguration())
.build()));
In my limited understanding, it seems that GuiceApplicationBuilder (or whatever) "forgets" about all builtin dependency injection configuration if any additional dependency definitions are given. Unfortunately, I have not found any applicable postings here or anywhere else which would lead me to a solution.
Questions:
Is my analysis correct?
How can I make my unit test code functional with the additional module in the DI framework?
Would it be helpful to directly continue in Play 2.5? I'd like to solve this problem before as that migration step will bring its own plethora of things to handle and I'd really to have a functional base for this - including an operational unit test framework...
Any insight and help greatly appreciated!
Kind regards,
Dirk
Update This is StartModule:
public class StartModule extends AbstractModule {
#Override protected void configure() {
bind(InnerServerLauncher.class).asEagerSingleton();
}
}
And this is the InnerServerLauncher:
#Singleton
public class InnerServerLauncher {
#Inject
public InnerServerLauncher(ApplicationLifecycle lifecycle,
Environment environment) {
System.err.println("*** Server launcher called ***");
}
}
I should add that the problem also arises if I put a totally different class into play.modules.enabled like
play.modules.enabled += "play.filters.cors.CORSModule"
Ok, I finally got it. Problem is the configuration() method which I mentioned above but did not elaborate further. As I said, we have quite some legacy in our system. Therefore, we have a mechanism which constructs the configuration independently from Play's config files for the unit tests. GuiceBuilder.configure() (and btw. also the fakeApplication()-based methods) merges this with the Play-internal configuration but only on the topmost layer. For plain settings (strings, numbers etc.) that's ok, but for value lists it means that the complete list is overwritten and replaced.
play.modules.enabled is used by Play internally to gather the default modules which have to be registered with the dependency injection framework. Documentation states very clearly that your statements in application.conf must only add elements to play.modules.enabled, i.e.
play.modules.enabled += "package.Module"
Our "special way" of constructing the configuration environment for unit tests, however, overwrote Play's own play.modules.enabled as soon as any value in our own configuration was set. And that destroyed the complete dependency injection scheme of Play as none of its own base classes were accessible any more. Bummer!
I solved this by actually using a "real" configuration file which is just read normally by GuiceApplicationBuilder and which contains those play.modules.enabled += ... lines. As this config file is still artifically and temporarily generated for the unit test scenario, I pass its name to GuiceApplicationBuilder via System.setProperty:
System.setProperty("config.file",conffilename);
Helpers.start(testserver=Helpers.testServer(playhttpport,
app=new GuiceApplicationBuilder().build()));
Now, the configuration is created correctly and with the internal default settings for play.modules.enabled. Now, I finally can start to actually move the code from GlobalSettings into that injected and eagerly loaded module. And it was just ten hours of hunting...

Forcing external java (jar project) library beans to be managed by Spring fabric

I have two project the first is a spring boot app and the second one i.e. java static library project that is not dependent on anything else except java. In the past those two projects were one project however i separated them since they represent two different logical components, and that java library is used in other sub projects as well. Now since the statical library is simply a jar i can not instantiate classes of that jar based on the interface name provided such as we do it in spring for example if my interface was located in path:
edu.university.ServiceLayer.StudentInterface
i would easily do:
Student object = (Student) applicationContext.getBean("StudentInterface");
and that gives me the student object
Now i would like to do the same with the external java library. Since i have never done this, my question is what would be the best way to do it if keeping in mind that i would like to keep that library not dependent on anything else except java.
In my spring boot project in order to do that i needed simply need to annotate the selected bean with the correct artifcat i.e. #Component, #Repository #Service etc. and those beans are then automatically managed by the spring fabric. I.e. I can then seen them by printing the BeanDefinitionNames() of the applicationContext i.e.
String[] beanNames = applicationContext.getBeanDefinitionNames();
But that trick does not work with external jar. Now what would be the best compromise for this constellation, i.e. shell i really add spring dependency to my jar java library or is there any magical way i can do it without adding those dependency to my library. Again my target is to allow spring to manage selected beans of the external library i.e. i would like them to appear under:
applicationContext.getBeanDefinitionNames()
is there any pattern-like way that is used to accomplish this?
many thanks for the ideas.
This doesn't relate to Spring Integration at all.
Plus you have to read more documentations.
Any class available in the CLASSPATH can be instantiated as a bean in the Spring Container, e.g.
#Bean
public Foo foo() {
return new Foo();
}
When that Foo is in your jar. No need to modify them for those stereotype annotations.
BTW, to be more clear you even can create beans for Java native classes:
#Bean
#Scope(ConfigurableBeanFactory.SCOPE_PROTOTYPE)
public Date now() {
return new Date();
}

Is there an alternative to loading a class with Class.forName()?

In my library's code, I load class names from an XML file using JAXB, in order to instanciate them later on using Class.forName(). A fictive example to illustrate the case:
public void libraryMethod() throws Exception {
List<String> classNames = loadClassNamesFromXML();
for (String className : classNames) {
Class.forName(className).newInstance().doThings();
}
}
Now, some users use OSGi to configure their applications, and they load my library with a different class loader than the classes they configure using my XML structure. This means that loading may fail, as the class cannot be found.
Is there any more reliable way to load such classes? Or is there any other way to configure these instances? I'm open to suggestions that result in this:
public void libraryMethod() throws Exception {
// Spring does things like this:
List<SomeType> instances = loadInstancesFromXML();
for (SomeType instance : instances) {
instance.doThings();
}
}
Some constraints:
From a library perspective, the lifecycle of those instances is unimportant. If they have (user) state, my library would not notice.
I want to keep things simple in this library, so I want to avoid creating external dependencies on configuration frameworks like spring, for instance. So I'm only interested in solutions that can be achieved with standard JDK 6+ distributions.
I would really like to keep my simple XML configuration file (slight adaptations to the XML structure are OK).
If you want your library to be flexible and work in both OSGi and non-OSGi environment, you should allow users provide their own ClassLoader's or let them tell your library which class names they have. Read Neil Bartlett blog post.
Original link returns a 404. You can access the article on the Wayback Machine.
Thanks for the explanation. Spring would not make this any easier in OSGi. You can not simply inject an implementation class from a package you do not import. In OSGi you typically use OSGi services to inject implementations that originate outside your bundle and are unknown to you at compile time.
So your user would implement an interface you specify and publish his implementation as an OSGi service. You could then either pick up all such services or let the user specify an ldap filter for his service in the xml config.
The advantage of this aproach is that you do not have to load classes and care about classloaders. So this is the recommended way in OSGi. If you want the same solution for inside and outside OSGi Ivan's aproach with specifying a classloader + classname is an alternative.
In general, in OSGi you should use a service. The reason that Class.forName/XML configuration is so popular is that only a single class gets control. To configure the rest, it needs to know the classes to initialize/call.
In OSGi this problem does not exist since each module (bundle) (can) get control through declarative services (or in the old fashioned way through an activator). So in OSGi you have a peer to peer model. Anybody can register a service and depend on other services.
So instead of specifying class names and assuming they are globally unique (they are not in large systems) it is a lot easier to use services and not leave the Java compiler; these class names are very error prone. In general this means that you often just register your service and wait to be called since there is no need to initialize your clients. However, the whiteboard pattern address the situation when you want to find out about your clients (with bndtools and bnd annotations):
The "server"
#Component
public class MyLib {
#Reference(type='*')
void addSomeType(SomeType st ) {
st.doThings();
}
}
The client
#Component
public class MyClient implements SomeType {
public void doThings() { ... }
}
Hope this helps.
JDBC4 drivers include META-INF/services/java.sql.Driver in the jar which use the ServiceProvider mechanism to register the Driver implementation with the JVM (see java.util.ServiceLoader javadocs). Having the driver on the class path will register the driver automatically, obviating the need to use Class.forName. Instead app code uses ServiceLoader.load to discover registered drivers. Same mechanism can be used for other config. Perhaps something like that could be used? As an aside, when registering one's own implementations with the Service Provider mechanism, using an annotation like the spi looks pretty convenient.

Categories

Resources