We have a system where we wanted to consume the implementation of our interfaces in a separate jar. The scenario is clients consume our work and provide their own implementation to override default implementation.
The question is what is the best way to bind/wire the actual implementation classes into our system?
One way is let spring wire the dependencies. It is currently not an option since all clients are not using spring.
Looked into some options like resolving interface implementation classes using reflection. Not very happy with this solution.
Another good old option is configure the class name in one of the property and let clients configure it. It looks good.
But wanted to find some elegant option if available.
Also any idea how SLF4J / EL resolves their implementations automatically?
I'd suggest you to use SPI (Service Provider Interface).
It requires creating file that enumerates all available implementations of specific service. This may be annoying. Fortunately you can use this open source library that does this work for you: http://code.google.com/p/spi/
Perhaps the Reflections library is what you are looking for.
Reflections scans your classpath, indexes the metadata, allows you to query it on runtime and may save and collect that information for many modules within your project.
Using Reflections you can query your metadata such as:
get all subtypes of some type
get all types/methods/fields annotated with some annotation, w/o annotation parameters matching
get all resources matching matching a regular expression
Related
I have been trying to design a software architecture with guice that allows for separate implementations by different teams in different regions.
The idea is like:
API-A ->
API-B -> API-Global
API-C ->
API Global has a bunch of classes (related to GraphQL) that should be non-interface shells of what needs to be implemented.
Because the region specific APIs depend on the Global package I can't have region specific code there. Is there a way to create guice bindings in an overall graph that the Global API can find through introspection for consumption?
I looked at Guice multibinds to do this but I am not sure that the best way to do it. I know this is possible using Spring but would really want to use Guice to do this.
Thanks.
The way OSGi and other frameworks handle this is to let each implementation come with a standardized file which contains meta-information (meaning the implementation class), and which Global-API can then find on the classpath to configure itself.
(Consider using a standardized framework instead of reinventing the wheel, but this is the general process which I have seen in these frameworks).
I'm looking for different ways to prevent internals leaking into an API. This is a huge problem because once these internals leak into the API; you can run either into unexpected incompatibility issues or into frozen internals.
One of the simplest ways to do so is just make use of different Maven modules; one module with API and one module with implementation. This way it is impossible to expose the implementation from the API.
Unfortunately not everyone agrees this is the best approach; But are there other alternatives? E.g using checkstyle or other 'architecture checking' tools?
PS: Java 9 for us is not usable, since we are about to upgrade to Java 8 and this will be the lowest supporting version for quite some time to come.
Following your checkstyle idea, it should be possible to set up rules which examine import statements in source files.
Checkstyle has built-in support for that, specifically the IllegalImport and ImportControl rules.
This of course works best if public and internal classes can be easily separated by package names.
The idea for IllegalImport would be that you configure a TreeWalker in checkstyle which only looks at your API-sources, and which excludes imports from internal packages.
With the ImportControl rule on the other hand you can define very detailed access rules for the whole application/module in a separate XML file.
It is standard in Java to define an API using interfaces and implement them using classes. That way you can change the "internals" however you want and nothing changes for the user(s) of the API.
One alternative is to have one module (Jar file) for API and implementation (but then again, is it an API or just any kind of library?). Inside one separates classes and interfaces by using packages, e.g. com.acme.stuff.api and com.acme.stuff.impl. It is important to make classes inside the latter package protected or just package-protected.
Not only does the package name show the consuming developer "hey, this is the implementation", it is also not possible to use anything inside (let's omit reflections at this point for the sake of simplicity).
But again: This is against the idea of an API, because usually the implementation can be changed. With this approach one cannot separate API from implementation, because both are inside the same module.
If it is only about hiding internals of a library, then this is one (not the one) feasible approach.
And just in case you meant a library instead of an API, which only exposes its "frontend" (by using interfaces or abstract classes and such), use different package names, e.g. com.acme.stuff and com.acme.stuff.internal. The same visibility rules apply of course.
Also: This way one does not need Checkstyle and other burdens.
Here is a good start : http://wiki.netbeans.org/API_Design
Key point : Do not expose more than you want Obviously the less of the implementation is expressed in the API, the more flexibility one can have in future. There are some tricks that one can use to hide the implementation, but still deliver the desired functionality
I think you don't need any checkstyle or anything like that, just a good old solid design and architecture should be enough. Polymorphism is all you need here.
One of the simplest ways to do so is just make use of different Maven
modules; one module with API and one module with implementation. This
way it is impossible to expose the implementation from the API.
Yes, I totally agree, hide as much as possible, separate your interface in a standalone project.
This is more a general question by example:
I'm using xstream and woodstox, woodstox comes with a service provider for javax.xml.stream.XMLOutputFactory in woodstox jar registering com.ctc.wstx.stax.WstxOutputFactory.
I want to provide my own javax.xml.stream.XMLOutputFactory and still have woodstox jar in the classpath. I know I can provide my own with the system property javax.xml.stream.XMLOutputFactory , but I'm trying to take off the hassle from our dev ops team and do it with a service file in my jar or maybe in my war's META-INF/services folder. looking the code of javax.xml.stream.FactoryFinder how can I make sure that my
META-INF/services/javax.xml.stream.XMLOutputFactory file will be the one used by FactoryFinder?
we use xstream with camel and could not find a way to inject the factory to XStreamDataFormat
First: instead of relying on JDK SPI interface, I strongly recommend simplifying your life and NOT using it. It really adds no value over injecting XMLInputFactory and/or XMLOutputFactory yourself. For injection you can use Guice (or Spring); or just pass it manually. Since these factories do not have dependencies of their own, this is easy.
But if choose to (or have to) use XMLInputFactory.newInstance(), you can define a System property for "javax.xml.stream.XMLOutputFactory" and "javax.xml.stream.XMLInputFactory".
So why not use JDK approach? Multiple reasons:
It adds overhead: if you are not specifying System property, it will have to scan the whole classpath, and with big app servers this takes 10x-100x as long as most parsing
Precedence of implementations is undefined: if you multiple in classpath, which one will you get? Who knows... (and note: it might even change when you add new jars in classpath)
You are very likely to get multiple impl via transitive dependencies
Unfortunately, Oracle still seems to insist on adding this known-faulty method for registering service providers. Why? Probably because they do not have a DI lib/framework of their own (Guice is by google, Spring by Springsource), and they tend to be pretty control hungry.
You can just do like this to specify the XMLOutputFactory implementation You want to use:
System.setProperty("javax.xml.stream.XMLOutputFactory", ... full classname You want to use ...);
Source:
http://docs.oracle.com/cd/E17802_01/webservices/webservices/docs/1.6/tutorial/doc/SJSXP4.html
Deriving from JAXP, the XMLInputFactory.newInstance() method
determines the specific XMLInputFactory implementation class to load
by using the following lookup procedure:
Use the javax.xml.stream.XMLInputFactory system property.
Use the lib/xml.stream.properties file in the JRE directory.
Use the Services API, if available, to determine the classname by looking in the META-INF/services/javax.xml.stream.XMLInputFactory
files in jars available to the JRE.
Use the platform default XMLInputFactory instance.
I discovered that if I put the service file under
WEB-INF/classes/services/javax.xml.stream.XMLOutputFactory then it will be first in classpath and before jars in WEB-INF/lib.
and that's my solution.
We had similar issue where parsing would run in local but fail on server. After debugging found server is using reader com.ctc.wstx.evt.WstxEventReader
Whereas on local reader was com.sun.xml.internal.stream.XMLEventReaderImpl
We set following property to resolve it.
System.setProperty("javax.xml.stream.XMLInputFactory", "com.sun.xml.internal.stream.XMLInputFactoryImpl");
If your implementation is in a jar then make sure it is before woodstox.jar on the class path, then FactoryFinder will use your implementation.
I am using google Reflections in a java library I am developing.
The reason I use Reflections is because I want to find all the classes with a particular annotation.
Simplifying things, in my library I have a method answering those classes, that is invoked with a line like:
//this method uses the Reflections library
Repository.getDefault().getMyAnnotatedClasses()
In the current version of my library, I require the user to explicitly add the package name where the Reflections library need to look for the annotated classes:
Repository.getDefault().addSearchPath("...");
In this way Reflections will look for classes located only in that package.
If the user of my library does not add this search path, I configure Reflections to search in all the classes in the system class loader. Obviously this solution is quite inefficient. However, I really want to get rid of the requirement of asking the user to ALWAYS set the search path.
- A side note in case it is important: in Reflections you can configure a search path with a package name, a url (the classpath where the classes can be found), or a class loader.
So my question is: Is there a way to find the classpath of the class invoking my library ? (from my library code).
In this way, I could detect that if the user has not explicitly set a search location, I will add that location by default, which seems to be a better solution than adding the entire system class loader as the alternative search path.
I know I could manually inspect the method call stack from my library code, but this seems a bit of a hack and dirty solution, so I am looking for alternative ideas.
Thanks in advance.
You can find the caller of a method by analyzing the stacktrace. Although not a good idea in general IMHO as it breaks encapsulation and sensitive to managed environments (proxies and so), it will work. Look more in this question.
What else you can do, and avoid the inefficiency of scanning all classpath every time, is to scan and save once on compile time with reflections, and than collect it on bootstrap time.
To scan and save once as XML:
new Reflections(...).save([somepath]);
And than collect without scanning:
Reflections reflections = Reflections.collect([somepath]);
If you're using Maven, you can automate it and do it as part of every build using the Reflections-Maven plugin.
Look for more info at collect pre scanned metadata in the Reflections UseCases Wiki page
I'd like to implement a dynamic plugin feature in a Java application. Ideally:
The application would define an interface Plugin with a method like getCapabilities().
A plugin would be a JAR pluginX.jar containing a class PluginXImpl implementing Plugin (and maybe some others).
The user would put pluginX.jar in a special directory or set a configuration parameter pointing to it. The user should not necessarily have to include pluginX.jar in their classpath.
The application would find PluginXImpl (maybe via the JAR manifest, maybe by reflection) and add it to a registry.
The client could get an instance of PluginXImpl, e.g., by invoking a method like getPluginWithCapabilities("X"). The user should not necessarily have to know the name of the plugin.
I've got a sense I should be able to do this with peaberry, but I can't make any sense of the documentation. I've invested some time in learning Guice, so my preferred answer would not be "use Spring Dynamic Modules."
Can anybody give me a simple idea of how to go about doing this using Guice/peaberry, OSGi, or just plain Java?
This is actually quite easy using plain Java means:
Since you don't want the user to configure the classpath before starting the application, I would first create a URLClassLoader with an array of URLs to the files in your plugin directory. Use File.listFiles to find all plugin jars and then File.toURI().toURL() to get a URL to each file. You should pass the system classloader (ClassLoader.getSystemClassLoader()) as a parent to your URLClassLoader.
If the plugin jars contain a configuration file in META-INF/services as described in the API documentation for java.util.ServiceLoader, you can now use ServiceLoader.load(Plugin.class, myUrlClassLoader) to obatin a service loader for your Plugin interface and call iterator() on it to get instances of all configured Plugin implementations.
You still have to provide your own wrapper around this to filter plugin capabilites, but that shouldn't be too much trouble, I suppose.
OSGI would be fine if you want to replace the plugins during runtime i.g. for bugfixes in a 24/7 environment. I played a while with OSGI but it took too much time, because it wasn't a requirement, and you need a plan b if you remove a bundle.
My humble solution then was, providing a properties files with the class names of plugin descriptor classes and let the server call them to register (including quering their capabilities).
This is obvious suboptimal but I can't wait to read the accepted answer.
Any chance you can leverage the Service Provider Interface?
The best way to implement plug-ins with Guice is with Multibindings. The linked page goes into detail on how to use multibindings to host plugins.
Apologize if you know this, but check out the forName method of Class. It is used at least in JDBC to dynamically load the DBMS-specific driver classes runtime by class name.
Then I guess it would not be difficult to enumerate all class/jar files in a directory, load each of them, and define an interface for a static method getCapabilities() (or any name you choose) that returns their capabilities/description in whatever terms and format that makes sense for your system.