Implementing a (compile-time) plugin architecture without split packages - java

In a past question I asked how to design a system where:
A class contains one or more optional methods.
Optional methods are implemented by plugins that may or not may be present at compile-time.
If a user invokes a method whose associated plugin is not present at compile-time, they will get a compile-time error.
I provided one possible solution that works in Java 8.
Unfortunately, this solution depends on the use of split packages (two modules exporting the same package) which are disallowed by the Java 9 Module System.
How can this be implemented in Java 9?

Services
If I have understood the question correctly what you're looking forward to using are Services from the module system.
Java has long supported services via the java.util.ServiceLoader
class, which locates service providers at run time by searching the
classpath.
The module system could identify uses of services by scanning the class files in module artifacts for invocations of the ServiceLoader::load method.
With your current project structure, you should define an abstract class or an interface that can be a extended of implemented in the guava, coremodule classes and is provided by them.
A module uses a particular service is an important aspect of that module’s definition, so for both efficiency and clarity its expressed in the module’s declaration with a uses clause:
module com.foo.bar.sql {
uses com.foo.Verifiers;
}
A module provides an implementation of a particular service is equally fundamental, however, this is put in the module’s declaration with a provides clause:
module guava {
provides com.foo.Verifiers with com.guava. GuavaVerifier;
}
module core {
provides com.foo.CoreVerifier with com.guava. GuavaVerifier;
}

Related

Java module system: how to avoid leaking all packages of transitive modules

In java module system, we can have:
module hellomodule {
exports com.name.hello;
requires transitive greetings;
}
by doing this the packages exposed by greetings module will effectively become part of the API exposed by the hellomodue.
We may want to avoid this to a certain degree, would be nice, for example, to allow only visibility on certain classes, the ones used by hellomodule in the signature of the methods it contains perhaps.
Is there any way to do this, i.e. allow only certain classes or packages to be leaked ?

Clean architecture: separate IO and Core in different .jar files

In clean architecture the structure is like that:
CORE:
CoreClass.java
SomeDAOInterface.java
IO
SomeDAOInterfaceImpl.java (implement SomeDAOInterface)
If I was supposed to split Core and IO in different .jar files, different projects, how am I supposed to handle "SomeDAOInterface" dependency in IO part? It is only contained in Core part, so I cannot really implement it without compiler error (no class SomeDAOInterface found).
What you describe is far from an unusual design, and there are plenty of examples around. For example Java EE declares a number of interfaces which are to be implemented by various containers. Or Jdbc also declares interfaces which will be implemented by database engines.
There are 2 possible designs depending on whether the binding will occur at build time or at run time.
When binding occurs at build time (common for jdbc for example), you must have an implementation available at build time, for example you declare a MySQL database driver in your project. In your example, it means the the IO project will depend on the Core one.
When binding occurs at run time (Java EE for example), you use a dummy project that only contains the interface classes (SomeDAOInterface in your example) and not the implementations for compilation and declare to the builder not to link it in the final jar but that it will be provided at run time. And at run-time you do provide in classpath a full implementation, containing both the interface classes (SomeDAOInterface) and the implementation ones (SomeDAOInterfaceImpl in your example). You will just have to read your build system documentation to know how to declare that.
Alternately, you can link the dummy project in the core jar, and declare that it will be provided in the implementation one.
If u r talking about Clean Architecture from Uncle Bob then I wonder what the CORE project is?
In case u refer to the "entities circle" then having the interface defined there is fine IF this is really part of ur core business rules.
u would then create a dependency from ur IO project (which is in frameworks or interface adapters layer) to the CORE project which is correct according to the dependency rule.
For a more detailed discussion on project structures in Clean Architecture pls refer to my post: https://plainionist.github.io/Implementing-Clean-Architecture-Scream/

Using different versions of dependencies in separated Java platform modules

I expected it's possible to use i.e. Guava-19 in myModuleA and guava-20 in myModuleB, since jigsaw modules have their own classpath.
Let's say myModuleA uses Iterators.emptyIterator(); - which is removed in guava-20 and myModuleB uses the new static method FluentIterable.of(); - which wasn't available in guava-19. Unfortunately, my test is negative. At compile-time, it looks fine. In contrast to runtime the result is a NoSuchMethodError. Means that, the class which was the first on the classloader decides which one fails.
The encapsulation with the underlying coupling? I found a reason for myself. It couldn't be supported because of transitive dependencies would have the same problem as before. If a guava class which has version conflicts occurred in the signature in ModuleA and ModuleB depends on it. Which class should be used?
But why all over the internet we can read "jigsaw - the module system stops the classpath hell"? We have now multiple smaller "similar-to-classpaths" with the same problems. It's more an uncertainty than a question.
Version Conflicts
First a correction: You say that modules have their own class path, which is not correct. The application's class path remains as it is. Parallel to it the module path was introduced but it essentially works in the same way. Particularly, all application classes are loaded by the same class loader (by default at least).
That there is only a single class loader for all application classes also explains why there can't be two versions of the same class: The entire class loading infrastructure is built on the assumption that a fully qualified class name suffices to identify a class with a class loader.
This also opens the path to the solution for multiple versions. Like before you can achieve that by using different class loaders. The module system native way to do that would be to create additional layers (each layer has its own loader).
Module Hell?
So does the module system replace class path hell with module hell? Well, multiple versions of the same library are still not possible without creating new class loaders, so this fundamental problem remains.
On the other hand, now you at least get an error at compile or launch due to split packages. This prevents the program from subtly misbehaving, which is not that bad, either.
Theoretically it is possible to use different versions of the same library within your application. The concept that enables this: layering!
When you study Jigsaw under the hood you find a whole section dedicated to this topic.
The idea is basically that you can further group modules using these layers. Layers are constructed at runtime; and they have their own classloader. Meaning: it should be absolutely possible to use modules in different versions within one application - they just need to go into different layers. And as shown - this kind of "multiple version support" is actively discussed by the people working on java/jigsaw. It is not an obscure feature - it is meant to support different module versions under one hood.
The only disclaimer at this point: unfortunately there are no "complete" source code examples out there (of which I know), thus I can only link to that Oracle presentation.
In other words: there is some sort of solution to this versioning problem on the horizon - but it will take more time until to make experiences in real world code with this new idea. And to be precise: you can have different layers that are isolated by different class loaders. There is no support that would allow you that "the same object" uses modV1 and modV2 at the same time. You can only have two objects, one using modV1 and the other modV2.
( German readers might want to have a look here - that publication contain another introduction to the topic of layers ).
Java 9 doesn't solve such problems. In a nutshell what was done in java 9 is to extend classic access modifiers (public, protected, package-private, private) to the jar levels.
Prior to java 9, if a module A depends on module B, then all public classes from B will be visible for A.
With Java 9, visibility could be configured, so it could be limited only to a subset of classes, each module could define which packages exports and which packages requires.
Most of those checks are done by the compiler.
From a run time perspective(classloader architecture), there is no big change, all application modules are loaded by the same classloader, so it's not possible to have the same class with different versions in the same jvm unless you use a modular framework like OSGI or manipulate classloaders by yourself.
As others have hinted, JPMS layers can help with that. You can use them just manually, but Layrry might be helpful to you, which is a fluent API and configuration-based launcher for running layered applications. It allows you to define the layer structure by means of configuration and it will fire up the layer graph for you. It also supports the dynamic addition/removal of layers at runtime.
Disclaimer: I'm the initial creator of Layrry

Factory which is not dependant on implementation

I have an api which has some base implementations. I have a factory which gives the instances of that api to the clients.
I want to change my factory to make it more generic so, if a new implementation of the api will be generated and its jar file will be put in classpath, the factory will understand it and any changes wouldn't be needed.
Use the java SPI, Service Provider Interface.
API jar - Provide one single interface.
Provider jar - Provide implementations in jars. You can even put several implementations in a jar. In a text file META-INF/services/my.package.MyInterface one lists implementing class(es).
Application - In the application the implementing jar should not be needed for compilation:
in maven scope runtime.
The service discovery happens with a ServiceLoader<T>:
public static void main(String[] args) {
ServiceLoader<MyInterface> loader = ServiceLoader.load(MyInterface.class);
for (MyInterface api : loader) {
api. ...
}
// Or take the first implementation:
MyInterface api = loader.iterator().next();
}
You could provide a class in the API jar with a static function for that discovery mechanism.
Advantages:
Separation
Several implementations possible
Selection of implementation can be done dynamically
Example of jars
xxx-api.jar
my/package/MyInterface.class
xxx-first-impl.jar
META-INF/services/my.package.MyInterface
my.package.impl.MyImpl1
my/package/impl/MyImpl1.class
public class MyImpl1 implements MyInterface { ... }
myapp1.jar
If you'd like to start with theory. Please read about Dependency inversion principle.
In object-oriented programming, the dependency inversion principle refers to a specific form of decoupling software modules. When following this principle, the conventional dependency relationships established from high-level, policy-setting modules to low-level, dependency modules are inverted (i.e. reversed), thus rendering high-level modules independent of the low-level module implementation details.
A. High-level modules should not depend on low-level modules. Both should depend on abstractions.
B. Abstractions should not depend on details. Details should depend on abstractions.
The principle inverts the way some people may think about object-oriented design, dictating that both high- and low-level objects must depend on the same abstraction.
Dependency Injection Library
As for specific implementations, you have many in Java. Specifically for the Dependency Injection approach. Spring Framework obviously comes to mind. But you can also look at Java EE Context and Dependency Injection.
Interface Injection
You can also...
Load the Jars manually : How should I load Jars dynamically at runtime?
Use Interface Injection : Spring interface injection example ( This title says Spring, but the answers show no Spring is needed for Interface Injection )

Best Practice For Referencing an External Module In a Java Project

I have a Java project that expects external modules to be registered with it. These modules:
Implement a particular interface in the main project
Are packaged into a uni-jar (along with any dependencies)
Contain some human-readable meta-information (like the module name).
My main project needs to be able to load at runtime (e.g. using its own classloader) any of these external modules. My question is: what's the best way of registering these modules with the main project (I'd prefer to keep this vanilla Java, and not use any third-party frameworks/libraries for this isolated issue)?
My current solution is to keep a single .properties file in the main project with key=name, value=class |delimiter| human-readable-name (or coordinate two .properties files in order to avoid the delimiter parsing). At runtime, the main project loads in the .properties file and uses any entries it finds to drive the classloader.
This feels hokey to me. Is there a better way to this?
The standard approach in Java is to define a Service Provider.
Let all module express their metadata via a standard xml file. Call it "my-module-data.xml".
On your main container startup it looks for a classpath*:my-module-data.xml" (which can have a FrontController class) and delegates to the individual modules FrontController class to do whatever it wants :)
Also google for Spring-OSGI and their doco can be helpful here.
Expanding on #ZZ Coder...
The Service Provider pattern mentioned, and used internally within the JDK is now a little more formalized in JDK 6 with ServiceLoader. The concept is further expanded up by the Netbeans Lookup API.
The underlying infrastructure is identical. That is, both API use the same artifacts, the same way. The NetBeans version is just a more flexible and robust API (allowing alternative lookup services, for example, as well as the default one).
Of course, it would be remiss to not mention the dominant, more "heavyweight" standards of EJB, Spring, and OSGi.

Categories

Resources