Java Beans Introspector requires desktop module - java

I'm investigating using Jigsaw to reduce the footprint of a microservice. One of the last dependencies I had to find was java.beans.Introspector.
Imagine my surprise when I discovered I needed to bring in the whole module java.desktop which contains all sorts of irrelevant stuff like awt, applets, swing etc.
This seems crazy to me, surely bean introspection should be a part of the fundamental language and not related to UI functionality. I think the dependency comes from the embedded Tomcat from Spring Boot so it's not something I can modify myself.
The Question: Are modules the finest granularity you can access or is there another way to trim the fat as it were.

The dependency exists, because BeanInfo and SimpleBeanInfo have references to Icon and Image from the AWT package. Further, PropertyEditor declares the methods getCustomEditor and paintValue​ creating dependencies to the classes Component, Graphics, and Rectangle. There are also some classes not visible in the API having references to desktop classes, i.e. the default property editor and persistence delegate implementations shipped for these desktop classes.
Since Java modules do not allow packages spread across multiple modules, there is no way to split the functionality into an AWT dependent and a non-dependent module (in a backward compatible manner). The dynamically loaded artifacts, i.e. actual bean infos, editors and persistence delegates, could have been moved into another module, but not the BeanInfo and PropertyEditor interfaces and their SimpleBeanInfo and PropertyEditorSupport implementations.
There is no finer granularity and no solution to use bean classes without creating that dependency. This is best illustrated by how the JDK developers dealt with the problems caused by this decision. Since java.util.logging.LogManager and java.util.jar.Pack200.Packer/Unpacker had support for java.beans.PropertyChangeListener, which caused a dependency to java.desktop, if kept that way, these methods were the first methods ever removed from the standard Java API, as fast as being deprecated in Java 8 for the first time and already removed in Java 9.
I think, if there was a way to declare a dependency to the fundamental bean classes like PropertyChangeListener without creating the unwanted dependency to java.desktop, the JDK developers didn’t set that precedent.

Related

Best practices for runtime-only dependencies in OSGi

In line with the Open-Closed Principle, I typically design my Java packages and libraries in such a way that there is a generic "interface" or "API" package/library and one or more implementations (quite similar to many common APIs like JDBC or JAXP/SAX).
To locate an implementation (or sometimes multiple implementations) in the base API library without violating OCP, I commonly use Java's ServiceLoader mechanism, or occasionally classpath scanning via third-party libraries like ClassGraph or Reflections. From a Maven perspective, the implementations are brought in as runtime dependencies (as they're only needed at execution time, but not at compile time). Pretty standard stuff.
So, now, I want to make some of these packages available as OSGi bundles (with API and implementation in separate bundles), but since in OSGi each bundle has its own class loader, neither classpath scanning nor the ServiceLoader API will work for this purpose. At first glance, OSGi's "fragment" mechanism seems to be the closest equivalent to the plain-Java setup described above. In that scenario, the API bundle would be the "fragment host", and concrete implementations would attach as fragments to that host bundle. As the fragment host and all its attached fragments use the same class loader, the standard plain-Java mechanisms like ServiceLoader or ClassGraph would conceivably still work. This would also have the advantage that there would be no need to detect whether a library/bundle is running in an OSGi context or not, and no OSGi framework dependencies are needed.
So, in a nutshell, my question is: are fragments the correct way to implement runtime-only dependencies in OSGi or is there a better (or more standard) way? Preferably, I'm looking for a solution that works in an OSGi container but does not require a dependency on OSGi itself.
No Fragments are almost always wrong outside the translations. The OSGi model is to use services.
The way to go then is to use DS. Using bnd (in maven, gradle, ant, sbt, or Bndtools) you can create components. A component is a Plain Old Java Object (POJO) that is annotated with injection and activation instructions. You could make those components to take all its dependencies in the constructor.
The bnd code uses the annotations to generate an XML file that is used in runtime to create, activate, inject, and register those components. This will then work out of the box in an OSGi Framework. The annotations are build time so they do not create dependencies in your runtime.
In your non-OSGi environment, you'd be responsible to call that constructor yourself. So you gather your dependencies using the Service Loader and then construct them in the right order.
#Component
public class MyComponent implements Foo {
final Bar bar;
#Activate
public MyComponent( #Reference Bar bar ) {
this.bar = bar;
}
...
}

Using different versions of dependencies in separated Java platform modules

I expected it's possible to use i.e. Guava-19 in myModuleA and guava-20 in myModuleB, since jigsaw modules have their own classpath.
Let's say myModuleA uses Iterators.emptyIterator(); - which is removed in guava-20 and myModuleB uses the new static method FluentIterable.of(); - which wasn't available in guava-19. Unfortunately, my test is negative. At compile-time, it looks fine. In contrast to runtime the result is a NoSuchMethodError. Means that, the class which was the first on the classloader decides which one fails.
The encapsulation with the underlying coupling? I found a reason for myself. It couldn't be supported because of transitive dependencies would have the same problem as before. If a guava class which has version conflicts occurred in the signature in ModuleA and ModuleB depends on it. Which class should be used?
But why all over the internet we can read "jigsaw - the module system stops the classpath hell"? We have now multiple smaller "similar-to-classpaths" with the same problems. It's more an uncertainty than a question.
Version Conflicts
First a correction: You say that modules have their own class path, which is not correct. The application's class path remains as it is. Parallel to it the module path was introduced but it essentially works in the same way. Particularly, all application classes are loaded by the same class loader (by default at least).
That there is only a single class loader for all application classes also explains why there can't be two versions of the same class: The entire class loading infrastructure is built on the assumption that a fully qualified class name suffices to identify a class with a class loader.
This also opens the path to the solution for multiple versions. Like before you can achieve that by using different class loaders. The module system native way to do that would be to create additional layers (each layer has its own loader).
Module Hell?
So does the module system replace class path hell with module hell? Well, multiple versions of the same library are still not possible without creating new class loaders, so this fundamental problem remains.
On the other hand, now you at least get an error at compile or launch due to split packages. This prevents the program from subtly misbehaving, which is not that bad, either.
Theoretically it is possible to use different versions of the same library within your application. The concept that enables this: layering!
When you study Jigsaw under the hood you find a whole section dedicated to this topic.
The idea is basically that you can further group modules using these layers. Layers are constructed at runtime; and they have their own classloader. Meaning: it should be absolutely possible to use modules in different versions within one application - they just need to go into different layers. And as shown - this kind of "multiple version support" is actively discussed by the people working on java/jigsaw. It is not an obscure feature - it is meant to support different module versions under one hood.
The only disclaimer at this point: unfortunately there are no "complete" source code examples out there (of which I know), thus I can only link to that Oracle presentation.
In other words: there is some sort of solution to this versioning problem on the horizon - but it will take more time until to make experiences in real world code with this new idea. And to be precise: you can have different layers that are isolated by different class loaders. There is no support that would allow you that "the same object" uses modV1 and modV2 at the same time. You can only have two objects, one using modV1 and the other modV2.
( German readers might want to have a look here - that publication contain another introduction to the topic of layers ).
Java 9 doesn't solve such problems. In a nutshell what was done in java 9 is to extend classic access modifiers (public, protected, package-private, private) to the jar levels.
Prior to java 9, if a module A depends on module B, then all public classes from B will be visible for A.
With Java 9, visibility could be configured, so it could be limited only to a subset of classes, each module could define which packages exports and which packages requires.
Most of those checks are done by the compiler.
From a run time perspective(classloader architecture), there is no big change, all application modules are loaded by the same classloader, so it's not possible to have the same class with different versions in the same jvm unless you use a modular framework like OSGI or manipulate classloaders by yourself.
As others have hinted, JPMS layers can help with that. You can use them just manually, but Layrry might be helpful to you, which is a fluent API and configuration-based launcher for running layered applications. It allows you to define the layer structure by means of configuration and it will fire up the layer graph for you. It also supports the dynamic addition/removal of layers at runtime.
Disclaimer: I'm the initial creator of Layrry

How to do Constructor Dependency Injection in Netbeans Platform

I'm asking myself for a while how to do CDI in Netbeans Platform especially with TopComponents or subclasses of them. Now i'm using LookUps to get my dependencies and for inter-module-communication and it's working fine, but the dependencies of my components are not visible to the outside like it is using CDI so I'm searching for a way to populate the constructors of my TopComponents with the right arguments (Loose-Coupled trough interface types). I currently using 3 modules:
API - contains the interfaces
Core - contains implementations of the interfaces, have API as a dependency
GUI - contains my GUI and logic code encapsulated in TopComponents also have a dependency on API
As you see both Modules (GUI and Core) rely on API because of the loose-coupling in the modular system. I think it is nice to use LookUps to find the right implementations for the interfaces in API but as I said I also want to have the dependencies visible to the outside of my component classes.
So is there any way for doing Constructor Dependency Injection in a modular loose-coupled architecture using Netbeans Platform(Version 8.0.2) and if yes how?
If no what is the best solution to provide a clear view on the dependencies of the component classes?
Any help will be appreciated.
I think I found a solution that solves part of the problem. Normally my TopComponents are getting accessed trough
Window -> TopComponentNameHere
action (Which is generated by the Annotations I use for the TopComponent?).
The thought was that you can also initialize the TopComponent inside of Actions and show them too. Here you use a simple new Statement to create an Object of the TopComponent you want to show. Here is the point: You can load the components dependencies before via LookUps and pass them directly into the constructor (In my eyes not a full solution because the action has to initialize the TopComponent, but however).
I think this is way more beautiful than grabbing the dependencies right in the Component initialization Code and is enough for the moment for me to live with.

Is it possible to share common JSR 330 code between Dagger and CDI?

I'm tech lead on Agorava, a Framework that helps consuming social network data.
Today Agorava is build on CDI to ease its usage in Java EE stack, but we want to provide an implementation with Dagger to have a lighter solution working for Android.
My question is: can we share common JSR 330 compliant code between CDI and Dagger implementations? In other words, is it possible with Dagger to have compiled code in jar bearing JSR 330 annotation and source code extending or using this code in a Dagger specific Jar (with #Provides,#Modules and other Dagger specific items)?
If the answer is no is there any issue to compile my common JSR 330 jar with Dagger compiler and use it in my CDI implementation? More precisely will #Inject, qualifiers and other JSR 330 specifics will be still available at runtime and will the classes bearing these annotations code stay untouched by Dagger compiler? Finally is there a kind of tracker on Dagger generated code (classname, annotation) to allow CDI to detect it and ignore it?
You can share client code between Dagger and any other JSR-330 implementation, so long as your code does not implement behaviours that are not compatible with Dagger. Dagger 1.0, for instance, does not support method injection. Dagger 2.0 uses component interfaces instead of injectors, so your code would have to not care about that.
#Inject and other JSR-330 API elements will still appear at runtime. Dagger does not access them at run time, creating generated code at compile time to interpret these annotations at run time. But these classes will still be valid JSR-330 compliant injectable classes for any JSR-330 app.
What might be problematic is that Dagger will generate these extra classes and you would have to post-process the jar, or re-configure your build system in order to strip out the generated code, and move them to a supplementary jar. But that is a build-system configuration issue, and dagger is agnostic to it, so long as the generated code is available to it at runtime in the dagger-using application.
One build-time option is to run the compiler with -proc:none, and in a second configuration with -proc:only, and pipe the output of the latter into another output folder, and jar that up. This can be done in maven by making different executions of the maven-compiler-plugin.
Dagger generated classes should all have #Generated on them (coming soon), but also all descend from dagger.internal.ModuleAdapter, dagger.internal.Binding, or dagger.internal.StaticInjection. Subclasses of these are all safely ignorable by a non-dagger framework. In fact, they could be pro-guarded away.

Java package politics

I always doubt when creating packages, I want to take advantage of the package limited access but at the same time I want to have similar classes divided into packages.
The problem comes when you understand that packages are not hierarchical in Java:
At first, packages appear to be
hierarchical, but they are not.
source
Imagine I have an API defined with its classes at foo.bar, only the classes the API client needs are set public. Then I have another package with some internal objects I need in the API defined at foo.bar.pojos, this classes need to be public so they can be accessed by foo.bar but this means the API client could also access them if the package foo.bar.pojos is imported.
What is the common package politic that should be followed?
I've seen two ways of doing.
The first one consists in separating the public API and internal classes into two different artefacts (jars). The documentation is separated as well, and it's thus easy for the end user to make the distinction between what is internal and what is not. But it sometimes make things more complex to have two jars, two source trees, etc.
The second one consists in delivering a single jar, but have a good documentation allowing to know what's internal and what's not. The textual documentation can explain how to use the API (and thus avoids talking about the internals). And the javadoc can specify that a class is for internal use and is thus subject to changes.
Yes, Java packages don't give you enough control over your dependencies. The classic way to deal with this is to put external APIs in one package and internal implementation classes in another, and rely on people's good sense to avoid creating dependencies on the latter.
With Maven and OSGI, you have an additional mechanism for managing dependencies between modules / bundles of packages. In the case of OSGI, you can explicitly declare some packages as not exported, and an OSGI aware development environment will prevent people creating harmful dependencies. Maven's module support is weaker, but at least it controls dependency cycles.
Finally, you could use custom PMD rules to enforce your project's modularization conventions ... in the same way that there are rules to discourage dependencies on Java's "com.sun.*" package tree.
It is a mess.
Using only what Java itself offers, you have to put everything in the same package. You end up with a single (or a few) packages with lots of classes, and no good way to group them for yourself (but at least that problem does not leak outside). Most people don't do that, though, and as a result, your (as a developer on top of these libraries) public classpath is littered with stuff you should never need to see.
You might like OSGi, which has (and enforces) the concept of bundle-private packages. Those are not exported to the outside world.

Categories

Resources