How to do Constructor Dependency Injection in Netbeans Platform - java

I'm asking myself for a while how to do CDI in Netbeans Platform especially with TopComponents or subclasses of them. Now i'm using LookUps to get my dependencies and for inter-module-communication and it's working fine, but the dependencies of my components are not visible to the outside like it is using CDI so I'm searching for a way to populate the constructors of my TopComponents with the right arguments (Loose-Coupled trough interface types). I currently using 3 modules:
API - contains the interfaces
Core - contains implementations of the interfaces, have API as a dependency
GUI - contains my GUI and logic code encapsulated in TopComponents also have a dependency on API
As you see both Modules (GUI and Core) rely on API because of the loose-coupling in the modular system. I think it is nice to use LookUps to find the right implementations for the interfaces in API but as I said I also want to have the dependencies visible to the outside of my component classes.
So is there any way for doing Constructor Dependency Injection in a modular loose-coupled architecture using Netbeans Platform(Version 8.0.2) and if yes how?
If no what is the best solution to provide a clear view on the dependencies of the component classes?
Any help will be appreciated.

I think I found a solution that solves part of the problem. Normally my TopComponents are getting accessed trough
Window -> TopComponentNameHere
action (Which is generated by the Annotations I use for the TopComponent?).
The thought was that you can also initialize the TopComponent inside of Actions and show them too. Here you use a simple new Statement to create an Object of the TopComponent you want to show. Here is the point: You can load the components dependencies before via LookUps and pass them directly into the constructor (In my eyes not a full solution because the action has to initialize the TopComponent, but however).
I think this is way more beautiful than grabbing the dependencies right in the Component initialization Code and is enough for the moment for me to live with.

Related

Can custom plugin for Intellij IDEA use DI in its code?

I am developing a plugin for Intellij IDEA 2018.2+ which will provide some additional inspections.
I have already learnt that there is a plugin.xml file which is the "heart" of plugin and is responsible for main behaviour of plugin.
As I understand, to implement (for example) additional inspection behaviour we need to define inspectionToolProvider in plugin.xml and inherit InspectionToolProvider interface. The same structure is defined for other extensions - we need to define something in .xml and implement some interface.
What bothers me is that if I want to implement some more-or-less complex algorithm, it looks like I need to use lots of static methods and utility classes, because I haven't found a way to use DI (e.g. Spring one) during plugin development.
Some examples in Intellij IDEA SDK docs also show "helper" methods as static ones defined in utility classes.
So overall question: is there a way to use dependency injection during Intellij IDEA plugin development?
IntelliJ IDEA has its own dependency injection, managed by PicoContainer. It allows you to inject any component or service into the constructor of any component, service or extension created on the same or lower level (possible levels are application, project and module). To use it, you simply declare a constructor parameter of the corresponding type; you do not need to apply any extra annotations.
You can also start your own DI container (using Spring or any other framework) in your plugin, but then it will be your own responsibility to support the injection of core IntelliJ IDEA components.

Java Beans Introspector requires desktop module

I'm investigating using Jigsaw to reduce the footprint of a microservice. One of the last dependencies I had to find was java.beans.Introspector.
Imagine my surprise when I discovered I needed to bring in the whole module java.desktop which contains all sorts of irrelevant stuff like awt, applets, swing etc.
This seems crazy to me, surely bean introspection should be a part of the fundamental language and not related to UI functionality. I think the dependency comes from the embedded Tomcat from Spring Boot so it's not something I can modify myself.
The Question: Are modules the finest granularity you can access or is there another way to trim the fat as it were.
The dependency exists, because BeanInfo and SimpleBeanInfo have references to Icon and Image from the AWT package. Further, PropertyEditor declares the methods getCustomEditor and paintValue​ creating dependencies to the classes Component, Graphics, and Rectangle. There are also some classes not visible in the API having references to desktop classes, i.e. the default property editor and persistence delegate implementations shipped for these desktop classes.
Since Java modules do not allow packages spread across multiple modules, there is no way to split the functionality into an AWT dependent and a non-dependent module (in a backward compatible manner). The dynamically loaded artifacts, i.e. actual bean infos, editors and persistence delegates, could have been moved into another module, but not the BeanInfo and PropertyEditor interfaces and their SimpleBeanInfo and PropertyEditorSupport implementations.
There is no finer granularity and no solution to use bean classes without creating that dependency. This is best illustrated by how the JDK developers dealt with the problems caused by this decision. Since java.util.logging.LogManager and java.util.jar.Pack200.Packer/Unpacker had support for java.beans.PropertyChangeListener, which caused a dependency to java.desktop, if kept that way, these methods were the first methods ever removed from the standard Java API, as fast as being deprecated in Java 8 for the first time and already removed in Java 9.
I think, if there was a way to declare a dependency to the fundamental bean classes like PropertyChangeListener without creating the unwanted dependency to java.desktop, the JDK developers didn’t set that precedent.

Proper Class Construction: Using Multiple Hard Dependencies

I'm trying to integrate Single Responsibility Principle into my Java code by refactoring large classes (2000+ lines) into smaller, cohesive classes (~200 lines). However I'm confused how to properly reduce coupling between classes, since certain classes seem bound to be create multiple "hard dependencies" via the new keyword.
I'm using dependency injection via constructors primarily, followed by setter methods, or methods which accept the dependency as a parameter and use it amonst other logic within the method body (not just a simple this.val = val; setter.
IntelliJ's automatic refactoring instantiates this newly extracted class and passes (injects) it with a this reference to the LoadController. If I have to refactor a 2000 line class, of course this auto-instantiation + injection will occur each time I extract a new class out. The following LoadController is a JavaFX controller class for the program's main stage, which acts as the starting point for various features:
public class LoadController{
private final DBConnection dbConnection = new DBConnection(this);
private final UpdateLabels updateLabels = new UpdateLabels(this);
private final OpenCloseMenu openCloseMenu= new OpenCloseMenu (this);
private final CreateVBox createVBox= new CreateVBox (this, dbConnection);
private final ...
private final ...
}
Is this wrong? My understanding is that large, separate functions should be in their own class ... BUT some classes must have multiple hard dependencies like above, in order to "guide" the flow of logic between the use of various other classes.
If you are doing dependency injection into JavaFX controllers, you might want to look into using something like Gluon Ignite to assist you.
Gluon Ignite allows developers to use popular dependency injection frameworks in their JavaFX applications, including inside their FXML controllers. Gluon Ignite creates a common abstraction over several popular dependency injection frameworks
The injection framework you choose to use (e.g. Guice or Spring) will be responsible for creating the injectable components (e.g. you don't invoke new) and injecting the relevant references into your code, (e.g. you don't need to write dbConnection = <some value>). The injection framework will have extensive documentation and blog articles on how it works and how it may best be used, so full discussion of that is outside the scope of this answer.
An alternate to Gluon Ignite is afterburner.fx, which is similar but uses a small custom implementation for #Inject, so is more lightweight (and a little less powerful), then the more established dependency injection frameworks (very simple to use though).
This is just one option, there are other ways you can handle this, but seeing as you state that you wish to perform dependency injection with JavaFX, it seems to make sense of proven frameworks to do this rather than try to roll your own implementation.
some classes must have multiple hard dependencies like above, in order to "guide" the flow of logic between the use of various other classes.
Using something like Guice you provide a module that defines bindings between interface types and implementations. These bindings tell Guice how to construct the dependencies, so you don't need to hard code the dependencies in your classes. See the BillingModule in the Guice getting started guide for a module example. If you need multiple instances of injectable objects, you can use Providers in Guice. Spring has similar concepts, but different names.
Deciding on whether or not to use a dependency injection framework is a tradeoff between the work you would need to do if there was no injection framework vs the additional time and complexity of integrating the framework into your application. so the decision of whether or not to use one needs to be an architectural decision that you make, there is not a generic right or wrong answer for every application on whether use of such frameworks is justified.
I decide that using an injection framework is superfluous for my requirements, then I am not doing anything inherently incorrect by having multiple hard dependencies in some classes, as shown above?
Well the dependencies need to be defined somewhere. Either in an inferred or centralized location such as dependency injection systems use or local to given classes as you might determine from a traditional responsibility driven design approach. So you are not necessarily doing anything wrong by having hard dependencies. Abstract decoupling patterns such as dependency injection aren't always required.
The trick is determining what dependencies to have where and how to manage them. Often it is just obvious and falls out naturally from the problem domain, sometimes techniques such as CRC modeling can help structure dependencies.
Related article:
Inversion of Control Containers and the Dependency Injection pattern by Martin Fowler.
My assumption is that I can refactor my large classes into smaller, cohesive classes with some of these classes having multiple hard dependencies, using new.
Yes, you can certainly do that.
Can an injection framework be integrated later on in a project's life, rather than early on when it may not be required yet?
Yes it can. It will be a bit of work to do so, but if the application is well structured, not all that difficult. It is more difficult to go the other way and try to remove usage of a dependency injection framework from applications and libraries that are already based on it.
Related:
Passing Parameters JavaFX FXML

How do I add an unknown amount of unknown modules in Dagger 2?

I have several modules that I don't know at compile time (think "plugins"). They all implement a "tag" interface MyModule: public interface MyModule {}
I've instantiated them thanks to ServiceLoader and #AutoService.
How do I add them all to my component builder?
The Dagger 2 authors seem to think that this question is relevant to StackOverflow. I don't believe that it is, because it looks like a missing use case, but well, I give them the benefits of the doubt here and post it.
I know I could use Guice or Dagger 1, but as said in the ticket to the Dagger 2 team; I don't want any reflection (bar ServiceLoader), and Dagger 1 is now deprecated. Also, recommends switching to Dagger 2 (which is why I'm trying to upgrade my Dagger 1 project to Dagger 2).
This is impossible as it stands, and is also outside of Dagger 2's charter. See the opening sentence of the project overview (emphasis mine):
Dagger is a fully static, compile-time dependency injection framework for both Java and Android.
When you're asking for your object graph to handle arbitrary marker-interface modules, you're precluding Dagger from knowing which #Provides methods it will have access to, which deprives Dagger of the ability to inspect and wire up your component at compile time. So, for Dagger to support your use case as you've stated it, it would have to reverse a number of core architectural decisions and advantages of compile-time inspection and code generation. Though I'm not on the Dagger team, I imagine this would make it a poor fit for the foreseeable future, which I imagine is part of the reason your issue was marked Working As Intended.
That said, there are number of ways you could use multiple coexisting components (one per plugin) to use static analysis and code generation in the core application, while supporting an arbitrary number of plugins. For instance, if your plugins have a predictable set of dependencies, you can create a new PluginModule(PluginFactory... factoriesToSupport), which would offer a #Provides Set<Plugin> createPluginSet(Dep1 dep1, Dep2 dep2) to iterate across the plugin factories and return you a set of plugins. In addition, if the plugin dependencies represent a large proportion of your top-level Component and your build graph supports it, you could inject the Component itself into the #Provides method, and then let your individually-built PluginFactory instances defer to their own (static-analyzed code-generated) Dagger components to produce the plugin instances. In short, as long as you're in Dagger, you'll need to play by Dagger's compile-time analysis rules, but within those constraints you can still likely come to a workable solution.

Java SE - Clever way to implement "plug and play" for different library modules

I'm trying to do something clever. I am creating a weather application in which we can replace the weather API with another weather API without affecting the code base. So I started with a Maven project with multiple modules.
I have a Base module that contains the Interface class and the Base class. The Interface class contains the calls to the APIs (all calls are similar, if not exact) and the Base class contains the properties to the APIs (again, all properties are similar, if not exact).
I have a module for each of the two weather APIs we are testing with plans to create more modules for new weather APIs as we grow the application.
Finally, I have created a Core module (includes main) to implement the specific module class for the weather API I want to test.
Now, I know the simplest way to do this would be to use a switch statement and enumeration. But I want to know if there is a more clever way to do this. Maybe using a Pattern? Any suggestions?
Here is a picture of the structure I have just described:
Here is the UML representation:
This is a learning process for me. I want to discover how a real Java Guru would implement the appropriate module and class based on a specified configuration.
Thank you for your suggestions.
I'm trying to do something clever. I am creating a weather application
in which we can replace the weather API with another weather API
without affecting the code base.
Without reading further down, this first statement makes me think about a plugin architecture design, but in the process of software design, decisions must not be rushed, the more you delay, the more information you have and a better informed decision can be made, for now is just an idea to keep in mind.
I have a Base module that contains the Interface class and the Base
class. The Interface class contains the calls to the APIs (all calls
are similar, if not exact) and the Base class contains the properties
to the APIs (again, all properties are similar, if not exact).
When different modules share behaviour/state, it is a good idea to refactor them and produce base abstract classes and interfaces, so you are on the right track, but, if there are differences, those shouldn't be refactored into the base module. The reason behind that is simple, maintainability. If you start adding if clauses or switches to deal with these differences, you just introduced coupling between modules, and you'll be always having to make changes in the base module, whenever you add/modify other modules, and this is not desirable at all.
This is reflected by the Open/Closed principle form the SOLID principles, which states that a class should be open for extension but closed for modifications.
So after you've refactored the common behaviour into the base modules, then each new API should extend the base module, as you did.
Finally, I have created a Core module (includes main) to implement the
specific module class for the weather API I want to test.
Now, I know the simplest way to do this would be to use a switch
statement and enumeration. But I want to know if there is a more
clever way to do this. Maybe using a Pattern? Any suggestions?
Indeed, making use of a switch, makes it work, but its not a clean design at all, for the same reason as before, when adding, modifying or removing modules, would require to modify this module aswell, and also this code can potentially break.
One possible solution, would be to delegate this responsability on a new component and make use of a creational design pattern like the Abstract Factory, which will provide a interface to instantiate components without specifying its classes.
As for the architecture, so far, the plugin architecture still makes sense, but what if the different modules extend the base contract adding more features? One option is to use the Facade pattern to adapt the module calls and provide an output that implements an interface that clients expect.
But then again, with the provided details, this is the solution I'd suggest, but the scenario should be studied carefully and in greater detail, in order to be able to assure that these are the right tools for the job, and commit to them.
In addition to Salvador Juan Martinez's answer...
To implement a plugin architecture Java's Jar File Specification provides support for service provider interfaces (SPI) and how they are looked up.
As of Java 1.6. you can use the ServiceLoader to lookup service providers. For Java 1.5. and less you must do it on your own or use a library. E.g. commons-discovery.
The usage is quiet simple. In your case put a META-INF/services/com.a2i.weatherbase.IWeather file in each plugin module.
In the Weather Forecast IO module the file should contain only one line
com.a2i.weatherforecastio.ForecastIO
The line must be the full quallified name of an IWeather implementation class.
Do the same for the other module and you can load the implementations via ServiceLoader.
ServiceLoader<IWeather> weatherServicesLoader = ServiceLoader.load(IWeather.class);
Iterator<IWeather> weatherServices = weatherServicesLoader.iterator();
Now it depends on your runtime classpath how many services will be found. Try to add and remove module jar archives from the classpath and run your application.
EDIT
I wrote a blog about a pluggable architecture with standard java. See http://www.link-intersystems.com/blog/2016/01/02/a-plug-in-architecture-implemented-with-java/
Source code is also available at https://github.com/link-intersystems/blog/tree/master/java-plugin-architecture
One solution is you have to define the common interface with all the identified common operations. The extensions/plugins need to implement that interface and have to provide the implementation to common operations.
You can use an abstract factory design pattern to hook up the exact implementation at runtime based on the input parameters.
Interfaces and abstract classes are always good in such scenarios, Thanks.

Categories

Resources