Can custom plugin for Intellij IDEA use DI in its code? - java

I am developing a plugin for Intellij IDEA 2018.2+ which will provide some additional inspections.
I have already learnt that there is a plugin.xml file which is the "heart" of plugin and is responsible for main behaviour of plugin.
As I understand, to implement (for example) additional inspection behaviour we need to define inspectionToolProvider in plugin.xml and inherit InspectionToolProvider interface. The same structure is defined for other extensions - we need to define something in .xml and implement some interface.
What bothers me is that if I want to implement some more-or-less complex algorithm, it looks like I need to use lots of static methods and utility classes, because I haven't found a way to use DI (e.g. Spring one) during plugin development.
Some examples in Intellij IDEA SDK docs also show "helper" methods as static ones defined in utility classes.
So overall question: is there a way to use dependency injection during Intellij IDEA plugin development?

IntelliJ IDEA has its own dependency injection, managed by PicoContainer. It allows you to inject any component or service into the constructor of any component, service or extension created on the same or lower level (possible levels are application, project and module). To use it, you simply declare a constructor parameter of the corresponding type; you do not need to apply any extra annotations.
You can also start your own DI container (using Spring or any other framework) in your plugin, but then it will be your own responsibility to support the injection of core IntelliJ IDEA components.

Related

Best practices for runtime-only dependencies in OSGi

In line with the Open-Closed Principle, I typically design my Java packages and libraries in such a way that there is a generic "interface" or "API" package/library and one or more implementations (quite similar to many common APIs like JDBC or JAXP/SAX).
To locate an implementation (or sometimes multiple implementations) in the base API library without violating OCP, I commonly use Java's ServiceLoader mechanism, or occasionally classpath scanning via third-party libraries like ClassGraph or Reflections. From a Maven perspective, the implementations are brought in as runtime dependencies (as they're only needed at execution time, but not at compile time). Pretty standard stuff.
So, now, I want to make some of these packages available as OSGi bundles (with API and implementation in separate bundles), but since in OSGi each bundle has its own class loader, neither classpath scanning nor the ServiceLoader API will work for this purpose. At first glance, OSGi's "fragment" mechanism seems to be the closest equivalent to the plain-Java setup described above. In that scenario, the API bundle would be the "fragment host", and concrete implementations would attach as fragments to that host bundle. As the fragment host and all its attached fragments use the same class loader, the standard plain-Java mechanisms like ServiceLoader or ClassGraph would conceivably still work. This would also have the advantage that there would be no need to detect whether a library/bundle is running in an OSGi context or not, and no OSGi framework dependencies are needed.
So, in a nutshell, my question is: are fragments the correct way to implement runtime-only dependencies in OSGi or is there a better (or more standard) way? Preferably, I'm looking for a solution that works in an OSGi container but does not require a dependency on OSGi itself.
No Fragments are almost always wrong outside the translations. The OSGi model is to use services.
The way to go then is to use DS. Using bnd (in maven, gradle, ant, sbt, or Bndtools) you can create components. A component is a Plain Old Java Object (POJO) that is annotated with injection and activation instructions. You could make those components to take all its dependencies in the constructor.
The bnd code uses the annotations to generate an XML file that is used in runtime to create, activate, inject, and register those components. This will then work out of the box in an OSGi Framework. The annotations are build time so they do not create dependencies in your runtime.
In your non-OSGi environment, you'd be responsible to call that constructor yourself. So you gather your dependencies using the Service Loader and then construct them in the right order.
#Component
public class MyComponent implements Foo {
final Bar bar;
#Activate
public MyComponent( #Reference Bar bar ) {
this.bar = bar;
}
...
}

How to do Constructor Dependency Injection in Netbeans Platform

I'm asking myself for a while how to do CDI in Netbeans Platform especially with TopComponents or subclasses of them. Now i'm using LookUps to get my dependencies and for inter-module-communication and it's working fine, but the dependencies of my components are not visible to the outside like it is using CDI so I'm searching for a way to populate the constructors of my TopComponents with the right arguments (Loose-Coupled trough interface types). I currently using 3 modules:
API - contains the interfaces
Core - contains implementations of the interfaces, have API as a dependency
GUI - contains my GUI and logic code encapsulated in TopComponents also have a dependency on API
As you see both Modules (GUI and Core) rely on API because of the loose-coupling in the modular system. I think it is nice to use LookUps to find the right implementations for the interfaces in API but as I said I also want to have the dependencies visible to the outside of my component classes.
So is there any way for doing Constructor Dependency Injection in a modular loose-coupled architecture using Netbeans Platform(Version 8.0.2) and if yes how?
If no what is the best solution to provide a clear view on the dependencies of the component classes?
Any help will be appreciated.
I think I found a solution that solves part of the problem. Normally my TopComponents are getting accessed trough
Window -> TopComponentNameHere
action (Which is generated by the Annotations I use for the TopComponent?).
The thought was that you can also initialize the TopComponent inside of Actions and show them too. Here you use a simple new Statement to create an Object of the TopComponent you want to show. Here is the point: You can load the components dependencies before via LookUps and pass them directly into the constructor (In my eyes not a full solution because the action has to initialize the TopComponent, but however).
I think this is way more beautiful than grabbing the dependencies right in the Component initialization Code and is enough for the moment for me to live with.

Splitting my gwt app into specific gwt code and a generic Java project?

I have a gwt project that acts as a semantic engine for other projects.
I recently realized very very little of the code is specific to gwt. Its almost all pretty basic java. In fact, the only things specific to gwt is retrieving files.
So what I would like to do is to separate out the gwt completely so I can use the same basic code for other Java projects - such as Android or Processing apps.
So, "Semantic Core" project could be inherited by GWT,Android and Processing apps and I wont have to maintain separate versions for each.
To do this, however, I need some way for other projects to "give" the Semantic Core project their own file-handleing methods.
My current idea how to do this;
One method I thought how to do this was to have SemanticCore define a Interface for FileManager with a method like;
getFile(String,MyErrorHandler,MySuccessHandler)
And then have the class's for MyErrorHandler and MySuccessHandler defined also in the SemanticCore project, effectively being runnables that take a string as a parameter.
With this Interface defined, other projects (GWT,Android etc) have to define their own class that implements it
eg, GWTFileHandler implements FileManager
Then create a object of this class, and pass it to the SemanticCore;
SemanticCore.setFileManager(new GWTFileHandler());
The semantic core can then use it at its leisure to retrieve files in a way suitable for the platform its on.
Question;
Is this a good way to do it? It seems wrong to me I am creating a new object, when I'll only be using static methods from that class.
Alternatives?
I hope my description is clear. As this all has to be GWT compatible in the "SemanticCore" project, any use of reflections is ruled out.
Thanks,
The recommended approach IMO is to use Deferred Binding to pick the GWT compatible version of your FileHandler or other GWT specific implementations. Extract the common interface for all versions and in your GWT module file you point to correct GWT implementation.
You can then instantiate your specific implemenation using GWT.create :
MyInterface implemenation = GWT.create(MyInterface.class);
more in depth info on the gwtproject site.
Deferred Binding is a technique used by the GWT compiler to create and
select a specific implementation of a class based on a set of
parameters. In essence, deferred binding is the GWT answer to Java
reflection. It allows the GWT developer to produce several variations
of their applications custom to each browser environment and have only
one of them actually downloaded and executed in the browser.

How to refactor a codebase that uses spring autowiring

I've inherited two fairly non-trivial codebases that uses spring for configuring the applications. Now I need to reconfigure the applications. But lots of the configuration is provided through autowiring so it is almost impossible to find out what the actual configuration is.
The projects are moderately sized, some 20-ish maven modules per project including integration test modules and such. Most modules define a few application contexts for various purposes, that contain one or two local spring config files along with one or two from the core modules it depends on. The result is a myriad of configurations, and that I cannot alter a class or variable name (or setter method) without risking breaking dependencies in some upstream or downstream module, even if no such dependency is visible anywhere in the project.
How do I work effectively with autowired dependencies in spring?
Can anyone, perhaps someone who actually likes autowiring, provide some insight into how you work with them effectively?
(I also inherited a small project that combines xml-files, autowiring and annotation-driven config, making dependency relations completely intractable, but I'll save those annotations for a separate question later)
You can perform re-factoring of auto wired beans using Intellij (I have version 9 Ultimate). Also Intellij has an option of making autowiring dependencies explicit. Link Provided below
http://blogs.jetbrains.com/idea/2009/03/making-spring-autowired-dependencies-explicit/
What IDE are you using? Spring STS (an Eclipse based IDE) has a lot of tools for working with Spring annotations and autowiring as well as good set of refactoring tools.

Best Practice For Referencing an External Module In a Java Project

I have a Java project that expects external modules to be registered with it. These modules:
Implement a particular interface in the main project
Are packaged into a uni-jar (along with any dependencies)
Contain some human-readable meta-information (like the module name).
My main project needs to be able to load at runtime (e.g. using its own classloader) any of these external modules. My question is: what's the best way of registering these modules with the main project (I'd prefer to keep this vanilla Java, and not use any third-party frameworks/libraries for this isolated issue)?
My current solution is to keep a single .properties file in the main project with key=name, value=class |delimiter| human-readable-name (or coordinate two .properties files in order to avoid the delimiter parsing). At runtime, the main project loads in the .properties file and uses any entries it finds to drive the classloader.
This feels hokey to me. Is there a better way to this?
The standard approach in Java is to define a Service Provider.
Let all module express their metadata via a standard xml file. Call it "my-module-data.xml".
On your main container startup it looks for a classpath*:my-module-data.xml" (which can have a FrontController class) and delegates to the individual modules FrontController class to do whatever it wants :)
Also google for Spring-OSGI and their doco can be helpful here.
Expanding on #ZZ Coder...
The Service Provider pattern mentioned, and used internally within the JDK is now a little more formalized in JDK 6 with ServiceLoader. The concept is further expanded up by the Netbeans Lookup API.
The underlying infrastructure is identical. That is, both API use the same artifacts, the same way. The NetBeans version is just a more flexible and robust API (allowing alternative lookup services, for example, as well as the default one).
Of course, it would be remiss to not mention the dominant, more "heavyweight" standards of EJB, Spring, and OSGi.

Categories

Resources