Very new to eclipse plugin development.
I have converted jar project into eclipse plugin. But I really dont know, how to make use of it. Some basic doubts,
How to call a method available in plugin in our program??
Should every exposed method should be public, in order to use it in our program??
My idea is something like a plugin to sum two numbers. And user installs the plugin and call add(x,y) method in this plugin. Just like calling a method from an included jar.
There are many tutorials explaining how to create a plugin, but i didn't found how to use the same.
What you are describing is a plain OSGi bundle, with no Eclipse-specific features. In terms of the New Plug-in wizard, yours "doesn't contribute to the UI". Technically, it means that it doesn't need plugin.xml.
The way your outside code perceives the bundle is just as if it was a regular jar: you can access its classes, instantiate them, and call their methods. Or you can call static methods, just like you are used to.
The additional layer provided by OSGi means you can identify which Java packages your bundle exports to its users. Therefore a class which is public, but doesn't reside in an exported package, is not accessible to other bundles (this applies only to the strict mode, however; otherwise you only get an Access Restriction warning).
I think this is the situation you are describing...
You have a plugin that you want Eclipse Java (JDT) users to install. In their Java projects, you want them to be able to use some of the Java classes in your plugin.
In Java, a class has to be found on a classpath by a class loader. JDT manages the classpath for projects through "class path containers." The first example of this is when you create a Java project, JDT will add "JRE System Library" as a container. You can see it under the project in the Package Explorer.
Another example of this is the JUnit plugin. You'll notice that when you add a JUnit Test Case to JDT project the first time, a dialog will ask about adding the JUnit library to the build path. (This is an explicit behavior of the JUnit plugin's New File Wizard.) If you agree, you'll see the "JUnit 4" container in the Package Explorer.
Yet another example: PDE expands on what JDT does. When you create a Plugin project, PDE adds a "Plug-in Dependencies" container that it manages based on the plugin dependencies you declare in the plugin manifest.
Users can create and reference their own classpath containers for their favorite libraries.
But, of course, as a library provider, you want to give them one like the JUnit plugin does. To do that, in your plugin:
Add a dependency on JDT Core
Extend from this extension point: org.eclipse.jdt.core.classpathContainerInitializer
If you want a wizard page to create or edit a classpath container entry:
Add a dependency on JDT UI
Extend from this extension point: org.eclipse.jdt.ui.classpathContainerPage
Some plugins use the wizard page to customize the container (JUnit allows picking JUnit 3 or 4); Others just use the page to provide information about the container.
See the JDT documentation topic Setting the Java build path and cross-reference the source code of any examples that familiar to you.
Here is a good article: Simplify Eclipse classpaths using classpath containers
To answer your questions:
You have to add the classes to the classpath using the initialize method of your subclass of ClasspathContainerInitializer.
Yes, methods that you want clients to call must be public and be members of the classes you add to the classpath.
Related
I have a project 'java11-core' that generates a test jar artifact to share with project 'java11-app'. These projects build fine with command line Maven, but within Eclipse, the classes shared in the test jar cannot be found.
Version Info:
Apache Maven 3.6.0 (command line and Eclipse)
Java version: 11.0.1, vendor: Oracle Corporation
Eclipse IDE: Version: 2018-09 (4.9.0)
M2E Plugin: 1.9.1.20180912-1601
I originally created these to projects as tradition non-JPMS projects. These projects compiled and ran tests normally as expected. After I added module-info.java to both java11-core and java11-app, the Eclipse compiler could not recognize the shared test files from the core project.
Here is a snapshot of the package explorer for an overview of the project structure.
The added java11-app and java11-core module-info contents respectively:
module com.java11.app {
exports com.java11.app;
requires com.java11.core;
}
module com.java11.core {
exports com.java11.core;
}
As you can see, I do not export the test utilities package from com.java11.core. I do not want to export the test packages because this would make the test classes publicly available. I also do not wish to introduce a new test project, because in real-world scenarios, this is very likely to require cyclic dependencies between test utilities and the projects they assist in testing.
Build errors for in AppTest.java. The failure reported by Eclipse is interesting is that it does not claim it cannot find the CoreTestUtil class, but rather:
The type com.java11.test.core.util.CoreTestUtil is not accessible AppTest.java /java11-app/src/test/java/com/java11/app line 8 Java Problem
CoreTestUtil cannot be resolved AppTest.java /java11-app/src/test/java/com/java11/app line 21 Java Problem
My assumption is that the lack of an export for this package from java11-core and/or the lack of a requires for this package in java11-app make eclipse believe the access is restricted, even though the classes exist in a separate test-jar.
The module path for java11-app shows it includes java11-core as a module, and the Without test code is set to No.
I know I am working with newly release features and suspect that sharing test classes across Eclipse JPMS project is not yet supported. But, I am not sure where to look (Eclipse? M2E plugin) for an update on it being supported. I am also not aware of a work-around that would allow me to be productive while adopting JPMS for my software projects.
For those that believe test utilities should not be shared this way...
This subject has been characterized as a best-practice issue that should be resolved by refactoring test utilities into a separate module. I respect this perspective, but in attempting to follow that guidance, I found myself being forced to violate other best-practices, including DRY (Don't Repeat Yourself), and cyclic dependencies between packages.
It is common for a test utility to emerge while developing a module that both assists in effective testing of that module, as well as depends on that module. This creates a cycle if those utilities are pulled out to separate module. Furthermore, some of these utilities can be equally useful when testing other modules that depend upon that module. This creates duplicate code if those utilities are copied to a new test module for dependents. This reasoning may have been why Maven 'test-jar' support was originally added.
Eclipse does not support multiple module-info per project: in whatever source folder (main or test), you must only have one module-info.
From Eclipse point of view, your only luck is to create a new Java project referencing the other and with its proper module-info/exports:
module mod.a {
exports com.example.a;
// com.example.a.Main
}
module mod.a.tests { // (1)
exports com.example.a.tests;
// com.example.a.tests.MainUtils calling com.example.a.Main
requires mod.a;
}
In case (1), you will have problems if you don't use mod.a.tests: Java will never find com.example.a.Main, probably because the second project shadows the first project.
I am not an OSGI expert, but I think that's one of those reason for why most Eclipse plugin do have a main and test projects: org.eclipse.m2e.core is patched by org.eclipse.m2e.core.tests
However module-info does not have any knowledge of "patches": you may patch a module on command line (java --patch-module), but not in module-info itself: perhaps Eclipse could do that on your behalf, but it don't.
As you can see, two project in Eclipse = two Maven module.
In the case of Maven, you can certainly create other artefacts with the same build (and I do think it tends to pollute the dependencies, because every time your secondary artefacts will requires a dependency, it would have to go to the common scope).
This can be done using maven-compiler-plugin, maven-shade-plugin and maven-jar-plugin:
I think you should not rely test-jar because you want to emulate the --patch-module of Java by merging the classes and test-classes directories.
You don't want to import this project in Eclipse due to multiple module-info; or you must ensure that its module-info is only visible to Maven (you can use a profile + m2e.version do detect m2e and disable it).
I fully agree with you. Why should I only use the src-main code from some core module when I also could inherit some src-test functionalities?
But how to handle the scope problem? When I use the "test"-scope I loose the relation to the src-main code. When I dont use the test scope I loose the relation to the src-test code.
My core test code does not change very often, so to get the stuff working in Eclipse
I install the test-jar to my local repository and everything works fine.
I'm developing an Eclipse plugin that scans and modifies the AST of the currently open Java project.
I want to create a Java annotation that will appear as a known annotation in projects that use the plugin. The annotation's RetentionPolicy will be SOURCE (so it is discarded by the compiler), yet the plugin will be able to identify (using the AST) methods marked with this annotation and handle them accordingly.
For example:
#SkipAnalysis
public void foo() {...}
This annotation will be analyzed by the plugin while traversing the AST, yet it holds no value for the compiler.
How can my plugin contribute annotations to an open project in the workspace?
After some research, it turns out that this is impossible since annotations (and other classes or interfaces) can only be contributed via the build path. Eclipse plugins can change the build path, but cannot contribute their own source code to any project.
One of the possibilities is to create a library project containing the annotation, and then use the plugin's ability to modify the build path to add that library to the build path. However, this is cumbersome and adds unneeded dependencies to the project, and if the library is not copied to every other developer working on the project, may lead to compilation errors. An exception to that is if the project uses a build automation system (like Maven or Gradle), and the library is stored in a public repository (like Maven Central), assuring that each user that imports the project on every IDE will download that library. Again - possible, but cumbersome.
This is more a question about what's out there, and future directions about resolving tools such as Ivy. Is there anything that can mention class-level dependencies for packages, rather than package level dependencies?
For example, let's say I have an apache-xyxy package, that comes with an ivy.xml that lists all it's dependencies. But suppose I only use class WX in apache-xyxy, which doesn't require most of those dependencies. Couldn't a resolver be intelligent and identify that class WX can only possibly invoke the set of other classes (AB, DC, EF), and none of those classes use any of other dependencies, to create a minimal subset of required dependencies? This would be easier and safer than cherry picking to remove some package dependencies that aren't needed because of the specific classes used in that package, and also prevent breaking down several larger packages into smaller ones just for this reason.
Then, if I later decided to use class GH from apache-xyxy, I could do an ivy resolve, and it would dynamically bring in the additional required libraries.
When packaging compiled java code for distribution it's common practice to bundle Java "packages" together. It's also quite possible (but silly) to split a java package across multiple jars. Large frameworks (like Spring) have lots of sub packages in different jars so that users can pick and choose what they need at run-time..... Of course the more jar options one has, the more complex it becomes to populate the run-time classpath...
The keyword here is "run-time".... Tools like Apache ivy and Apache Maven are primarily designed to manage dependencies needed at build time....
Apache Maven does have a "runtime" scope, for it's dependencies, but it's limited to a single list of jars. Typically this scope is used for deciding which jars are needed for testing and populating the lib directory of a WAR file.
Apache ivy has a similar more flexible mechanism called "configurations". It's possible to create as many runtime configurations as you need, and these can be used to decide which jars are downloaded by ivy.
So while it would appear ivy has the answer, I've rarely seen ivy used when launching programs (The one exception is Groovy's Grape annotations)
So what, you might ask, is the answer?
The future of "run-time" classpath management is either OSGI or project jigsaw. I'm more familiar with OSGI where special dependency indicators are added the the jar file's manifest, stating what it's dependencies are. The idea is that when a container loads a jar (called a "bundle") it can check and see whether the other dependencies are already loaded. These dependencies can be retrieved and loaded from a common repository. This is fundentally different way to launch java. Traditionally each application is loaded onto it's own isolated classpath.....
Time will tell if either project catches on. In the meantime we use Apache ivy and Apache Maven to build self-contained and possibly over-bloated WAR (EAR, etc) packages.
Currently when I am writting a bundle in that depends on a package, I have to "import" or "depend" on a whole other bundle in Maven that contains that package.
This seems like it is counter-productive to what OSGi gives me.
For example let's say I have two bundles: BundleAPI and BundleImpl.
BundleAPI provides the API interfaces:
// BundleAPI's manifest
export-package: com.service.api
BundleImpl provides the implementation:
//BundleImpl's manifest
import-package com.service.api
However, when I am coding BundleImpl in Eclipse, I am forced to "depend" in maven POM on BundleAPI itself - so that eclipse does not complain.
//BundleImpl's POM
<dependency>
<groupId>com.service</groupId>
<artifactId>com.service.api</artifactId>
[...]
</dependency>
So - on one hand, I am depending only on the package com.service.api, while on the other - I need to have the whole bundle - BundleAPI.
Is there a way to make maven or eclipse smart enough to just find the packages somewhere, instead of whole bundles?
I am very much confused as to how this works - any type of clarity here would be great. Maybe I am missing something fundamentally simple?
The key is to distinguish between build-time dependencies and runtime dependencies.
At build time you have to depend on a whole artifact, i.e. a JAR file or bundle. That's pretty much unavoidable because of the way Java compilers work. However at runtime you depend only on the packages you use in your bundle, and this is how OSGi manages runtime substitution. This is the Import-Package statement in your final bundle.
Of course as a developer you don't want to list two parallel sets of dependencies, that would be crazy. Fortunately maven-bundle-plugin is based on a tool called bnd that calculates the Import-Package statement for you based on analysing your code and discovering the actual packages used. Other tools such as bndtools (an Eclipse-based IDE for OSGi development) also use bnd in this way. Incidentally bnd is much more reliable and accurate than any human at doing this job!
So, you define only the module-level dependencies that you need at build time, and the tool generates the runtime package-level dependencies.
I would recommend against using Tycho because it forces you to use Eclipse PDE, which in turn forces you to manually manage imported packages (for the sake of full disclosure, I am the author of bndtools which competes against PDE).
You cannot develop bundles like regular Java projects with Maven and eclipse. You basically have 2 options.
Apache Felix Bundle Plugin: Basically you develop the project as a regular Java project and use Maven as you normally would. This plugin will be used to add all the OSGi specifics to the jar manifest at deployment time to OSGi enable it. The disadvantage of this aproach is that you are using a Java project in your workspace instead of a bundle, which makes running your project in the OSGi container a little extra work since Eclipse doesn't recognize it as a plugin project. Thus you have to add the jar from the Maven build as part of the target platform manually.
Tycho: This is another Maven plugin that attempts to actually bring theses two environments together and does a pretty good job of it. In this scenario, you actually create an Eclipse bundle/plugin project, which obviously makes for seamless integration in Eclipse. The pom then marks the project as being an eclipse-plugin type, which effectively makes Maven resolve the project dependencies (defined in the manifest) via the target platform instead of Maven itself.
I would take the Tycho approach as it gives a much more integrated approach with Eclipse.
Having the whole jar as a dependency shouldn't be a problem, that's how you have to do it with Maven anyway.
I developing a web application with a lot of libraries like, Spring, Apache CXF, Hibernate, Apache Axis, Apache Common and so one. Each of these framework comes with a lot of *.jar libraries.
For development I simple take all of the delivered libraries and add them to my classpath.
For deployment not all of these libraries are required, so is there a quick way to examine all the required libraries (*.jar) which are used by my source code?
If you move your project to use Maven such things become easier:
mvn dependency:analyze
mvn dependency:tree
For your example, Maven + IDE + nice dependency diagrams could help allot.
See an example of this : it's much easier this way to figure out what happens in a project, and this way you don't need to add to your project "all delivered libraries" - just what it's required.
JDepend traverses Java class file
directories and generates design
quality metrics for each Java package.
JDepend allows you to automatically
measure the quality of a design in
terms of its extensibility,
reusability, and maintainability to
manage package dependencies
effectively.
So, as a quick, dirty, and potentially inefficient way, you can try this in Eclipse:
Create two copies of your project.
In project copy #2 remove all the jars from the classpath.
Pick a source file that now has errors because it can't resolve a class reference. Pick one of the unresolved classes and note its fully qualified class name.
Do Control-Shift-T and locate the unresolved class. You should be able to see which jar its contained in since all the jars are still in the classpath for project copy #1.
Add the jar that contains this unresolved class back into your classpath in project copy #2, then repeat steps 3 and 4 until all class references are resolved.
Unfortunately you're not done yet since the jar files themselves may also have dependencies. Two ways to deal with this:
Go read the documentation for all the third-party packages you're using. Each package should tell you what its dependencies are.
Run your application and see if you get any ClassNotFoundExceptions. If you do, then use Control-Shift-T to figure out what jar that class comes from and add it to your classpath. Repeat until your project runs without throwing any ClassNotFoundExceptions.
The problem with #2 is that you don't really know you've resolved all the dependencies since you can't simulate every possible execution path your project might take.