I'm developing plugin for IntelliJ IDEA. How can plugin get the name and version of libraries that are imported to the project that is being checked by plugin? I have PsiClass of the project, but cannot convert it to java.lang.Class. Maybe there's the way to get ClassLoader from PsiElement?
super.visitImportStatement(psiImport);
Class importedClass = Class.forName(psiImport.getQualifiedName(), true, psiImport.getClass().getClassLoader());
PsiImport.getClass().GetClassLoader() - returns ClassLoader of class PsiImportStatementImpl instead of ClassLoader of class that I've imported.
IntelliJ does mostly static analysis on your code. In fact, the IDE and the projects you run/debug have completely different classpaths. When you open a project, your dependencies are not added to the IDE classpath. Instead, the IDE will index the JARs, meaning it will automatically discover all the declarations (classes, methods, interfaces etc) and save them for later in a cache.
When you write code in your editor, the static analysis tool will leverage the contents of this index to validate your code and show errors when you're trying to use unknown definitions for example.
On the other hand, when you run a Main class from your project, it will spawn a new java process that has its own classpath. This classpath will likely contain every dependency declared in your module.
Knowing this, you should now understand why you can't "transform" a PsiClass to a corresponding Class.
Back to your original question:
How can plugin get the name and version of libraries that are imported to the project that is being checked by plugin?
You don't need to access Class objects for this. Instead, you can use IntelliJ SDK libraries. Here's an example:
Module mod = ModuleUtil.findModuleForFile(virtualFile,myProject);
ModuleRootManager.getInstance(mod).orderEntries().forEachLibrary(library -> {
// do your thing here with `library`
return true;
});
This question already has answers here:
What's the difference between requires and requires static in module declaration
(2 answers)
Does the Java 9 Module system support optional dependencies?
(1 answer)
Closed 3 years ago.
I started to learn jigsaw java-9 feature and read some articles/video.
I can't understand concept of optional dependencies(requires static)
quote from article:
When a module needs to be compiled against types from another module
but does not want to depend on it at run time, it can use a requires
static clause. If foo requires static bar, the module system behaves
different at compile and run time:
At compile time, bar must be present or there will be an error. During
compilation bar is readable by foo.
At run time, bar might be absent
and that will cause neither error nor warning. If it is present, it is
readable by foo.
So I want to know couple of things:
What the reason to make module dependable on another module during compile time but not in runtime? any examples? instruments like lombok?
Any analogs of optional dependencies in java prior java-9 ?
P.S.
I found one more explanation:
quote from article:
Sometimes we write code that references another module, but that users
of our library will never want to use.
For instance, we might write a utility function that pretty-prints our
internal state when another logging module is present. But, not every
consumer of our library will want this functionality, and they don’t
want to include an extra logging library.
In these cases, we want to use an optional dependency. By using the
requires static directive, we create a compile-time-only dependency:
module my.module {
requires static module.name;
}
But it is absolutely unclear for me. Could anyone explain it in a simple way?
There are a decent number of libraries out there where it only makes sense to have them at compile time. Mostly this deals with annotations that only exist to help during development (e.g. prevent bugs, reduce boilerplate). Some examples include:
java-annotations by JetBrains
spotbugs-annotations by SpotBugs (successor of FindBugs)
Project Lombok (as you mentioned)
jcip-annotations
These annotations tend to have a RetentionPolicy of SOURCE or CLASS, which means they aren't useful (or even available) at runtime. Why ship these dependencies with the rest of your application when you deploy? Without requires static you would be forced to include them when you deploy, otherwise your application would fail to start due to missing dependencies.
You would declare these dependencies as optional pre-Java 9 as well. Many Java projects of any significance use a build tool such as Maven or Gradle. In addition to those tools automatically building and testing your project, a large part of what they do is dependency management. I'm not familiar enough with Maven, but when using Gradle one would use:
dependencies {
compileOnly 'group.id:artifact-id:version'
}
To declare dependencies that are not needed at runtime.
If Dependent Module should be available at compile time but optional at rumtime, then such type of Dependency is called Optional Dependency. We can Specify optional dependency by using static keyword.
Note The static keyword is used to say that "This dependency check is mandatory at compile time and optional at runtime."
Eg.1
module moduleB {
requires moduleA;
}
moudleA should be available at the time of compilation & rumtime. it is not Optional Dependency.
Eg2.
module moduleB {
requires static moduleA;
}
At the time of compilation moduleA should be available, but at runtime it is optional ie, at runtime even moduleA is not avaiable JVM will execute code.
I have a project with different classes and packages as dependencies. Note that everything writte below occurs in one project.
I have a class that at some point runs the code getDiagramPanel().setRelationsPaintOrder(new Comparator() {.
getDiagramPanel() calls the method from DjtSheet.class, which is located in a dependency .jar-file. This method returns the DjtDiagramPanel object. I also have a DjtDiagramPanel.java file, which should override the one from the package and contains the method setRelationsPaintOrder().
In Java 7, this works fine. It correctly calls the method from the dependency, which returns the object in the format of the class which overrides the panelclass from the dependency package.
In Java 6 however, the panelclass from the dependency package is returned instead of the one from my project.
java.lang.NoSuchMethodError:
com.dlsc.djt.gantt.DjtDiagramPanel.setRelationsPaintOrder(Ljava/util/Comparator;)V
Note that this message occurs at runtime! Compiling the project gives no errors.
How can I solve this?
This problem definitely means that you have a problem in class path. I guess that the problem is that class DjtDiagramPanel is duplicate and you have 2 different veraions: one that has method setRelationsPaintOrder and second that does not have. Apparently you compile code against the "good" version and run against the "bad" one.
When this happens you can probably change the order of class loading by playing with order of dependencies in project properties of eclipse, but it will just fail later (on production). So, you should find what is the root cause of the duplication.
First find these 2 versions of the same class. Then find how the bad version arrived to your classpath. It typically happes because of 3rd party dependencies. If you are using maven you can use dependency plugin to find the root cause and disable it using tag "exclusion".
I'm working now together with others in a grails project. I have to write some Java-classes. But I need access to an searchable object created with groovy. It seems, that this object has to be placed in the default-package.
My question is: Is there a way to access this object in the default-package from a Java-class in a named package?
You can’t use classes in the default package from a named package.
(Technically you can, as shown in Sharique Abdullah's answer through reflection API, but classes from the unnamed namespace are not in scope in an import declaration)
Prior to J2SE 1.4 you could import classes from the default package using a syntax like this:
import Unfinished;
That's no longer allowed. So to access a default package class from within a packaged class requires moving the default package class into a package of its own.
If you have access to the source generated by groovy, some post-processing is needed to move the file into a dedicated package and add this "package" directive at its beginning.
Update 2014: bug 6975015, for JDK7 and JDK8, describe an even stricter prohibition against import from unnamed package.
The TypeName must be the canonical name of a class type, interface type, enum type, or annotation type.
The type must be either a member of a named package, or a member of a type whose outermost lexically enclosing type is a member of a named package, or a compile-time error occurs.
Andreas points out in the comments:
"why is [the default package] there in the first place? design error?"
No, it's deliberate.
JLS 7.4.2. Unnamed Packages says: "Unnamed packages are provided by the Java SE platform principally for convenience when developing small or temporary applications or when just beginning development".
In fact, you can.
Using reflections API you can access any class so far. At least I was able to :)
Class fooClass = Class.forName("FooBar");
Method fooMethod = fooClass.getMethod("fooMethod", String.class);
String fooReturned = (String)fooMethod.invoke(fooClass.newInstance(), "I did it");
Use jarjar to repackage the jar file with the following rule:
rule * <target package name>.#1
All classes in the default package of the source jar file will move to the target package, thus are able to access.
You can use packages in the Groovy code, and things will work just fine.
It may mean a minor reorganization of code under grails-app and a little bit of a pain at first, but on a large grails project, it just make sense to organize things in packages. We use the Java standard package naming convention com.foo.<app>.<package>.
Having everything in the default package becomes a hindrance to integration, as you're finding.
Controllers seem to be the one Grails artifact (or artefact) that resists being put in a Java package. Probably I just haven't figured out the Convention for that yet. ;-)
just to complete the idea:
From inside default-package you can access objects resided in named packages.
I'm packaging a Java library as a JAR, and it's throwing many java.lang.IncompatibleClassChangeErrors when I try to invoke methods from it. These errors seem to appear at random. What kinds of problems could be causing this error?
This means that you have made some incompatible binary changes to the library without recompiling the client code. Java Language Specification §13 details all such changes, most prominently, changing non-static non-private fields/methods to be static or vice versa.
Recompile the client code against the new library, and you should be good to go.
UPDATE: If you publish a public library, you should avoid making incompatible binary changes as much as possible to preserve what's known as "binary backward compatibility". Updating dependency jars alone ideally shouldn't break the application or the build. If you do have to break binary backward compatibility, it's recommended to increase the major version number (e.g. from 1.x.y to 2.0.0) before releasing the change.
Your newly packaged library is not backward binary compatible (BC) with old version. For this reason some of the library clients that are not recompiled may throw the exception.
This is a complete list of changes in Java library API that may cause clients built with an old version of the library to throw java.lang.IncompatibleClassChangeError if they run on a new one (i.e. breaking BC):
Non-final field become static,
Non-constant field become non-static,
Class become interface,
Interface become class,
if you add a new field to class/interface (or add new super-class/super-interface) then a static field from a super-interface of a client class C may hide an added field (with the same name) inherited from the super-class of C (very rare case).
Note: There are many other exceptions caused by other incompatible changes: NoSuchFieldError, NoSuchMethodError, IllegalAccessError, InstantiationError, VerifyError, NoClassDefFoundError and AbstractMethodError.
The better paper about BC is "Evolving Java-based APIs 2: Achieving API Binary Compatibility" written by Jim des Rivières.
There are also some automatic tools to detect such changes:
japi-compliance-checker
clirr
japitools
sigtest
japi-checker
Usage of japi-compliance-checker for your library:
japi-compliance-checker OLD.jar NEW.jar
Usage of clirr tool:
java -jar clirr-core-0.6-uber.jar -o OLD.jar -n NEW.jar
Good luck!
While these answers are all correct, resolving the problem is often more difficult. It's generally the result of two mildly different versions of the same dependency on the classpath, and is almost always caused by either a different superclass than was originally compiled against being on the classpath or some import of the transitive closure being different, but generally at class instantiation and constructor invocation. (After successful class loading and ctor invocation, you'll get NoSuchMethodException or whatnot.)
If the behavior appears random, it's likely the result of a multithreaded program classloading different transitive dependencies based on what code got hit first.
To resolve these, try launching the VM with -verbose as an argument, then look at the classes that were being loaded when the exception occurs. You should see some surprising information. For instance, having multiple copies of the same dependency and versions you never expected or would have accepted if you knew they were being included.
Resolving duplicate jars with Maven is best done with a combination of the maven-dependency-plugin and maven-enforcer-plugin under Maven (or SBT's Dependency Graph Plugin, then adding those jars to a section of your top-level POM or as imported dependency elements in SBT (to remove those dependencies).
Good luck!
I have also discovered that, when using JNI, invoking a Java method from C++, if you pass parameters to the invoked Java method in the wrong order, you will get this error when you attempt to use the parameters inside the called method (because they won't be the right type). I was initially taken aback that JNI does not do this checking for you as part of the class signature checking when you invoke the method, but I assume they don't do this kind of checking because you may be passing polymorphic parameters and they have to assume you know what you are doing.
Example C++ JNI Code:
void invokeFooDoSomething() {
jobject javaFred = FredFactory::getFred(); // Get a Fred jobject
jobject javaFoo = FooFactory::getFoo(); // Get a Foo jobject
jobject javaBar = FooFactory::getBar(); // Get a Bar jobject
jmethodID methodID = getDoSomethingMethodId() // Get the JNI Method ID
jniEnv->CallVoidMethod(javaFoo,
methodID,
javaFred, // Woops! I switched the Fred and Bar parameters!
javaBar);
// << Insert error handling code here to discover the JNI Exception >>
// ... This is where the IncompatibleClassChangeError will show up.
}
Example Java Code:
class Bar { ... }
class Fred {
public int size() { ... }
}
class Foo {
public void doSomething(Fred aFred, Bar anotherObject) {
if (name.size() > 0) { // Will throw a cryptic java.lang.IncompatibleClassChangeError
// Do some stuff...
}
}
}
I had the same issue, and later I figured out that I am running the application on Java version 1.4 while the application is compiled on version 6.
Actually, the reason was of having a duplicate library, one is located within the classpath and the other one is included inside a jar file that is located within the classpath.
In my case the error appeared when I added the com.nimbusds library in my application deployed on Websphere 8.5.
The below exception occurred:
Caused by: java.lang.IncompatibleClassChangeError: org.objectweb.asm.AnnotationVisitor
The solution was to exclude the asm jar from the library:
<dependency>
<groupId>com.nimbusds</groupId>
<artifactId>nimbus-jose-jwt</artifactId>
<version>5.1</version>
<exclusions>
<exclusion>
<artifactId>asm</artifactId>
<groupId>org.ow2.asm</groupId>
</exclusion>
</exclusions>
</dependency>
Another situation where this error can appear is with Emma Code Coverage.
This happens when assigning an Object to an interface. I guess this has something to do with the Object being instrumented and not binary compatible anymore.
http://sourceforge.net/tracker/?func=detail&aid=3178921&group_id=177969&atid=883351
Fortunately this problem doesn't happen with Cobertura, so I've added cobertura-maven-plugin in my reporting plugins of my pom.xml
I've faced this issue while undeploying and redeploying a war with glassfish. My class structure was like this,
public interface A{
}
public class AImpl implements A{
}
and it was changed to
public abstract class A{
}
public class AImpl extends A{
}
After stopping and restarting the domain, it worked out fine.
I was using glassfish 3.1.43
All of the above - for whatever reason I was doing some big refactor and starting to get this. I renamed the package my interface was in and that cleared it. Hope that helps.
I have a web application that deploys perfectly fine on my local machine's tomcat(8.0.20). However, when I put it into the qa environment (tomcat - 8.0.20), it kept on giving me the IncompatibleClassChangeError and it was complaining that I was extending on an interface. This interface was changed to an abstract class. And I compiled the parent and child classes and still I kept on getting the same issue. Finally, I wanted to debug, so, I changed the version on the parent to x.0.1-SNAPSHOT and then compiled everything and now it is working. If someone is still hitting the problem after following the answers given here, please make sure the versions in your pom.xml are also correct. Change the versions to see if that works. If so, then fix the version problem.
My answer, I believe, will be Intellij specific.
I had rebuilt clean, even going as far as to manually delete the "out" and "target" dirs. Intellij has a "invalidate caches and restart", which sometimes clears odd errors. This time it didn't work. The dependency versions all looked correct in the project settings->modules menu.
The final answer was to manually delete my problem dependency from my local maven repo. An old version of bouncycastle was the culprit(I knew I had just changed versions and that would be the problem) and although the old version showed up no where in what was being built, it solved my problem. I was using intellij version 14 and then upgraded to 15 during this process.
In my case, I ran into this error this way. pom.xml of my project defined two dependencies A and B. And both A and B defined dependency on same artifact (call it C) but different versions of it (C.1 and C.2). When this happens, for each class in C maven can only select one version of the class from the two versions (while building an uber-jar). It will select the "nearest" version based on its dependency mediation rules and will output a warning "We have a duplicate class..." If a method/class signature changes between the versions, it can cause a java.lang.IncompatibleClassChangeError exception if the incorrect version is used at runtime.
Advanced: If A must use v1 of C and B must use v2 of C, then we must relocate C in A and B's poms to avoid class conflict (we have a duplicate class warning) when building the final project that depends on both A and B.
An additional cause of this issue, is if you have Instant Run enabled for Android Studio.
The fix
If you find you start getting this error, turn off Instant Run.
Android Studio main settings
Build, Execution, Deployment
Instant Run
Untick "Enable instant run..."
Why
Instant Run modifies a large number of things during development, to make it quicker to provide updates to your running App. Hence instant run. When it works, it is really useful. However, when an issue such as this strikes, the best thing to do is to turn off Instant Run until the next version of Android Studio releases.
Please check if your code doesnt consist of two module projects that have the same classes names and packages definition. For example this could happen if someone uses copy-paste to create new implementation of interface based on previous implementation.
If this is a record of possible occurences of this error then:
I just got this error on WAS (8.5.0.1), during the CXF (2.6.0) loading of the spring (3.1.1_release) configuration where a BeanInstantiationException rolled up a CXF ExtensionException, rolling up a IncompatibleClassChangeError. The following snippet shows the gist of the stack trace:
Caused by: org.springframework.beans.BeanInstantiationException: Could not instantiate bean class [org.apache.cxf.bus.spring.SpringBus]: Constructor threw exception; nested exception is org.apache.cxf.bus.extension.ExtensionException
at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:162)
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:76)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:990)
... 116 more
Caused by: org.apache.cxf.bus.extension.ExtensionException
at org.apache.cxf.bus.extension.Extension.tryClass(Extension.java:167)
at org.apache.cxf.bus.extension.Extension.getClassObject(Extension.java:179)
at org.apache.cxf.bus.extension.ExtensionManagerImpl.activateAllByType(ExtensionManagerImpl.java:138)
at org.apache.cxf.bus.extension.ExtensionManagerBus.<init>(ExtensionManagerBus.java:131)
[etc...]
at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:147)
... 118 more
Caused by: java.lang.IncompatibleClassChangeError:
org.apache.neethi.AssertionBuilderFactory
at java.lang.ClassLoader.defineClassImpl(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:284)
[etc...]
at com.ibm.ws.classloader.CompoundClassLoader.loadClass(CompoundClassLoader.java:586)
at java.lang.ClassLoader.loadClass(ClassLoader.java:658)
at org.apache.cxf.bus.extension.Extension.tryClass(Extension.java:163)
... 128 more
In this case, the solution was to change the classpath order of the module in my war file. That is, open up the war application in the WAS console under and select the client module(s). In the module configuration, set the class-loading to be "parent last".
This is found in the WAS console:
Applicatoins -> Application Types -> WebSphere Enterprise Applications
Click link representing your application (war)
Click "Manage Modules" under "Modules" section
Click link for the underlying module(s)
Change "Class loader order" to be "(parent last)".
Documenting another scenario after burning way too much time.
Make sure you don't have a dependency jar that has a class with an EJB annotation on it.
We had a common jar file that had an #local annotation. That class was later moved out of that common project and into our main ejb jar project. Our ejb jar and our common jar are both bundled within an ear. The version of our common jar dependency was not updated. Thus 2 classes trying to be something with incompatible changes.
For some reason the same exception is also thrown when using JNI and passing the jclass argument instead of the jobject when calling a Call*Method().
This is similar to the answer from Ogre Psalm33.
void example(JNIEnv *env, jobject inJavaList) {
jclass class_List = env->FindClass("java/util/List");
jmethodID method_size = env->GetMethodID(class_List, "size", "()I");
long size = env->CallIntMethod(class_List, method_size); // should be passing 'inJavaList' instead of 'class_List'
std::cout << "LIST SIZE " << size << std::endl;
}
I know it is a bit late to answer this question 5 years after being asked but this is one of the top hits when searching for java.lang.IncompatibleClassChangeError so I wanted to document this special case.
Adding my 2 cents .If you are using scala and sbt and scala-logging as dependency ;then this can happen because scala-logging's earlier version had the name scala-logging-api.So;essentially the dependency resolutions do not happen because of different names leading to runtime errors while launching the scala application.
I got this error because I had an abstract base class which promised that it implements a certain interface, but I had forgotten to add the implementations of the interface methods, and then I created a non-abstract (concrete) bytecode-generated class which extended the abstract class, without providing implementations for those methods, either.
When I tried to create an instance the bytecode-generated class, the JVM complained with java.lang.IncompatibleClassChangeError.
Luckily, the exception has a "message" member which provides more detailed information as to what went wrong. In my case the message clearly said that the particular class was supposed to implement the particular interface, but it did not actually implement it.
If you came from android development. Then give a try of rebuild option might be fix for you.
In my case:
I have a project containing a few modules, including app, test, integrationTest
I created OneElementCache in app module.
Then, I created a file Cache in test module, the file contains some helpers for creating OneElementCache in tests.
Until now, everything works perfectly (both test and integrationTest passes).
After that, I created a file Cache in app module.
Got while running integrationTest:
Caused by: java.lang.IncompatibleClassChangeError:
class app.cache.CacheImpl can not implement app.cache.Cache, because it is not an interface (app.cache.Cache is in unnamed module of loader 'app')
The reason was a conflict in naming in different modules (app/test). Changing the filename in test did the job.