How to define qualified exports to unknown modules? - java

I have a Maven project with two Maven modules A and B. A contains the following Java module definition:
module A {
exports internal.util to B;
exports external.A;
}
B contains the following Java module definition:
module B {
requires A;
exports external.B;
}
When I build the project, I get an error:
[WARNING] module-info.java:[16,106] module not found: B
Module B exists but because Module A is compiled before B and does not depend on it, the compiler has no way of knowing that. Because I configured the compiler to treat warnings as errors (-Werror), the build fails.
Seeing as I want to keep treating warnings as errors, what is the best way to resolve this problem?
Is there a way to hint to the compiler that this module will be declared in the future?
Is there a way to suppress all warnings of this type?

I figured out a workaround by scanning through the JDK 11 source-code: -Xlint:-module. I am still open to a better solution if someone finds one.
UPDATE: An alternative is to use --module-source-path as demonstrated by https://stackoverflow.com/a/53717183/14731
Thank you Alan Bateman for pointing me in this direction!

As I could infer from the question, your scenario is such that both the module A and B co-exists and the module A does not depend on the module B.
From a module system prospect, since A makes a qualified export to the module B, a warning is thrown if the module B is not resolved on the modulepath while you build A.
On IntelliJ-IDEA I could fix such an issue by adding a dependency of module B to the module A without any declaration changes. Though using the Maven framework, I couldn't imagine something without a cyclic dependency here, maybe the suggestion by khmarbaise could help.
Note: An alternate way to fix this though could also be to disable such warnings using the command-line arg :
-Xlint:-exports
Or as Alan points out for correction
-Xlint:-module
Thinking out loud, that would be since, the warning is a result of a qualified export, but about a module not found.
Useful: You could find the details over these command line args with javac command.

Related

Changed jdeps behavior in OpenJDK 11.0.11 (JDK-8214213)

Summary
Our build pipeline has been broken after some machines have updated from JDK 11.0.10- to JDK 11.0.11+. This happens due to changed jdeps behavior. After some research it became evident, this is likely due to changes introduced with JDK-8214213:
https://mail.openjdk.java.net/pipermail/jdk-updates-dev/2021-April/005860.html
Assuming we were retrieving dependencies for sentry-1.7.25.jar, then our usage of jdeps via CLI is as follows:
jdeps --list-deps -filter:module --multi-release=11 "..\somePath\sentry-1.7.25.jar
The resulting dependency lists look like this:
11.0.10 and below
java.base
java.logging
java.naming
11.0.11 and above
Error: Missing dependencies: classes not found from the module path and classpath.
To suppress this error, use --ignore-missing-deps to continue.
sentry-1.7.25.jar
io.sentry.event.helper.BasicRemoteAddressResolver -> javax.servlet.http.HttpServletRequest not found
io.sentry.event.helper.ForwardedAddressResolver -> javax.servlet.http.HttpServletRequest not found
io.sentry.event.helper.HttpEventBuilderHelper -> javax.servlet.http.HttpServletRequest not found
io.sentry.event.helper.RemoteAddressResolver -> javax.servlet.http.HttpServletRequest not found
io.sentry.event.interfaces.HttpInterface -> javax.servlet.http.Cookie not found
io.sentry.event.interfaces.HttpInterface -> javax.servlet.http.HttpServletRequest not found
io.sentry.servlet.SentryServletContainerInitializer -> javax.servlet.ServletContainerInitializer not found
io.sentry.servlet.SentryServletContainerInitializer -> javax.servlet.ServletContext not found
io.sentry.servlet.SentryServletContainerInitializer -> javax.servlet.ServletException not found
io.sentry.servlet.SentryServletRequestListener -> javax.servlet.ServletRequest not found
io.sentry.servlet.SentryServletRequestListener -> javax.servlet.ServletRequestEvent not found
io.sentry.servlet.SentryServletRequestListener -> javax.servlet.ServletRequestListener not found
io.sentry.servlet.SentryServletRequestListener -> javax.servlet.http.HttpServletRequest not found
In order to fix this on OpenJDK 11.0.11+ it's necessary to set --ignore-missing-deps when calling jdeps. If done, then the output again looks correct again:
java.base
java.logging
java.naming
Question
So I am able to produce the same output with jdeps using JDK 11.0.11+ as I was able to do with JDK 11.0.10-. That being said, this output is used to create a custom runtime and in the description of JDK-8214213 is explicitely stated:
Note that a
custom image is created with the list of modules output by jdeps when
using the --ignore-missing-deps option for a non-modular
application. Such an application, running on the custom image, might
fail at runtime when missing dependence errors are suppressed.
From my understanding this means that if there is a transitive dependency involved, where the dependency of a dependency requires a runtime module that is not required by any of the top level dependencies, then this can lead to a custom runtime uncapable of running the application, since the transitive dependency cannot be resolved. In other words, if my application requires dependency A, which requires dependency B and module C, but dependency B also requires module D, then my application is at risk of encountering runtime errors, since my custom runtime is not being provided with module D.
My question now is this, since I am unable to derive it from documentation:
With JDK 11.0.11+ I can only get the same dependency list output, if --ignore-missing-deps is used. Does that mean that...
...jdeps was able to resolve transitive dependencies prior to 11.0.11, but cannot do so anylonger above said version, e.g. because dependency analysis is done differently internally in jdeps?
...jdeps acted as if it was using --ignore-missing-deps prior to 11.0.11 by default, hence if the default changed, jdeps is now throwing an error on 11.0.11+?
...something else is going on?
The resulting dependency list might be the same, simply because there are a lot of libraries, so most modules are used either way. However I am trying to determine, whether
jdeps --list-deps -filter:module --multi-release=11 "..\somePath\sentry-1.7.25.jar (11.0.10)
and
jdeps --list-deps --ignore-missing-deps -filter:module --multi-release=11 "..\somePath\sentry-1.7.25.jar (11.0.11)
behave exactly the same, or whether using --ignore-missing-deps introduces a new risk when adding new libraries to our project, as they may at some point require a module that is not part of the current jdeps-list.
Bear in mind, to me this is rather a deep-dive into OpenJDK specifics, so if there is faulty terminoogy or problems with my understanding of these scenarios, then feel free to point out and correct them.

How to solve circular dependency in gradle multi-project build

Consider the following situation. I have two gradle (sub-)projects called "A" and "B". A defines some classes/interfaces that are being referenced by B. So B has a compile dependency to A. Now A is a web server that should be started with B on the classpath. How do you achieve that with gradle?
Of course it is not possible to add B as compile dependency to A because that would mean a circular dependency between A and B. Even adding B as runtime dependency to A did not work because then compile errors in B state that referenced classes from A do not exist. But why?
One solution would be to move code from B into A but I really would like to separate that code because there might be another implementation of B later that I want to swap easily in A (e.g. by exchanging the jar in runtime classpath).
Another solution I was thinking about is to separate classes from A referenced by B into a new module and make both A and B depend on that new module. This sounds valid but that would imply to move persistence layer from A to that new module which feels wrong.
Additional information: A is a Spring boot web application with persistence layer, web services etc, B produces a JAR.
Circular dependencies are a well-known problem when you try to get Dependency Injection. In this case, you have something similar but at a module level
The only way I see you can solve your issue is by creating a third module C with the common code (probably the A interfaces referenced by B)
This way you can compile C (it doesn't have any dependencies), A (it depends on C), and B (it depends on C) and launch A with B in its classpath
Everytime you end up with circular dependency you probably should introduce another entity to break the cycle.
Have a look at my explanation in this other QA article (it's dealing with packages and classes, but idea is the same): What does it mean and how to fix SonarQube Java issue "Cycles between packages should be removed" (squid:CycleBetweenPackages)

warning: unknown enum constant Status.STABLE

In the quest to solve this and somehow that, I was trying out to create packages to subdivide main and test classes and then to make use of compiler with added modules to execute the unit-tests. Not a very good way agreed, but just a hypothetical structure for now.
Few open questions as I proceeded further were:-
Add a JDK9 based module to the project.
Add JUnit5 to the classpath using IntelliJ's shortcut. (lib folder) [junit-jupiter-api-5.0.0.jar]
Q. Note that it brings along the opentest4j-1.0.0.jar to the lib/ folder. Why is that so, what is the other jar used for?
Add the classes and generate some tests method correspondingly.
Compile the sample project (shared just to draw a picture of the directory structure in use) using the command
javac --module-path lib -d "target" $(find src -name "*.java")
Results into warnings as -
warning: unknown enum constant Status.STABLE
reason: class file for org.apiguardian.api.API$Status not found
warning: unknown enum constant Status.STABLE
2 warnings
Note:-
I find the usage of junit-jupiter suspicious since if I comment out the code using JUnit and execute the same command, things seem to be working fine.
Libraries/Tools used if that might matter:-
junit-jupiter-api-5.0.0 with
Java version "9" (build 9+181)
IntelliJ 2017.2.5
Q. What could be a probable cause to such a warning? Moreover, I am unable to find the API.Status in my project and outside the project classes as well.
The compilation warning can simply be ignored. Moreover, it won't be appearing anymore starting with the version 5.1.0 (currently in development). It is all explained in Release Notes:
In 5.0.1, all artifacts were changed to have an optional instead of a mandatory dependency on the #API Guardian JAR in their published Maven POMs. However, although the Java compiler should ignore missing annotation types, a lot of users have reported that compiling tests without having the #API Guardian JAR on the classpath results in warnings emitted by javac that look like this:
warning: unknown enum constant Status.STABLE
reason: class file for org.apiguardian.api.API$Status not found
To avoid confusion, the JUnit team has decided to make the dependency to the #API Guardian JAR mandatory again.
For reference also see:
Remove compile dependency on apiguardian-api in Maven POMs
Reintroduce compile dependency on apiguardian-api in Maven POMs
1) opentest4j
opentest4j is a transitive dependency of junit-jupiter-api. See the dependency graph:
+--- org.junit.jupiter:junit-jupiter-api:5.0.1
+--- org.opentest4j:opentest4j:1.0.0
\--- org.junit.platform:junit-platform-commons:1.0.1
2) unknown enum constant Status.STABLE
You need to add following dependency: apiguardian-api.
For example in Gradle, you can do it via:
dependencies {
testCompile 'org.junit.jupiter:junit-jupiter-api:5.0.1'
testRuntime 'org.junit.jupiter:junit-jupiter-engine:5.0.1'
testCompileOnly 'org.apiguardian:apiguardian-api:1.0.0'
}
But overall, dependency is build-tool-independent, so you can do it in plain IDE without Gradle, or Maven.

Android Annotation Processor accessing Annotated classes from different modules

I'm having an Android Studio project with 2 modules: A and B. (I do not include here the Annotation Processor and the Annotations module)
B depends on A.
B is an Android Library Module, and A is simple Java library Module. I'm also having an Annotation Processor on module B.
The problem I'm facing is:
I want to generate some code, based on annotated files placed in both modules - A and B. The problem comes from the way the Annotation Processor works - only with source code files *.java - not with compiled *.class ones. Unfortunately, during the compilation of B, the Annotation Processor doesn't have access to those source files from A...
The only thing, I was able to think about as a kind of solution, even an ugly one, was to include the folder with the annotated classes from module A as a source set to module B. This way I give module B access to those files during compilation.
sourceSets {
main {
java {
srcDirs = ['src/main/java', '../module_A/src/main/java/path/to/annotated/classes/folder']
}
}
}
That solves the problem - now the Annotation Processor has access to all the annotated classes from both modules, but...
Unfortunately, it introduces another issue... those annotated classes from module A, are now compiled twice. And they are included in the module A's JAR file and in the module B's AAR file.
Question 1: Is there another way to access those source files of module A, from the Annotation Processor running on B??? (From what I was able to find, the answer is NO, but checking...)
Question 2: How can I exclude those compiled files (the repeated ones) from the AAR final package of module B?
Question 3: Maybe... that's an absolutely wrong approach? Any suggestions?
Thanks in advance!
Nop, you can not achieve what you want using just java.lang.model API. At least not without some additional tricks.
The issues is not with binary-vs-source. Annotation processors can use Elements#getTypeElement to interospect compiled classes as well as source-defined classes:
Elements elementUtil = processingEnvironment.getElementUtils();
TypeElement integerClass = elementUtil.getTypeElement("java.lang.Integer");
TypeElement myClass = elementUtil.getTypeElement("currently.compiled.Class");
But you still need to have class on compilation classpath to observe it, and the class must be in process of being compiled to be visible to getElementsAnnotatedWith.
You can work around later limitation by using a tool like FastClasspathScanner: it will use it's own mechanisms to find annotations in compiled bytecode, and report them to you separately from compilation process. But you can not work around the classpath issue: if you don't have some dependency in compilation classpath, it can not be processed. So you have to compile modules together — either by merging them into one (as you did) or via declaring one to depend on another. In later case you might not be able to use getElementsAnnotatedWith, but getTypeElement and FastClasspathScanner will work.

The hierarchy of the type Classe is inconsistent

I have a class that implements an abstract class. The abstract class is in another package of my project. Added the package where the class is abstract by Configure build path/Project. In class that implements is accusing the following error: The hierarchy of the type class is inconsistent
Have I to add this package somewhere else?
Thank you!
These errors happened because some interface/class in the hierarchy cannot be resolved.
For example: the error is in your class - class X, X inherits Y, and in turn, Y inherits Z. However, the compiler cannot resolve z (in above error), because z is belong to a library that is not included.
Therefore, you have to add package containing z to the classpath/ or project's Java Build Path (if you are using eclipse).
hope it may help.
Go to the Project Explorer.
Right Click on your Project
Build Path
Configure Build Path
Remove JRE System Library
Click on Add Library
Add JRE System Library
Click on Next and then Finish
The errors will be resolved.
This means you have made an incompatible change in a super class but haven't recompiled it.
I suggest you use a build system like Maven or Ant and/or use an IDE to build all your code.
I was facing this issue in one of My RCP application.
Cause: I was not added core plugin org.eclipse.core.runtime in dependancy section of Manifest file.
When I have added this dependency jar issue has been resolved.
Thanks,
Sid
That means class which implemented by you has reference to the interface or class which again references some other classes or interfaces in other libraries and those are not available.
springframework.aop jar is depends on aopalliance.jar add it to your class path it will resolve ur problm hope it works cos i was facing same prob I added dependent jar so it resolved
right click your project ,Properties- java build path- source- include all the cs file source(it may inherit the files) and sync.
this worked for me and correctly.
I also had this problem when tried to use some class from a plugin project in another one.
I had something like myClass extends pp1Class - here i had the error, pp1Class plugin was added as a dependency. pp1Class extends pp2Class - which was a dependency in pp1 (plugin proj1) but not in my plugin. What i did was go to pp1 and where you have defined the dependency to pp2, click properties and check the "Reexport this dependency" (this is in the MANIFEST.MF).
This should solve the problem, it solved mine.
Another reason for this error is, one of your base classes implements an interface which is in an external library, and your .classpath file is kept on a source control system (therefore readonly).
For instance, your ClassB extends ClassA and ClassA implements InterfaceA which is in LibraryA.jar. ClassA is in ProjectA, ClassB is in ProjectB. ProjectA .classpath file is readonly.
Here you have to export the LibraryA.jar from your ProjectA. But I guess due to an Eclipse bug, when a new team mate connects these projects (or occasionally when you prepare another workspace), he gets this type hierarchy error. Only way to solve this problem is to check-out .classpath file in ProjectA, remove and re-add a library (does not have to be LibraryA.jar). This operation somehow resolves the error.
In the Eclipse, OSGi environment, the required package can be added to the MANIFEST.MF-> Dependencies tab -> Imported packages. This will solve the issue. Or the plugin which contains the class can be added to the Required Plugin-ins
I also face this issue in my maven project using with Eclipse oxygen.1a IDE,
The hierarchy of the type MyClassName is inconsistent
Error showing on class name level,
thereafter I took complete svn update from repository and later Maven -> Update Project and later Project Refresh.
The error was gone...
As per my understanding, this were happening due to unmanaged project version.
I actually added all the related jars and interfaces in build path but still I was facing the error so later someone suggested me to add the j2ee.jar in build path and my error just went away.
The hierarchy of the type A is inconsistent
The above error is mainly because of Some jars missing in the classpath
eg: I was trying to implement an interface "MethodBeforeAdvice"
Here MethodBeforeAdvice implements BeforeAdvice and these two interfaces were present in one jar file called "Spring-aop-4.2.5.Release.jar"
But "BeforeAdvice" interface implements an interface "Advice" which was present in some other jar "aop-alliance-1.0.0.jar" which was not present in my class-path

Categories

Resources