Not finding automatic modules when compiling Java Application - java

I have a large multi module (100s) Java project and have been experimenting with adopting java module support. This is using Java 17 (temurin), gradle 7.6, and IntelliJ 2022.3.
I have hit a couple of stubborn errors with java modules where the module cannot be found.
I have one project which has some java code that uses plexus ie:
import org.codehaus.plexus.util.Base64;
...
byte[] encodedAuthorizationString = Base64.encodeBase64(authorizationString.getBytes(StandardCharsets.US_ASCII));
It has a gradle dependency
implementation 'org.codehaus.plexus:plexus-utils'
This has a version constraint in our main build.gradle (just salient lines included):
plexusVersion = '3.5.0'
implementation("org.codehaus.plexus:plexus-utils:${plexusVersion}")
Prior to adding module support this is working fine.
Now, with a module-info.java:
module egeria.open.metadata.implementation.adapters.open.connectors.rest.client.connectors.spring.rest.client.connector.main {
requires egeria.open.metadata.implementation.adapters.authentication.plugins.http.helper.main;
requires egeria.open.metadata.implementation.adapters.open.connectors.rest.client.connectors.rest.client.connectors.api.main;
//requires egeria.open.metadata.implementation.adapters.open.connectors.rest.client.connectors.rest.client.factory.main;
requires egeria.open.metadata.implementation.frameworks.open.connector.framework.main;
requires plexus.utils;
requires org.slf4j;
requires spring.core;
requires spring.web;
exports org.odpi.openmetadata.adapters.connectors.restclients.spring;
}
I am getting a compile error
Task ':open-metadata-implementation:adapters:open-connectors:rest-client-connectors:spring-rest-client-connector:compileJava' is not up-to-date because:
Task has failed previously.
The input changes require a full rebuild for incremental task ':open-metadata-implementation:adapters:open-connectors:rest-client-connectors:spring-rest-client-connector:compileJava'.
Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments.
Compiling with toolchain '/Library/Java/JavaVirtualMachines/temurin-19.jdk/Contents/Home'.
Compiling with JDK Java compiler API.
/Users/jonesn/IdeaProjects/egeria/v4/open-metadata-implementation/adapters/open-connectors/rest-client-connectors/spring-rest-client-connector/src/main/java/module-info.java:6: error: module not found: plexus.utils
requires plexus.utils;
^
1 error
This is despite the fact, that having downloaded the jar file, the automatic module name looks to be what I am using ie:
jar --file=/Users/jonesn/Downloads/plexus-utils-3.5.0.jar --describe-module
No module descriptor found. Derived automatic module.
plexus.utils#3.5.0 automatic
requires java.base mandated
contains org.codehaus.plexus.util
contains org.codehaus.plexus.util.cli
contains org.codehaus.plexus.util.cli.shell
contains org.codehaus.plexus.util.dag
contains org.codehaus.plexus.util.introspection
contains org.codehaus.plexus.util.io
contains org.codehaus.plexus.util.reflection
contains org.codehaus.plexus.util.xml
contains org.codehaus.plexus.util.xml.pull
I am seeing the same error with kafka-clients
For most other code, including those libraries without full module support, all is good....
tried various compilers, such as openjdk 17 & temurin 19
built at cli & within IntelliJ
I was expecting this module to resolve ok
I have also reviewed Java 9 automatic modules not found but note that other automatic modules (including org.slf4j) are working just fine
I should add that I could refactor this code to use java.util.Base64 (probably makes sense)... but I'm still confused as to why the module error, which I also see in another project with 'kafka.clients'

Related

JavaFX with Gradle error module not found

I'm creating a sample demo application with JavaFX in IntelliJ, but I need to use a library called the JavaFaker library. I'm using Gradle as the build system, but every time I try to add the library, either as the implementation in the build.gradle file, or via IntelliJ project structure options, the module.java file says error: module not found. I've already tried adding it to modules but nothing changes.
module-info.java
module com.example.demo1 {
requires javafx.controls;
requires javafx.fxml;
requires javafaker;
opens com.example.demo1 to javafx.fxml;
exports com.example.demo1;
}
build.gradle
plugins {
id 'java'
id 'application'
id 'org.openjfx.javafxplugin' version '0.0.10'
id 'org.beryx.jlink' version '2.24.1'
}
group 'com.example'
version '1.0-SNAPSHOT'
repositories {
mavenCentral()
}
ext {
junitVersion = '5.8.2'
javaFakerVersion = '1.0.2'
}
sourceCompatibility = '17'
targetCompatibility = '17'
tasks.withType(JavaCompile) {
options.encoding = 'UTF-8'
}
application {
mainModule = 'com.example.demo1'
mainClass = 'com.example.demo1.HelloApplication'
}
javafx {
version = '17.0.1'
modules = ['javafx.controls', 'javafx.fxml']
}
dependencies {
implementation("com.github.javafaker:javafaker:${javaFakerVersion}")
testImplementation("org.junit.jupiter:junit-jupiter-api:${junitVersion}")
testRuntimeOnly("org.junit.jupiter:junit-jupiter-engine:${junitVersion}")
}
test {
useJUnitPlatform()
}
jlink {
imageZip = project.file("${buildDir}/distributions/app-${javafx.platform.classifier}.zip") as RegularFile
options = ['--strip-debug', '--compress', '2', '--no-header-files', '--no-man-pages']
launcher {
name = 'app'
}
}
jlinkZip {
group = 'distribution'
}
error message
> Task :HelloApplication.main() FAILED
Error occurred during initialization of boot layer
java.lang.module.FindException: Module javafaker not found, required by com.example.demo1
I tried for a while to get this to work with Gradle but was unable to. I don't know Gradle well, but unless you do, I don't advise trying it.
Alternate option: use a static import
I didn't try this, but this is suggested in another answer.
Before you try this, see:
What's the difference between requires and requires static in module declaration
It is IMO, a bit of a hack in this usage case. This makes the module optional at runtime. But, if the module is on the classpath instead of the module path its code can still be used. More information quoted from the linked answer:
A requires static clause expresses a dependency that is optional at
run time. That means at compile time the module system behaves exactly
as described above.
At run time, on the other hand, it mostly ignores requires static
clauses. If it encounters one, it does not resolve it. That means, if
an observable module is only referenced with requires static, it does
not make it into the module graph!
Alternate option: Non-modular project
You can fix this issue by making your project non-modular:
Delete your module-info.java file.
Run your application with JavaFX modules on the module-path.
The org.openjfx.javafxplugin you are already doing will help achieve this by specifying the modules to be used.
To execute the application directly in the IDE rather than through Gradle, you will need to specify the module options to the VM for the IDE execution configuration (information on that is in the getting started documentation at openjfx.io).
For packaging, switch to using the badass-runtime-plugin rather than the badass-jlink-plugin. This will package the application via jpackage rather than jlink (which cannot package non-modular applications or applications with automatic modules).
In the application block of your build file, you no longer need to specify the module for your application as you no longer have one.
While that means that your application is no longer modular, in this case, in my opinion, this is not such a big loss. The dependencies you are using are not well-defined modules, so you can't use jlink to create a package for your application, and you don't have the level of modular encapsulation and definition you would normally receive for fully modular projects.
For more information, see the Getting started instructions at:
https://openjfx.io/openjfx-docs/
Under the sections "Non-Modular with Gradle" for your selected IDE.
Alternate option: Using Maven
It is easy to get this to work with Maven.
Create a new JavaFX project
Choose Maven as your build system instead of Gradle.
Add the javafaker dependency to your pom.xml.
<dependency>
<groupId>com.github.javafaker</groupId>
<artifactId>javafaker</artifactId>
<version>1.0.2</version>
</dependency>
Press the refresh icon in the Maven window to reimport the Maven project into the IDE.
Add the requires clause for the javafaker module into your module-info.java
requires javafaker;
Add the code to use javafaker to your app.
I don't have code to use javafaker, so I could not verify that the last step would work, but, give it a try . . .
Why you can receive this issue when using Gradle, but not Maven
Looking at the Gradle Documentation section "Using libraries that are not modules":
A third case are traditional libraries that provide no module information at all — for example commons-cli:commons-cli:1.4. Gradle puts such libraries on the classpath instead of the module path. The classpath is then treated as one module (the so called unnamed module) by Java.
This is the case with the javafaker dependency that you are using. It has no module-info.java and does not define the property Automatic-Module-Name in its manifest file (which are the other two cases in the section). Both the other cases result in Gradle putting the library on the module path, but the case you have means that it is on the class path.
This is a problem when you want to access the code from a named module that you define, which you have because you created a module-info.java.
Your module can only find code and resources of modules it requires (which need to be on the module path), so you add requires javafaker to the module-info.java, and get the following when you try to run through the IDE:
java.lang.module.FindException: Module javafaker not found, required by com.example.demo1
So you remove the requires javafaker from the module-info.java as advised by the Gradle documentation I linked and you get the following when you try to compile:
Package 'com.github.javafaker' is declared in module 'javafaker', but module 'com.example.demo1' does not read it
So you must place the library in the module-info to use it, but you can't place the library in module-info because Gradle puts in on the classpath -> catch-22.
There are workarounds to this such as providing VM arguments to allow access to the unnamed module (which is the classpath), or maybe modifying the module path handling of the Gradle build and/or IDE somehow (I don't know how), but they are pretty ugly.
On the other hand, for this case, Maven acts differently from Gradle, it places the dependent library on the module path, even if it does not have a module-info.java or Automatic-Module-Name defined. This means that it was (for me) much easier to set up and use.
Incidental advice on module naming
This is not an error, but note: Although module names with numbers in them are now allowed due to a change in the module system specification, it is probably best not to put numbers in module names to prevent the module name and version info being confused.
I've had a similar issue recently. Adding static to the requires statement helped however. Maybe this will fix your issue without having to switch to maven.
So you'd need to add: requires static javafaker;

Are there any JDK 11+ system modules which are not root modules?

In JDKs 9 and 10, there used to be a few modules such as java.xml.bind, containing Java EE classes. They were marked as deprecated and to be removed with the advent of JDK 9 and finally removed in 11 (see JEP 320). In a product I am contributing to, there used to be tests for the javac compiler option --add-modules, adding those modules as root modules. Those tests have been deactivated for JDK 11+. Instead of removing them, I would like to reactivate them, if there are any other JDK modules which are also non-root by default. The tests could then just use those modules instead.
I know I can just test --add-modules with my own modules, but then I have to specify them on the module path. The test case that an extra module path is not necessary for JDK modules added via --add-modules is also interesting, if any JDK 11+ modules still exist to be tested against. I am not talking about non-exported packages, but really about non-root JDK modules.
So, according to the information in this answer, I am actually looking for non-java.* modules among the system modules which do not export at least one package without qualification. In that case, those modules should not be root, and they would be eligible for my test case.
Update: What I am looking for is an equivalent for this in JDK 9:
import javax.xml.bind.JAXBContext;
public class UsesJAXB {
JAXBContext context;
}
xx> "C:\Program Files\Java\jdk-9.0.4\bin"\javac UsesJAXB.java
UsesJAXB.java:1: error: package javax.xml.bind is not visible
import javax.xml.bind.JAXBContext;
^
(package javax.xml.bind is declared in module java.xml.bind, which is not in the module graph)
1 error
xx> "C:\Program Files\Java\jdk-9.0.4\bin"\javac --add-modules java.xml.bind UsesJAXB.java
See? With --add-modules it builds, without it does not.
I am looking for modules (if any) in JDK 11-18, which would yield the same result when importing their classes in a simple program, i.e. require them to be explicitly added via --add-modules for compilation (not talking about runtime).
You can list all modules of the jdk with:
java --list-modules
Then you can print the module descriptors with:
java --describe-module a.module.name
After filterting these outputs in a little script, here are the modules of my JDK 17 than could qualify:
jdk.charsets
jdk.crypto.cryptoki
jdk.crypto.ec
jdk.editpad
jdk.internal.vm.compiler
jdk.internal.vm.compiler.management
jdk.jcmd
jdk.jdwp.agent
jdk.jlink
jdk.jpackage
jdk.localedata
jdk.zipfs
jdk.charsets, for instance, is a module providing a service.
Update after question update
So you are looking for a module that exports a package but is not in the default module graph when compiling a class in the unnamed module. According to the JEP, only java.* modules can qualify.
When I'm looking for modules required, either directly or indirectly, by java.se (which is a root module when it exists), I see all java.* modules of the JDK, but java.smartcardio. And due to some unknown magic, java.smartcardio is also in the default graph : I tried to compile a class importing one of its class and it works without --add-modules.
So I think you are down to using a non-java module exporting no package (like jdk.charsets), importing one of its class (like sun.nio.cs.ext.ExtendedCharsets) and either:
add --add-exports in addition to --add-modules when compiling your test class so that javac succeeds.
or parse the error message of javac and distinguish between "not in the module graph" and "not exported".

Changed jdeps behavior in OpenJDK 11.0.11 (JDK-8214213)

Summary
Our build pipeline has been broken after some machines have updated from JDK 11.0.10- to JDK 11.0.11+. This happens due to changed jdeps behavior. After some research it became evident, this is likely due to changes introduced with JDK-8214213:
https://mail.openjdk.java.net/pipermail/jdk-updates-dev/2021-April/005860.html
Assuming we were retrieving dependencies for sentry-1.7.25.jar, then our usage of jdeps via CLI is as follows:
jdeps --list-deps -filter:module --multi-release=11 "..\somePath\sentry-1.7.25.jar
The resulting dependency lists look like this:
11.0.10 and below
java.base
java.logging
java.naming
11.0.11 and above
Error: Missing dependencies: classes not found from the module path and classpath.
To suppress this error, use --ignore-missing-deps to continue.
sentry-1.7.25.jar
io.sentry.event.helper.BasicRemoteAddressResolver -> javax.servlet.http.HttpServletRequest not found
io.sentry.event.helper.ForwardedAddressResolver -> javax.servlet.http.HttpServletRequest not found
io.sentry.event.helper.HttpEventBuilderHelper -> javax.servlet.http.HttpServletRequest not found
io.sentry.event.helper.RemoteAddressResolver -> javax.servlet.http.HttpServletRequest not found
io.sentry.event.interfaces.HttpInterface -> javax.servlet.http.Cookie not found
io.sentry.event.interfaces.HttpInterface -> javax.servlet.http.HttpServletRequest not found
io.sentry.servlet.SentryServletContainerInitializer -> javax.servlet.ServletContainerInitializer not found
io.sentry.servlet.SentryServletContainerInitializer -> javax.servlet.ServletContext not found
io.sentry.servlet.SentryServletContainerInitializer -> javax.servlet.ServletException not found
io.sentry.servlet.SentryServletRequestListener -> javax.servlet.ServletRequest not found
io.sentry.servlet.SentryServletRequestListener -> javax.servlet.ServletRequestEvent not found
io.sentry.servlet.SentryServletRequestListener -> javax.servlet.ServletRequestListener not found
io.sentry.servlet.SentryServletRequestListener -> javax.servlet.http.HttpServletRequest not found
In order to fix this on OpenJDK 11.0.11+ it's necessary to set --ignore-missing-deps when calling jdeps. If done, then the output again looks correct again:
java.base
java.logging
java.naming
Question
So I am able to produce the same output with jdeps using JDK 11.0.11+ as I was able to do with JDK 11.0.10-. That being said, this output is used to create a custom runtime and in the description of JDK-8214213 is explicitely stated:
Note that a
custom image is created with the list of modules output by jdeps when
using the --ignore-missing-deps option for a non-modular
application. Such an application, running on the custom image, might
fail at runtime when missing dependence errors are suppressed.
From my understanding this means that if there is a transitive dependency involved, where the dependency of a dependency requires a runtime module that is not required by any of the top level dependencies, then this can lead to a custom runtime uncapable of running the application, since the transitive dependency cannot be resolved. In other words, if my application requires dependency A, which requires dependency B and module C, but dependency B also requires module D, then my application is at risk of encountering runtime errors, since my custom runtime is not being provided with module D.
My question now is this, since I am unable to derive it from documentation:
With JDK 11.0.11+ I can only get the same dependency list output, if --ignore-missing-deps is used. Does that mean that...
...jdeps was able to resolve transitive dependencies prior to 11.0.11, but cannot do so anylonger above said version, e.g. because dependency analysis is done differently internally in jdeps?
...jdeps acted as if it was using --ignore-missing-deps prior to 11.0.11 by default, hence if the default changed, jdeps is now throwing an error on 11.0.11+?
...something else is going on?
The resulting dependency list might be the same, simply because there are a lot of libraries, so most modules are used either way. However I am trying to determine, whether
jdeps --list-deps -filter:module --multi-release=11 "..\somePath\sentry-1.7.25.jar (11.0.10)
and
jdeps --list-deps --ignore-missing-deps -filter:module --multi-release=11 "..\somePath\sentry-1.7.25.jar (11.0.11)
behave exactly the same, or whether using --ignore-missing-deps introduces a new risk when adding new libraries to our project, as they may at some point require a module that is not part of the current jdeps-list.
Bear in mind, to me this is rather a deep-dive into OpenJDK specifics, so if there is faulty terminoogy or problems with my understanding of these scenarios, then feel free to point out and correct them.

Compiling application with Tika with Java 13 - problems loading modules

I'm trying to migrate a Java application that uses Tika from OracleJDK 1.8 to OPenJDK 13.
My IDE is Eclipse.
I have created the file module-info.java to indicate the required modules for my application.
In order to be able to use Tika classes such as AbstractParser, Detector, etc., I have added requires org.apache.tika.core; in module-info.java.
My code also uses the class org.apache.tika.parser.pdf.PDFParserConfig to extract embedded images:
PDFParserConfig pdfConfig = new PDFParserConfig();
pdfConfig.setExtractInlineImages(true);
context.set(PDFParserConfig.class, pdfConfig);'
I get the compilation error:
PDFParserConfig cannot be resolved to a type
Eclipse suggests to add requires org.apache.tika.parsers; to module-info.java: Eclipse suggestion screenshot.
When I add this module requirement to module-info.java, the application compiles properly.
That is, at this stage we have included in module-info.java:
module myapp {
/** others ... */
requires org.apache.tika.core;
requires org.apache.tika.parsers;
}
However, when trying to execute the compiled application, we get the error:
Error occurred during initialization of boot layer
java.lang.module.FindException: Unable to derive module descriptor for C:\Users\Admin\.m2\repository\org\apache\tika\tika-parsers\1.24\tika-parsers-1.24.jar
Caused by: java.lang.module.InvalidModuleDescriptorException: Provider class org.apache.tika.parser.onenote.OneNoteParser not in module
Inspecting the project Libraries in Eclipse, I can see that tika-core and tika-parsers (v1.24) are both modular: Eclipse Java Build Path
In conclusion: If I don't add org.apache.tika.parsers as a required module, the application won't compile, and if I add it I get the runtime error saying org.apache.tika.parser.onenote.OneNoteParser is not in the module.
I have inspected the JAR files for these packages to see the dependencies they have. The core packages seems to be right:
$ jar --file=tika-core-1.24.jar --describe-module
No module descriptor found. Derived automatic module.
org.apache.tika.core#1.24 automatic
requires java.base mandated
contains org.apache.tika
contains org.apache.tika.concurrent
contains org.apache.tika.config
contains org.apache.tika.detect
contains org.apache.tika.embedder
contains org.apache.tika.exception
contains org.apache.tika.extractor
contains org.apache.tika.fork
contains org.apache.tika.io
contains org.apache.tika.language
contains org.apache.tika.language.detect
contains org.apache.tika.language.translate
contains org.apache.tika.metadata
contains org.apache.tika.mime
contains org.apache.tika.parser
contains org.apache.tika.parser.digest
contains org.apache.tika.parser.external
contains org.apache.tika.sax
contains org.apache.tika.sax.xpath
contains org.apache.tika.utils
...but the 'parsers' jar gives an error:
$ jar --file=tika-parsers-1.24.jar --describe-module
Unable to derive module descriptor for: tika-parsers-1.24.jar
Provider class org.apache.tika.parser.onenote.OneNoteParser not in module
Does this mean the jar package for parsers is not well formed?
Is there any workaround for this?
Thank you.
EDIT:
If I try with version 1.24.1, I get the execution error:
Error occurred during initialization of boot layer
java.lang.module.FindException: Unable to derive module descriptor for C:\Users\Admin\.m2\repository\org\apache\tika\tika-parsers\1.24.1\tika-parsers-1.24.1.jar
Caused by: java.lang.module.InvalidModuleDescriptorException: Provider class org.apache.tika.parser.external.CompositeExternalParser not in module
That is: the failing class is CompositeExternalParser instead of OneNoreParser.
Inspecting META-INF/services/org.apache.tika.parser.Parser of tika-parsers-1.42.1.jarI can see the entryorg.apache.tika.parser.external.CompositeExternalParser` but the package does not contain this class.
So, it seems to be an error in this META-INF file. Id this due to an error when compiling the package and submitting it to Maven Central?
I've found a JIRA issue, TIKA-2929, where they say "Apache Tika needs to be on the Java Classpath, not the module path". I've tried this, but, as explained before, I get a compilation error if I don't add it to the module path and set requires org.apache.tika.parsers;.
This is a hard puzzle...
Ran into the same issues.
Also found the faulty entries in
org.apache.tika.parser.Parser (and also org.apache.tika.parser.Detector) in META-INF/services/
A quick fix is to ...
Unpack those files
delete the lines that seem to reference non existing classes
pack them back into the jar
My project compiled after that.
For sure no longterm solution, but since even older versions i tried ran into that problem, it might help out some people.

Java module system and AspectJ

I have 2 libraries, which I have migrated to Java 10 and used module system.
First thing which makes me worry is that I had a lot of errors like:
Error:java: the unnamed module reads package org.aspectj.internal.lang.reflect from both aspectjrt and org.aspectj.weaver. To fix that, I have added requires org.aspectj.weaver; to module info. Actually I had to put there lot of other things which I don't use, but e.g. Spring uses. Only then I was able to compile it.
So in the end, my module-info for the first library looks like this:
module my.lib1 {
requires spring.context;
requires spring.core;
requires spring.context.support;
requires spring.beans;
requires org.aspectj.weaver;
requires slf4j.api;
requires metrics.core;
requires com.fasterxml.jackson.databind;
requires java.validation;
requires org.hibernate.validator;
requires javax.el;
exports my.lib1;
}
For the second library I also had to add a lot of libs that are used by dependencies, not me:
module my.lib2 {
requires org.hibernate.orm.core;
requires java.sql;
requires java.persistence;
requires spring.context;
requires spring.tx;
requires spring.orm;
requires spring.data.jpa;
requires spring.beans;
requires HikariCP;
requires metrics.core;
requires slf4j.api;
requires spring.core;
exports my.lib2;
}
Both libs are compiling now. I put them in my local mvn repo and started third project which depends on this two.
module my.project {
requires my.lib1;
requires my.lib2;
}
And now I got the same error... Error:java: the unnamed module reads package org.aspectj.internal.lang.reflect from both aspectjrt and org.aspectj.weaver, but this time, adding requires org.aspectj.weaver; doesn't help. I have noticed when I put only one of the libs in the module (not both together, but one lib1 or lib2) it works.
Is it normal that I have to put different libs in my module-info which are used not by my but by other dependencies? (e.g. shouldn't it be Spring's responsibility to require aspectj?).
And the most important thing: how do I fix the problem with my project which depends on my two libs?
I have noticed when I put only one of the libs in the module (not both together, but one lib1 or lib2) it works.
Using the module system, a package may only be offered by a single module in your current module graph. This is one of the main reasons for the introduction of the module system: avoidance of ambiguous dependencies (as they occur on the classic class path; called split packages). And that's why it works with only one of your modules.
In your case, both modules aspectjrt and org.aspectj.weaver are offering the same package org.aspectj.internal.lang.reflect and that's where the error message comes from (the unnamed module reads package org.aspectj.internal.lang.reflect from both aspectjrt and org.aspectj.weaver).
how do I fix the problem with my project which depends on my two libs?
like Andy Guibert wrote to this topic, you could:
unsplit the package
bundle them in a single package
hope that third party modules become named modules in the future.
AspectJ has been ported to the module system. Check the newest version, if you don't use it (requires org.aspectj.runtime, requires org.aspectj.weaver)?
Additionally, here you can find some more information about the module system.

Categories

Resources