How to use jlink with automated libraries - java

I am having a problem where I can not use jlink for compiling and I now kind of know what is wrong: jlink can not compile automated modules because they have unstable names. Is there a workaround or a fix for this? Here are the relevant parts of my code:
module-info.java
module org.example {
requires MathParser.org.mXparser;
exports org.example;
}
pom.xml
...
<dependency>
<groupId>org.mariuszgromada.math</groupId>
<artifactId>MathParser.org-mXparser</artifactId>
<version>4.4.2</version>
</dependency>
...
Main.java
...
import org.mariuszgromada.math.mxparser.*;
...
When compiling with jlink, it gives this error message:
Required filename-based automodules detected. Please don't publish this project to a public artifact repository!
This question is no duplicate because it was not answered yet. The suggested question did not receive a fix or a workaround. Still it answers the technical things: What does "Required filename-based automodules detected." warning mean?

No, there is no workaround for your case. jlink doesn't support automatic modules (even stable names will not help). You need to convert your automatic module to an explicit module first. The good news is that you don't have to recompile your library. You can generate a module declaration with jdeps --generate-module-info and then inject it into the JAR. This was described in this question.

Related

same accessible package from two different modules in Lucene leading Eclipse to raise an error [duplicate]

I'm developing on a Maven project (branch platform-bom_brussels-sr7) in Eclipse. When I recently tried switching the Java Build Path for the project to JDK 10, Eclipse build can no longer find classes such as javax.xml.xpath.XPath, org.w3c.dom.Document, or org.xml.sax.SAXException. It seems only XML related classes are impacted, mostly from the Maven dependency xml-apis-1.4.01.
Trying a Maven build from Eclipse works without errors. Ctrl-LeftClick on one of the supposedly missing classes finds the class and opens it in the Eclipse editor. It seems only the Eclipse build is impacted.
I tried several things, but none helped. I tried:
Project Clean
Different Eclipse Versions: Oxygen and Photon.
Running Eclipse itself with JDK 8 and JDK 10.
Changing Compiler Compliance level for the project. It builds with compliance level 8 and 10 under JDK 8 build path and fails for both with JDK 10 in build path.
I assume that the project being migrated from Java 1.8 still has no module-info.java. This implies you are compiling code in the "unnamed module".
Code in the unnamed module "reads" all observable named and unnamed modules, in particular it reads module "java.xml" from the JRE System Library. This module exports package like java.xml.xpath.
Additionally, you have xml-apis.java on the classpath, which contributes another set of packages of the same names (java.xml.xpath and friends). These are said to be associated to the unnamed module, like your own code.
This situation violates the requirement of "unique visibility" as defined in JLS §7.4.3 (last paragraph). In particular every qualified type name Q.Id (JSL §6.5.5.2) requires that its prefix Q is a uniquely visible package (I'm disregarding the case of nested types for simplicity). Ergo: the program is illegal and must be rejected by compilers.
This leaves us with one question and two solutions:
(1) Question: Why is javac accepting the program?
(2) Solution: If you add module-info.java to your project, you can control via requires which module your project reads, either requires java.xml; or requires xml.apis; (where "xml.apis" is the automatic module name of "xml-apis-1.4.01.jar).
(3) Solution: Short of turning your project into a module, you can still avoid the conflict by excluding java.xml from the set of observable modules. On the command line this would be done using --limit-modules. The equivalent in Eclipse is the "Modularity Details" dialog, see also the JDT 4.8 New&Noteworthy (look for Contents tab). Since java.xml is implicitly required via a lot of other default-observable modules, it may be a good idea to push everything except for java.base from right ("Explicitly included modules") to left ("Available modules") (and selectively re-add those modules that your project needs).
PS: Eclipse still doesn't provide an ideal error message, instead of "cannot be resolved" it should actually say: "The package javax.xml.xpath is accessible from more than one module: javax.xml, <unnamed>.
PPS: Also weird: how come that changing the order between JRE and a jar on the classpath (such ordering is not a concept supported by javac nor JEP 261) changes the behavior of the compiler.
EDITs:
Alex Buckley confirmed that the given situation is illegal, despite what javac says. Bug against javac has been raised as JDK-8215739. This bug has been acknowledged months before the release of Java 12. As of 2019-06 it has been decided that also Java 13 will ship without a fix. Similarly for Java 14. The bug was temporarily scheduled for Java 15, but this plan has been dropped on 2020-04-20.
Eclipse error message has been improved to mention the real problem.
In Eclipse 2019-06 the UI used for Solution (3) has been revamped. Up-to-date documentation can be found in the online help.
As of 2022-12 there's yet another perspective on this issue as described in my other answer. It doesn't invalidate what's said here, but let's things appear in a different light.
In my case the problem was that xercesImpl : 2.10.0 was a (transient) dependency. This jar bundles org.w3c.dom.html.HTMLDOMImplementation.
As far as I understand the org.w3c.dom package then becomes available from two modules, causing the build to fail.
In case one of the dependencies (direct or transient) has classes in one of the 25 packages exported by the java.xml module your build will fail.
Excluding xercesImpl (and also the offenders listed below) in Maven solved the issue for me:
<dependency>
<groupId>xyz</groupId>
<artifactId>xyz</artifactId>
<version>1.0</version>
<exclusions>
<exclusion>
<groupId>xerces</groupId>
<artifactId>xercesImpl</artifactId>
</exclusion>
<exclusion>
<groupId>xml-apis</groupId>
<artifactId>xml-apis</artifactId>
</exclusion>
<exclusion>
...
</exclusion>
</exclusions>
</dependency>
Thanks to Rune Flobakk for giving the hint here: https://bugs.eclipse.org/bugs/show_bug.cgi?id=536928#c73
Other offenders:
batik-ext : 1.9 (bundles org.w3c.dom.Window)
xom : 1.2.5 (bundles org.w3c.dom.UserDataHandler)
stax-api : 1.0.2 (bundles javax.xml.stream.EventFilter)
xml-apis : 1.4.01 (bundles org.w3c.dom.Document)
xml-beans : 2.3.0 (bundles org.w3c.dom.TypeInfo)
While the accepted answer (by myself) is still correct, a further twist of the story was recently brought to my attention:
The original intention may have been to actually support the situation at hand.
See this quote in the original design document "The State of the Module System" (SotMS):
If a package is defined in both a named module and the unnamed module then the package in the unnamed module is ignored.
That document is dated 2016/3/8 08:18, and already at that time was marked "This document is slightly out of date". Moreover, it is not legally binding for any implementation. Still that document has some relevance since what's quoted above is precisely what javac appears to implement (and still implements many years after JDK-8215739 was filed).
IOW, the conflict is not so much a conflict between 1st and 2nd implementation, but a conflict even within Oracle, so it seems. 2 Votes for supporting the situation (SotMS and javac) and only one vote for disallowing (JLS).
Since Eclipse committers are not inclined to resolve this conflict within Oracle, the recent 2022-12 release of Eclipse has a new compiler option: by adding the following line to a project's .settings/org.eclipse.jdt.core.prefs, a user may opt to ignore JLS in this regard:
org.eclipse.jdt.core.compiler.ignoreUnnamedModuleForSplitPackage=ENABLED
This option puts the decision into the user's hands: do they want JLS-semantics or SotMS/javac semantics (in this particular issue)? Still we were not quite ready to provide a UI option for it, to avoid that users made this choice thoughtlessly, without the background information as provided here.
Personally, I'm not particularly happy about this situation, as it aggravates the fact that Java is not one, but several languages.
This seems to have been reported as Eclipse Bug 536928. Maybe if everyone were to go vote on it it would get them to raise the priority.
What happens here is you have a wildcard import like import org.w3c.dom.*, stating you want to import all classes from package org.w3c.dom. Now, if there's at least one class in org.w3c.dom provided by a second source, Java must not start (as pointed out here).
(By the way, the message "... cannot be resolved" is replaced by a more accurate error message "The package org.w3c.dom is accessible from more than one module: <unnamed>, java.xml" in more recent Eclipse versions, see this merged change request by Stephan Herrmann.)
To resolve this problem
Open the "Open Type" dialog (Ctrl+Shift+T).
Enter the complete import, so org.w3c.dom.* or org.w3c.dom..
Check the entire list for multiple sources. All entries here should contain only something like "jdk-11-...".
Gather all JARs that contain classes you have multiple sources for.
Open the "Dependency Hirarchy" tab from pom.xml.
Search for the JAR file.
Add an exlusion (right click or edit the pom.xml manually).
Example
I had this findbugs dependency in my pom.xml:
<dependency>
<groupId>com.google.code.findbugs</groupId>
<artifactId>findbugs</artifactId>
<version>${findbugs.version}</version>
</dependency>
Findbugs has two dependencies that need to be excluded:
<dependency>
<groupId>com.google.code.findbugs</groupId>
<artifactId>findbugs</artifactId>
<version>${findbugs.version}</version>
<exclusion>
<groupId>xml-apis</groupId>
<artifactId>xml-apis</artifactId>
</exclusion>
<exclusion>
<groupId>jaxen</groupId>
<artifactId>jaxen</artifactId>
</exclusion>
</dependency>
While Stephan Herrmann's answer is the correct one, I'll post my error and how I got it solved if it can help others. I had the error The package javax.xml.namespace is accessible from more than one module: <unnamed>, java.xml and after inspecting the class with the error, it was the javax.xml.namespace.QName import that was complaining. With the "Open Type" dialog, I found out that it was pulled from stax-api through eureka client. This solved it for me :
<exclusion>
<groupId>stax</groupId>
<artifactId>stax-api</artifactId>
</exclusion>
Have seen something very similar under Eclipse 4.8.0 and JDK 10. E.g.
import org.w3c.dom.Element;
was failing to compile in Eclipse with: The import org.w3c.dom.Element cannot be resolved
Even so, pressing F3 (Open Declaration) on that import, Eclipse was able to open the interface definition - in this case under xml-apis-1.4.01.jar.
Meanwhile, builds from Maven direct were working fine.
In this case the fix was to remove this dependency from the pom.xml:
<dependency>
<groupId>xml-apis</groupId>
<artifactId>xml-apis</artifactId>
<version>1.4.01</version>
</dependency>
Then the compile errors in Eclipse melted away. Following F3 again showed the Element interface - now under the java.xml module, under the JRE System Library under the project. Also the Maven build remained fine.
This feels like a problem with Eclipse resolving a class that it finds in both a JDK module and dependent .jar file.
Interestingly, in a separate environment, this time under Eclipse 4.9.0 and JDK 11, all is fine, with or without the xml-apis:1.4.01 dependency.
This is more of a work-around, but from my experience it can be resolved by going to the "Java Build Path", the "Order and Export" tab, and sending the "Maven Dependencies" to the bottom (so it's below the "JRE System Library").
Thanks for this clue. I was having trouble identifying where the conflicting reference was coming from for org.w3c.dom.Document. Found it easily in Eclipse 2020-12 this way: Selected org.w3c.dom.Document within the import statement that Eclipse flagged, right-click and choose Open Type Hierarchy, in the Type Hierarchy dialog right click Document at the top and choose Implementors > Workspace to reveal all the JARs in all projects in the workspace which are bringing in org.w3c.dom.Document (or whatever type you have selected that is accessible from more than one module – MikeOnline yesterday
following the directions above from one of the earlier posts helped us solve our issue.
what we did was replace Document with GenericDocument and Element with GenericElement from batik - and compile errors are gone - now we just have to test to make sure the implementation matches what we had under java 8. Thanks MikeOnline
jdk 9+ brought in changes related to project jigsaw. JDK was broken down into various modules and some modules, javaee, jaxb and xml related, are no more loaded by default. You should add these to your maven build directly, instead of expecting them to be in jre classpath. see this SO question

Access `sun.security.x509` in JDK 11 without modules?

(tl,dr at the end)
We have a small method that generates self-signed SSL certificate and it obviously depends on sun.security.x509. Currently we are still building it using JDK8 because of that, even though the rest of the codebase (it's only small, single library) is build using JDK11 and run with JVM11.
Unfortunately there aren't replacement in the main JDK, as per (and CertificateFactory has little to nothing with generating certificates, contrary to what it's javadoc states…):
https://bugs.java.com/bugdatabase/view_bug.do?bug_id=JDK-8165481
https://bugs.java.com/bugdatabase/view_bug.do?bug_id=8058778
One option would be to use BouncyCastle, but that's additional 4MB that we really don't need, especially for such small task so I was pondering ways to access it while
From what I saw, the package and required classes are still package and relevant classes are still there (see sun.security.x509 on github but when building it (using maven) I get error:
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.1:compile (default-compile) on project: Compilation failure: Compilation failure:
[ERROR] OldSelfSignedCertificateGenerator.java:[20,25] package sun.security.x509 does not exist
[ERROR] OldSelfSignedCertificateGenerator.java:[71,45] cannot find symbol
[ERROR] symbol: class X509CertInfo
[ERROR] location: class OldSelfSignedCertificateGenerator
I was searching a bit and adding:
<arg>--add-exports</arg><arg>java.base/sun.security.x509=ALL-UNNAMED</arg>
to maven-compiler-plugin and it somewhat worked - I only get WARNING not regarding sun.security.x509 package:
[WARNING] OldSelfSignedCertificateGenerator.java:[20,25] sun.security.x509.AlgorithmId is internal proprietary API and may be removed in a future release
BUT! Now it seems I entered (unwillingly!) module system and it complains about access to other, basic Java classes (and one more our dependency):
[ERROR] CertificateUtil.java:[35,17] package java.util.logging is not visible
(package java.util.logging is declared in module java.logging, but module java.base does not read it)
I tried adding java.logging module in the same manner to exports but without much success. It also seems that I would have to convert both this library and it's dependency to module system, which is not really desired.
The question is somewhat related to How to generate a self-signed certificate using only JDK supported classes?
tl,dr;
is there a way to compile library using sun.security.x509 package under JDK 11 without module system? Some simple switch?
It turns out that presumably it has to do with the fact that builds produced by newer JDK (9+) Versions won't be executable under JDK8:
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>9</source>
<target>9</target>
<release combine.self="override"></release>
<compilerArgs>
<arg>--add-exports</arg><arg>java.base/sun.security.x509=ALL-UNNAMED</arg>
</compilerArgs>
</configuration>
</plugin>
To include sun.security.[somePackage] classes in gradle you may add:
tasks.withType(AbstractCompile) {
options.compilerArgs += ["--add-exports", "java.base/sun.security.util=ALL-UNNAMED"]
options.compilerArgs += ["--add-exports", "java.base/sun.security.pkcs=ALL-UNNAMED"]
}

Unable to derive module descriptor: Provider {class X} not in module

I am getting this error message when I try to compile my new modularized Java 11 application:
Error occurred during initialization of boot layer
java.lang.module.FindException: Unable to derive module descriptor for C:\Users\inter\.m2\repository\xalan\xalan\2.7.2\xalan-2.7.2.jar
Caused by: java.lang.module.InvalidModuleDescriptorException: Provider class org.apache.bsf.BSFManager not in module
This appears to be an issue from a dependency of a dependency. I can't even find which module is pulling it in so I can update it.
I am using openjdk 11.0.2, IntelliJ 2018.3.4, Maven
Any advice how I can troubleshoot or fix this? I have found very little documentation on this issue.
Xalan
I had a look at their bug tracker following their index page and wasn't able to find this reported and not sure how actively is the library being maintained either.
General Explanation
Just to explain what has caused the issue in your code, I would share a screenshot and then try to add details around it.
So within the JAR that for version 2.7.2, there are service declarations (META-INF/services) which include org.apache.xalan.extensions.bsf.BSFManager as one of them. The service file here has to indicate the Provider thereby for itself and the class is supposed to be present on the modulepath to be resolved for reliable configuration of modules.
In this case for the module xalan(automatic module), the service listed doesn't have the provider class packaged within the dependency itself. (See the package org.apache, it doesn't further have package bsf and the class BSFManager thereby. Hence the exception as you get.
Short term hack
One of the tweaks to get that resolved would be to get update the library jar (patch it) and get rid of the service file if you're not using it. Or to add the provider copied from the corresponding artifact.
If you don't directly depend on this artifact or its parent dependencies, you can let those remain on the --classpath and get resolved as an unnamed module for your application.
Long term solve
An ideal way would be to report this to the maintainers and getting it resolved. It depends though on how actively are they maintaining it e.g. the last release for xalan was almost 5 years back, might just want to look for an actively participated alternative in my opinion.
I tried to install update for TestNG in eclipse:
"Help -> Check for updates -> deselect all and select TestNG check box. Then
install latest version i installed the version which starts with 7.2.0.
It fixed the issue for me.

How to define qualified exports to unknown modules?

I have a Maven project with two Maven modules A and B. A contains the following Java module definition:
module A {
exports internal.util to B;
exports external.A;
}
B contains the following Java module definition:
module B {
requires A;
exports external.B;
}
When I build the project, I get an error:
[WARNING] module-info.java:[16,106] module not found: B
Module B exists but because Module A is compiled before B and does not depend on it, the compiler has no way of knowing that. Because I configured the compiler to treat warnings as errors (-Werror), the build fails.
Seeing as I want to keep treating warnings as errors, what is the best way to resolve this problem?
Is there a way to hint to the compiler that this module will be declared in the future?
Is there a way to suppress all warnings of this type?
I figured out a workaround by scanning through the JDK 11 source-code: -Xlint:-module. I am still open to a better solution if someone finds one.
UPDATE: An alternative is to use --module-source-path as demonstrated by https://stackoverflow.com/a/53717183/14731
Thank you Alan Bateman for pointing me in this direction!
As I could infer from the question, your scenario is such that both the module A and B co-exists and the module A does not depend on the module B.
From a module system prospect, since A makes a qualified export to the module B, a warning is thrown if the module B is not resolved on the modulepath while you build A.
On IntelliJ-IDEA I could fix such an issue by adding a dependency of module B to the module A without any declaration changes. Though using the Maven framework, I couldn't imagine something without a cyclic dependency here, maybe the suggestion by khmarbaise could help.
Note: An alternate way to fix this though could also be to disable such warnings using the command-line arg :
-Xlint:-exports
Or as Alan points out for correction
-Xlint:-module
Thinking out loud, that would be since, the warning is a result of a qualified export, but about a module not found.
Useful: You could find the details over these command line args with javac command.

warning: unknown enum constant Status.STABLE

In the quest to solve this and somehow that, I was trying out to create packages to subdivide main and test classes and then to make use of compiler with added modules to execute the unit-tests. Not a very good way agreed, but just a hypothetical structure for now.
Few open questions as I proceeded further were:-
Add a JDK9 based module to the project.
Add JUnit5 to the classpath using IntelliJ's shortcut. (lib folder) [junit-jupiter-api-5.0.0.jar]
Q. Note that it brings along the opentest4j-1.0.0.jar to the lib/ folder. Why is that so, what is the other jar used for?
Add the classes and generate some tests method correspondingly.
Compile the sample project (shared just to draw a picture of the directory structure in use) using the command
javac --module-path lib -d "target" $(find src -name "*.java")
Results into warnings as -
warning: unknown enum constant Status.STABLE
reason: class file for org.apiguardian.api.API$Status not found
warning: unknown enum constant Status.STABLE
2 warnings
Note:-
I find the usage of junit-jupiter suspicious since if I comment out the code using JUnit and execute the same command, things seem to be working fine.
Libraries/Tools used if that might matter:-
junit-jupiter-api-5.0.0 with
Java version "9" (build 9+181)
IntelliJ 2017.2.5
Q. What could be a probable cause to such a warning? Moreover, I am unable to find the API.Status in my project and outside the project classes as well.
The compilation warning can simply be ignored. Moreover, it won't be appearing anymore starting with the version 5.1.0 (currently in development). It is all explained in Release Notes:
In 5.0.1, all artifacts were changed to have an optional instead of a mandatory dependency on the #API Guardian JAR in their published Maven POMs. However, although the Java compiler should ignore missing annotation types, a lot of users have reported that compiling tests without having the #API Guardian JAR on the classpath results in warnings emitted by javac that look like this:
warning: unknown enum constant Status.STABLE
reason: class file for org.apiguardian.api.API$Status not found
To avoid confusion, the JUnit team has decided to make the dependency to the #API Guardian JAR mandatory again.
For reference also see:
Remove compile dependency on apiguardian-api in Maven POMs
Reintroduce compile dependency on apiguardian-api in Maven POMs
1) opentest4j
opentest4j is a transitive dependency of junit-jupiter-api. See the dependency graph:
+--- org.junit.jupiter:junit-jupiter-api:5.0.1
+--- org.opentest4j:opentest4j:1.0.0
\--- org.junit.platform:junit-platform-commons:1.0.1
2) unknown enum constant Status.STABLE
You need to add following dependency: apiguardian-api.
For example in Gradle, you can do it via:
dependencies {
testCompile 'org.junit.jupiter:junit-jupiter-api:5.0.1'
testRuntime 'org.junit.jupiter:junit-jupiter-engine:5.0.1'
testCompileOnly 'org.apiguardian:apiguardian-api:1.0.0'
}
But overall, dependency is build-tool-independent, so you can do it in plain IDE without Gradle, or Maven.

Categories

Resources