I have a project(Java 12) with several Maven dependencies, and now I'm trying to add module-info file like
module mymodule {
requires java.net.http;
}
But if I do this all Maven dependecies (in pom.xml) become invisible for code, and compiler throws errors like java: package org.openqa.selenium.safari is not visible
(package org.openqa.selenium.safari is declared in module selenium.safari.driver, but module mymodule does not read it)
Is it possible to make them work together?
The new module info ist not congruent with the information in the pom.xml. Robert wrote a good article about the differences of both systems:
https://www.sitepoint.com/maven-cannot-generate-module-declaration/
Related
I'm creating a sample demo application with JavaFX in IntelliJ, but I need to use a library called the JavaFaker library. I'm using Gradle as the build system, but every time I try to add the library, either as the implementation in the build.gradle file, or via IntelliJ project structure options, the module.java file says error: module not found. I've already tried adding it to modules but nothing changes.
module-info.java
module com.example.demo1 {
requires javafx.controls;
requires javafx.fxml;
requires javafaker;
opens com.example.demo1 to javafx.fxml;
exports com.example.demo1;
}
build.gradle
plugins {
id 'java'
id 'application'
id 'org.openjfx.javafxplugin' version '0.0.10'
id 'org.beryx.jlink' version '2.24.1'
}
group 'com.example'
version '1.0-SNAPSHOT'
repositories {
mavenCentral()
}
ext {
junitVersion = '5.8.2'
javaFakerVersion = '1.0.2'
}
sourceCompatibility = '17'
targetCompatibility = '17'
tasks.withType(JavaCompile) {
options.encoding = 'UTF-8'
}
application {
mainModule = 'com.example.demo1'
mainClass = 'com.example.demo1.HelloApplication'
}
javafx {
version = '17.0.1'
modules = ['javafx.controls', 'javafx.fxml']
}
dependencies {
implementation("com.github.javafaker:javafaker:${javaFakerVersion}")
testImplementation("org.junit.jupiter:junit-jupiter-api:${junitVersion}")
testRuntimeOnly("org.junit.jupiter:junit-jupiter-engine:${junitVersion}")
}
test {
useJUnitPlatform()
}
jlink {
imageZip = project.file("${buildDir}/distributions/app-${javafx.platform.classifier}.zip") as RegularFile
options = ['--strip-debug', '--compress', '2', '--no-header-files', '--no-man-pages']
launcher {
name = 'app'
}
}
jlinkZip {
group = 'distribution'
}
error message
> Task :HelloApplication.main() FAILED
Error occurred during initialization of boot layer
java.lang.module.FindException: Module javafaker not found, required by com.example.demo1
I tried for a while to get this to work with Gradle but was unable to. I don't know Gradle well, but unless you do, I don't advise trying it.
Alternate option: use a static import
I didn't try this, but this is suggested in another answer.
Before you try this, see:
What's the difference between requires and requires static in module declaration
It is IMO, a bit of a hack in this usage case. This makes the module optional at runtime. But, if the module is on the classpath instead of the module path its code can still be used. More information quoted from the linked answer:
A requires static clause expresses a dependency that is optional at
run time. That means at compile time the module system behaves exactly
as described above.
At run time, on the other hand, it mostly ignores requires static
clauses. If it encounters one, it does not resolve it. That means, if
an observable module is only referenced with requires static, it does
not make it into the module graph!
Alternate option: Non-modular project
You can fix this issue by making your project non-modular:
Delete your module-info.java file.
Run your application with JavaFX modules on the module-path.
The org.openjfx.javafxplugin you are already doing will help achieve this by specifying the modules to be used.
To execute the application directly in the IDE rather than through Gradle, you will need to specify the module options to the VM for the IDE execution configuration (information on that is in the getting started documentation at openjfx.io).
For packaging, switch to using the badass-runtime-plugin rather than the badass-jlink-plugin. This will package the application via jpackage rather than jlink (which cannot package non-modular applications or applications with automatic modules).
In the application block of your build file, you no longer need to specify the module for your application as you no longer have one.
While that means that your application is no longer modular, in this case, in my opinion, this is not such a big loss. The dependencies you are using are not well-defined modules, so you can't use jlink to create a package for your application, and you don't have the level of modular encapsulation and definition you would normally receive for fully modular projects.
For more information, see the Getting started instructions at:
https://openjfx.io/openjfx-docs/
Under the sections "Non-Modular with Gradle" for your selected IDE.
Alternate option: Using Maven
It is easy to get this to work with Maven.
Create a new JavaFX project
Choose Maven as your build system instead of Gradle.
Add the javafaker dependency to your pom.xml.
<dependency>
<groupId>com.github.javafaker</groupId>
<artifactId>javafaker</artifactId>
<version>1.0.2</version>
</dependency>
Press the refresh icon in the Maven window to reimport the Maven project into the IDE.
Add the requires clause for the javafaker module into your module-info.java
requires javafaker;
Add the code to use javafaker to your app.
I don't have code to use javafaker, so I could not verify that the last step would work, but, give it a try . . .
Why you can receive this issue when using Gradle, but not Maven
Looking at the Gradle Documentation section "Using libraries that are not modules":
A third case are traditional libraries that provide no module information at all — for example commons-cli:commons-cli:1.4. Gradle puts such libraries on the classpath instead of the module path. The classpath is then treated as one module (the so called unnamed module) by Java.
This is the case with the javafaker dependency that you are using. It has no module-info.java and does not define the property Automatic-Module-Name in its manifest file (which are the other two cases in the section). Both the other cases result in Gradle putting the library on the module path, but the case you have means that it is on the class path.
This is a problem when you want to access the code from a named module that you define, which you have because you created a module-info.java.
Your module can only find code and resources of modules it requires (which need to be on the module path), so you add requires javafaker to the module-info.java, and get the following when you try to run through the IDE:
java.lang.module.FindException: Module javafaker not found, required by com.example.demo1
So you remove the requires javafaker from the module-info.java as advised by the Gradle documentation I linked and you get the following when you try to compile:
Package 'com.github.javafaker' is declared in module 'javafaker', but module 'com.example.demo1' does not read it
So you must place the library in the module-info to use it, but you can't place the library in module-info because Gradle puts in on the classpath -> catch-22.
There are workarounds to this such as providing VM arguments to allow access to the unnamed module (which is the classpath), or maybe modifying the module path handling of the Gradle build and/or IDE somehow (I don't know how), but they are pretty ugly.
On the other hand, for this case, Maven acts differently from Gradle, it places the dependent library on the module path, even if it does not have a module-info.java or Automatic-Module-Name defined. This means that it was (for me) much easier to set up and use.
Incidental advice on module naming
This is not an error, but note: Although module names with numbers in them are now allowed due to a change in the module system specification, it is probably best not to put numbers in module names to prevent the module name and version info being confused.
I've had a similar issue recently. Adding static to the requires statement helped however. Maybe this will fix your issue without having to switch to maven.
So you'd need to add: requires static javafaker;
How can we create a Maven test jar with Java 11 modules?
The only way I found is to add a module-info.java file to test/java and change the module name (e.g. append ".test"). Then provide the class in a separate package (e.g. append ".test") and export that package:
module my.module.test {
requires my.module;
exports my.module.test;
}
Otherwise the classes are not visible or I get split package issues.
But this isn't really the purpose of the test-jar goal and it limits access to "my.module".
What is the proper way to use test-jar with Java 11 modules? Or should it be avoided?
I am trying to call a non-module class from a module class. I have created a folder structure
moduledemo > allclasses > moduleC > packageC > MyMethods.class
is the path to my module class file
moduledemo > moduleC > packageC > MyMethods.java
and
moduledemo > nomodule > packageD > DemoNoModule.class
is the no module class that I am calling from MyMethods.java
I am able to compile the DemoNoModule file. I am able to compile MyMethods.java into allclasses folder moduleC.
When I am running MyMethods I am getting error moduleC not found. Can anyone update? I am using the following command to run
java --module-path allclasses -m moduleC/packageC.MyMethods
Both files code -> Non-Module Class
package packageD;
public class DemoNoModule {
public void showD() {
System.out.println("this is show of D in No Module");
}
}
Module class calling class
package packageC;
import packageD.*;
public class MyMethods {
public static void main(String s[]) {
DemoNoModule d=new DemoNoModule();
d.showD();
}
}
Module info in module C
module moduleC {
exports packageC;
}
On one hand, the moduleC(mind improving naming?) is a named module.
While on another, the "no module class" termed by you is nothing but as stated by Alan a class present on the classpath. The classes present on the classpath during the execution are part of an unnamed module in JPMS.
Quoting the documentation further:-
The unnamed module exports all of its packages. This enables
flexible migration... It does not, however, mean
that code in a named module can access types in the unnamed module. A
named module cannot, in fact, even declare a dependence upon the
unnamed module.
This is intentional to preserve the reliable configuration in the module system. As stated further :
If a package is defined in both a named module and the unnamed module
then the package in the unnamed module is ignored. This preserves
reliable configuration even in the face of the chaos of the class
path, ensuring that every module still reads at most one module
defining a given package.
Still, to make use of a class from the unnamed module in your named module moduleC, you can follow the suggestion of making use of the flag to add ALL-UNNAMED module to be read by modules on the module path using the
following command:
--add-reads <source-module>=<target-module> // moduleC=ALL-UNNAMED
As a special case, if the <target-module> is ALL-UNNAMED then
readability edges will be added from the source module to all present
and future unnamed modules, including that corresponding to the class
path.
PS: Do take into consideration the highlighted portion(above) of the documentation as you do so.
Also note the long-term solution would be to revise your design here, for which you can plan to move your code in the class DemoNoModule into an explicit module or package it separately to be converted into an automatic module.
Java 9 programs are supposed to be modular. That is how I understood jigsaw in JDK-9. So, IMHO, you'll have to 'wrap' your packageD in another module and in the module-info for moduleC write requires moduleD. Also moduleD should export packageD.
ALL-UNNAMED is added for backward compatibility, and I suppose it will be removed in some point of Java evolution.
I have a project like this:
\---main
\---src
\---com.foo
\---UnnamedStart.java
\---api
\---src
\---com.foo.api
\---ApiInterface.java
\---module-info.java
\---impl
\---src
\---com.foo.impl
\---ApiInterfaceImpl.java
\---module-info.java
Implementatio of UnnamedStart.java is:
public static void main(String[] args) {
ServiceLoader<ApiInterface> services = ServiceLoader.load(ApiInterface.class);
...
}
Note that main is unnamed module.
api/src/module-info.java is:
module com.foo.api {
exports com.foo.api;
}
and impl/src/module-info.java is:
update 1.1 - code below updated see comments, added requires
update 1.2 - code below updated, provides A with B changed to provides B with A mistake during creating question, originally was ok
module com.foo.impl {
requires com.foo.api; //added (update 1.1)
provides com.foo.impl.ApiInterface
with com.foo.api.ApiInterfaceImpl; //vice versa (update 1.2)
}
When I run my code in UnnamedStart.java I end up with no element in services.
I also tried to create a static method in com.foo.api.ApiInterface:
static List<ApiInterface> getInstances() {
ServiceLoader<ApiInterface> services = ServiceLoader.load(ApiInterface.class);
List<ApiInterface> list = new ArrayList<>();
services.iterator().forEachRemaining(list::add);
return list;
}
and add in api/src/module-info.java line uses com.foo.api.ApiInterface; but it gave the same result (nothing).
The only way I made it work is by migrating main from unnamed to named module.
1. How does java 9 work when unnamed module trying to interact with named module?
2. Does it possible to make it work and keeping the main like unnamed module?
update 1.3 - added related project
ServiceLoader::load works as usual, but the are other things.
[Short answer]
1. Unnamed module reads the same like named module to named module, but named module can not access types in the unnamed module.
2. You are trying to launch an application from a non-modular JAR so you have to explicitly resolve required modules by --add-modules com.foo.impl.
Note that your required modules have to be on module graph (e.g. add by --module-path).
[More details]
1. There are 4 different types of modules: built-in platform module, named module, automatic module, unnamed module and each of them are named apart from unnamed module
As they wrote the unnamed module treats all the other modules the same like named module:
All other modules have names, of course, so we will henceforth refer to those as named modules.
The unnamed module reads every other module. [...]
The unnamed module exports all of its packages. [...] It does not, however, mean that code in a named module can access types in the unnamed module. A named module cannot, in fact, even declare a dependence upon the unnamed module.
[...]
If a package is defined in both a named module and the unnamed module then the package in the unnamed module is ignored.
Even an automatic module indeed is also named:
An automatic module is a named module that is defined implicitly, since it does not have a module declaration.
2. Second part of this answer
If you compile non-modular code or launch an application from a non-modular JAR, the module system is still in play and because non-modular code does not express any dependencies, it will not resolve modules from the module path.
So if non-modular code depends on artifacts on the module path, you need to add them manually with the --add-modules option. Not necessarily all of them, just those that you directly depend on (the module system will pull in transitive dependencies) - or you can use ALL-MODULE-PATH (check the linked post, it explains this in more detail).
This #nullpointer comment will be useful
Also, the module resolution still needed the impl to be resolved during the startup. To check which you could also make use of the --show-module-resolution flag.
working with java 9 modules, if i am using java.xml in my code...
1) i will import xml package using import statement...
2) if i don't mention that this package is required in the module declaration of my module...
- will the compilation of my module work.. ??
i would guess... no... and on mentioning that xml package is required on module-info.java... it might work.
so.. what I am wondering is... is that not redundancy... every importing package is implicitly... required. (unless i need to understand module even better)
Is there a way to mention that all imported packages are required in module declaration, other wise it could be a long list to mention in module-info.java?
First of all, in module-info.java you mention modules, not packages. E.g. java.xml is a module which contains about 25 packages. So, if your module uses 10 packages from the java.xml module, you don't have to repeat that 10 times in module-info.java, you write requires java.xml just once. So, that huge list of dependencies is not huge actually.
If you really want to skip all those declarations, you can just not create module-info.java (but I don't recommend to do that). A module that does not have module-info.java is called an automatic module and it implicitly requires all other modules.