I've implemented jacoco to my project some times ago. The implementation is just like described here:
classpath "org.jacoco:org.jacoco.core:0.8.3"
and I have no problem on that. (Let's say this is project 1)
But the thing is whenever I'm exporting the project to .jar file. The second project (let's say project 2) that I use the .jar file gives that error on runtime:
java.lang.NoClassDefFoundError: Failed resolution of: Lorg/jacoco/agent/rt/internal_8ff85ea/Offline;
at
I've tracked the error about the library but in reality I don't want to export anything related to jacoco in project 1, because it seems meaningless to me to have that inside the jar file.
How can I keep jacoco out of the jar file?
Your error is specified in how to prevent jacoco instrumenting production code?. Basically it is explained that JaCoCo provides two ways of performing instrumentation: on the fly and offline. It also describes literally the root of your problem (in bold):
One of such specific scenarios - is execution of tests on Android,
because there is no way to use Java agent on it. So AFAIK Android
Plugin for Gradle, when instructed to measure coverage using JaCoCo,
uses offline instrumentation and therefore requires two types of build
- with coverage and without for release.
So you need to generate the jar for release. Actually in the example you used as baseline, it gives you the solution (with small corrections) to disable testCoverage for the release:
buildTypes {
release {
// Your release part, disable testCoverage
testCoverageEnabled false
}
debug {
//your debug part
//add tests coverage using Jacoco
testCoverageEnabled true
}
}
UPDATE: found this workaround for maven.
Instrumentation should not be done on every build. You do not want to release instrumented code, first because this is bad practice and second code will not run unless jacocoagent.jar is in the classpath.
Try to download and include jacocoagent.jar manually in the classpath (I did not have time to test it).
You can create a separate 'Java Library' module in your project and don't add 'Jacoco' dependencies in that module. You can just keep your core logic, which you want to be exported in that module. When you will build that project, a .jar file will get created in '{module_name}/build/' folder of your project.
You also need to add this new module dependencies in your main module 'app'. And add 'Jacoco' dependencies also in 'app' module.
Let me know, if you need further help.
Related
I have a JavaFX application that works as expected. I need to use Apache POI to read and write excel files. The following are the steps I have taken:
Added the required dependency
implementation 'org.apache.poi:poi-ooxml:5.2.3'
Added the module to module-info.java
requires org.apache.poi.ooxml;
Tried to use the library within a function:
#FXML
private void downloadTemplate() {
XSSFWorkbook workbook = new XSSFWorkbook();
}
All this is fine with no issues. However when I try to run the application, I get the following two errors (interchanging)
> Task :Start.main() FAILED
Error occurred during initialization of boot layer
java.lang.module.FindException: Module SparseBitSet not found, required by org.apache.poi.ooxml
and
> Task :Start.main() FAILED
Error occurred during initialization of boot layer
java.lang.module.FindException: Module commons.math3 not found, required by org.apache.poi.ooxml
I can however, clearly see both libraries under 'external libraries'
I am using IntelliJ Community Edition 2022.1.2 and running the project using Java 17.0.1. Any help would be highly appreciated.
SparseBitSet is an automatic module, it has no module-info of its own (probably commons-math3 is as well), and is without an Automatic-Module-Name entry in its manifest.
Gradle puts libraries without a module-info.class or an Automatic-Module-Name in their manifest on the class path, not the module path, so they won't be treated as modules, and the module finder won't find them.
You can:
hack the gradle build to allow the modules to be found. (I don't use Gradle so I have no specific advice on how to do that other than referring to the documentation).
Hack the library jar which you want to be treated as a module to include a module-info.class or an Automatic-Module-Name in its manifest.
Or, switch to maven, which automatically places automatic modules on the module path.
The easiest way to do this, IMO, is to create a new JavaFX project in Idea, then add the required dependencies as maven dependencies and add your code.
Or, as swpalmer suggests in the comments, request that library maintainers update their codebase to make their libraries modular.
And, when you run your app, make sure all jars are on the module path, not the class path.
Or, make your app non-modular by removing the module-info.java from it, then manually place the JavaFX modules on the module-path and add them with the --add-modules switch.
FAQ
Are you SURE that automatic modules are put on the class path by Gradle?
From the Gradle documentation section Building Modules for the Java Module System:
To tell the Java compiler that a Jar is a module, as opposed to a
traditional Java library, Gradle needs to place it on the so called
module path. It is an alternative to the classpath, which is the
traditional way to tell the compiler about compiled dependencies.
Gradle will automatically put a Jar of your dependencies on the module
path, instead of the classpath, if these three things are true:
java.modularity.inferModulePath is not turned off
We are actually building a module (as opposed to a traditional
library) which we expressed by adding the module-info.java file.
(Another option is to add the Automatic-Module-Name Jar manifest
attribute as described further down.)
The Jar our module depends on is itself a module, which Gradles
decides based on the presence of a module-info.class — the compiled
version of the module descriptor — in the Jar. (Or, alternatively, the
presence of an Automatic-Module-Name attribute the Jar manifest)
It is the third point that is key. Java can treat a library with no module-info.class and no Automatic-Module-Name in the Jar manifest as an automatic module if it is on the module path. However, Gradle will by default, only place libraries which fulfill one of those two conditions on the module path.
Using jewelsea's answer above, I have been able to solve the problem. I am posting the answer here to help anyone else who encounters the problem in future.
So, the overall problem is, as said in the answer above, both SparseBitSet and commons-math3 are automatic modules with no module-info of their own. The solution that worked for me was to convert them into the modules expected by the project. Here are the steps I took:
Use a gradle plugin 'extra-java-module-info'. The github page didn't show how to import it to a normal gradle file so here it is:
plugins {
id 'org.gradlex.extra-java-module-info' version '1.0'
}
Note the names that your application expects for the modules. In my case, from the error messages thrown, they were 'SparseBitSet' and 'commons-math3'
Locate the said libraries on the sidebar under 'external libraries' and note the 'jar' file names. In my case, they were 'commons-math3-3.6.1.jar' and 'SparseBitSet-1.2.jar'.
Add a section 'extraJavaModuleInfo' to your gradle files and use the parameters as follows: module('jar file name', 'name expected by your project', 'jar version'), as shown in the blue rectangle in the image above.
extraJavaModuleInfo {
module('commons-math3-3.6.1.jar', 'commons.math3', '3.6.1')
module('SparseBitSet-1.2.jar', 'SparseBitSet', '1.2')
}
That's it. Try to sync and run your project. Thanks jewelsea.
I've made a lint check, simple java-only (kotlin) module, put it in a jar file and I plan to upload it to Maven Central as a jar. I started with publishing to maven local for test.
I put the required Lint-Registry-v2 in jar's manifest and I'm confident it's there. Lint check itself is there and working in tests, IssueRegistry is there. Then I mention this library in my Android project's build.gradle, in implementation clause, so it is on the classpath, and it is seen by Android Studio.
But still lint check doesn't work and isn't being seen by the lint tool: I try to find it with Run Inspection By Name... action and it's not there.
I even tried doing lintChecks 'my.lint.check:version' but it didn't work too.
Is it simply not possible? Do I have to distribute it through aar on Maven and is this the only option?
I am trying to learn how to use gradle to make compiling and packaging my java programs easier. I already have experience programming in java and usually I just use eclipse to manage this, or, I just compile my programs manually (using javac in a terminal). However, I am getting to a point where I want to use libraries that seem to be easiest to use & maintain with gradle.
To understand more of how gradle works, I went to the gradle site where they have tutorials for making a simple gradle setup for use when writing a java application (this is the one I found). However, this tutorial didn't really explain how any of this was supposed to work, and it seemed to assume that you were following some specific system for the layout of your project. It also included some testing program or something to test your program when you build it. And in addition to these things, it never really explained how running the gradle build function worked or where (and what) the file was that held the instructions for what it was doing when you built the program.
So essentially, I am asking if someone can explain the steps to take to make a simple gradle environment that simply compiles a set of .java files and either puts them in an executable jar or just leaves them as .class files in a separate bin folder or something. It would also be helpful if you explained what each part was doing and how I can make changes to add more stuff (like basic dependencies, maybe adding .exe wrappers around the .jar, etc.)
Let's start with a simple question: What basic steps are required to build a Java project? Well, as long as your project is not using some fancy preprocessing or code generation, the first step will probably be the compilation of .java files to .class files. Those .class files (and resources, if provided) may then be packed into a .jar file. It may not seem obvious to you right now, but in general, you also should run (unit) tests when your project is built. Depending on your project, there may be many more possible steps (e.g. publishing, documentation, quality control).
In Gradle, those steps are called tasks. A task can basically do anything that helps you build your software. This often means that a task takes some input files and transforms them to some output files. As an example, a compilation task transforms .java files to .class files. Another task can transform .class files to a .jar file (lets call this task jar task). Since the .class files need to be created to put them into a .jar file, the jar task depends on the compilation task. You can express this relation using Gradle, so every time the jar task runs, Gradle will ensure that the compilation task has run beforehand. Tasks do not need to do something, they might just be used to express such relations (such a task is called a lifecycle task). You may create tasks of a specific task type to reuse functionality, for a example two different tasks may be used to compile production code and test code, but both internally use a compiler and just operate on a different set of input files.
You can create tasks manually and I definitively encourage you to create some tasks to understand how their relations and their execution works, but quite often you don't need to create tasks in your build scripts, because all necessary tasks are created by plugins. Gradle plugins are really mighty, as they can basically do everything you could do manually in your build script and there is a plugin for almost everything you might want to do in your build process.
[...] I am asking if someone can explain the steps to take to make a simple gradle environment that simply compiles a set of .java files [...]
The easiest way to compile a Java project using Gradle is to create a file build.gradle with the following content:
plugins {
id 'java'
}
That's it! Gradle is ready to build your project (just run gradle build). How is this possible? Well, the Java plugin creates all the necessary tasks to compile, test and pack your project and configures them to follow the common conventions. Take a look at the documentation of the Java plugin, it even includes a nice image that shows all the tasks and their dependencies. As shown by the image, the build task depends on all other tasks. This task is executed when you call gradle build. You can also call gradle jar or gradle assemble and Gradle will only run the tasks that are required to build the .jar file.
[...] it seemed to assume that you were following some specific system for the layout of your project.
Yes, Gradle follows an approach called convention over configuration. This means that there is a (somehow common or useful) convention for everything that otherwise would have to be configured manually.
A simple example for this approach is the location where source and resource files are expected. As you can see, this location is not configured in the build script above. However, there is a convention (established by Maven) that .java source files should go into src/main/java for production code and src/test/java for test code. Of course, those paths may be changed, but in most projects you should simply stick to the conventions.
It would also be helpful if you explained what each part was doing and how I can make changes to add more stuff (like basic dependencies [...])
Let's simply take a look at the build file in your tutorial:
plugins {
id 'application'
}
repositories {
jcenter()
}
dependencies {
testImplementation 'junit:junit:4.13'
implementation 'com.google.guava:guava:29.0-jre'
}
application {
mainClass = 'demo.App'
}
The first block is similar to the example above, but instead of the plugin with the identifier java the plugin with the identifier application is applied. However, this won't change much as the Application plugin internally applies the Java plugin, too.
Now let's take a look at the blocks repositories and dependencies. The dependencies block can be used to register dependencies. You may add dependencies on local .jar files, on other Gradle projects (in multi-project builds) or on external modules. The word external refers to remote repositories that provide libraries that you may use in your project. The lines inside the dependencies block each denote a specific dependency by defining the dependency scope and a module identifier that consists of a group identifier, an artifact identifier and a version (using : as a separator). The dependency scope (also called configuration in Gradle) basically defines where and how a dependency may be used. As an example, the first dependency can only be used in test code (due to the testImplementation scope). Now Gradle knows what library is required to build the project, but it does not know where to get that library. Here the repositories block comes to the rescue, because it can be used to define where Gradle should look for external module dependencies. Most of the time you will mainly use jcenter(), mavenCentral() or google(), however it is also possible to add repositories accessible under custom URLs.
The last part applies a configuration that is necessary, because no useful convention can be applied. Gradle simply does not know which class in your project should be used as the main class of your application, so you must define it manually.
Now, thanks to the Application plugin, you may not only build your project using gradle build, but also run your application from Gradle using gradle run, as the task run is one of the tasks created on top of the tasks created by the Java plugin.
I am trying to setup a Maven profile that uses a modified module-info.java for an otherwise unchanged project.
What I can do is:
create an alternative module-info.java in an alternative src directory.
add that source directory depending on a profile via the build-helper-maven-plugin in phase generate-sources
However I did not yet manage to let him ignore the other (default) module-info.java file and he will report a conflict (module already exists, etc.).
My question is: How can I add a requiresto a module-info.java file depending on a Maven profile?
To explain the need for this - maybe strange - request: my build process uses Clover to perform an analysis of test coverage. Clover needs to instrument the library. The clover build steps appears to require the clover.jar to be visible. The clover step works if clover is added to the module-info.java, but I do not like to have it there in the release step. So I would like to move the clover step to a profile.
Currently, my built structure for a plugin in is a bit messy: I'm using the normal IDEA project file to build the plugin locally. When I push it to the repo and travis-ci is building it, it uses the maven pom.xml because for travis to work, it always has to download the complete IDEA sources.
Although this works, this has several drawbacks:
I need to keep two built mechanisms up to date. This is
When a new IDEA version is out (every few weeks), I need to change the SDK in maven and in my IDEA settings
When I add a new library, change resources, etc. I need to do this for two the two settings as well
I ran into problems when I kept the IDEA Maven plugin turned on because it saw the pom.xml and interfered with my local built. Turning it off means, I cannot download libraries with Maven which has the feature of tracking dependencies.
I saw that Gradle has an 'idea' plugin and after googling, I got the impression that Gradle is the preferred choice these days. I have seen Best way to add Gradle support to IntelliJ IDEA and I'm sure I can use the answers there to turn my pom.xml into a valid build.gradle.
However, maybe someone else has already done this or can provide a better approach. What I'm looking for is a unified way to build my plugin locally and on Travis-CI.
Some Details
For compiling an IDEA plugin, you need its SDK which you can access through an installation of IDEA or a download of the complete package. Locally, I'm using my installation for the SDK. With Travis, my maven built has the rule to download the tar.gz and extract it.
It turns out that in particular for building an IntelliJ plugin, Gradle seems to have many advantages. This is mainly due to the great IntelliJ plugin for Gradle which makes compiling plugins so much easier. With Gradle, I could turn my >220 lines of Maven build into a few lines of easily readable Gradle code. The main advantages are that
It takes care of downloading and using the correct IDEA SDK while you only have to specify the IDEA version.
It can publish your plugin to your Jetbrains repository and make it instantly available to all users
It fixes items in your plugin.xml, e.g. you can use one central version number in gradle.build and it will keep plugin.xml up-to-date or it can include change-notes
It seamlessly integrates with Travis-CI
How to use Gradle with an existing IDEA plugin
Do it manually. It's much easier.
Create an empty build.gradle file
Look at an example and read through the README (there are many build.gradle of projects at the end) to see what each intellij property does.
Adapt it to your plugin by
Setting the intellij.version you want to build against
Setting your intellij.pluginName
Define where your sources and resources are
Define your plugin version
Define a Gradle wrapper that enables people (and Travis) to build your plugin without having Gradle
Create the gradle wrapper scripts with gradle wrapper
Test and fix your build process locally with ./gradlew assemble
If everything works well, you can push build.gradle, gradlew, gradlew.bat and the gradle-folder to your repo.
Building with Travis-CI
For Travis you want to use the gradlew script for building. To do so, you need to make it executable in the travis run. An example can be found here.