How to have many configurations of a task in gradle builds? - java

I am currently using a plugin to run benchmarks on Java code, and I used jmh-gradle-plugin to do that.
The plugin allows to conveniently describe JMH configurations with a jmh code block:
jmh {
include = ["List"]
benchmarkParameters=["seed": ["1", "2", "3", "4", "5"],
"applicationSize":["100", "1000"],
"baseStructureSize":["100", "1000"]]
fork=2
timeOnIteration="250ms"
iterations=2
warmup="250ms"
warmupIterations=2
fork=2
resultFormat="CSV"
benchmarkMode=["ss"]
timeUnit="ms"
failOnError = true
}
This is useful, but I would like to have different types of the same task, for instance, one where the output is CSV, one where the output is JSON. I know this can be configured with resultFormat=<format>, but I could not find a way to "duplicate" a task and have a different configuration for each variant.
The Gradle documentation has a page about configuring tasks, but they configure a Copy task. I thought I could follow a similar approach and write:
task myJMH(type: me.champeau.gradle.JMHTask) {
resultFormat="JSON"
}
But approach does not work, as I mentioned in this issue. I think it might be that the JMH task is just different. Registering a class of that name works, but it's not possible to configure it. I get the following error:
Could not set unknown property 'include' for task ':myJMH' of type me.champeau.gradle.JMHTask.
Similarly, I would like to have various configurations of the shadowJar task, to be able to generate several different variants of the task, but I had the same problem.

The jmh in your first example is not a task, but an extension. The plugin registers both an extension and a task with the same name. Actually, this is a prevalent pattern for Gradle plugins.
Usually, even if the tasks created by a plugin may be configured using an extension, it is still possible to configure them directly, as they still provide configuration properties. This is the case for the task type ShadowJar, so you can simply create tasks of that type manually:
// Shadowing Test Sources and Dependencies
import com.github.jengelman.gradle.plugins.shadow.tasks.ShadowJar
task testJar(type: ShadowJar) {
classifier = 'tests'
from sourceSets.test.output
configurations = [project.configurations.testRuntime]
}
Sadly, the task type JMHTask is implemented in a way that it simply retrieves the configuration from the extension, so any instance will use the same configuration.
However, you may try the following workaround:
Create tasks that configure the extension and then wire them to be executed together with the jmh task:
jmh {
include = ["List"]
benchmarkParameters=["seed": ["1", "2", "3", "4", "5"],
"applicationSize":["100", "1000"],
"baseStructureSize":["100", "1000"]]
fork=2
timeOnIteration="250ms"
iterations=2
warmup="250ms"
warmupIterations=2
fork=2
benchmarkMode=["ss"]
timeUnit="ms"
failOnError = true
}
task jmhCsv {
doFirst {
jmh.resultFormat="CSV"
}
finalizedBy 'jmh'
}
task jmhJson {
doFirst {
jmh.resultFormat="JSON"
}
finalizedBy 'jmh'
}
Please note that a task may only be executed once in a build, so this workaround won't work if you want to run different configurations in the same run.

Related

How do we properly create an execution phase tasks?

this is what the ideal build script I have:
I do want to execute tasks "unzip_natives_os" manually. But it seems it only works at config phase. And when I take a test run with this set-up it gives me an error: "java.lang.UnsatisfiedLinkError" but if I change the configuration from "nativesOS" into "runtimeOnly" inside of the dependencies block, it works fine. Do I have to explicitly create this "applicationDefaultJvmArgs" and insert the libraryPath of the natives. Is there any other way? And when I need to unzip the "nativesOS" config it needs an explicit version, it seems it did not see the platform/BOM?
// build.gradle.kts
val nativesOS : Configuration by configurations.creating {
this.isTransitive = false
this.extendsFrom(configurations.runtimeOnly.get())
}
dependencies {
implementation(platform("org.lwjgl:lwjgl-bom:3.2.3"))
listOf(
"", "-assimp", "-openal",
"-opengl", "-glfw"
).map { lib ->
implementation("org.lwjgl:lwjgl$lib")
// I give it an explicit version, because it would not work if I unzip this.
nativeOS("org.lwjgl","lwjgl$lib", "3.2.3", classifier = LWJGL.lwjglNatives)
}
...
}
// unzip_native_os tasks, here is the problem.
tasks.register<Copy>("unzip_native_os") {
this.group = "zip"
doLast {
nativesOS.asFileTree.filter { it.name.contains("natives") }.forEach {
unzipTo(File("$buildDir/libs/natives-os"), it)
}
}
}
Edited: Why this is not working? I config it first then execute it.
tasks.register<Copy>("unzip_native_os") {
this.group = "zip"
val nativesJar = nativesOS.asFileTree.filter { it.name.contains("natives") }.files
doFirst {
nativesJar.forEach {
println(">>>> $it")
unzipTo(File("$buildDir/libs/natives-os/v2"), it)
}
}
}
Edited: I found a possible answer and it looks promising but I did not implement it yet, because I need some learning to do on building this kind of script plugin/inline plugin. Here's the link: gradle custom task execution phase
Edited: found an alternative/another quick solution here: Fix custom tasks in gradle. Want to run it manually via at Execution Phase

Why there are 2 ways to create a task in gradle?

I'm currently writing a gradle script to automate some builds. However it seems that there are 2 ways to create tasks. What should I take and why are there different types in first place?
task copy(type: Copy, group: "Custom", description: "Copies sources to the dest directory") {
from "src"
into "dest"
}
vs
tasks.register("gutenTag", Greeting) {
group = 'Welcome'
description = 'Produces a German greeting'
message = 'Guten Tag'
recipient = 'Welt'
}
The first is the (one of the) older methods of adding a task to a build
The second is using register, which enables task configuration avoidance
https://docs.gradle.org/current/userguide/task_configuration_avoidance.html
That is; the task is only configured if the task is used in the build

Jenkins: Is there any API to see test reports remotely?

I'm using jenkins as CI tool. I used restful api to build a job remotely but I don't know how to get test result remotely as well.
I can't be more thankful if anybody know a solution
Use the XML or Json API. At most pages on Jenkins you can add /api/ to the url and get data in xml, json and similar formats. So for a job you can go to <Jenkins URL>/job/<Job Name>/api/xml and get informaiton about the job, builds, etc. For a build you can go to <Jenkins URL>/job/<Job Name>/<build number>/api/xml and you will get a summary for the build. Note that you can use the latestXXXBuild in order to get the latest successful, stable, failing, complete build, like this; <Jenkins URL>/job/<Job Name>/lastCompletedBuild/api/xml.
Additionally if youre using any plugin which publishes test results to the build, then for a given job you can go to <Jenkins URL>/job/<Job Name>/lastCompletedBuild/testReport/api/xml and you will get an xml report with results.
There is a lot more to it, you can control what is exported with the tree parameter and depth parameter. For a summary go to <Jenkins URL>/api/
Well, if you are using a jenkins shared library or decided to permit the security exceptions (a less good approach) then you can access them via a job and send them out to whatever you like - push vs pull
def getCurrentBuildFailedTests() {
def failedTests = []
def build = currentBuild.build()
def action = build.getActions(hudson.tasks.junit.TestResultAction.class)
if (action) {
def failures = build.getAction(hudson.tasks.junit.TestResultAction.class).getFailedTests()
println "${failures.size()} Test Results Found"
for (def failure in failures) {
failedTests.add(['name': failure.name, 'url': failure.url, 'details': failure.errorDetails])
}
}
return failedTests
}

How to order the 'running' of configuration tasks in Gradle?

I've currently got a configuration task being depended on by an execution task that I am calling from command-line, i.e.
task deployTest(dependsOn: [build, assembleForTest]) << {
...
}
This task should essentially grab the files I've assembled in assembleForTest and then deploy them (ssh, etc.)
My assembleForTest code:
task assembleForTest(type: Sync) {
fileMode = 0775
from ("scripts") {
include "**/*.cgi"
filter(org.apache.tools.ant.filters.ReplaceTokens,
tokens: [programName: programName,
version: version,
dbServer: dbServerTest,
deployDir: deployDirTest])
}
from files("scripts/" + programName + ".cgi")
from files("build/libs/" + programName + "-" + version + ".jar")
into ("build/" + programName)
}
But the problem is: My project gets built AFTER this configuration task assembleForTest has run. i.e. it will try build after the assembly is finished, which means an outdated (or nonexistant) deployment is attempted.
Gradle has some of the worst documenting I've seen, I have worked with it for a while and I still don't understand the ideal setup.
Wouldn't that solve your problem?
task assembleForTest(type: Sync, dependsOn: build) {
/* configuration phase, evaluated always and before execution phase of any task */
...
}
task deployTest(dependsOn: assembleForTest) << {
/* execution phase, evaluated only if the task is invoked and after configuration phase for all tasks has been finished */
...
}
EDIT: I added comments within example. Note that the 1st task is provided with configuration while the 2nd is provided with action. The switch is done with left shift operator. Alternative syntax, especially helpful to combine both phases definition, looks as follows:
task plop() {
// some configuration
...
doLast {
// some action
...
}
}
If you put println in place of 'some configuration' it prints always regardless what task is invoked, as that's evaluated in configuration phase.

How do I declare gradle Antlr task output specs to avoid unnecessary rebuilds

I have a typical Antlr 4.5 project with two grammar files: MyLexer.g4 and MyParser.g4. From them, Antlr generates 6 output files: MyLexer.java, MyLexer.tokens, MyParser.java, MyParser.tokens, MyParserBaseListener.java and MyParserListener.java. The gradle tasks are all working correctly so that the output files are all generated, compiled and tested as expected.
The problem is that gradle sees the 6 target files as always being out of date, so every run or debug session has to regenerate them and therefore has to recompile the main java project even if none of the source files have changed.
The gradle task which generates the file has the output spec defined as the folder into which the 6 output files are generated. I think that I need a way to define it as being the 6 specific files rather than the output folder. I just don't know the syntax to do that.
Here's the pertinent part of my build.gradle file:
ext.antlr4 = [
antlrSource: "src/main/antlr",
destinationDir: "src/main/java/com/myantlrquestion/core/antlr/generated",
grammarpackage: "com.myantlrquestion.core.antlr.generated"
]
task makeAntlrOutputDir << {
file(antlr4.destinationDir).mkdirs()
}
task compileAntlrGrammars(type: JavaExec, dependsOn: makeAntlrOutputDir) {
// Grammars are conveniently sorted alphabetically. I assume that will remain true.
// That ensures that files named *Lexer.g4 are listed and therefore processed before the corresponding *Parser.g4
// It matters because the Lexer must be processed first since the Parser needs the .tokens file from the Lexer.
// Also note that the output file naming convention for combined grammars is slightly different from separate Lexer and Parser grammars.
def grammars = fileTree(antlr4.antlrSource).include('**/*.g4')
def target = file("${antlr4.destinationDir}")
inputs.files grammars
// TODO: This output spec is incorrect, so this task is never considered up to date.
// TODO: Tweak the outputs collection so it is correct with combined grammars as well as separate Lexer and Parser grammars.
outputs.dir target
main = 'org.antlr.v4.Tool'
classpath = configurations.antlr4
// Antlr command line args are at https://theantlrguy.atlassian.net/wiki/display/ANTLR4/ANTLR+Tool+Command+Line+Options
args = ["-o", target,
"-lib", target,
//"-listener", //"-listener" is the default
//"-no-visitor", //"-no-visitor" is the default
"-package", antlr4.grammarpackage,
grammars.files
].flatten()
// include optional description and group (shown by ./gradlew tasks command)
description = 'Generates Java sources from ANTLR4 grammars.'
group = 'Build'
}
compileJava {
dependsOn compileAntlrGrammars
// this next line isn't technically needed unless the antlr4.destinationDir is not under buildDir, but it doesn't hurt either
source antlr4.destinationDir
}
task cleanAntlr {
delete antlr4.destinationDir
}
clean.dependsOn cleanAntlr
I discovered the problem was not that the target files were out of date, but rather, due to a bug in the cleanAntlr task, they were being deleted every time that any gradle task was run. The problem was that all of the code in cleanAntlr was being run during gradle's initialization and configuration phase, even if the cleanAntlr task itself wasn't being executed.
Originally, the task was defined as:
task cleanAntlr {
delete antlr4.destinationDir
}
clean.dependsOn cleanAntlr
The solution was to define it like this: (Note the "<<" after the task name.)
task cleanAntlr << {
delete antlr4.destinationDir
}
clean.dependsOn cleanAntlr
... or, for additional clarity, use this more verbose, but functionally equivalent task definition:
task cleanAntlr {
doLast() {
// Be sure to wrap the execution phase code inside doLast().
// Otherwise it will run during the initialization or configuration phase, even when an unrelated task is is run.
// It would also run when the NetBeas IDE first loaded the project.
//println 'Deleting Antlr Directory: ' + antlr4.destinationDir
delete antlr4.destinationDir
}
}
clean.dependsOn cleanAntlr
With that bug fixed, the original outputs specification for the compileAntlrGrammars task works correctly. There is no need to specify each individual output file. This is explained quite well in section 15.9.2 of https://gradle.org/docs/current/userguide/more_about_tasks.html.
def grammars = fileTree(antlr4.antlrSource).include('**/*.g4')
def target = file("${antlr4.destinationDir}")
inputs.files grammars
outputs.dir target
Could you please try the following piece of code:
generatedFiles = ['MyLexer.java',] // and so on..
generatedFiles.each { f -> outputs.file("$target/$f") }

Categories

Resources