I have a project where Gradle is not copying all test resources to the output dir, and I don't know if I am doing something wrong or if this is a bug in Gradle. I managed to create a simplified test case; the directory structure is as follows:
gradle/ # (contains Gradle wrapper files)
src/
hello/
Hello.java
tests/
hello/
hello.txt
foo/
bar.txt
build.gradle
gradlew
gradle.bat
The contents of build.gradle are as follows:
apply plugin: 'java'
sourceSets {
main {
java.srcDirs = ['src']
resources.srcDirs = ['src']
resources.excludes = ['**/*.java']
}
test {
java.srcDirs = ['tests']
resources.srcDirs = ['tests']
resources.excludes = ['**/*.java']
}
}
task staging (type: Copy) {
from processResources
from processTestResources { include 'foo/' } // Offending line
into "${buildDir}/staging"
}
task run (type: JavaExec) {
dependsOn staging
// [...]
}
When I run:
./gradlew processTestResources # or just ./gradlew test
Only the resources from tests/foo are copied to the output folder. The resources from tests/hello are not copied.
However, if I comment out the line marked as "Offending line" in the staging task, then all resources are copied.
Is this the expected behaviour? Looks like Gradle is trying to calculate which resources are needed, and sees that only tests/foo are necessary for the staging task. But I am not running the staging task; I should be able to run the processTestResources or test tasks and have all test resources copied to the output folder.
Is this a bug in Gradle?
Update:
Here's a link to the relevant issue in GH (closed as invalid since this is not really a bug, as explained in the accepted answer).
The issue is the way Groovy interprets the "offending line":
Without parenthesis, Groovy interprets everything after from as the argument.
And so it fetches the reference to the processTestResources task, applies the configuration closure to it, and then passes the result to from.
So the end result is that you configured in-line the processTestResources instead of configuring the from.
Adding parenthesis changes the meaning by clearly indicating that the result of calling from with a task is what is to be configured:
from(processTestResources) { include 'foo/' }
Related
On a new environment gradle build takes quite a while because all dependencies have to be downloaded.
Is there a way to only download dependencies in order to speed up the following build?
That way we could for example already prefill a CI build environment.
Edit: Updated for Gradle 6+.
Some notes:
This new approach downloads jars into a folder, and then deletes the folder. So the result of having the jars in the Gradle cache is a side-effect.
It currently uses jars configured for the main source-set but could be generalized.
Even though it is neither efficient nor elegant, it can be useful if you actually want the jars (and transitive dependencies): simply comment-out the deletion of the runtime folder.
This solution can be handy when you want the jars (and transitive dependencies), as you simply have to comment-out deleting the folder.
Consider this build.gradle (as an arbitrary, concrete example):
apply plugin: 'java'
dependencies {
implementation 'org.apache.commons:commons-io:1.3.2'
implementation 'org.kie.modules:org-apache-commons-lang3:6.2.0.Beta2'
}
repositories {
jcenter()
}
task getDeps(type: Copy) {
from sourceSets.main.runtimeClasspath
into 'runtime/'
doFirst {
ant.delete(dir: 'runtime')
ant.mkdir(dir: 'runtime')
}
doLast {
ant.delete(dir: 'runtime')
}
}
Example run:
$ find /Users/measter/.gradle/caches -name "commons-io*1.3.2.jar"
$ gradle getDeps
$ find /Users/measter/.gradle/caches -name "commons-io*1.3.2.jar"
/Users/measter/.gradle/caches/modules-2/files-2.1/commons-io/commons-io/1.3.2/[snip]/commons-io-1.3.2.jar
I've found ./gradlew dependencies (as suggested by this user) to be very handy for Docker builds.
You can create a custom task that resolves all the configurations( in doing so, it will also download the dependencies without building the project)
task downloadDependencies {
doLast {
configurations.findAll{it.canBeResolved}.each{it.resolve()}
}
}
Run command ./gradlew downloadDependencies
My answer will favor the gradle plugins and built-in tasks.
I would use "gradle assemble" in the command-line.
It is a minor version of "gradle build".
This way, you may reduce the time of your preparations before running or building anything.
Check the link bellow for the documentation:
https://docs.gradle.org/current/userguide/java_plugin.html#lifecycle_tasks
In general, what is my recipe when I clone a new repository:
-gradle assemble
-do some coding
-gradle run (and basically test until done)
-gradle build (to make distributable files)
note: this last step may have adicional configurations for .jar files as outputs (depends on you).
I'm attempting to use "HTTPBuilder" within my simple Groovy script. When I use '#Grab' to import the dependency, everything works fine. Though, I'd like to keep the jar within a different directory and import it using the classLoader function. I've copied the 'http-builder-0.7.jar' that '#Grab' placed into my grapes directory and pasted it into the same directory my Groovy script is running (on Windows). I then comment out the '#Grab' statement and include the classLoader, but get this error:
org.codehaus.groovy.control.MultipleCompilationErrorsException:
startup failed: C:\Groovy Scripts\test.groovy: 9: unable to resolve
class HTTPBuilder
Any ideas why the classLoader wouldn't be working in the script? I printed out the path of the jar when importing with '#Grab' and it's definitely using the one within the grape directory. If I uncomment the '#Grab' statement, it works again. Here's the small script...
//#Grab('org.codehaus.groovy.modules.http-builder:http-builder:0.7')
this.getClass().classLoader.rootLoader.addURL(new File("http-builder-0.7.jar").toURL())
//return new File(groovyx.net.http.HTTPBuilder.class.getProtectionDomain().getCodeSource().getLocation().toURI().getPath());
def http = new HTTPBuilder('http://httpbin.org/get')
As mentioned, you would be wise to use another method, such as Gradle's application plugin.
However, this is one way to do what you're asking.
First, to get the jar and all dependencies, consider the following Gradle build.gradle script:
apply plugin: 'java'
dependencies {
compile 'org.codehaus.groovy.modules.http-builder:http-builder:0.7'
}
repositories {
jcenter()
}
clean {
doLast {
ant.delete(dir: 'runtime')
}
}
task getDeps(type: Copy) {
from sourceSets.main.runtimeClasspath
into 'runtime/'
doFirst {
ant.delete(dir: 'runtime')
ant.mkdir(dir: 'runtime')
}
}
If you run gradle getDeps, it will write all of the jars into runtime.
Then, in a Unix terminal (for example), you could set the classpath with this (using wildcard syntax from Java 6+, and assuming the path is the same runtime as above):
export CLASSPATH=.:"/user/foo/some/path/runtime/*"
In the same terminal, this will work:
import groovyx.net.http.*
def http = new HTTPBuilder('http://httpbin.org/get')
println "Ready."
I'm taking over a project and have to assume that the tests worked at some point the way I find them (boiled down to what should not be null):
#Test
public void testCoding() {
assertNotNull(getClass().getResourceAsStream("/myfile.json"));
//assertNotNull(MyTest.class.getResourceAsStream("/myfile.json"));
//assertNotNull(MyTest.class.getClassLoader().getResourceAsStream("myfile.json"));
//...
}
with myfile.json in src/test/resources/ and the test in src/test/java/some/package/.
I tried putting myfile.json into the same folder as the test, in the src/java folder, I tried with and without leading / and with /resources and all the tips I found on SO.
The gradle file is:
apply plugin: 'java'
repositories {
mavenCentral()
}
dependencies {
testCompile "junit:junit:$junitVersion"
}
What am I missing?
Update:
I realized that the source code is in the open source part of our project and that it does fail in AS, only.
Here is the line that fails if I right-click the lt-api/src/test/java/ folder and select "Run All Tests". The Gradle console prints out this line:
Executing tasks: [:lt-api:compileJava, :lt-api:testClasses, :mbwlib:compileJava, :mbwlib:testClasses, :bitlib:compileJava, :bitlib:testClasses]
Running
./gradlew clean :lt-api:compileJava :lt-api:testClasses :mbwlib:compileJava :mbwlib:testClasses :bitlib:compileJava :bitlib:testClasses
does not trigger the error but neither does it run tests. (How can I know what exactly AS is doing there? I thought AS was doing gradle all the way now?)
Running
./gradlew clean :lt-api:test
I get the tests to run (adding a typo results in failing tests) but it doesn't trigger the issue I originally had and that I'd like to understand.
Where should the file be?
$ sudo updatedb
$ locate test-classes
$ locate ungargasse.json
/path/to/project/lt-api/build/resources/test/ungargasse.json
/path/to/project/lt-api/src/test/resources/ungargasse.json
Be Sure that myfile.json is in the test-classe after you complied the codes.
(By default,the files in src/test/resources/ will automatically move into test-classes after compling)
You can use getResourceAsStream without leading "/" in two cases:
Test.java is in src/test/java/some/package and myfile.json is in /test/resource
src/test/java/some/package/Test.java
src/test/resource/myfile.json
Test.java and myfile.json are in the same package
src/test/java/some/package/Test.java
src/test/java/some/package/myfile.json
For example
#org.junit.Test
public void testCoding() {
InputStream resourceAsStream = getClass().getResourceAsStream("myfile.json");
System.out.println(resourceAsStream);
}
I have a gradle project that has an xslt file in resources:
src/main/resources/xslt.sec/sec_report.xslt
At time of build w/ gradle I'd like to use that file to overwrite:
src/test/resources/sec_report.xslt
That way my unit tests are always consuming the latest version/there is one source of truth for this file. What is the right way to make this happen? Write a shell script and execute it from gradle or maybe add it to the build task?
You can add your file to your test source set in your build.gradle:
sourceSets {
test {
resources {
srcDir 'src/test/resources'
include 'src/main/resources/xslt.sec/sec_report.xslt'
}
}
}
I'm using Spring REST Docs to generate documentation for our API.
I've added everything to build.gradle from tutorial here http://docs.spring.io/spring-restdocs/docs/current/reference/html5/
ext {
snippetsDir = file('build/generated-snippets')
}
test {
outputs.dir snippetsDir
}
asciidoctor {
attributes 'snippets': snippetsDir
inputs.dir snippetsDir
outputDir "build/asciidoc"
dependsOn test
sourceDir 'src/main/asciidoc'
}
jar {
dependsOn asciidoctor
from ("${asciidoctor.outputDir}/html5") {
into 'static/docs'
}
}
After I do gradle build I can see that in build/asciidoc directory files are generated and also in build/generated-snippets.
But when I run from IDEA gradle task bootRun and trying to access localhost:8080/docs/index.html I'm getting not found 404. Just for test I've tried to put some index.html file under resources/static directory and then do bootRun and I can access localhost:8080/index.html file after that.
If I open my .jar file I can see static files under directory BOOT-INF/classes/static/docs so they are packed into jar.
Maybe somebody had the same issue?
There are two things that you need to do so that the documentation is served when using bootRun. The first is to copy the generated documentation into a location that's on the classpath used by bootRun:
task copyRestDocs(type: Copy) {
dependsOn asciidoctor
from "${asciidoctor.outputDir}/html5"
into "${sourceSets.main.output.resourcesDir}/static/docs"
}
Note that this new task depends on the asciidoctor task. This ensures that the documentation has been generated before it's copied.
Secondly, the bootRun task must depend on the new copyRestDocs task:
bootRun {
dependsOn copyRestDocs
}