I'm working on an OSGi bundle using bndtools in Eclipse. I just added a new bundle (apache commons logging) to the cnf/localrepo repository, and our CI server is now failing the build:
[Gradle] - Launching build.
[workspace] $ gradle build
:mybundle : Cannot find /error/com.springsource.org.apache.commons.logging;version=0 Not found in [bnd-cache, Release, Local, Bndtools Hub, /var/lib/jenkins/jobs/myapp/workspace/cnf/nonosgi-repo r/w=true]
Error : com.springsource.org.apache.commons.logging;version=0 Not found in [bnd-cache, Release, Local, Bndtools Hub, /var/lib/jenkins/jobs/myapp/workspace/cnf/nonosgi-repo r/w=true]
Project names lightly obfuscated just for simplicity.
It appears to me that the gradle plugin doesn't refresh the repository index - if one of my teammates updates from our vcs without refreshing in Eclipse, then they get the same error.
I know bndtools has the org.osgi.impl.bundle.repoindex.cli plugin, but I don't know enough about bndtools or gradle to apply it to my project. I also feel as though either (a) the gradle plugin should refresh the repositories on its own or (b) I'm using the repositories incorrectly.
Is it possible to add a task to our build.gradle that refreshes the indexes before a build?
Should we instead move all our dependencies to an online repository so bnd doesn't need to manage the indexes?
What kind of repo is cnf/localrepo? If it is a FileRepo, then you don't need an index. You just put the bundles in a folder/filename format for the bundle's bsn/version. If is is an indexed repo, then you must maintain the index and commit it with the new bundles added to the repo. This is we manage the bundle-hub repo. Whenever a new bundle is added, we update the index.
As for the gradle plugin, you can write a task to reindex you repo each build. See https://groups.google.com/forum/#!searchin/bndtools-users/index$20gradle/bndtools-users/OQ0Ns5v0ELo/JOB803lBBwAJ for a discussion about how to do this.
Should we instead move all our dependencies to an online repository so bnd doesn't need to manage the indexes?
Probably, but for various reasons it's easier in the short-term to continue to use our LocalIndexedRepository.
Is it possible to add a task to our build.gradle that refreshes the indexes before a build?
bndtools' Bundle-Hub repository uses the org.osgi.impl.bundle.repoindex.cli plugin to reindex the repository in its build.gradle. Since I'm not fluent in groovy or gradle, I simply copied its code into cnf/localrepo/build.gradle:
repositories {
mavenCentral()
}
configurations {
repoindex
}
dependencies {
repoindex group: 'biz.aQute.bnd', name: 'org.osgi.impl.bundle.repoindex.cli', version: '3.0.0'
}
defaultTasks = [':index']
task('index') {
/* Bundles to index. */
def bundles = fileTree(projectDir) {
include '**/*.jar'
exclude '**/*-latest.jar'
exclude '.*/'
}
doLast {
javaexec {
main = '-jar' // first arg must be the jar
args configurations.repoindex.singleFile
args '-n', 'Local' // REPO NAME HERE
args bundles*.absolutePath
}.assertNormalExitValue()
}
}
Thanks to BJ Hargrave for pointing me in the direction of Bundle-Hub's scripts.
Related
I can successfully add a generated openapi client to my project via source sets. But then I have to copy dependencies into the main build-gradle, resolve conflicts -> I think it would be a better design to have the client as a subproject with its own build.gradle.
So I add include = 'build:openapi-java-client' to my settings.gradle and compile project(':build:openapi-java-client') to my dependencies. So that I have the following files:
build.gradle:
plugins {
id 'java'
id 'application'
id "org.openapi.generator" version "4.3.1"
}
repositories {
jcenter()
}
openApiGenerate {
generatorName = "java"
inputSpec = "$rootDir/specs/petstore.yaml".toString()
outputDir = "$buildDir/openapi-java-client".toString()
apiPackage = "org.openapi.example.api"
invokerPackage = "org.openapi.example.invoker"
modelPackage = "org.openapi.example.model"
configOptions = [
dateLibrary: "java8"
]
}
dependencies {
implementation 'com.google.guava:guava:29.0-jre'
testImplementation 'junit:junit:4.13'
compile project(':build:openapi-java-client')
}
application {
mainClassName = 'a.aa.App'
}
and settings.gradle:
rootProject.name = 'simple-java-app'
include = 'build:openapi-java-client'
I execute openApiGenerate in advance, after adding it as a subproject, I do Gradle -> Refresh Gradle Project and Refresh.
Eclipse then shows me a problem:
Could not run phased build action using Gradle distribution 'https://services.gradle.org/distributions/gradle-6.5.1-bin.zip'.
Settings file 'C:\...\simple-java-app\settings.gradle' line: 11
A problem occurred evaluating settings 'simple-java-app'.
Could not set unknown property 'include' for settings 'simple-java-app' of type org.gradle.initialization.DefaultSettings.
I don't know where to go from here, addressing subprojects in subfolders worked just fine when I worked through https://guides.gradle.org/creating-multi-project-builds/ and put greeting-library in a subfolder.
You are trying to make build/ a project when that directory specifically is not meant to be a project directory. It's Gradle default build directory and likely 99% of other plugins and other Gradle plugins.
Simply change output directory to something else other than build/:
openApiGenerate {
generatorName.set("java")
inputSpec.set("$rootDir/specs/petstore.json")
outputDir.set("$rootDir/openapi-java-client")
apiPackage.set("org.openapi.example.api")
invokerPackage.set("org.openapi.example.invoker")
modelPackage.set("org.openapi.example.model")
}
Then include the project in your build with the correct syntax:
// settings.gradle
include("openapi-java-client")
However, using the org.openapi.generator seems to generate an invalid build.gradle since I get the following error:
FAILURE: Build failed with an exception.
* Where:
Build file 'C:\Users\fmate\code\example\openapi-java-client\build.gradle' line: 23
* What went wrong:
Could not compile build file 'C:\Users\fmate\code\example\openapi-java-client\build.gradle'.
> startup failed:
build file 'C:\Users\fmate\code\example\openapi-java-client\build.gradle': 23: unexpected char: '\' # line 23, column 35.
main.java.srcDirs = ['src/main\java']
This obviously won't work how you wanted it to since it appears to be an issue with the Gradle plugin itself. If you just need to include the generate code in your project, then just include the generated Java code as part of your main Java source:
openApiGenerate {
generatorName.set("java")
inputSpec.set("$rootDir/specs/petstore.json")
outputDir.set("$buildDir/openapi-java-client")
apiPackage.set("org.openapi.example.api")
invokerPackage.set("org.openapi.example.invoker")
modelPackage.set("org.openapi.example.model")
}
tasks {
compileJava {
dependsOn(openApiGenerate)
}
}
sourceSets {
main {
java {
srcDir(files("${openApiGenerate.outputDir.get()}/src/main"))
}
}
}
But with this approach, you'll run into missing imports/dependencies. It doesn't appear this plugin offers the ability to just generate the models/POJOs only, so updating the library property to native and including some missing dependencies manually, it all works:
plugins {
java
id("org.openapi.generator") version "5.0.0-beta"
}
repositories {
mavenCentral()
}
group = "io.mateo.test"
dependencies {
implementation(platform("com.fasterxml.jackson:jackson-bom:2.11.1"))
implementation("com.fasterxml.jackson.core:jackson-databind")
implementation("com.fasterxml.jackson.datatype:jackson-datatype-jsr310")
implementation("org.openapitools:jackson-databind-nullable:0.2.1")
implementation("com.google.code.findbugs:jsr305:3.0.2")
implementation("io.swagger:swagger-core:1.6.2")
}
openApiGenerate {
generatorName.set("java")
inputSpec.set("$rootDir/specs/petstore.json")
outputDir.set("$buildDir/openapi-java-client")
apiPackage.set("org.openapi.example.api")
invokerPackage.set("org.openapi.example.invoker")
modelPackage.set("org.openapi.example.model")
library.set("native")
configOptions.put("dateLibrary", "java8")
}
tasks {
compileJava {
dependsOn(openApiGenerate)
}
}
sourceSets {
main {
java {
srcDir(files("${openApiGenerate.outputDir.get()}/src/main"))
}
}
}
You cannot configure it alike this, because build most certainly is an output directory, which would create a circular reference. Better try to add a new module and add that generator plugin into that module. If you can configure another module as outputDir, this could be referenced.
Even if the plugin resides in the root project, the destination needs to be a module.
The point is, that the root project always executes, opposite to module configutions.
I’ve just answered a very similar question. While my answer there is not perfect, I would personally still prefer the approach suggested there – and kind of repeated here:
Suggested Approach
I would keep the builds of the modules that depend on the generated API completely separate from the build that generates the API. The only connection between such builds should be a dependency declaration. That means, you’ll have to manually make sure to build the API generating project first and only build the dependent projects afterwards.
By default, this would mean to also publish the API module before the dependent projects can be built. An alternative to this default would be Gradle composite builds – for example, to allow you to test a newly generated API locally first before publishing it. However, before creating/running the composite build, you would have to manually run the API generating build each time that the OpenAPI document changes.
Example
Let’s say you have project A depending on the generated API. Its Gradle build would contain something like this:
dependencies {
implementation 'com.example:api:1.0'
}
Of course, the simple-java-app build described in the question would have to be adapted to produce a module with these coordinates:
openApiGenerate {
// …
groupId = "com.example"
id = "api"
version = "1.0"
}
Before running A’s build, you’d first have to run
./gradlew openApiGenerate from your simple-java-app project.
./gradlew publish from the simple-java-app/build/openapi-java-client/ directory.
Then A’s build could fetch the published dependency from the publishing repository.
Alternatively, you could drop step 2 locally and run A’s build with an additional Gradle CLI option:
./gradlew --include-build $path_to/simple-java-app/build/openapi-java-client/ …
I have a Gradle project which creates a zip artifact. I define the artifact via artifacts.add('default', zipTask). I add this project to another project via includeBuild and use the zip as dependency (dependencies { myConfiguration 'org.example:testA:+#zip' }).
So far so good. It works.
The problem starts when I add the plugin java to the first project. For some reason it prevents Gradle from finding the zip artifact.
The error is:
Execution failed for task ':doubleZipTask'.
> Could not resolve all files for configuration ':myConfiguration'.
> Could not find testA.zip (project :testA).
Why? How to fix it?
Complete example:
Project testA
settings.gradle:
rootProject.name = 'testA'
build.gradle:
plugins {
id 'base'
// Uncomment the line below to break the zip artifact
//id 'java'
}
group = 'org.test'
version = '0.0.0.1_test'
task zipTask(type: Zip) {
from './settings.gradle' // just so the zip isn't empty
}
artifacts.add('default', zipTask)
Project testB
settings.gradle:
rootProject.name = 'testB'
// This line may be commented out in some cases and then the artifact should be downloaded from Maven repository.
// For this question it should be always uncommented, though.
includeBuild('../testA')
build.gradle:
plugins {
id 'base'
}
configurations {
myConfiguration
}
dependencies {
myConfiguration 'org.test:testA:0.0.0.+#zip'
}
task doubleZipTask(type: Zip) {
from configurations.myConfiguration
}
Update 1
I've added some diagnostic code at the end of the build.grade:
configurations.default.allArtifacts.each() {
println it.toString() + ' -> name: ' + it.getName() + ', extension: ' + it.getExtension()
}
and in the version with java plugin it prints:
ArchivePublishArtifact_Decorated testA:zip:zip: -> name: testA, extension: zip
org.gradle.api.internal.artifacts.dsl.LazyPublishArtifact#2c6aaa5 -> name: testA, extension: jar
However, I'm not sure if an additional artifact can break something.
It doesn't seem to be a problem when I add a second artifact myself.
Update 2
Maybe the zip file isn't the best representation of my intentions. After all, I could build java related files in one project and zip them in another.
However, the problem also applies to war files. (War plugin internally uses the Java plugin so it cannot be run separately.)
The issue seems to be a bug in Gradle where the composite build and reference to artifacts are broken.
Some discussion is here: https://discuss.gradle.org/t/composite-build-cant-use-included-artifact-in-buildsrc-build-gradle/24978
Bug report: https://github.com/gradle/gradle/issues/3768
A workaround would be to move the artifact dependency to a task dependency:
plugins {
id 'base'
}
configurations {
myConfiguration
}
dependencies {
}
task doubleZipTask(type: Zip) {
dependsOn gradle.includedBuild('testA').task(':zipTask')
from configurations.myConfiguration
}
The following setup should work with Gradle 5.6 (when using another attribute it will likely also work with previous versions). It mostly corresponds to your original setup with the exception of the changes indicated by XXX.
Project testA
settings.gradle:
rootProject.name = 'testA'
build.gradle:
plugins {
id 'base'
// Uncomment the line below to break the zip artifact
//id 'java'
}
group = 'org.test'
version = '0.0.0.1_test'
task zipTask(type: Zip) {
from './settings.gradle' // just so the zip isn't empty
}
// XXX added an attribute to the configuration
configurations.default.attributes {
attribute(LibraryElements.LIBRARY_ELEMENTS_ATTRIBUTE,
project.objects.named(LibraryElements, 'my-zipped-lib'))
}
artifacts.add('default', zipTask)
Project testB
settings.gradle:
rootProject.name = 'testB'
// This line may be commented out in some cases and then the artifact should be downloaded from Maven repository.
// For this question it should be always uncommented, though.
includeBuild('../testA')
build.gradle:
plugins {
id 'base'
}
configurations {
// XXX added the same attribute as in the testA project
myConfiguration {
attributes {
attribute(LibraryElements.LIBRARY_ELEMENTS_ATTRIBUTE,
project.objects.named(LibraryElements, 'my-zipped-lib'))
}
}
}
dependencies {
myConfiguration 'org.test:testA:0.0.0.+#zip'
}
task doubleZipTask(type: Zip) {
from configurations.myConfiguration
}
I have tested this setup both with and without the java plugin. I’ve also tested publishing to a Maven repository and letting testB take its dependency from there instead of from the included build of testA. Adding an additional dependency from testB on the JAR artifact of testA (myConfiguration 'org.test:testA:0.0.0.+#jar') worked, too.
Some explanation on what (I believe) is going on: Gradle needs a way of determining automatically which local component/artifact in testA it can use for substituting the external dependency of testB.
Without applying the java plugin, there is only a single component/artifact and my guess is that Gradle then just selects that single one without further ado.
As you saw, by applying the java plugin, another artifact is added to testA. Now which one should Gradle take? One would expect that it would look at the file extension specified on the dependency in testB but that doesn’t seem to be the case. It seems that Gradle isn’t actually working at the artifacts level when substituting dependencies but rather at the module/component level. You might say we only have one component with two artifacts, so selecting the one component should be straightforward. But it seems that we actually have two variants of the same component and Gradle wants to select one of these variants. In your own testB setup, there is no clue given that would tell Gradle which variant to select; so it fails (with an admittedly bad/misleading error message). In my changed testB setup I provide the clue: I tell Gradle that we want the variant that has a specific attribute with the value my-zipped-lib. Since I’ve added the same attribute on the published configuration of testA, Gradle is now able to select the right variant (because there is only one that has the required attribute). The file extension on the dependency is still relevant in a second step: once Gradle has selected the component variant, it still needs to select the right artifact – but only then.
Note that we’re actually working on the rim of what is supported with composite builds today. See also Gradle issue #2529 which states that “projects that publish non-jar artifacts” were not terribly well supported. When I first saw your question I honestly thought we’d be out of luck here … but it seems there’s a way to again step a little closer to the rim ;-)
In the comments, the question came up why adding multiple custom artifacts works while applying the java plugin breaks the build. As I’ve tried to explain above, it’s not a question of multiple artifacts but of multiple component variants. IIUIC, these variants originate from different attributes on configurations. When you don’t add such attributes, then you will not have different components variants. However, the Java plugin does add such attributes which consequently leads to different component variants in your project(/component). If you’re interested, you can see the different attributes by adding something like the following to your build.gradle:
configurations.each { conf ->
println "Attributes of $conf:"
conf.attributes.keySet().each { attr ->
println "\t$attr -> ${conf.attributes.getAttribute(attr)}"
}
}
Now where and when to add which attributes? That depends on your setup. I wouldn’t blindly add attributes to all configurations in the hope that that’ll solve things magically. While it might work (depending on your setup), it’d certainly not be clean. If your project setup is as complex and/or special as your question suggests, then it’ll probably make sense to think more deeply about which configurations you need and what attributes they should carry. The first sections of the Gradle documentation page that I have linked above are probably a good starting point if you are not intricately familiar with these nuances of configurations in Gradle, yet. And yes, I agree that such logic would best live in a Gradle plugin.
On a new environment gradle build takes quite a while because all dependencies have to be downloaded.
Is there a way to only download dependencies in order to speed up the following build?
That way we could for example already prefill a CI build environment.
Edit: Updated for Gradle 6+.
Some notes:
This new approach downloads jars into a folder, and then deletes the folder. So the result of having the jars in the Gradle cache is a side-effect.
It currently uses jars configured for the main source-set but could be generalized.
Even though it is neither efficient nor elegant, it can be useful if you actually want the jars (and transitive dependencies): simply comment-out the deletion of the runtime folder.
This solution can be handy when you want the jars (and transitive dependencies), as you simply have to comment-out deleting the folder.
Consider this build.gradle (as an arbitrary, concrete example):
apply plugin: 'java'
dependencies {
implementation 'org.apache.commons:commons-io:1.3.2'
implementation 'org.kie.modules:org-apache-commons-lang3:6.2.0.Beta2'
}
repositories {
jcenter()
}
task getDeps(type: Copy) {
from sourceSets.main.runtimeClasspath
into 'runtime/'
doFirst {
ant.delete(dir: 'runtime')
ant.mkdir(dir: 'runtime')
}
doLast {
ant.delete(dir: 'runtime')
}
}
Example run:
$ find /Users/measter/.gradle/caches -name "commons-io*1.3.2.jar"
$ gradle getDeps
$ find /Users/measter/.gradle/caches -name "commons-io*1.3.2.jar"
/Users/measter/.gradle/caches/modules-2/files-2.1/commons-io/commons-io/1.3.2/[snip]/commons-io-1.3.2.jar
I've found ./gradlew dependencies (as suggested by this user) to be very handy for Docker builds.
You can create a custom task that resolves all the configurations( in doing so, it will also download the dependencies without building the project)
task downloadDependencies {
doLast {
configurations.findAll{it.canBeResolved}.each{it.resolve()}
}
}
Run command ./gradlew downloadDependencies
My answer will favor the gradle plugins and built-in tasks.
I would use "gradle assemble" in the command-line.
It is a minor version of "gradle build".
This way, you may reduce the time of your preparations before running or building anything.
Check the link bellow for the documentation:
https://docs.gradle.org/current/userguide/java_plugin.html#lifecycle_tasks
In general, what is my recipe when I clone a new repository:
-gradle assemble
-do some coding
-gradle run (and basically test until done)
-gradle build (to make distributable files)
note: this last step may have adicional configurations for .jar files as outputs (depends on you).
I have project which using Gradle to publish our SNAPSHOT artifact's to remote Maven Repository.
When I publish to Maven, time stamp and build number is getting appended to Jar name. And I am trying to download the latest version ie myjar-1.6.0-20170926.190543-10.jar from one of other project. I am not able to download until unless, I remove it from my .gradle or restart my work space.
myjar-1.6.0-20170926.162756-7.jar
myjar-1.6.0-20170926.162756-7.jar.md5
myjar-1.6.0-20170926.162756-7.jar.sha1
myjar-1.6.0-20170926.162756-7.pom
myjar-1.6.0-20170926.162756-7.pom.md5
myjar-1.6.0-20170926.162756-7.pom.sha1
myjar-1.6.0-20170926.182639-8.jar
myjar-1.6.0-20170926.182639-8.jar.md5
myjar-1.6.0-20170926.182639-8.jar.sha1
myjar-1.6.0-20170926.182639-8.pom
myjar-1.6.0-20170926.182639-8.pom.md5
myjar-1.6.0-20170926.182639-8.pom.sha1
myjar-1.6.0-20170926.182748-9.jar
myjar-1.6.0-20170926.182748-9.jar.md5
myjar-1.6.0-20170926.182748-9.jar.sha1
myjar-1.6.0-20170926.182748-9.pom
myjar-1.6.0-20170926.182748-9.pom.md5
myjar-1.6.0-20170926.182748-9.pom.sha1
myjar-1.6.0-20170926.190543-10.jar
myjar-1.6.0-20170926.190543-10.jar.md5
myjar-1.6.0-20170926.190543-10.jar.sha1
myjar-1.6.0-20170926.190543-10.pom
myjar-1.6.0-20170926.190543-10.pom.md5
myjar-1.6.0-20170926.190543-10.pom.sha1
The dependency project has below
configurations.all {
resolutionStrategy.cacheDynamicVersionsFor 0, 'seconds'
resolutionStrategy.cacheChangingModulesFor 0, 'seconds'
}
compile ( group: "com.test", name:"myjar", version: "1.6.0-SNAPSHOT", changing: true );
Also tried with
compile ( "com.test:myjar:latest.integration);
But nothing working out. Let me know how to fix this?
You can run the local build with --refresh-dependencies. See the answers to this question How can I force gradle to redownload dependencies? for more information.
I encounter the same situation as you.
Develop environment: Android Studio 4.0
Building tools: gradle 3.6
Computer: Mac
I'm at app end, while I am using a jar from server end, the jar is included by adding maven repo and adding project dependency in build.gradle, like this:
buildscript {
repositories {
mavenLocal()
maven { url 'http://maven.xxx.com/xxx/xxx/‘ }
}}
dependencies {
...
implementation ‘project_group_id:artifact_id:latest.integration’
}
Instead of using ./gradlew build --refresh-dependencies to clear cache, I directly went to the cache folder /Users/your_user_name/.gradle/caches/modules-2/files-2.1 and delete the local jar in order to download updated one.
But still failed...
Finally we found the cause is weird:
we included several maven repo, one repo has the same project as the one we need, but it has only one version for each version code.It is in front of the maven repo from which we can download newest jar, therefore it blocked the update of the newest version.
We exchange the sequence of these two repo, and the problem disappear.
In my case, it's nothing to do with the maven config or the cache, it is due to repo included in gradle. Further reason still need to dig.
Hope can help the ones who meet the same problem.
I have gradle multi-module project configured with kotlin-script. I'd like to add publishing to maven repository and I found maven-publish plugin for it. But it seems to skip the version configured for each project:
MyProject/build.gradle.kts:
subprojects {
apply {
plugin("maven-publish")
}
configure<PublishingExtension>() {
publications {
repositories { ... }
create<MavenPublication>("myPublication") {
from(components.getByName("java"))
logger.lifecycle("test: ${project.group} ${project.name} ${project.version}")
}
}
MyProject/subproject1/build.gradle.kts:
version = "1.0.0-SNAPSHOT"
gradle publish output:
test: my.project subproject1 unspecified
artifact file does not exist: '.../MyProject/subproject1/build/libs/subproject1.jar'
File subproject1.jar doesn't exist, but subproject1-1.0.0-SNAPSHOT.jar does. How to make gradle get the correct version of module?
I found a similar problem while using the maven-publish plugin:
I was trying to set the repository URL depending on project version as described in the gradle docs here and this answer.
But I found the version always resolved to (as in the question) as the default (un-set) value: unspecified.
So I guess those documentation examples are for a project's build.gradle and not a general gradle script.
Anyway, I believe the problem is due to the timing of the execution of the blocks in the gradle script. The project.version could not be accessed where I wanted it. So I ended up passing a parameter to the gradlew command with the -Pparameter flag.
Gradle has a configuration and then an execution stage.
Refer to documentation:
https://docs.gradle.org/current/userguide/build_lifecycle.html
https://www.oreilly.com/library/view/gradle-beyond-the/9781449373801/ch03.html
and an apparently similar problem, https://discuss.gradle.org/t/maven-publication-closure-is-evaluated-too-early/19911
About your problem, it may be the same as I have described, or perhaps the reason is simpler:
Looking at the structure of your gradle file, it does not appear to match the hierarchy specified in the maven-publish documentation. In particular, repositories {} block should be at the same level as publications {}, not inside of it.
Possibly related:
Gradle maven publish plugin config has reference to dynamically created gradle task
Gradle shouldRunAfter not available for a task