How to add OpenAPI client as a subproject? - java

I can successfully add a generated openapi client to my project via source sets. But then I have to copy dependencies into the main build-gradle, resolve conflicts -> I think it would be a better design to have the client as a subproject with its own build.gradle.
So I add include = 'build:openapi-java-client' to my settings.gradle and compile project(':build:openapi-java-client') to my dependencies. So that I have the following files:
build.gradle:
plugins {
id 'java'
id 'application'
id "org.openapi.generator" version "4.3.1"
}
repositories {
jcenter()
}
openApiGenerate {
generatorName = "java"
inputSpec = "$rootDir/specs/petstore.yaml".toString()
outputDir = "$buildDir/openapi-java-client".toString()
apiPackage = "org.openapi.example.api"
invokerPackage = "org.openapi.example.invoker"
modelPackage = "org.openapi.example.model"
configOptions = [
dateLibrary: "java8"
]
}
dependencies {
implementation 'com.google.guava:guava:29.0-jre'
testImplementation 'junit:junit:4.13'
compile project(':build:openapi-java-client')
}
application {
mainClassName = 'a.aa.App'
}
and settings.gradle:
rootProject.name = 'simple-java-app'
include = 'build:openapi-java-client'
I execute openApiGenerate in advance, after adding it as a subproject, I do Gradle -> Refresh Gradle Project and Refresh.
Eclipse then shows me a problem:
Could not run phased build action using Gradle distribution 'https://services.gradle.org/distributions/gradle-6.5.1-bin.zip'.
Settings file 'C:\...\simple-java-app\settings.gradle' line: 11
A problem occurred evaluating settings 'simple-java-app'.
Could not set unknown property 'include' for settings 'simple-java-app' of type org.gradle.initialization.DefaultSettings.
I don't know where to go from here, addressing subprojects in subfolders worked just fine when I worked through https://guides.gradle.org/creating-multi-project-builds/ and put greeting-library in a subfolder.

You are trying to make build/ a project when that directory specifically is not meant to be a project directory. It's Gradle default build directory and likely 99% of other plugins and other Gradle plugins.
Simply change output directory to something else other than build/:
openApiGenerate {
generatorName.set("java")
inputSpec.set("$rootDir/specs/petstore.json")
outputDir.set("$rootDir/openapi-java-client")
apiPackage.set("org.openapi.example.api")
invokerPackage.set("org.openapi.example.invoker")
modelPackage.set("org.openapi.example.model")
}
Then include the project in your build with the correct syntax:
// settings.gradle
include("openapi-java-client")
However, using the org.openapi.generator seems to generate an invalid build.gradle since I get the following error:
FAILURE: Build failed with an exception.
* Where:
Build file 'C:\Users\fmate\code\example\openapi-java-client\build.gradle' line: 23
* What went wrong:
Could not compile build file 'C:\Users\fmate\code\example\openapi-java-client\build.gradle'.
> startup failed:
build file 'C:\Users\fmate\code\example\openapi-java-client\build.gradle': 23: unexpected char: '\' # line 23, column 35.
main.java.srcDirs = ['src/main\java']
This obviously won't work how you wanted it to since it appears to be an issue with the Gradle plugin itself. If you just need to include the generate code in your project, then just include the generated Java code as part of your main Java source:
openApiGenerate {
generatorName.set("java")
inputSpec.set("$rootDir/specs/petstore.json")
outputDir.set("$buildDir/openapi-java-client")
apiPackage.set("org.openapi.example.api")
invokerPackage.set("org.openapi.example.invoker")
modelPackage.set("org.openapi.example.model")
}
tasks {
compileJava {
dependsOn(openApiGenerate)
}
}
sourceSets {
main {
java {
srcDir(files("${openApiGenerate.outputDir.get()}/src/main"))
}
}
}
But with this approach, you'll run into missing imports/dependencies. It doesn't appear this plugin offers the ability to just generate the models/POJOs only, so updating the library property to native and including some missing dependencies manually, it all works:
plugins {
java
id("org.openapi.generator") version "5.0.0-beta"
}
repositories {
mavenCentral()
}
group = "io.mateo.test"
dependencies {
implementation(platform("com.fasterxml.jackson:jackson-bom:2.11.1"))
implementation("com.fasterxml.jackson.core:jackson-databind")
implementation("com.fasterxml.jackson.datatype:jackson-datatype-jsr310")
implementation("org.openapitools:jackson-databind-nullable:0.2.1")
implementation("com.google.code.findbugs:jsr305:3.0.2")
implementation("io.swagger:swagger-core:1.6.2")
}
openApiGenerate {
generatorName.set("java")
inputSpec.set("$rootDir/specs/petstore.json")
outputDir.set("$buildDir/openapi-java-client")
apiPackage.set("org.openapi.example.api")
invokerPackage.set("org.openapi.example.invoker")
modelPackage.set("org.openapi.example.model")
library.set("native")
configOptions.put("dateLibrary", "java8")
}
tasks {
compileJava {
dependsOn(openApiGenerate)
}
}
sourceSets {
main {
java {
srcDir(files("${openApiGenerate.outputDir.get()}/src/main"))
}
}
}

You cannot configure it alike this, because build most certainly is an output directory, which would create a circular reference. Better try to add a new module and add that generator plugin into that module. If you can configure another module as outputDir, this could be referenced.
Even if the plugin resides in the root project, the destination needs to be a module.
The point is, that the root project always executes, opposite to module configutions.

I’ve just answered a very similar question. While my answer there is not perfect, I would personally still prefer the approach suggested there – and kind of repeated here:
Suggested Approach
I would keep the builds of the modules that depend on the generated API completely separate from the build that generates the API. The only connection between such builds should be a dependency declaration. That means, you’ll have to manually make sure to build the API generating project first and only build the dependent projects afterwards.
By default, this would mean to also publish the API module before the dependent projects can be built. An alternative to this default would be Gradle composite builds – for example, to allow you to test a newly generated API locally first before publishing it. However, before creating/running the composite build, you would have to manually run the API generating build each time that the OpenAPI document changes.
Example
Let’s say you have project A depending on the generated API. Its Gradle build would contain something like this:
dependencies {
implementation 'com.example:api:1.0'
}
Of course, the simple-java-app build described in the question would have to be adapted to produce a module with these coordinates:
openApiGenerate {
// …
groupId = "com.example"
id = "api"
version = "1.0"
}
Before running A’s build, you’d first have to run
./gradlew openApiGenerate from your simple-java-app project.
./gradlew publish from the simple-java-app/build/openapi-java-client/ directory.
Then A’s build could fetch the published dependency from the publishing repository.
Alternatively, you could drop step 2 locally and run A’s build with an additional Gradle CLI option:
./gradlew --include-build $path_to/simple-java-app/build/openapi-java-client/ …

Related

Compile a groovy script with all it's dependencies which are managed by gradle and then run it as a standalone application via the command line

I have a simple groovy script with a single java library dependency:
package com.mrhacki.myApp
import me.tongfei.progressbar.ProgressBar
class Loading {
static void main(String[] arguments) {
List list = ["file1", "file2", "file3"]
for (String x : ProgressBar.wrap(list, "TaskName")) {
println(x)
}
}
}
I'm using gradle to manage the dependencies of the project. The gradle configuration for the project is pretty straightforward too:
plugins {
id 'groovy'
}
group 'com.mrhacki'
version '1.0-SNAPSHOT'
repositories {
mavenCentral()
}
dependencies {
compile 'org.codehaus.groovy:groovy-all:2.3.11'
compile 'me.tongfei:progressbar:0.7.2'
}
If I run the script from the Intellij IDE, the script is executed as expected.
What I would like to do now is to compile the script with this dependency into one single .jar file, so I can distribute it as such, and run the application from any filesystem path, as the script logic will be dependent on the path from which the execution was called.
I've tried with a few gradle fat jars examples out there but none have worked for me since the .jar file is constantly throwing Could not find or load main class Loading when I try it to run.
If anyone would be so kind to give a hint or to show an example of a gradle task that would do a build that fits my described needs I would be very gratefull.
I'm aware of the groovy module Grape with the #Grab annotation too, but I would leave that as a last resort since I don't want the users to wait for the dependencies download, and would like to bundle them with the app.
I'm using groovy 2.5.6 and gradle 4.10 for the project
Thanks
You can simply create the fat-jar yourself, without any extra plugin, using the jar Task. For a simple/small project like yours, it should be straightforward :
jar {
manifest {
// required attribute "Main-Class"
attributes "Main-Class": "com.mrhacki.myApp.Loading"
}
// collect (and unzip) dependencies into the fat jar
from {
configurations.compile.collect {
it.isDirectory() ? it : zipTree(it)
}
}
}
EDIT : pleas take other comments into consideration: if you have more that one external lib you might have issues with this solution, so you should go for a solution with "shadow" plugins in this case.

Dependency on generated source set using Gradle Composite Builds and Intellij IDEA

I build my project (call this project B) and some of its upstream dependency projects with Gradle composite builds. One of these upstream projects (call this project A) has an alternate source set configured to avoid producing warnings on generated code.
This is configured like:
sourceSets {
generated {
java {
srcDir "$buildDir/generated-sources/generated/main/java"
}
}
}
dependencies {
compile sourceSets.generated.compileClasspath
compile sourceSets.generated.output
}
compileGeneratedJava.options.warnings = false
jar { from sourceSets.generated.output }
This works fine building with gradle from the command line. But, in IntelliJ Idea, it imports the two source sets as separate modules: A_main and A_generated. It creates a dependency from B_main on A_main, but not on A_generated.
This results in run-time errors when running from IntelliJ IDEA. (B does not directly use any generated classes from A).
How can this be resolved?
The versions I'm using are:
IntelliJ IDEA: 2017.2.5
Gradle: 4.2.1

How to make gradle download dependencies without actually building things

On a new environment gradle build takes quite a while because all dependencies have to be downloaded.
Is there a way to only download dependencies in order to speed up the following build?
That way we could for example already prefill a CI build environment.
Edit: Updated for Gradle 6+.
Some notes:
This new approach downloads jars into a folder, and then deletes the folder. So the result of having the jars in the Gradle cache is a side-effect.
It currently uses jars configured for the main source-set but could be generalized.
Even though it is neither efficient nor elegant, it can be useful if you actually want the jars (and transitive dependencies): simply comment-out the deletion of the runtime folder.
This solution can be handy when you want the jars (and transitive dependencies), as you simply have to comment-out deleting the folder.
Consider this build.gradle (as an arbitrary, concrete example):
apply plugin: 'java'
dependencies {
implementation 'org.apache.commons:commons-io:1.3.2'
implementation 'org.kie.modules:org-apache-commons-lang3:6.2.0.Beta2'
}
repositories {
jcenter()
}
task getDeps(type: Copy) {
from sourceSets.main.runtimeClasspath
into 'runtime/'
doFirst {
ant.delete(dir: 'runtime')
ant.mkdir(dir: 'runtime')
}
doLast {
ant.delete(dir: 'runtime')
}
}
Example run:
$ find /Users/measter/.gradle/caches -name "commons-io*1.3.2.jar"
$ gradle getDeps
$ find /Users/measter/.gradle/caches -name "commons-io*1.3.2.jar"
/Users/measter/.gradle/caches/modules-2/files-2.1/commons-io/commons-io/1.3.2/[snip]/commons-io-1.3.2.jar
I've found ./gradlew dependencies (as suggested by this user) to be very handy for Docker builds.
You can create a custom task that resolves all the configurations( in doing so, it will also download the dependencies without building the project)
task downloadDependencies {
doLast {
configurations.findAll{it.canBeResolved}.each{it.resolve()}
}
}
Run command ./gradlew downloadDependencies
My answer will favor the gradle plugins and built-in tasks.
I would use "gradle assemble" in the command-line.
It is a minor version of "gradle build".
This way, you may reduce the time of your preparations before running or building anything.
Check the link bellow for the documentation:
https://docs.gradle.org/current/userguide/java_plugin.html#lifecycle_tasks
In general, what is my recipe when I clone a new repository:
-gradle assemble
-do some coding
-gradle run (and basically test until done)
-gradle build (to make distributable files)
note: this last step may have adicional configurations for .jar files as outputs (depends on you).

Maven-publish gradle plugin skips the version

I have gradle multi-module project configured with kotlin-script. I'd like to add publishing to maven repository and I found maven-publish plugin for it. But it seems to skip the version configured for each project:
MyProject/build.gradle.kts:
subprojects {
apply {
plugin("maven-publish")
}
configure<PublishingExtension>() {
publications {
repositories { ... }
create<MavenPublication>("myPublication") {
from(components.getByName("java"))
logger.lifecycle("test: ${project.group} ${project.name} ${project.version}")
}
}
MyProject/subproject1/build.gradle.kts:
version = "1.0.0-SNAPSHOT"
gradle publish output:
test: my.project subproject1 unspecified
artifact file does not exist: '.../MyProject/subproject1/build/libs/subproject1.jar'
File subproject1.jar doesn't exist, but subproject1-1.0.0-SNAPSHOT.jar does. How to make gradle get the correct version of module?
I found a similar problem while using the maven-publish plugin:
I was trying to set the repository URL depending on project version as described in the gradle docs here and this answer.
But I found the version always resolved to (as in the question) as the default (un-set) value: unspecified.
So I guess those documentation examples are for a project's build.gradle and not a general gradle script.
Anyway, I believe the problem is due to the timing of the execution of the blocks in the gradle script. The project.version could not be accessed where I wanted it. So I ended up passing a parameter to the gradlew command with the -Pparameter flag.
Gradle has a configuration and then an execution stage.
Refer to documentation:
https://docs.gradle.org/current/userguide/build_lifecycle.html
https://www.oreilly.com/library/view/gradle-beyond-the/9781449373801/ch03.html
and an apparently similar problem, https://discuss.gradle.org/t/maven-publication-closure-is-evaluated-too-early/19911
About your problem, it may be the same as I have described, or perhaps the reason is simpler:
Looking at the structure of your gradle file, it does not appear to match the hierarchy specified in the maven-publish documentation. In particular, repositories {} block should be at the same level as publications {}, not inside of it.
Possibly related:
Gradle maven publish plugin config has reference to dynamically created gradle task
Gradle shouldRunAfter not available for a task

Choose Directory for Gradle Generated Source Code

I'm using Dagger 2 to generate some source code in my Gradle project. Right now those sources are being generated and added in the ./build/classes/main folder along with all the class files.
How do I choose a folder to separate all the generated .java files to?
How do I include that folder in my gradle Java project, and have IntelliJ view those as sources so I can use them in my project?
It looks like the application plugin only uses a certain set of directories by default, mixing in flavours of build to decide what files to compile.
However, I did find an example build script that creates a dagger configuration and manipulates gradle into using it for the generated output and adds it to the classpath. It uses dagger-compiler.
The core of it is:
sourceSets {
dagger {
java {
srcDirs = ['src/dagger/java']
}
}
}
configurations {
compileDagger
}
compileJava {
description = "dagger annotation processor is loaded automatically from classpath"
sourceSets.dagger.java.srcDirs*.mkdirs()
classpath += configurations.compileDagger
options.compilerArgs += [
'-s', sourceSets.dagger.java.srcDirs.iterator().next()
]
}
clean {
description = "delete files in generated source directory tree"
delete fileTree(dir: sourceSets.dagger.java.srcDirs.iterator().next())
}
dependencies {
ext.daggerVersion = "2.0.1"
compile(
"com.google.dagger:dagger:${daggerVersion}",
"com.google.guava:guava:18.0")
compileDagger(
"com.google.dagger:dagger-compiler:${daggerVersion}")
}
Regarding IntelliJ, the plugin should automatically add any srcSets via the normal building of the idea project, so there should be no additional configuration needed, just regenerate it.

Categories

Resources