This question was answered before but the chosen answer doesn't explain a lot for me on how this is doable on Gradle.
That and the fact that I can't comment on the solution to ask for more info forced me to make this question.
I have a Gradle project that has several modules available and I now want to set up the Javadoc task to combine the Javadoc comments of all the modules into a single location where I could browse it.
How would I now be able to do this using Gradle? I run Gradle 5.5 if the version is of any importance and I have the following things set in the build.gradle file:
allprojects {
ext {
// Convenience method to configure Javadoc
configureJavadoc = { Object jDocConfig ->
jDocConfig.options {
it.author()
it.encoding = 'UTF-8'
it.memberLevel = JavadocMemberLevel.PROTECTED
if (it instanceof StandardJavadocDocletOptions) {
def opt = it as StandardJavadocDocletOptions
opt.links(
"https://docs.example.com/java/"
)
if (JavaVersion.current().isJava9Compatible()) {
opt.addBooleanOption("html5", true)
opt.addStringOption("-release", "8")
}
if (JavaVersion.current().isJava11Compatible()) {
opt.addBooleanOption("-no-module-directories", true)
}
}
}
}
}
}
subprojects {
javadoc {
destinationDir = file("$rootDir/docs/")
configureJavadoc(it)
}
}
I was able to do it with:
def exportedProjects = [
":",
":module-a",
":module-b",
":module-c"
]
task allJavadoc(type: Javadoc) {
source exportedProjects.collect { project(it).sourceSets.main.allJava }
classpath = files(exportedProjects.collect { project(it).sourceSets.main.compileClasspath })
destinationDir = file("${buildDir}/docs/javadoc-all")
}
Related
I am using the below configuration build.gradle
plugins {
id "com.google.protobuf" version "0.8.17"
id "java"
}
group "de.prerna.aws.tests"
version "1.0-SNAPSHOT"
repositories {
mavenCentral()
}
ext {
protobufVersion = "3.18.1"
}
dependencies {
implementation "com.google.protobuf:protobuf-java:$protobufVersion"
sourceSets {
main {
proto {
srcDir 'src/main/proto'
}
java {
// include self written and generated code
srcDirs 'src/main/java'
}
}
}
protobuf {
protoc {
artifact = 'com.google.protobuf:protoc:4.0.0-rc-2'
}
plugins {
grpc {
artifact = "io.grpc:protoc-gen-grpc-java:1.39.0"
}
}
generateProtoTasks.generatedFilesBaseDir = 'generated-sources'
generateProtoTasks {
all().each { task ->
task.plugins { grpc{} }
}
ofSourceSet('main')
}
}
Error
* What went wrong:
Execution failed for task ':processResources'.
> Entry Person.proto is a duplicate but no duplicate handling strategy has been set. Please refer to https://docs.gradle.org/7.2/dsl/org.gradle.api.tasks.Copy.html#org.gradle.api.tasks.Copy:duplicatesStrategy for details.
A variant of BParolini for build.gradle (Groovy DSL)
tasks.withType(Copy) {
filesMatching("**/*.proto") {
duplicatesStrategy = DuplicatesStrategy.INCLUDE
}
}
I could fix this problem by adding the following code to my build.gradle.kts:
tasks {
withType<Copy> {
filesMatching("**/*.proto") {
duplicatesStrategy = DuplicatesStrategy.INCLUDE
}
}
}
Extra info: I'm using Gradle 7.3-rc-3 and Java 17.
Unfortunately nobody explains reasons for this problem, so here is some of my explorations and guesses. Please correct me if you know more.
If found that following build script code causes this error:
proto { srcDir 'src/main/proto' }
If look inside "build/extracted-include-protos" directory, there are original .proto files copied into "build/extracted-include-protos/test" (but not into main).
My guess is that those auto-copied .proto files are originally uses as the only sources, but when adding "src/main/proto" source set we give some compiler tool second set of same files.
Removing this srcDir is not a good idea, because it required for IDEA to correctly open included .proto on Ctrl+click (otherwise it is opened extracted copies which is useless).
Recently we had a version mismatch problem with a class org.apache.commons.beanutils.PropertyUtilsBean. We thought that mismatch is just between some dependency that brings commons-beanutils in versions 1.8 and 1.9.3 but after tracking and excluding each transitive dependency we were still facing an issue.
It turns out that the the PropertyUtilsBean was also packed inside the commons-digester3-3.2-with-deps instead declared as dependency to commons-beanutils.
Is it possible in gradle to search all dependencies (including transitive ones) for specific fully qualified classname? That way we could resolve such problems on the spot.
I tried it and it is possible using some custom gradle build logic:
Kotlin DSL
tasks {
val searchClass by creating {
doLast {
configurations.forEach { // check all configurations
if (it.isCanBeResolved) {
try {
val classLoader = configToClassloader(it)
// replace here class you are looking for
val cl = Class.forName("arrow.core.Either", false, classLoader)
println("found in Configuration $it")
println(cl.protectionDomain.codeSource.location)
} catch (e: Exception) {}
}
}
}
}
}
// Helper function: convert a gradle configuration to ClassLoader
fun configToClassloader(config: Configuration) =
URLClassLoader(
config.files.map {
it.toURI().toURL()
}.toTypedArray())
This could be further enhanced by replacing the hard coded classname with some parameter mechanism.
Sample output:
> Task :searchClass
Configuration configuration ':domain:apiDependenciesMetadata'
file:/Users/abendt/.gradle/caches/modules-2/files-2.1/io.arrow-kt/arrow-core-data/0.9.0/a5b0228eebd5ee2f233f9aa9b9b624a32f84f328/arrow-core-data-0.9.0.jar
Groovy DSL
def configToClassloader(config) {
return new URLClassLoader(
*config.files.collect {
it.toURI().toURL()
}.toArray())
}
task searchClass {
doLast {
configurations.forEach { // check all configurations
if (it.canBeResolved) {
try {
def classLoader = configToClassloader(it)
// replace here class you are looking for
def cl = Class.forName("arrow.core.Either", false, classLoader)
println("found in Configuration $it")
println(cl.protectionDomain.codeSource.location)
} catch (e) {}
}
}
}
}
Edit: I have recently created a Gradle Plugin that provides the described tasks: https://plugins.gradle.org/plugin/io.github.redgreencoding.findclass
You could do this
task findJarsForClass {
doLast {
def findMe = 'org/apache/commons/beanutils/PropertyUtilsBean.class'
def matches = configurations.runtime.findAll { f ->
f.name.endsWith('.jar') &&
!(zipTree(f).matching { include findMe }.empty)
}
println "Found $findMe in ${matches*.name}"
}
}
Just ctrl+left click class name which was imported, then you can see the jar on your ide(eclipse has that feature, probably IntelliJ has as well)
Try using the task dependencyInsight :
gradle -q dependencyInsight --configuration compile --dependency commons-beanutils
Every Gradle project provides the task dependencyInsight to render the
so-called dependency insight report from the command line. Given a
dependency in the dependency graph you can identify the selection
reason and track down the origin of the dependency selection.
For some reason I have to manually delete generated folder and run gradle task to get updated POJOs. Is this my setup, expected behavior or a bug? My setup is as follows:
jooq {
library(sourceSets.main) {
jdbc {
driver = 'com.mysql.jdbc.Driver'
url = 'jdbc:mysql://localhost:3306/library'
user = 'library'
password = '123'
schema = 'library'
}
generator {
name = 'org.jooq.util.DefaultGenerator'
strategy {
name = 'org.jooq.util.DefaultGeneratorStrategy'
}
database {
name = 'org.jooq.util.mysql.MySQLDatabase'
inputSchema = 'library'
}
generate {
daos = true
}
target {
packageName = 'com.example.library.db'
directory = 'src/main/java'
}
}
}
}
Currently when you generated the files they're added under src/main/java folder. This is not a good idea since you have mixed source and generated files. It's much better to add a separate folder src/main/generated and modify the build.gradle in the following way:
def generatedDir = 'src/main/generated'
sourceSets {
main {
java {
srcDirs += [generatedDir]
}
}
}
clean.doLast {
project.file(generatedDir).deleteDir()
}
and change:
target {
packageName = 'com.example.library.db'
directory = generatedDir
}
This way you can easily manage the generated classes. All the classes will be removed automatically when clean task is run.
You also need to define a dependency between compileJava and the generator task. It can be done in the following way:
compileJava.dependsOn YOUR_GENERATOR_TASK_NAME
jOOQ will not delete the files automatically.
I use a third-party Gradle plugin in a lot of projects and would like to add this plugin permanently to my gradle installation. Currently I need to add the plugin to each build.gradle like so:
buildscript {
repositories {
jcenter()
}
dependencies {
classpath "com.github.dcendents:android-maven-plugin:1.2"
}
}
Is there a way to add this plugin to my Gradle installation so that I don't need to include it in every build file?
I do realise it might not be the best practice and can result in unreproducible builds.
This is a hack and not a solution
Here is now an updated version which is also able to apply plugins and add maven repositories. Testet with gradle 2.10.
Add this Plugin to your .gradle/init.gradle:
apply plugin:AddDepPlugin
class AddDepPlugin implements Plugin<Gradle> {
def addDeps = [
"org.ensime.gradle": "gradle.plugin.net.coacoas.gradle:ensime-gradle:0.2.2",
"com.github.dcendents.android-maven": "com.github.dcendents:android-maven-plugin:1.2"]
def addRepos = ["https://plugins.gradle.org/m2/"]
void apply(Gradle gradle) {
def add = 0
gradle.allprojects { project ->
plugins.whenPluginAdded { t ->
if (++add == 1) {
project.getBuildScriptSource()
def bs = project.getBuildscript()
bs.getDependencies()
def repo = bs.getRepositories()
def ccf = bs.class.getDeclaredField("classpathConfiguration")
ccf.setAccessible(true)
def cc = ccf.get(bs)
addDeps.each { k,v-> cc.dependencies.add(project.dependencies.create(v))}
addRepos.each { k-> repo.maven { -> setUrl(k) } }
}
if (add == 8)
addDeps.each { k,v ->
if (!k.startsWith("x")) project.apply([plugin: k])
}
}
}
}
}
On http://ensime.github.io//build_tools/gradle/ I found this alternative solution (this is for the ENSIME plugin):
apply plugin: AddEnsimePlugin
class AddEnsimePlugin implements Plugin<Gradle> {
def supportedPlugins = [
'org.gradle.api.plugins.JavaPlugin',
'org.gradle.api.plugins.ScalaPlugin',
'jp.leafytree.gradle.AndroidScalaPlugin'
]
void apply(Gradle gradle) {
def added = false
gradle.allprojects { project ->
project.with {
if (parent == null) {
buildscript {
repositories {
jcenter()
maven {
name 'JFrog OSS Snapshot Repository'
url 'http://oss.jfrog.org/oss-snapshot-local'
}
}
dependencies {
classpath 'net.coacoas.gradle:ensime-gradle:0.2.6'
}
}
}
plugins.whenPluginAdded { plugin ->
if (!added && supportedPlugins.contains(plugin.class.name)) {
rootProject.apply plugin: 'org.ensime.gradle'
added = true
}
}
}
}
}
}
It works for me with Gradle 2.12. The other answer also works for me.
I have read many similar questions where the reply is that the project structure is not ideal so my questions based on the following:
I have a main project (ProjA) which needs to include a second project (ProjB) which is not a child project. ProjB has various resource files which need to be copied in the distribution of ProjA.
build.gradle of ProjA
dependencies {
compile project(":ProjB")
}
distributions {
main {
baseName = "Something"
contents {
into('bin') { from jar.archivePath }
into('lib') { from configurations.runtime }
into('etc') {
from ('../../projb/src/main/webapp') // Fix me!
}
}
}
}
1.) Ideally ProjB should expose the location of the resource files through a property used by ProjA, how can this be done?
2.) Is this the correct way to do it as I have read alot about cross-project properties not being ideal - or should I be doing something completely different?
Don't know if it helps but it seems that the best way is to do it in the following way:
distributions {
main {
baseName = "Something"
contents {
into('bin') { from jar.archivePath }
into('lib') { from configurations.runtime }
into('etc') {
from project(':projB').file('src/main/webapp')
}
}
}
}
The path must be hardcoded in that case.
Second option might be specifying a project property - in general not a very good idea - and use in another project - there must be also evaluation order defined.
In projB
ext.resourcesDir = project.file('src/main/webapp2')
and in projA
evaluationDependsOn(':projB')
and:
distributions {
main {
baseName = "Something"
contents {
into('bin') { from jar.archivePath }
into('lib') { from configurations.runtime }
into('etc') {
from project(':projB').file('src/main/webapp')
from project(':projB').resourcesDir
}
}
}
}
Here's complete example.