Beam: Failed to serialize and deserialize property 'awsCredentialsProvider - java

I have been using a Beam pipeline examples as a guide in an attempt to load files from S3 for my pipeline. Like in the examples I have defined my own PipelineOptions that also extends S3Options and I am attempting to use the DefaultAWSCredentialsProviderChain. The code to configure this is:
MyPipelineOptions options = PipelineOptionsFactory.fromArgs(args).as(MyPipelineOptions.class);
options.setAwsCredentialsProvider(new DefaultAWSCredentialsProviderChain());
options.setAwsRegion("us-east-1");
runPipeline(options);
When I run it from Intellij it works fine using the Direct Runner
but when I package it as a jar and it execute it (also using the Direct Runner) I see:
Exception in thread "main" java.lang.IllegalArgumentException: PipelineOptions specified failed to serialize to JSON.
at org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:166)
at org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:67)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:313)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:299)
at a.b.c.beam.CleanSkeleton.runPipeline(CleanSkeleton.java:69)
at a.b.c.beam.CleanSkeleton.main(CleanSkeleton.java:53)
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Unexpected IOException (of type java.io.IOException): Failed to serialize and deserialize property 'awsCredentialsProvider' with value 'com.amazonaws.auth.DefaultAWSCredentialsProviderChain#40f33492'
at com.fasterxml.jackson.databind.JsonMappingException.fromUnexpectedIOE(JsonMappingException.java:338)
at com.fasterxml.jackson.databind.ObjectMapper.writeValueAsBytes(ObjectMapper.java:3247)
at org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:163)
... 5 more
I am using gradle to build my jar with the following task:
jar {
manifest {
attributes (
'Main-Class': 'a.b.c.beam.CleanSkeleton'
)
}
from {
configurations.runtimeClasspath.collect { it.isDirectory() ? it : zipTree(it) }
}
from('src') {
include '/main/resources/*'
}
zip64 true
exclude 'META-INF/*.RSA', 'META-INF/*.SF', 'META-INF/*.DSA'
}

The problem was occuring because when the the fat/uber jar was being created, files in META-INF/serivces where being overwritten by duplicate files. Specifically com.fasterxml.jackson.databind.Module where a number of Jackson modules needed to be defined but where missing. These include org.apache.beam.sdk.io.aws.options.AwsModule and com.fasterxml.jackson.datatype.joda.JodaModule. The code in the DirectRunner instantiates the ObjectMapper like so :
new ObjectMapper()
.registerModules(ObjectMapper.findModules(ReflectHelpers.findClassLoader()));
ObjectMapper::findModules relies on java.util.ServiceLoader which locates services from META-INF/services/ files.
The solution was to use the gradle Shadow plugin to build the fat/uber jar and configure it to merge the services files:
apply plugin: 'com.github.johnrengelman.shadow'
shadowJar {
mergeServiceFiles()
zip64 true
}

Related

How to add OpenAPI client as a subproject?

I can successfully add a generated openapi client to my project via source sets. But then I have to copy dependencies into the main build-gradle, resolve conflicts -> I think it would be a better design to have the client as a subproject with its own build.gradle.
So I add include = 'build:openapi-java-client' to my settings.gradle and compile project(':build:openapi-java-client') to my dependencies. So that I have the following files:
build.gradle:
plugins {
id 'java'
id 'application'
id "org.openapi.generator" version "4.3.1"
}
repositories {
jcenter()
}
openApiGenerate {
generatorName = "java"
inputSpec = "$rootDir/specs/petstore.yaml".toString()
outputDir = "$buildDir/openapi-java-client".toString()
apiPackage = "org.openapi.example.api"
invokerPackage = "org.openapi.example.invoker"
modelPackage = "org.openapi.example.model"
configOptions = [
dateLibrary: "java8"
]
}
dependencies {
implementation 'com.google.guava:guava:29.0-jre'
testImplementation 'junit:junit:4.13'
compile project(':build:openapi-java-client')
}
application {
mainClassName = 'a.aa.App'
}
and settings.gradle:
rootProject.name = 'simple-java-app'
include = 'build:openapi-java-client'
I execute openApiGenerate in advance, after adding it as a subproject, I do Gradle -> Refresh Gradle Project and Refresh.
Eclipse then shows me a problem:
Could not run phased build action using Gradle distribution 'https://services.gradle.org/distributions/gradle-6.5.1-bin.zip'.
Settings file 'C:\...\simple-java-app\settings.gradle' line: 11
A problem occurred evaluating settings 'simple-java-app'.
Could not set unknown property 'include' for settings 'simple-java-app' of type org.gradle.initialization.DefaultSettings.
I don't know where to go from here, addressing subprojects in subfolders worked just fine when I worked through https://guides.gradle.org/creating-multi-project-builds/ and put greeting-library in a subfolder.
You are trying to make build/ a project when that directory specifically is not meant to be a project directory. It's Gradle default build directory and likely 99% of other plugins and other Gradle plugins.
Simply change output directory to something else other than build/:
openApiGenerate {
generatorName.set("java")
inputSpec.set("$rootDir/specs/petstore.json")
outputDir.set("$rootDir/openapi-java-client")
apiPackage.set("org.openapi.example.api")
invokerPackage.set("org.openapi.example.invoker")
modelPackage.set("org.openapi.example.model")
}
Then include the project in your build with the correct syntax:
// settings.gradle
include("openapi-java-client")
However, using the org.openapi.generator seems to generate an invalid build.gradle since I get the following error:
FAILURE: Build failed with an exception.
* Where:
Build file 'C:\Users\fmate\code\example\openapi-java-client\build.gradle' line: 23
* What went wrong:
Could not compile build file 'C:\Users\fmate\code\example\openapi-java-client\build.gradle'.
> startup failed:
build file 'C:\Users\fmate\code\example\openapi-java-client\build.gradle': 23: unexpected char: '\' # line 23, column 35.
main.java.srcDirs = ['src/main\java']
This obviously won't work how you wanted it to since it appears to be an issue with the Gradle plugin itself. If you just need to include the generate code in your project, then just include the generated Java code as part of your main Java source:
openApiGenerate {
generatorName.set("java")
inputSpec.set("$rootDir/specs/petstore.json")
outputDir.set("$buildDir/openapi-java-client")
apiPackage.set("org.openapi.example.api")
invokerPackage.set("org.openapi.example.invoker")
modelPackage.set("org.openapi.example.model")
}
tasks {
compileJava {
dependsOn(openApiGenerate)
}
}
sourceSets {
main {
java {
srcDir(files("${openApiGenerate.outputDir.get()}/src/main"))
}
}
}
But with this approach, you'll run into missing imports/dependencies. It doesn't appear this plugin offers the ability to just generate the models/POJOs only, so updating the library property to native and including some missing dependencies manually, it all works:
plugins {
java
id("org.openapi.generator") version "5.0.0-beta"
}
repositories {
mavenCentral()
}
group = "io.mateo.test"
dependencies {
implementation(platform("com.fasterxml.jackson:jackson-bom:2.11.1"))
implementation("com.fasterxml.jackson.core:jackson-databind")
implementation("com.fasterxml.jackson.datatype:jackson-datatype-jsr310")
implementation("org.openapitools:jackson-databind-nullable:0.2.1")
implementation("com.google.code.findbugs:jsr305:3.0.2")
implementation("io.swagger:swagger-core:1.6.2")
}
openApiGenerate {
generatorName.set("java")
inputSpec.set("$rootDir/specs/petstore.json")
outputDir.set("$buildDir/openapi-java-client")
apiPackage.set("org.openapi.example.api")
invokerPackage.set("org.openapi.example.invoker")
modelPackage.set("org.openapi.example.model")
library.set("native")
configOptions.put("dateLibrary", "java8")
}
tasks {
compileJava {
dependsOn(openApiGenerate)
}
}
sourceSets {
main {
java {
srcDir(files("${openApiGenerate.outputDir.get()}/src/main"))
}
}
}
You cannot configure it alike this, because build most certainly is an output directory, which would create a circular reference. Better try to add a new module and add that generator plugin into that module. If you can configure another module as outputDir, this could be referenced.
Even if the plugin resides in the root project, the destination needs to be a module.
The point is, that the root project always executes, opposite to module configutions.
I’ve just answered a very similar question. While my answer there is not perfect, I would personally still prefer the approach suggested there – and kind of repeated here:
Suggested Approach
I would keep the builds of the modules that depend on the generated API completely separate from the build that generates the API. The only connection between such builds should be a dependency declaration. That means, you’ll have to manually make sure to build the API generating project first and only build the dependent projects afterwards.
By default, this would mean to also publish the API module before the dependent projects can be built. An alternative to this default would be Gradle composite builds – for example, to allow you to test a newly generated API locally first before publishing it. However, before creating/running the composite build, you would have to manually run the API generating build each time that the OpenAPI document changes.
Example
Let’s say you have project A depending on the generated API. Its Gradle build would contain something like this:
dependencies {
implementation 'com.example:api:1.0'
}
Of course, the simple-java-app build described in the question would have to be adapted to produce a module with these coordinates:
openApiGenerate {
// …
groupId = "com.example"
id = "api"
version = "1.0"
}
Before running A’s build, you’d first have to run
./gradlew openApiGenerate from your simple-java-app project.
./gradlew publish from the simple-java-app/build/openapi-java-client/ directory.
Then A’s build could fetch the published dependency from the publishing repository.
Alternatively, you could drop step 2 locally and run A’s build with an additional Gradle CLI option:
./gradlew --include-build $path_to/simple-java-app/build/openapi-java-client/ …

How to build distribution package in multi-module Gradle project?

I have seen this post Gradle multi project distribution but still have some doubts.
I would like to create the following project layout
root
|--lib-java-module
|--spring-boot-module
|--3PP_A_module # not java
| |-- custom scripts, config
|--3PP_B_module # not java
| |-- custom scripts, config
|--dist-module
As you might have guessed, I want the dist-module to build myapp-dist.tar.gz with libjava.jar, sprintbootapp.jar, 3pp-a.tar, 3pp-b.tar.
myapp-dist.tar.gz
libjava.jar
sprintbootapp.jar
3pp-a.tar
3pp-b.tar.
The 3pp-a-module and the 3pp-b-module only contain some configuration files and startup scripts. No java or any compiled code. How to package them individually into tar files (no compression)?
How to define dependencies in dist-module to the other modules? Is it possible to get the other modules built when build is triggered from dist-module?
Update:
I setup my test project based on #marco-r's answer and it works except for packaging the war file. Checkout the test project from github https://github.com/KiranMohan/study-spring-boot.
This is the project setup of interest.
include ':sb-2.1-multi-package', ':sb-2.1-multi-package:hello-rest-lib',
':sb-2.1-multi-package:hello-rest-standalone-jar',
':sb-2.1-multi-package:hello-rest-war'
include 'sb-2.1-3pp-resources'
include 'sb-2.1-build'
However adding hello-rest-war to sb-2.1-build.tar.gz fails.
Instead of war files, its the dependencies that are getting packaged.
dependencies {
archivesDeps project(path: ':sb-2.1-3pp-resources', configuration: 'archives')
javaDeps project(":sb-2.1-multi-package:hello-rest-war")
}
...
task copyJavaDeps(type: Copy) {
inputs.files(configurations.javaDeps)
from configurations.javaDeps
into "${ARCHIVE_DIRECTORY}/lib"
}
...
// create distribution bundle
distributions {
main {
contents {
from ARCHIVE_DIRECTORY
into "/springapp/multimodule"
}
}
}
Contents of the package
springapp/multimodule/lib/classmate-1.4.0.jar
springapp/multimodule/lib/hello-rest-lib-0.0.1-SNAPSHOT.jar
springapp/multimodule/lib/hibernate-validator-6.0.16.Final.jar
...
springapp/multimodule/lib/tomcat-embed-websocket-9.0.17.jar
springapp/multimodule/lib/validation-api-2.0.1.Final.jar
springapp/multimodule/sb-2.1-3pp-resources/config/3pp.json
How to package war file (hello-rest-war module) and without all the transitive dependencies?
This is multiple question scenario, so I am going to address it in parts.
Since all 3PP_X_module have the same building requirements create a build.gradle in each of the submodules that refer to an actual build gradle that have the common functionality required:
apply from: '../tarArtifact.gradle'
In the parent folder create the previously referred tarArtifact.gradle to have the functionality to TAR the contents of a subfolder (arbitrarily chosen as contents) of a referring subproject:
apply plugin: 'base'
task tarContents(type: Tar) {
from 'contents'
archiveName = "${project.name}.tar"
destinationDir file('build/tar')
}
artifacts {
archives file: tarContents.archivePath, type: 'tar', builtBy: tarContents
}
Since the archives configuration is wired to the output of the tarContents (builtBy: tarContents), then the archives configuration can be used to retrieve the desired TAR as the output of building this project naturally.
Create in dist-module the following build.gradle file:
apply plugin: 'distribution'
plugins.withType(DistributionPlugin) {
distTar {
compression = Compression.GZIP
extension = 'tar.gz'
}
}
configurations {
wholeProjectDist
}
dependencies {
wholeProjectDist project(path: ':3pp-a-module', configuration: 'archives')
wholeProjectDist project(path: ':3pp-b-module', configuration: 'archives')
wholeProjectDist project(':lib-java-module')
wholeProjectDist project(':spring-boot-module')
}
distributions {
main {
contents {
from configurations.wholeProjectDist
}
}
}
This gradle file includes the following:
Applies the Distribution plugin, so we can generate the final tar.gz file from the artifacts generated by all the other subprojects.
Configures the distTar task (of the DistributionPlugin plugin) to compress any generated TAR using it by using GZIP.
Creates the configuration wholeProjectDist to capture the dependencies of dist-module itself; which we will use with the distribution plugin's tasks.
Declares the dependencies of dist-module as the artifacts output by the siblings' subprojects; using the newly created wholeProjectDist.
Configures the distribution's plugin main configuration to have as contents all the files from configurations.wholeProjectDist
Create a settings.gradle file under dist-module to allow it to access its siblings modules using includeFlat:
includeFlat '3pp-a-module', '3pp-b-module', 'lib-java-module', 'spring-boot-module'
Include in the parent folder a settings.gradle file to include all children submodules (as the root project):
includeFlat '3pp-a-module', '3pp-b-module', 'lib-java-module', 'spring-boot-module'
Build the desired tar.gz files by invoking the gradle command (from the root folder):
gradle :dist-module:distTar
Hope this helps.

Compile a groovy script with all it's dependencies which are managed by gradle and then run it as a standalone application via the command line

I have a simple groovy script with a single java library dependency:
package com.mrhacki.myApp
import me.tongfei.progressbar.ProgressBar
class Loading {
static void main(String[] arguments) {
List list = ["file1", "file2", "file3"]
for (String x : ProgressBar.wrap(list, "TaskName")) {
println(x)
}
}
}
I'm using gradle to manage the dependencies of the project. The gradle configuration for the project is pretty straightforward too:
plugins {
id 'groovy'
}
group 'com.mrhacki'
version '1.0-SNAPSHOT'
repositories {
mavenCentral()
}
dependencies {
compile 'org.codehaus.groovy:groovy-all:2.3.11'
compile 'me.tongfei:progressbar:0.7.2'
}
If I run the script from the Intellij IDE, the script is executed as expected.
What I would like to do now is to compile the script with this dependency into one single .jar file, so I can distribute it as such, and run the application from any filesystem path, as the script logic will be dependent on the path from which the execution was called.
I've tried with a few gradle fat jars examples out there but none have worked for me since the .jar file is constantly throwing Could not find or load main class Loading when I try it to run.
If anyone would be so kind to give a hint or to show an example of a gradle task that would do a build that fits my described needs I would be very gratefull.
I'm aware of the groovy module Grape with the #Grab annotation too, but I would leave that as a last resort since I don't want the users to wait for the dependencies download, and would like to bundle them with the app.
I'm using groovy 2.5.6 and gradle 4.10 for the project
Thanks
You can simply create the fat-jar yourself, without any extra plugin, using the jar Task. For a simple/small project like yours, it should be straightforward :
jar {
manifest {
// required attribute "Main-Class"
attributes "Main-Class": "com.mrhacki.myApp.Loading"
}
// collect (and unzip) dependencies into the fat jar
from {
configurations.compile.collect {
it.isDirectory() ? it : zipTree(it)
}
}
}
EDIT : pleas take other comments into consideration: if you have more that one external lib you might have issues with this solution, so you should go for a solution with "shadow" plugins in this case.

how to include my test in application jar

I am using Gradle build in my java application. My project has the elasticsearch intergation test. Following is my gradle.build
jar {
baseName = 'myproject'
version = 'V.4.0.0'
manifest {
attributes 'Main-Class': 'com.myapp.Application'
}
from {
configurations.compile.collect { it.isDirectory() ? it : zipTree(it) }
}
}
test {
systemProperties = System.properties
systemProperty 'tests.security.manager', 'false'
}
When i give gradle build it executed the test and created the myproject-V.4.0.0.jar. but when i run the
java -cp myproject-V.4.0.0.jar;junit-4.11.jar junit.textui.TestRunner com.myapp.test.testclassname
i got class not found exception for com.myapplication.test.testclassname.
I extracted the myproject-V.4.0.0.jar and can not find the test class.
My question is, How can i include the test class also in my application jar?
This is a deliberate behaviour of gradle java projects. A jar is your production artifact, so usually you want to test it during the build, but you do not want to run your tests in production, so I do not recommend doing it. Having said that, there is a way of doing it in gradle, like this:
task myJar(type:Jar) {
from {sourceSets.main.output + sourceSets.test.output}
}

Using Gradle with native dependencies

I am trying to use Sigar in a Gradle project. Sigar distribution is by default provided with 2 types of files:
a JAR that contains classes
some native files (.so, dylib, .dll)
My purpose is to repackage these files so that I can use them as dependencies deployed and downloaded on-demand from a personal Maven repository.
My first try was to define dependencies as files in order to check that my application is working as expected before to repackage. Below is the Gradle code I used for my first test that works:
dependencies {
compile files("${rootDir}/lib/sigar/sigar.jar")
runtime fileTree(dir: "${rootDir}/lib/sigar/", exclude: "*.jar")
}
Then, I have repackaged Sigar native files into a JAR and renamed the other one to match rules for maven artifacts since I want to deploy them in a Maven repository. Below is what I get:
sigar-1.6.4.jar (contains .class files)
sigar-1.6.4-native.jar (contains .dylib, .so, and .dll files at the root)
The next step was to deploy these files in my custom repository. Then, I have updated my build.gradle as follows:
dependencies {
compile 'sigar:sigar:1.6.4'
runtime 'sigar:sigar:1.6.4:native'
}
Unfortunately, when I do a gradle clean build, new dependencies are fetched but native libraries can no longer be found at runtime since now I get the following exception:
Error thrown in postRegister method: rethrowing <java.lang.UnsatisfiedLinkError: org.hyperic.sigar.Sigar.getCpuInfoList()[Lorg/hyperic/sigar/CpuInfo;>
Consequently, I am looking for a solution to fetch and to link native files to my Java app like for other dependencies. Any advice, comment, suggestion, help, solution, etc. are welcome ;)
A solution is to define a new gradle configuration that unzips JAR files at the desired location:
project.ext.set('nativeLibsDir', "$buildDir/libs/natives")
configurations {
nativeBundle
}
dependencies {
nativeBundle 'sigar:sigar:1.6.4:native'
}
task extractNativeBundle(type: Sync) {
from {
configurations.nativeBundle.collect { zipTree(it) }
}
into file(project.nativeLibsDir)
}
dist.dependsOn extractNativeBundle
Then, this location must be put in java.library.path for tasks that depend on native libraries:
systemProperty "java.library.path", project.nativeLibsDir

Categories

Resources