Skipping a step on CircleCI Orb build, version 2.1 - java

I am using CircleCI CI/CD service. I have a basic build config for Java project with Gradle and Java 1.8. It works fine.
Here is the source of my .circleci/config.yml file
executors:
java1_8:
docker:
- image: 'cimg/openjdk:8.0'
orbs:
gradle: circleci/gradle#2.2.0
version: 2.1
workflows:
checkout-build-test:
jobs:
- gradle/test:
executor: java1_8
After completing the build CircleCI uploads artifacts and this is taking a lot of time.
I am looking for a way to skip the "Uploading Artifacts" step.
I can change to CircleCI version 2.0 if needed, but it would be nice to configure 2.1

I have found how to skip the step.
It turned out to be not very hard.
First, in gradle we can skip steps if we add a -x "step-to-skip" command-line parameters.
Second, CircleCi Gradle Orb can be configured with custom 'test' commands.
Here is the documentation: https://circleci.com/developer/orbs/orb/circleci/gradle#jobs-test
I have combined these two features to get a working configuration.
So, if I need to skip 'javaDoc' step, I would modify my CircleCi config.yml file in the following manner:
executors:
java1_8:
docker:
- image: 'cimg/openjdk:8.0'
orbs:
gradle: circleci/gradle#2.2.0
version: 2.1
workflows:
checkout-build-test:
jobs:
- gradle/test:
test_command: test -x javaDoc
executor: java1_8

Related

Is it possible to use Gradle as a build target in the SAM CLI?

I'm currently building a Java Lambda function as a Gradle project. The application is built using the command gradlew build. This both tests ad builds the application JAR.
The definition for my function resembles the following snippet from the SAM template.yml.
ImageResizeLambda:
Type: AWS::Serverless::Function
Properties:
CodeUri: thumbor
Handler: com.nosto.imagevector.resizer.ResizeController::handleRequest
Runtime: java11
Events:
ResizeImage:
Type: Api
Properties:
Path: /quick/{accountId}/{thumbnailVersion}/{imageId}/{hash}/
Method: GET
Policies:
S3ReadPolicy:
BucketName: !Ref ImageBucket
Metadata:
BuildMethod: makefile
As shown here, the build method is defined as makefile. The Makefile inside my Gradle project (which seems really) unnecessary has this bit:
CUR_DIR := $(abspath $(patsubst %/,%,$(dir $(abspath $(lastword $(MAKEFILE_LIST))))))
build-ImageNormalizeLambda:
cd $(CUR_DIR)/.. && ./gradlew normaliser:clean normaliser:buildZip
cd $(CUR_DIR) && unzip build/distributions/normaliser.zip -d $(ARTIFACTS_DIR)
Because my project contains multiple projects, there's a Gradle subproject for every function.
While this seemingly works, I've glued this together and seems to add yet another DSL to the already-complex build.
Based on this project, I can see some work on a Gradle builder but I'm unsure how/if this transitive dependency is used. https://github.com/aws/aws-lambda-builders/tree/develop/aws_lambda_builders/workflows/java_gradle
Is there a better way to build a Gradle-based Java Lambda function as compared to this approach? The documentation for sam build doesn't go very deep.

IntelliJ cannot find Step Definitions from Feature File

I've setup the following cucumber skeleton project using the repo from Cucumber:
https://github.com/cucumber/cucumber-java-skeleton
I'm using IntelliJ 2019.1.3 with the latest bundled Cucumber for Java and Gherkin plugins
I'm using Java: openjdk version "1.8.0_232"
I'm have a local Gradle install of 6.0.1 and for the project, gradlew 6.7.1
I have the following cucumber dependencies specified:
testImplementation 'io.cucumber:cucumber-java:6.9.1'
testImplementation 'io.cucumber:cucumber-junit:6.9.1'
When I run using the command line:
./gradlew test --rerun-tasks --info
...the tests can be run from the RunCucumberTest JUnit runner, but I cannot configure the bundled Cucumber and Gherkin plugins to 'see' the Steps from the Feature File as they all remain grayed out:
Given I'm using Cucumber's own vanilla project, what can I try to resolve the issue of the Feature File failing to recognised or run the steps?
Well as the error you are getting says:
io.cucumber.skeleton.RunCucumberTest > Belly.a few cukes FAILED
io.cucumber.junit.UndefinedStepException: The step "I wait 1 hour" is >undefined. You can implement it using the snippet(s) below:
So add the following snippet into StepDefinitions.java file:
#When("I wait {int} hour")
public void i_wait_hour(Integer int1) {
// Write code here that turns the phrase above into concrete actions
throw new io.cucumber.java.PendingException();
}

surefire results not showing in the test tab of gitlab CI/CD

I have a java-maven application and I am trying to display my junit results on gitlab as shown in the gitlab help here: https://docs.gitlab.com/ee/ci/junit_test_reports.html#viewing-junit-test-reports-on-gitlab
I added the maven-surefire-plugin in the pom build and the maven-surefire-report-plugin in the pom reporting section. I checked that it works because the surefire-report.html is correctly created in my local target repository, with the test results.
Then i configured my gitlabci.yaml by adding the last lines:
image: "maven"
before_script:
- cd gosecuri
stages:
- build
- test
- run
job_build:
stage: build
script:
- mvn compile
job_test:
stage: test
script:
- mvn package
artifacts:
paths:
- gosecuri/target/*.war
expire_in: 1 week
reports:
junit:
- target/surefire-reports/TEST-*.xml
The pipeline succeeds. In gitlab i have the message "There are no tests to show." in the job tests tab, and in the console I have this warning: target/surefire-reports/TEST-*.xml: no matching files
What am I missing? Thanks
PS: i'm running in gitlab.com Saas free plan
Right after docs you mentioned there is Enabling the feature paragraph.
This feature comes with the :junit_pipeline_view feature flag disabled
by default.
Looks like feature is disabled on public gitlab.com. If you launch your instance of GitLab you can enable it.
Update: Path to reports was incorrect: target/... should be gosecuri/target/...

Run tests in Azure DevOps Build Pipeline

I would like to make a build pipeline in Azure DevOps including tests/code coverage.
For that, I created a very basic Java project:
package main:
- main class
- Calculator class
- add method
package test:
- CalculatorTest class
- addTest method
It's very basic, just for me to understand how test in pipeline work. I don't use maven or things like that. For the tests, I'm using JUnit framework.
In Azure DevOps pipeline, I imported my project from Github, and started to create the pipeline. I start from the starter template, which contains:
trigger:
- master
pool:
vmImage: 'Ubuntu-16.04'
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
- script: |
echo Add other tasks to build, test, and deploy your project.
echo See https://aka.ms/yaml
displayName: 'Run a multi-line script'
My question is:
What do I have to do to run my tests automatically ?
I've seen several examples on the Microsoft documentation but it was always for "complex" projects (like with maven etc.). And ass I'm new with Azure DevOps and YAML file/syntax, I'm lost.
I want to run my tests after each commit, and see the results (test + code coverage) in the pipeline summary, like it is described here : https://learn.microsoft.com/en-us/azure/devops/pipelines/test/review-continuous-test-results-after-build?view=azure-devops#view-test-results-in-build
Thanks a lot.
PS: For the moment I'm just focusing on tests but once it will be done I also would like to publish build artefacts. I would like the confirmation of that:
- task: PublishBuildArtifacts#1
Is that line correct ?
EDIT
The line - task: PublishBuildArtifacts#1 seems to work correclty but I have the following warning:
Directory '/home/vsts/work/1/a' is empty. Nothing will be added to build artifact 'drop'.
What does it mean ?
Finally I used the visual designer (like explained here: https://learn.microsoft.com/en-US/azure/iot-edge/how-to-ci-cd) and I added the Maven task.
I upgraded my project to use Maven, which is well integrated in Azure Devops.

Run Dynamodb local as part of a Gradle Java project

I am trying to run DynamoDB local for testing purposes. I followed the steps amazon provides for setting it up and running the jar by itself works fine (link to amazon's tutorial Here). However, the tutorial doesn't go over running the jar within your own project. I don't want all the other developers to have to grab a jar and run it locally every time they test their code.
That is where my question comes in. I've had a real hard time finding any examples online of how to configure a Gradle project to run the DynamoDB local server as part of my tests. I found the following maven example https://github.com/awslabs/aws-dynamodb-examples/blob/master/src/test/java/com/amazonaws/services/dynamodbv2/DynamoDBLocalFixture.java#L32 and am trying to convert it to a Gradle, but am getting errors for all of com.amazonaws.services.dynamodbv2.local import statements they are using. The errors are that the resource cannot be found.
I went into their project's pom and put the following into my build.gradle file to emulate it.
//dynamodb local dependencies
testCompile('com.amazonaws:aws-java-sdk-dynamodb:1.10.42')
testCompile('com.amazonaws:aws-java-sdk-cloudwatch:1.10.42')
testCompile('com.amazonaws:aws-java-sdk:1.3.0')
testCompile('com.amazonaws:amazon-kinesis-client:1.6.1')
testCompile('com.amazonaws:amazon-kinesis-connectors:1.1.1')
testCompile('com.amazonaws:dynamodb-streams-kinesis-adapter:1.0.2')
testCompile('com.amazonaws:DynamoDBLocal:1.10.5.1')
The import statements still fail. Here is an example of one that fails.
import com.amazonaws.services.dynamodbv2.local.embedded.DynamoDBEmbedded;
TL;DR
Has anyone managed to get the DynamoDB local JAR to execute as part of a Gradle project or have a link to a good tutorial (it doesn't have to be the tutorial I linked to).
We have DynamoDB local working with gradle. Here's what you need to add to your gradle.build file:
For gradle 4.x and below versions
1) Add to the repositories section:
maven {
url 'http://dynamodb-local.s3-website-us-west-2.amazonaws.com/release'
}
2) Add to the dependencies section (assuming you're using this for your tests):
testCompile group: 'com.amazonaws', name: 'DynamoDBLocal', version: 1.11.0
3) These next two steps are the tricky part. First copy the native files to a directory:
task copyNativeDeps(type: Copy) {
from (configurations.testCompile) {
include "*.dylib"
include "*.so"
include "*.dll"
}
into 'build/libs'
}
4) Then make sure you include this directory (build/libs in our case) in the java library path like so:
test.dependsOn copyNativeDeps
test.doFirst {
systemProperty "java.library.path", 'build/libs'
}
Now you should be able to run ./gradlew test and have your tests hit your local DynamoDB.
For Gradle 5.x the below solution works
maven {
url 'http://dynamodb-local.s3-website-us-west-2.amazonaws.com/release'
}
configurations {
dynamodb
}
dependencies {
testImplementation 'com.amazonaws:DynamoDBLocal:1.11.477'
dynamodb fileTree (dir: 'lib', include: ["*.dylib", "*.so", "*.dll"])
dynamodb 'com.amazonaws:DynamoDBLocal:1.11.477'
}
task copyNativeDeps(type: Copy) {
from configurations.dynamodb
into "$project.buildDir/libs/"
}
test.dependsOn copyNativeDeps
test.doFirst {
systemProperty "java.library.path", 'build/libs'
}
I run into the same problem and first I tried to add sqlite4java.library.path to the Gradle script as it has been mentioned in the other comments.
This worked for command line, but were not working when I was running the tests from IDE (IntelliJ IDEA), so finally I come up with a simple init method, that is called at the beginning of each of integration tests:
AwsDynamoDbLocalTestUtils.initSqLite();
AmazonDynamoDBLocal amazonDynamoDBLocal = DynamoDBEmbedded.create();
Implementation can be found here: https://github.com/redskap/aws-dynamodb-java-example-local-testing/blob/master/src/test/java/io/redskap/java/aws/dynamodb/example/local/testing/AwsDynamoDbLocalTestUtils.java
I put a whole example to GitHub, it might be helpful: https://github.com/redskap/aws-dynamodb-java-example-local-testing
In August 2018 Amazon announced new Docker image with Amazon DynamoDB Local onboard. It does not require downloading and running any JARs as well as adding using third-party OS-specific binaries like sqlite4java.
It is as simple as starting a Docker container before the tests:
docker run -p 8000:8000 amazon/dynamodb-local
You can do that manually for local development, as described above, or use it in your CI pipeline. Many CI services provide an ability to start additional containers during the pipeline that can provide dependencies for your tests. Here is an example for Gitlab CI/CD:
test:
stage: test
image: openjdk:8-alpine
services:
- name: amazon/dynamodb-local
alias: dynamodb-local
script:
- ./gradlew clean test
So, during the test task DynamoDB will be available on http://dynamodb-local:8000.
Another, more powerful tool is localstack. It supports two dozen of AWS services, DynamoDB is one of them. The isage is very similar, you have to start it before running the tests and it will expose AWS-compatible APIs on given ports:
test:
stage: test
image: openjdk:8-alpine
services:
- name: localstack/localstack
alias: localstack
script:
- ./gradlew clean test
The idea is to move all the configuration out of your build tool and tests and provide the dependency externally. Think of it as of dependency injection / IoC but for the whole service, not just a single bean. This way, you code is more clean and maintainable. You can see that even in the examples above: you can switch mock implementation from DynamoDB Local to localstack by simply changing the image part!
The easiest way, in my opinion, is to:
Download the JAR from here:
http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DynamoDBLocal.html#DynamoDBLocal.DownloadingAndRunning
Then unzip the downloaded folder and add its content to the /libs folder in the project (create the /libs folder before that)
Finally, add to the build.gradle:
dependencies {
runtime files('libs/DynamoDBLocal.jar')
}
I didn't want to create a specific configuration for dynamo for gradle 6+ so I tweaked the original answer instructions. Also, this is in kotlin gradle DSL rather than groovy.
val copyNativeDeps by tasks.creating(Copy::class) {
from(configurations.testRuntimeClasspath) {
include("*.dylib")
include("*.so")
include("*.dll")
}
into("$buildDir/libs")
}
tasks.withType<Test> {
dependsOn.add(copyNativeDeps)
doFirst { systemProperty("java.library.path", "$buildDir/libs") }
}
By leveraging the testRuntimeClasspath configuration, gradle is able to locate the relevant files for you without needing to create a custom configuration. Obviously this has the side effect that if your test runtime has many native deps, they will also be copied which would make the custom configuration approach more ideal.

Categories

Resources