Run tests in Azure DevOps Build Pipeline - java

I would like to make a build pipeline in Azure DevOps including tests/code coverage.
For that, I created a very basic Java project:
package main:
- main class
- Calculator class
- add method
package test:
- CalculatorTest class
- addTest method
It's very basic, just for me to understand how test in pipeline work. I don't use maven or things like that. For the tests, I'm using JUnit framework.
In Azure DevOps pipeline, I imported my project from Github, and started to create the pipeline. I start from the starter template, which contains:
trigger:
- master
pool:
vmImage: 'Ubuntu-16.04'
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
- script: |
echo Add other tasks to build, test, and deploy your project.
echo See https://aka.ms/yaml
displayName: 'Run a multi-line script'
My question is:
What do I have to do to run my tests automatically ?
I've seen several examples on the Microsoft documentation but it was always for "complex" projects (like with maven etc.). And ass I'm new with Azure DevOps and YAML file/syntax, I'm lost.
I want to run my tests after each commit, and see the results (test + code coverage) in the pipeline summary, like it is described here : https://learn.microsoft.com/en-us/azure/devops/pipelines/test/review-continuous-test-results-after-build?view=azure-devops#view-test-results-in-build
Thanks a lot.
PS: For the moment I'm just focusing on tests but once it will be done I also would like to publish build artefacts. I would like the confirmation of that:
- task: PublishBuildArtifacts#1
Is that line correct ?
EDIT
The line - task: PublishBuildArtifacts#1 seems to work correclty but I have the following warning:
Directory '/home/vsts/work/1/a' is empty. Nothing will be added to build artifact 'drop'.
What does it mean ?

Finally I used the visual designer (like explained here: https://learn.microsoft.com/en-US/azure/iot-edge/how-to-ci-cd) and I added the Maven task.
I upgraded my project to use Maven, which is well integrated in Azure Devops.

Related

Rerun file not found to execute failed feature scenario in cucumber

Hi StackoverFlow folks,
The below error is really eating my head and I am not able to understand whats the mistake I have made. Hence looking forward to communities help.
Command and error
home:tcr-ui-automation sobhit.sharma$ gradle clean build -Denv=QA '-Dcucumber.options=--tags #target/rerun.txt'
Task :test
runners.TestRunner STANDARD_OUT None of the features at [classpath:features] matched the filters: [#target/rerun.txt]
0 Scenarios
0 Steps
0m0.000s
`
The cucumber option is something like this where I have defined my options-
#RunWith(Cucumber.class)
#CucumberOptions(
features = "classpath:features",glue = "stepDefinations",
plugin = { "com.cucumber.listener.ExtentCucumberFormatter:",
"junit:target/cucumber-results.xml",
"rerun:target/rerun.txt"},
tags="#Smoke",
monochrome = true
)
public class TestRunner {
Goal is to run the failed scenarios of a feature file.
File structure-
When dealing with errors that don't seem to make sense, it often helps to read every part of the command and error message out loud and in detail. Think if it as carefully explaining what you are doing to a co-worker.
For example:
gradle clean build -Denv=QA '-Dcucumber.options=--tags #target/rerun.txt'
Here you are telling gradle to clean the build environment. After cleaning the build environment to build the project. When doing so a JVM flag is passed to set the environment to QA and a JVM flag is passed to tell Cucumber to run only scenarios that are tagged with #target/rerun.txt.
When executing this:
None of the features at [classpath:features] matched the filters: [#target/rerun.txt]
Now Cucumber complains that it looked for features on the classpath but none were tagged with #target/rerun.txt.

Azure JAVA Functions + DevOps pipeline for different environment

I have Azure Function project in JAVA. Unfortunately, java is not supported very well )-: So, everything is a "little" bit different. So, could you point me to reference example or documentation how to deploy a function project written in java to azure? Since all what I want shows just part of the problem - and the parts does not fit together )-:
java uses azure-function-maven-plugin (which is wrapper to func core tools)
this plugin prepares stagging folder which is compressed to ZIP and deployed as "package"
unfortunately, this stagging folder is named based of function name. So, ZIP package IS DEPENDENT on target azure RESOURCE name.
It is impossible to build independent package (ZIP) and deploy it to several different environment (stages/dev/test/prod). Or is it?
It is especially wierd when you use CI/CD pipeline. It is not possible to have one BUILD pipeline and than several DEPLOY pipeline. Because the build HAVE TO be named (internal directories) based on target deployment name - so not independent. Which goes against the basic principle to have one build and several configuration for each environment.
Any idea how to solve it without building several builds? Thank you.
EDIT:
maven "mvn package (+azure-function:package)" prepare build with directories
${projectRoot}/target/azure-functions/${functionResourceName}/...
where
/...
is compressed to final azure package named: ${functionResourceName}.zip
So, the "functionResourceName" is just in the name of ZIP file (+ containing jar with the same name). But ...
... if you try deploy this ZIP to azure function resource with another name - it fails.
Yes. I indeed manually prepare the package(using Publish Build Artifacts Task.)
I would like to share my steps to deploy the Java Function Package to Azure Function.
Here are my steps:
In Build Pipeline:
steps:
- task: Maven#3
displayName: 'Maven pom.xml'
inputs:
mavenPomFile: '$(Parameters.mavenPOMFile)'
options: 'azure-functions:package'
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: '$(system.defaultworkingdirectory)'
Contents: '**/azure-functions/**'
TargetFolder: '$(build.artifactstagingdirectory)'
condition: succeededOrFailed()
- task: ArchiveFiles#2
displayName: 'Archive $(Build.ArtifactStagingDirectory)/target/azure-functions/kishazureappfunction'
inputs:
rootFolderOrFile: '$(Build.ArtifactStagingDirectory)/target/azure-functions/kishazureappfunction'
includeRootFolder: false
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: drop'
inputs:
PathtoPublish: '$(build.artifactstagingdirectory)'
condition: succeededOrFailed()
In Release Pipeline:
I use the Azure App Service Deploy Task. (For clear, I converted it to yaml format)
- task: AzureRmWebAppDeployment#4
displayName: 'Azure App Service Deploy: kevin1014'
inputs:
azureSubscription: kevintest
appType: functionApp
WebAppName: kevin1014
packageForLinux: '$(System.DefaultWorkingDirectory)/_123-Maven-CI/drop/1.zip'
enableCustomDeployment: true
DeploymentType: runFromZip
Result:
The azure function name and the package name is different. But it could be deployed to Azure Function successfully.

surefire results not showing in the test tab of gitlab CI/CD

I have a java-maven application and I am trying to display my junit results on gitlab as shown in the gitlab help here: https://docs.gitlab.com/ee/ci/junit_test_reports.html#viewing-junit-test-reports-on-gitlab
I added the maven-surefire-plugin in the pom build and the maven-surefire-report-plugin in the pom reporting section. I checked that it works because the surefire-report.html is correctly created in my local target repository, with the test results.
Then i configured my gitlabci.yaml by adding the last lines:
image: "maven"
before_script:
- cd gosecuri
stages:
- build
- test
- run
job_build:
stage: build
script:
- mvn compile
job_test:
stage: test
script:
- mvn package
artifacts:
paths:
- gosecuri/target/*.war
expire_in: 1 week
reports:
junit:
- target/surefire-reports/TEST-*.xml
The pipeline succeeds. In gitlab i have the message "There are no tests to show." in the job tests tab, and in the console I have this warning: target/surefire-reports/TEST-*.xml: no matching files
What am I missing? Thanks
PS: i'm running in gitlab.com Saas free plan
Right after docs you mentioned there is Enabling the feature paragraph.
This feature comes with the :junit_pipeline_view feature flag disabled
by default.
Looks like feature is disabled on public gitlab.com. If you launch your instance of GitLab you can enable it.
Update: Path to reports was incorrect: target/... should be gosecuri/target/...

Run Dynamodb local as part of a Gradle Java project

I am trying to run DynamoDB local for testing purposes. I followed the steps amazon provides for setting it up and running the jar by itself works fine (link to amazon's tutorial Here). However, the tutorial doesn't go over running the jar within your own project. I don't want all the other developers to have to grab a jar and run it locally every time they test their code.
That is where my question comes in. I've had a real hard time finding any examples online of how to configure a Gradle project to run the DynamoDB local server as part of my tests. I found the following maven example https://github.com/awslabs/aws-dynamodb-examples/blob/master/src/test/java/com/amazonaws/services/dynamodbv2/DynamoDBLocalFixture.java#L32 and am trying to convert it to a Gradle, but am getting errors for all of com.amazonaws.services.dynamodbv2.local import statements they are using. The errors are that the resource cannot be found.
I went into their project's pom and put the following into my build.gradle file to emulate it.
//dynamodb local dependencies
testCompile('com.amazonaws:aws-java-sdk-dynamodb:1.10.42')
testCompile('com.amazonaws:aws-java-sdk-cloudwatch:1.10.42')
testCompile('com.amazonaws:aws-java-sdk:1.3.0')
testCompile('com.amazonaws:amazon-kinesis-client:1.6.1')
testCompile('com.amazonaws:amazon-kinesis-connectors:1.1.1')
testCompile('com.amazonaws:dynamodb-streams-kinesis-adapter:1.0.2')
testCompile('com.amazonaws:DynamoDBLocal:1.10.5.1')
The import statements still fail. Here is an example of one that fails.
import com.amazonaws.services.dynamodbv2.local.embedded.DynamoDBEmbedded;
TL;DR
Has anyone managed to get the DynamoDB local JAR to execute as part of a Gradle project or have a link to a good tutorial (it doesn't have to be the tutorial I linked to).
We have DynamoDB local working with gradle. Here's what you need to add to your gradle.build file:
For gradle 4.x and below versions
1) Add to the repositories section:
maven {
url 'http://dynamodb-local.s3-website-us-west-2.amazonaws.com/release'
}
2) Add to the dependencies section (assuming you're using this for your tests):
testCompile group: 'com.amazonaws', name: 'DynamoDBLocal', version: 1.11.0
3) These next two steps are the tricky part. First copy the native files to a directory:
task copyNativeDeps(type: Copy) {
from (configurations.testCompile) {
include "*.dylib"
include "*.so"
include "*.dll"
}
into 'build/libs'
}
4) Then make sure you include this directory (build/libs in our case) in the java library path like so:
test.dependsOn copyNativeDeps
test.doFirst {
systemProperty "java.library.path", 'build/libs'
}
Now you should be able to run ./gradlew test and have your tests hit your local DynamoDB.
For Gradle 5.x the below solution works
maven {
url 'http://dynamodb-local.s3-website-us-west-2.amazonaws.com/release'
}
configurations {
dynamodb
}
dependencies {
testImplementation 'com.amazonaws:DynamoDBLocal:1.11.477'
dynamodb fileTree (dir: 'lib', include: ["*.dylib", "*.so", "*.dll"])
dynamodb 'com.amazonaws:DynamoDBLocal:1.11.477'
}
task copyNativeDeps(type: Copy) {
from configurations.dynamodb
into "$project.buildDir/libs/"
}
test.dependsOn copyNativeDeps
test.doFirst {
systemProperty "java.library.path", 'build/libs'
}
I run into the same problem and first I tried to add sqlite4java.library.path to the Gradle script as it has been mentioned in the other comments.
This worked for command line, but were not working when I was running the tests from IDE (IntelliJ IDEA), so finally I come up with a simple init method, that is called at the beginning of each of integration tests:
AwsDynamoDbLocalTestUtils.initSqLite();
AmazonDynamoDBLocal amazonDynamoDBLocal = DynamoDBEmbedded.create();
Implementation can be found here: https://github.com/redskap/aws-dynamodb-java-example-local-testing/blob/master/src/test/java/io/redskap/java/aws/dynamodb/example/local/testing/AwsDynamoDbLocalTestUtils.java
I put a whole example to GitHub, it might be helpful: https://github.com/redskap/aws-dynamodb-java-example-local-testing
In August 2018 Amazon announced new Docker image with Amazon DynamoDB Local onboard. It does not require downloading and running any JARs as well as adding using third-party OS-specific binaries like sqlite4java.
It is as simple as starting a Docker container before the tests:
docker run -p 8000:8000 amazon/dynamodb-local
You can do that manually for local development, as described above, or use it in your CI pipeline. Many CI services provide an ability to start additional containers during the pipeline that can provide dependencies for your tests. Here is an example for Gitlab CI/CD:
test:
stage: test
image: openjdk:8-alpine
services:
- name: amazon/dynamodb-local
alias: dynamodb-local
script:
- ./gradlew clean test
So, during the test task DynamoDB will be available on http://dynamodb-local:8000.
Another, more powerful tool is localstack. It supports two dozen of AWS services, DynamoDB is one of them. The isage is very similar, you have to start it before running the tests and it will expose AWS-compatible APIs on given ports:
test:
stage: test
image: openjdk:8-alpine
services:
- name: localstack/localstack
alias: localstack
script:
- ./gradlew clean test
The idea is to move all the configuration out of your build tool and tests and provide the dependency externally. Think of it as of dependency injection / IoC but for the whole service, not just a single bean. This way, you code is more clean and maintainable. You can see that even in the examples above: you can switch mock implementation from DynamoDB Local to localstack by simply changing the image part!
The easiest way, in my opinion, is to:
Download the JAR from here:
http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DynamoDBLocal.html#DynamoDBLocal.DownloadingAndRunning
Then unzip the downloaded folder and add its content to the /libs folder in the project (create the /libs folder before that)
Finally, add to the build.gradle:
dependencies {
runtime files('libs/DynamoDBLocal.jar')
}
I didn't want to create a specific configuration for dynamo for gradle 6+ so I tweaked the original answer instructions. Also, this is in kotlin gradle DSL rather than groovy.
val copyNativeDeps by tasks.creating(Copy::class) {
from(configurations.testRuntimeClasspath) {
include("*.dylib")
include("*.so")
include("*.dll")
}
into("$buildDir/libs")
}
tasks.withType<Test> {
dependsOn.add(copyNativeDeps)
doFirst { systemProperty("java.library.path", "$buildDir/libs") }
}
By leveraging the testRuntimeClasspath configuration, gradle is able to locate the relevant files for you without needing to create a custom configuration. Obviously this has the side effect that if your test runtime has many native deps, they will also be copied which would make the custom configuration approach more ideal.

Cant run vertx module for eclipse project on windows 7

I can't run vertx module for eclipse project on windows 7
I have followed the instructions here: http://vertx.io/gradle_dev.html
download the template https://github.com/vert-x/vertx-gradle-template
run the tests
cd vertx-gradle-template-master
gradlew.bat test
BUILD SUCCESSFUL
setup the ide
gradlew.bat eclipse
BUILD SUCCESSFUL
trying to run module
gradlew.bat runMod
I got this:
:collectDeps UP-TO-DATE
:runMod
Module directory build\mods\com.mycompany~my-module~1.0.0-final already exists. Creating properties on demand (a.k.a. dynamic properties) has been deprecated and is scheduled to be removed in Gradle 2.0. Please read http://gradle.org/docs/current/dsl/org.gradle.api.plugins.ExtraPropertiesExtension.html for information
on the replacement for dynamic properties.
Deprecated dynamic property: "args" on "task ':runMod'", value: "[runmod, com.mycompany...".
Building 50% > :runMod
What I should do with this? I don't understand.
Actually, you got everything working!
The dev guide [0] provides the extra information you need. According to it, Gradle swallows INFO-level messages from the program. If you do it again with '-i', then you will see the missing output that actually indicates things are working as expected.
Here's what I see when I run ./gradlew runmod -i (note that case doesn't matter, this works same as using 'runMod'), where you can see the log message from the PingVerticle class indicating it's waiting for ping messages:
...Same output as from the question...
PingVerticle started
Succeeded in deploying module
> Building 50% > :runMod
Unfortunately, the documentation is sorely lacking and the PingVerticle will just sit there indefinitely until you actually send a ping message yourself.
[0] http://vertx.io/dev_guide.html

Categories

Resources