Jenkins: Is there any API to see test reports remotely? - java

I'm using jenkins as CI tool. I used restful api to build a job remotely but I don't know how to get test result remotely as well.
I can't be more thankful if anybody know a solution

Use the XML or Json API. At most pages on Jenkins you can add /api/ to the url and get data in xml, json and similar formats. So for a job you can go to <Jenkins URL>/job/<Job Name>/api/xml and get informaiton about the job, builds, etc. For a build you can go to <Jenkins URL>/job/<Job Name>/<build number>/api/xml and you will get a summary for the build. Note that you can use the latestXXXBuild in order to get the latest successful, stable, failing, complete build, like this; <Jenkins URL>/job/<Job Name>/lastCompletedBuild/api/xml.
Additionally if youre using any plugin which publishes test results to the build, then for a given job you can go to <Jenkins URL>/job/<Job Name>/lastCompletedBuild/testReport/api/xml and you will get an xml report with results.
There is a lot more to it, you can control what is exported with the tree parameter and depth parameter. For a summary go to <Jenkins URL>/api/

Well, if you are using a jenkins shared library or decided to permit the security exceptions (a less good approach) then you can access them via a job and send them out to whatever you like - push vs pull
def getCurrentBuildFailedTests() {
def failedTests = []
def build = currentBuild.build()
def action = build.getActions(hudson.tasks.junit.TestResultAction.class)
if (action) {
def failures = build.getAction(hudson.tasks.junit.TestResultAction.class).getFailedTests()
println "${failures.size()} Test Results Found"
for (def failure in failures) {
failedTests.add(['name': failure.name, 'url': failure.url, 'details': failure.errorDetails])
}
}
return failedTests
}

Related

Gradle task error : NullPointerException thrown while using ServerEvaluationCall to invoke xquery module

I am new to gradle and need to write a task for scheduling MarkLogic backup.
So, I want to invoke an XQuery module that uses a config XML for getting details for backup.
So I tried this :
task mlBackupTask(type: com.marklogic.gradle.task.ServerEvalTask) {
def client = hubConfig.newStagingClient()
println client
//DatabaseClient client = DatabaseClientFactory.newClient(host,portno,new DatabaseClientFactory.DigestAuthContext(username, password))
ServerEvaluationCall invoker = client.newServerEval();
String result = invoker.modulePath("/admin/create-backup.xqy").addVariable("config-name", "dev").evalAs(String.class);
}
I tried both :
hubConfig.newStagingClient()
DatabaseClientFactory.newClient(host,portno,new DatabaseClientFactory.DigestAuthContext(username, password))
This doesn't work and just give this error :
Execution failed for task ':mlBackupTask'.
java.lang.NullPointerException (no error message)
Could someone please help out on this?
Start with the docs at https://github.com/marklogic-community/ml-gradle/wiki/Writing-your-own-task . "hubConfig.newStagingClient()" will only work if you're using DHF, as hubConfig is specific to DHF.
Also, I think based on your code, what you really want is to use MarkLogicTask. The purpose of ServerEvalTask is to allow you to write a single line of JS or XQuery code. It looks like you want to write multiple lines of code, given a DatabaseClient. If so, use MarkLogicTask, and also put your code in a "doLast" block as shown in the docs.

Best practices for Spring boot testing against authenticated remote system

I have written code that leverages Azure SDK for Blobs in order to interact with the blob storage.
As a clever and dutiful developer, I have not tested my code by navigating the live application, but rather created a Spring Boot JUnit test and spent a few hours fixing all my mistakes. I didn't use anyh kind of mocking, in fact, as my problem was using the library the correct way. I ran the code against a live instance of a blob storage and checked that all my Java methods worked as expected.
I am writing here because
To call it a day, I hardcoded the credentials in my source files. The repository is a company-private repository, not that harm. Credentials can be rotated, developers can all access from Azure portal and get the credentials. But still I don't like the idea of pushing credentials into code
Having these junit tests work on Azure DevOps pipelines could be some of a good idea
I know from the very beginning that hardcoding credentials into code is a worst practice, but since this morning I wanted to focus on my task. Now I want to adopt the best practices. I am asking about redesigning the test structure
Testing code is this.
The code creates an ephemeral container and tries to store/retrieve/delete blobs. It uses a GUID to create a unique private workspace, to clear after test is finished.
#SpringBootTest(classes = FileRepositoryServiceAzureBlobImplTest.class)
#SpringBootConfiguration
#TestConfiguration
#TestPropertySource(properties = {
"azure-storage-container-name:amlcbackendjunit",
"azure-storage-connection-string:[not going to post it on Stackoverflow before rotating it]"
})
class FileRepositoryServiceAzureBlobImplTest {
private static final Resource LOREM_IPSUM = new ClassPathResource("loremipsum.txt", FileRepositoryServiceAzureBlobImplTest.class);
private FileRepositoryServiceAzureBlobImpl uut;
private BlobContainerClient blobContainerClient;
private String loremChecksum;
#Value("${azure-storage-connection-string}")
private String azureConnectionString;
#Value("${azure-storage-container-name}")
private String azureContainerName;
#BeforeEach
void beforeEach() throws IOException {
String containerName = azureContainerName + "-" + UUID.randomUUID();
blobContainerClient = new BlobContainerClientBuilder()
.httpLogOptions(new HttpLogOptions().setApplicationId("az-sp-sb-aml"))
.clientOptions(new ClientOptions().setApplicationId("az-sp-sb-aml"))
.connectionString(azureConnectionString)
.containerName(containerName)
.buildClient()
;
blobContainerClient.create();
uut = spy(new FileRepositoryServiceAzureBlobImpl(blobContainerClient));
try (InputStream loremIpsumInputStream = LOREM_IPSUM.getInputStream();) {
loremChecksum = DigestUtils.sha256Hex(loremIpsumInputStream);
}
blobContainerClient
.getBlobClient("fox.txt")
.upload(BinaryData.fromString("The quick brown fox jumps over the lazy dog"));
}
#AfterEach
void afterEach() throws IOException {
blobContainerClient
.delete();
}
#Test
void store_ok() {
String desiredFileName = "loremIpsum.txt";
FileItemDescriptor output = assertDoesNotThrow(() -> uut.store(LOREM_IPSUM, desiredFileName));
assertAll(
() -> assertThat(output, is(notNullValue())),
() -> assertThat(output, hasProperty("uri", hasToString(Matchers.startsWith("azure-blob://")))),
() -> assertThat(output, hasProperty("size", equalTo(LOREM_IPSUM.contentLength()))),
() -> assertThat(output, hasProperty("checksum", equalTo(loremChecksum))),
() -> {
String localPart = substringAfter(output.getUri().toString(), "azure-blob://");
assertAll(
() -> assertTrue(blobContainerClient.getBlobClient(localPart).exists())
);
}
);
}
}
In production (but also in SIT/UAT), the real Spring Boot application will get the configuration from the Container environment, including the storage connection string. Yes, for this kind of test I could also avoid using Spring and #TestPropertySource, because I'm not leveraging any bean from the context.
Question
I want to ask how can I amend this test in order to
Decouple the connection string from code
Softly-ignore the test if for some reason the connection string is not present (e.g. developer downloaded the project the first time and wants to kick-start) (note 1)
Integrate this test (with a working connection string) from Azure DevOps pipelines, where I can configure virtually any environment variable and such
Here is the build job comprised of tests
- task: Gradle#2
displayName: Build with Gradle
inputs:
gradleWrapperFile: gradlew
gradleOptions: -Xmx3072m $(gradleJavaProperties)
options: -Pci=true -PbuildId=$(Build.BuildId) -PreleaseType=${{parameters.releaseType}}
jdkVersionOption: 1.11
jdkArchitectureOption: x64
publishJUnitResults: true
sqAnalysisEnabled: true
sqGradlePluginVersionChoice: specify
sqGradlePluginVersion: 3.2.0
testResultsFiles: '$(System.DefaultWorkingDirectory)/build/test-results/**/TEST-*.xml'
tasks: clean build
Note 1: the live application can be kick-started without the storage connection string. It falls back to a local temporary directory.
The answer is a bit complex to explain, so I did my best
TL;DR
Note that the original variable names are redacted and YMMV if you try to recreate the example with the exact keys I used
Create a secret pipeline variable containing the connection string, and bury* it into the pipeline
Example name testStorageAccountConnectionString
Change the Gradle task
- task: Gradle#3
displayName: Build with Gradle
inputs:
gradleWrapperFile: gradlew
gradleOptions: -Xmx10240m -XX:+HeapDumpOnOutOfMemoryError -Dfile.encoding=UTF-8 -DAZURE_STORAGE_CONNECTION_STRING=$(AZURE_STORAGE_CONNECTION_STRING)
options: --build-cache -Pci=true -PgitCommitId=$(Build.SourceVersion) -PbuildId=$(Build.BuildId) -Preckon.stage=${{parameters.versionStage}} -Preckon.scope=${{parameters.versionScope}}
jdkVersionOption: 1.11
jdkArchitectureOption: x64
publishJUnitResults: true
sqAnalysisEnabled: true
sqGradlePluginVersionChoice: specify
sqGradlePluginVersion: 3.2.0
testResultsFiles: '$(System.DefaultWorkingDirectory)/build/test-results/**/TEST-*.xml'
tasks: clean build
env:
AZURE_STORAGE_CONNECTION_STRING: $(testStorageAccountConnectionString)
Explanation
Spring Boot accepts placeholder ${azure.storageConnectionString} from an environment variable AZURE_STORAGE_CONNECTION_STRING. Please read the docs and try it locally first. This means we need to run the test with an environment variable propely set in order to resolve the placeholder
Gradle can run with -D to add an environment variable. -DAZURE_STORAGE_CONNECTION_STRING=$(AZURE_STORAGE_CONNECTION_STRING) adds an environment variable AZURE_STORAGE_CONNECTION_STRING to the test run equal to the pipeline environment variable AZURE_STORAGE_CONNECTION_STRING (not that fantasy)
Azure DevOps pipelines protect secret variables from unwanted access. We created the pipeline variable as secret, so there is another trick to do first
Gradle's env attributes set environment variable for the pipeline container. In this case, we make sure that Gradle runs with AZURE_STORAGE_CONNECTION_STRING set to testStorageAccountConnectionString. Env is the only place where Azure pipelines agent will resolve and set free the content of the secret variable
Secrets cannot be retrieved any more from web interface. Azure Pipelines are designed for this

gRPC the protoc compiler creates not the expected

i start whith gRPC bulding an easy Java Chat Programm.
protc --version prints libprotoc 3.5.1
the -proto File:
syntax = "proto3";
option java_multiple_files = true;
option java_package = "grpc";
// whihout this Option i get no service
option java_generic_services = true;
option java_outer_classname = "ChatProto";
option objc_class_prefix = "HLW";
package chat;
message ClientPost {
string name = 1;
string value = 2;
}
message ServerReply {
ClientPost back = 1;
}
// The service definition.
service Verbindung {
rpc ChatService (stream ClientPost) returns (stream ServerReply);
}
// file end
why i need to set the option java_generic_services ?
class ChatImpl extends grpc.Verbindung {
#Override
public void chatService(RpcController controller, ClientPost request, RpcCallback done) {
// why i get this kind of Service ?
}
}
//
2. why i get an other class name ? shut be VerbindungImplBase
expected Function
public void sayHello(HelloRequest req, StreamObserver<HelloReply> responseObserver) { }
what must i do to get this kind of expected Service Function ?
may be an wrong protoc compiler / wrong Installation / missing Parts ?
You're likely not running the gRPC code generator. Without the full configuration of how you're running protoc I can't point out too much detail, but you are likely only generating the protobuf messages via java_generic_services=true.
You shouldn't need java_generic_services=true. Instead, you should generate the messages like you are now, but then also use the grpc-java plugin. There's documentation for when running protoc manually and our main documentation documents the preferred method, using Maven or Gradle plugins.
I have an open suse leap 42.2 System
this Version knows nothing about grpc - no Support from this side
i get the compiled protoc - it comes whithout the needed Java-gen Plugin
found https://github.com/grpc/grpc-java/blob/master/compiler/README.md
"Normally you don't need to compile the codegen by yourself, since pre-compiled binaries for common platforms are available on Maven Central."
i found only some exe files. - not useful
"Change to the compiler directory:"
i have no compiler dir. - still try to find out were i can get
NetBeans have only an Editor plugin for protofiles - so my IDE can't handel gRPC
maybe for other IDEs are the Maven Plugins for gRPC are helpful
i expected an full protoc Compiler with all needed plugins :-)
not an install the tool adventure.
a the Moment for me: gRPC - nice Idea , but i get an "install the gRPC" Adventure

Cannot run Shared Groovy library function

I am in the process of setting up Jenkins pipeline builds and am starting to use the same methods across multiple jobs, so it's time to put these common methods into a shared library.
The first function I have created is to update GitHub with the result of some unit tests. I am having an issue where I can run this function from the command line fine but when it comes to using it within my Jenkins build it does not work and I cannot seem to get any debug output in the Jenkins console
This is the directory structure of my shared library
my-project
src
vars
- getCommitId.groovy
- gitUpdateStatus.groovy
So the first function getCommitId works fine
#!/usr/bin/env groovy
def call() {
commit_id = sh script: 'git rev-parse HEAD', returnStdout: true
commit_id = commit_id.replaceAll("\\s","") // Remove Whitespace
return commit_id
}
This returns the correct value
This is gitUpdateStatus
#!/usr/bin/env groovy
#Grab(group='org.codehaus.groovy.modules.http-builder', module='http-builder', version='0.7')
import static groovyx.net.http.ContentType.JSON
import static groovyx.net.http.Method.POST
import groovyx.net.http.HTTPBuilder
String targetUrl = 'https://api.github.com/repos/myRepo/'
def http = new HTTPBuilder(targetUrl)
http.request(POST) {
uri.path = "repo/statuses/12345678"
requestContentType = JSON
body = [state: 'success', description: 'Jenkins Unit Tests', target_url: 'http://test.co.uk', context: 'unit tests']
headers.'Authorization' = "token myOauthTokenHere"
headers.'User-Agent' = 'Jenkins Status Update'
headers.Accept = 'application/json'
response.success = { resp, json ->
println "GitHub updated successfully! ${resp.status}"
}
response.failure = { resp, json ->
println "GitHub update Failure! ${resp.status} " + json.message
}
}
I can run this fine via the command line, but I get no output when run as a Jenkins build.
My Jenkinsfile
#Library('echo-jenkins-shared')_
node {
GIT_COMMIT_ID = getGitCommitId()
echo "GIT COMMIT ID: ${GIT_COMMIT_ID}"
gitUpdateStatus(GIT_COMMIT_ID)
}
Why would this not work or could this be converted to just use native Groovy methods?
First off, I would advise you to use as service like https://requestb.in to check if your code actually perform HTTP calls.
Second, I would recommend NOT to use #Grab-based dependencies like HTTPBuilder in Jenkins pipelines, but the http_request plugin instead, downloadable & installable as a .hpi:
https://jenkins.io/doc/pipeline/steps/http_request/
Finally, you can find an example of utility class to perform HTTP requests here:
https://github.com/voyages-sncf-technologies/hesperides-jenkins-lib/blob/master/src/com/vsct/dt/hesperides/jenkins/pipelines/http/HTTPBuilderRequester.groovy
With the rationale behind it explained there: https://github.com/voyages-sncf-technologies/hesperides-jenkins-lib#httprequester

Cannot get api hostname via System property in Java

Recently got the code to write bdd tests with cucumber on Java. There is already maven project with couple of tests and test framework. I need to continue writing bdd tests using this framework.
I am writing API tests and try to run them and i get the error. I found where it fails to run further but I want to figure out what's the idea of doing so in the code. Let me share some code:
So the test framework is collecting info about the API host name this way:
public class AnyClass {
private static final String API_HOSTNAME = "hostname";
private static String getAPIHostName() {
String apiHostName = System.getProperty(API_HOSTNAME);
...
}
When i leave it as is, and run the test, i get the error that host name is empty.
Can you advise on what might be expected to have under System property key "hostname"?
p.s. I tried to use http://localhost and http://127.0.0.1, where my api is located instead of assigning system property but it cannot find such host name.
Can you advise on what might be expected to have under System property key "hostname"?
Yes, I needed to run tests in command line with the syntax like:
mvn clean verify -Dhostname=http://127.0.0.1:8080

Categories

Resources