I want to speed up the Jenkins job through parallelizing the test stages. The idea is to move each test stage to a separate node. What I've found is the parallel keyword. When I used this I ran into several problems:
The timing seems to be different. I had to adapt several tests.
Mocking seems to have issues in parallel mode.
For me it seems that the parallel stages are running on the same machine/node. Is there an opportunity to force the different stages to run on different nodes so that they don't influence each other?
One of the errors I get:
org.mockito.exceptions.misusing.UnfinishedMockingSessionException:
Unfinished mocking session detected.
Previous MockitoSession was not concluded with 'finishMocking()'.
For examples of correct usage see javadoc for MockitoSession class.
The Jenkins file:
pipeline {
options {
timeout(time: 120, unit: 'MINUTES')
buildDiscarder(logRotator(numToKeepStr: env.BRANCH_NAME == 'master' ? '30' : '5'))
skipDefaultCheckout()
}
agent {
label 'win10'
}
stages {
stage('cleanWs and checkoutCode') {
steps {
cleanWs()
script {
checkoutCode()
}
}
}
stage('build') {
steps {
bat 'gradlew showJavaVersion'
bat 'gradlew printVersion'
bat 'gradlew compileKotlin compileTestKotlin compileJava compileTestJava compileIntTestJava --parallel'
bat 'git status'
}
}
stage('tests') {
parallel {
stage("unit tests without ui") {
environment {
// give each build testing its own random port
BDP_HTTP_MOCK_PORT = 0
}
steps {
bat 'gradlew test -x :ui:test -x :bdp-ui:test -x :bdp-mock:test -P headless=true -P maxParallelIntegrationTests=3 -P os=win --parallel'
}
}
stage("mock tests") {
environment {
// give each build testing its own random port
BDP_HTTP_MOCK_PORT = 0
}
steps {
bat 'gradlew :bdp-mock:test -P headless=true -P maxParallelIntegrationTests=3 -P os=win --parallel'
}
}
stage("unit tests with ui") {
environment {
// give each build testing its own random port
BDP_HTTP_MOCK_PORT = 0
}
steps {
bat 'gradlew :ui:test :bdp-ui:test -P headless=true -P maxParallelIntegrationTests=3 -P os=win --parallel'
}
}
} // parallel
} // stage('tests')
}
}
Several ways of fixing this:
Use only a single slot on the test machines:
Pros: Very simple, foolproof way of avoiding interference between tests
Cons: Using a single slot might be wasteful if the machine could actually handle more
use locks, make sure you have the NODE_NAME be part of the lock's name
Pros: Fine-grained control over which parts of the pipeline need locking
Cons: locks won't influence where the stage will be scheduled, just make processes wait who want the lock
Apply a different scheduling strategy, for example Least Load, or round robbin. This will distribute load better and make clashes less likely.
Pros: Better load balancing
Cons: Doesn't actually prevent clashes
In my opinion, I would say a combination of using locks, and changing the scheduler, is the right way to do this (unless someone comes up with a better way I don't know). Locks guarantee that no tests will interfere, and round-robbin scheduling / Least Load will make it less likely a process needs to wait for the lock to be released.
Related
I am currently using a plugin to run benchmarks on Java code, and I used jmh-gradle-plugin to do that.
The plugin allows to conveniently describe JMH configurations with a jmh code block:
jmh {
include = ["List"]
benchmarkParameters=["seed": ["1", "2", "3", "4", "5"],
"applicationSize":["100", "1000"],
"baseStructureSize":["100", "1000"]]
fork=2
timeOnIteration="250ms"
iterations=2
warmup="250ms"
warmupIterations=2
fork=2
resultFormat="CSV"
benchmarkMode=["ss"]
timeUnit="ms"
failOnError = true
}
This is useful, but I would like to have different types of the same task, for instance, one where the output is CSV, one where the output is JSON. I know this can be configured with resultFormat=<format>, but I could not find a way to "duplicate" a task and have a different configuration for each variant.
The Gradle documentation has a page about configuring tasks, but they configure a Copy task. I thought I could follow a similar approach and write:
task myJMH(type: me.champeau.gradle.JMHTask) {
resultFormat="JSON"
}
But approach does not work, as I mentioned in this issue. I think it might be that the JMH task is just different. Registering a class of that name works, but it's not possible to configure it. I get the following error:
Could not set unknown property 'include' for task ':myJMH' of type me.champeau.gradle.JMHTask.
Similarly, I would like to have various configurations of the shadowJar task, to be able to generate several different variants of the task, but I had the same problem.
The jmh in your first example is not a task, but an extension. The plugin registers both an extension and a task with the same name. Actually, this is a prevalent pattern for Gradle plugins.
Usually, even if the tasks created by a plugin may be configured using an extension, it is still possible to configure them directly, as they still provide configuration properties. This is the case for the task type ShadowJar, so you can simply create tasks of that type manually:
// Shadowing Test Sources and Dependencies
import com.github.jengelman.gradle.plugins.shadow.tasks.ShadowJar
task testJar(type: ShadowJar) {
classifier = 'tests'
from sourceSets.test.output
configurations = [project.configurations.testRuntime]
}
Sadly, the task type JMHTask is implemented in a way that it simply retrieves the configuration from the extension, so any instance will use the same configuration.
However, you may try the following workaround:
Create tasks that configure the extension and then wire them to be executed together with the jmh task:
jmh {
include = ["List"]
benchmarkParameters=["seed": ["1", "2", "3", "4", "5"],
"applicationSize":["100", "1000"],
"baseStructureSize":["100", "1000"]]
fork=2
timeOnIteration="250ms"
iterations=2
warmup="250ms"
warmupIterations=2
fork=2
benchmarkMode=["ss"]
timeUnit="ms"
failOnError = true
}
task jmhCsv {
doFirst {
jmh.resultFormat="CSV"
}
finalizedBy 'jmh'
}
task jmhJson {
doFirst {
jmh.resultFormat="JSON"
}
finalizedBy 'jmh'
}
Please note that a task may only be executed once in a build, so this workaround won't work if you want to run different configurations in the same run.
I have a Jenkins pipeline;
#Library('sharedLib#master')
import org.foo.point
pipeline {
agent { label 'slaveone' }
// agent { label 'master' }
stages {
stage('Data Build'){
steps{
script{
def Point = new point()
Point.hello("mememe")
}
}
}
}
}
which runs a small bit of code in a library called 'jenkins-shared-library/src/sharedLib';
package org.foo
import java.io.File
class point{
def hello(name){
File saveFile = new File("c:/temp/jenkins_log.txt")
saveFile.write "hello"
}
}
It runs fine on both 'master' and 'slaveone', but in both cases the 'jenkins_log.txt' file appears on the master. The log file contains this;
Running on slaveone in d:\Jenkins_WorkDir\workspace\mypipeline
How is this code running on slaveone and writing files to master?
Edit: I should also mention that this is my third attempt at doing this. The first one was with Groovy code direct in the pipeline, and the second was using a 'def' type call in the vars directory. Both produced the same behaviour, seemingly oblivious to the agent it was being run on.
I think everything inside the script runs on master, but here I found a workaround: Jenkins Declarative Pipeline, run groovy script on slave agent
Jenkins stores all logs on master only, that's why you cannot find any log on nodes.
I have a pipeline job using Groovy script set up to run multiple tests in "parallel", but I am curious as to how to get the report(s) unified.
I am coding my Selenium tests in Java and using TestNG and Maven.
When I look at the report in target/surefire-reports, the only thing there is the "last" test ran of "suite".
How can I get a report that combines all of the tests within the Pipeline parallel job?
Example Groovy code:
node() {
try {
parallel 'exampleScripts':{
node(){
stage('ExampleScripts') {
def mvnHome
mvnHome = tool 'MAVEN_HOME'
env.JAVA_HOME = tool 'JDK-1.8'
bat(/"${mvnHome}\bin\mvn" -f "C:\workspace\Company\pom.xml" test -DsuiteXmlFile=ExampleScripts.xml -DenvironmentParam="$ENVIRONMENTPARAM" -DbrowserParam="$BROWSERPARAM" -DdebugParam="false"/)
} // end stage
} // end node
}, // end parallel
'exampleScripts2':{
node(){
stage('ExampleScripts2') {
def mvnHome
mvnHome = tool 'MAVEN_HOME'
env.JAVA_HOME = tool 'JDK-1.8'
bat(/"${mvnHome}\bin\mvn" -f "C:\workspace\Company\pom.xml" test -DsuiteXmlFile=ExampleScripts2.xml -DenvironmentParam="$ENVIRONMENTPARAM" -DbrowserParam="$BROWSERPARAM" -DdebugParam="false"/)
} // end stage
} // end node
step([$class: 'Publisher', reportFilenamePattern: 'C:/workspace/Company/target/surefire-reports/testng-results.xml'])
} // end parallel
There is a little more to this code after this in terms of emailing the test runner the result of the test and such.
This works great, other than the reporting aspect.
I prefer to use ExtentReports because it has a ExtentX server that allows to you report on multiple different test reports.
I used to use ReportNG but development on that stalled and so I don't recommend it any more. It doesn't allow you combine reports anyway.
Other than that, you could use CouchBase or similar JSON database to store test results and then generate your own report from that information.
I'm investigating some unit test failures. The tests pass on an old build server that's been hand-configured (and not documented). I'm trying to run them in a clean virtual machine.
My latest problem is a unit test that creates 10K threads.
for (int i = 0; i < 10000; i++) {
final Thread thread = new Thread(new Runnable() { ... });
threads.add(thread);
thread.start();
}
Well, the max user processes in the clean environment is only 4K.
$ ulimit -u
4096
I was wondering if there's some way for Java to get at that limit. The test really doesn't need 10K, it just needs some arbitrarily large number.
You could call ulimit directly:
Runtime.getRuntime().exec(command)
I am working on a program written in Java which, for some actions, launches external programs using user-configured command lines. Currently it uses Runtime.exec() and does not retain the Process reference (the launched programs are either a text editor or archive utility, so no need for the system in/out/err streams).
There is a minor problem with this though, in that when the Java program exits, it doesn't really quit until all the launched programs are exited.
I would greatly prefer it if the launched programs were completely independent of the JVM which launched them.
The target operating system is multiple, with Windows, Linux and Mac being the minimum, but any GUI system with a JVM is really what is desired (hence the user configurability of the actual command lines).
Does anyone know how to make the launched program execute completely independently of the JVM?
Edit in response to a comment
The launch code is as follows. The code may launch an editor positioned at a specific line and column, or it may launch an archive viewer. Quoted values in the configured command line are treated as ECMA-262 encoded, and are decoded and the quotes stripped to form the desired exec parameter.
The launch occurs on the EDT.
static Throwable launch(String cmd, File fil, int lin, int col) throws Throwable {
String frs[][]={
{ "$FILE$" ,fil.getAbsolutePath().replace('\\','/') },
{ "$LINE$" ,(lin>0 ? Integer.toString(lin) : "") },
{ "$COLUMN$",(col>0 ? Integer.toString(col) : "") },
};
String[] arr; // array of parsed tokens (exec(cmd) does not handle quoted values)
cmd=TextUtil.replace(cmd,frs,true,"$$","$");
arr=(String[])ArrayUtil.removeNulls(TextUtil.stringComponents(cmd,' ',-1,true,true,true));
for(int xa=0; xa<arr.length; xa++) {
if(TextUtil.isQuoted(arr[xa],true)) {
arr[xa]=TextDecode.ecma262(TextUtil.stripQuotes(arr[xa]));
}
}
log.println("Launching: "+cmd);
Runtime.getRuntime().exec(arr);
return null;
}
This appears to be happening only when the program is launched from my IDE. I am closing this question since the problem exists only in my development environment; it is not a problem in production. From the test program in one of the answers, and further testing I have conducted I am satisfied that it is not a problem that will be seen by any user of the program on any platform.
There is a parent child relation between your processes and you have to break that.
For Windows you can try:
Runtime.getRuntime().exec("cmd /c start editor.exe");
For Linux the process seem to run detached anyway, no nohup necessary.
I tried it with gvim, midori and acroread.
import java.io.IOException;
public class Exec {
public static void main(String[] args) {
try {
Runtime.getRuntime().exec("/usr/bin/acroread");
} catch (IOException e) {
e.printStackTrace();
}
System.out.println("Finished");
}
}
I think it is not possible to to it with Runtime.exec in a platform independent way.
for POSIX-Compatible system:
Runtime.getRuntime().exec(new String[]{"/bin/sh", "-c", "your command"}).waitFor();
I have some observations that may help other people facing similar issue.
When you use Runtime.getRuntime().exec() and then you ignore the java.lang.Process handle you get back (like in the code from original poster), there is a chance that the launched process may hang.
I have faced this issue in Windows environment and traced the problem to the stdout and stderr streams. If the launched application is writing to these streams, and the buffer for these stream fills up then the launched application may appear to hang when it tries to write to the streams. The solutions are:
Capture the Process handle and empty out the streams continually - but if you want to terminate the java application right after launching the process then this is not a feasible solution
Execute the process call as cmd /c <<process>> (this is only for Windows environment).
Suffix the process command and redirect the stdout and stderr streams to nul using 'command > nul 2>&1'
It may help if you post a test section of minimal code needed to reproduce the problem. I tested the following code on Windows and a Linux system.
public class Main {
/**
* #param args the command line arguments
*/
public static void main(String[] args) throws Exception {
Runtime.getRuntime().exec(args[0]);
}
}
And tested with the following on Linux:
java -jar JustForTesting.jar /home/monceaux/Desktop/__TMP/test.sh
where test.sh looks like:
#!/bin/bash
ping -i 20 localhost
as well as this on Linux:
java -jar JustForTesting.jar gedit
And tested this on Windows:
java -jar JustForTesting.jar notepad.exe
All of these launched their intended programs, but the Java application had no problems exiting. I have the following versions of Sun's JVM as reported by java -version :
Windows: 1.6.0_13-b03
Linux: 1.6.0_10-b33
I have not had a chance to test on my Mac yet. Perhaps there is some interaction occuring with other code in your project that may not be clear. You may want to try this test app and see what the results are.
You want to launch the program in the background, and separate it from the parent. I'd consider nohup(1).
I suspect this would require a actual process fork. Basically, the C equivalent of what you want is:
pid_t id = fork();
if(id == 0)
system(command_line);
The problem is you can't do a fork() in pure Java. What I would do is:
Thread t = new Thread(new Runnable()
{
public void run()
{
try
{
Runtime.getRuntime().exec(command);
}
catch(IOException e)
{
// Handle error.
e.printStackTrace();
}
}
});
t.start();
That way the JVM still won't exit, but no GUI and only a limited memory footprint will remain.
I tried everything mentioned here but without success. Main parent Java process can't quit until the quit of subthread even with cmd /c start and redirecting streams tu nul.
Only one reliable solution for me is this:
try {
Runtime.getRuntime().exec("psexec -i cmd /c start cmd.cmd");
}
catch (Exception e) {
// handle it
}
I know that this is not clear, but this small utility from SysInternals is very helpful and proven. Here is the link.
One way I can think of is to use Runtime.addShutdownHook to register a thread that kills off all the processes (you'd need to retain the process objects somewhere of course).
The shutdown hook is only called when the JVM exits so it should work fine.
A little bit of a hack but effective.