Parallel Test Execution with Gradle maxParallelForks property - java

I have been searching for the answers last few days and unable to find one. The closest answer I could find is this which does not exactly answer the questions I have.
By the way, I have a Selenium Test Project which is based on Gradle. We build the project on Jenkins run the tests in 20 concurrent threads. Total number of unique test classes I have is 87. So, I expect gradle to exeute at least 5 batches The test project is build using Cucumber JVM, build and triggers tests by Jenkins to Selenium Hub. I tried to increase the parallelism of the tests by utilizing the grid as much as possible. But, the problem started when the number of tests started growing.
When I started the tests from Jenkins, I observed at first shot the test executed all 20 test processes and I see the second batch also started with same amount of processes. After the second batch the processes went back to single mode and the entire job took 14 hours to complete which defeats the purpose of having parallel test execution.
Gradle properties:
jvmArgs '-Xms128m', '-Xmx1024m', '-XX:MaxPermSize=128m'
Runtime.runtime.availableProcessors().toString()) as int
maxParallelForks = PropertyUtils.getProperty('test.parallel', '15') as int
forkEvery = PropertyUtils.getProperty('test.forkEvery', '0') as int
CLI:
gradle clean test -Dtest.single=*TestRun --info
I have read all the documents I can possibly find but failed to get answer. It would be greatly appreciated if someone can help me with these questions
1. How Gradle batch the test runner internally? For example if 20 executors starts and test 1,2,3 done executing faster than the others, do the three executors gets three more test classes or waits for the entire batch to finish executing?
2. Can forkEvery impact how the execution works during parallel testing?
Jenkins log
Successfully started process 'Gradle Test Executor 6'
Successfully started process 'Gradle Test Executor 13'
Successfully started process 'Gradle Test Executor 14'
Successfully started process 'Gradle Test Executor 5'
Successfully started process 'Gradle Test Executor 16'
Successfully started process 'Gradle Test Executor 8'
Successfully started process 'Gradle Test Executor 19'
Successfully started process 'Gradle Test Executor 4'
Successfully started process 'Gradle Test Executor 2'
Successfully started process 'Gradle Test Executor 11'
Successfully started process 'Gradle Test Executor 10'
Successfully started process 'Gradle Test Executor 18'
Successfully started process 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 20'
Successfully started process 'Gradle Test Executor 7'
Successfully started process 'Gradle Test Executor 9'
Successfully started process 'Gradle Test Executor 3'
Successfully started process 'Gradle Test Executor 15'
Successfully started process 'Gradle Test Executor 17'
Successfully started process 'Gradle Test Executor 12'
Gradle Test Executor 13 started executing tests.
Gradle Test Executor 14 started executing tests.
Gradle Test Executor 6 started executing tests.
Gradle Test Executor 5 started executing tests.
Gradle Test Executor 16 started executing tests.
Gradle Test Executor 19 started executing tests.
Gradle Test Executor 8 started executing tests.
Gradle Test Executor 4 started executing tests.
Gradle Test Executor 2 started executing tests.
Gradle Test Executor 10 started executing tests.
Gradle Test Executor 11 started executing tests.
Gradle Test Executor 18 started executing tests.
Gradle Test Executor 1 started executing tests.
Gradle Test Executor 20 started executing tests.
Gradle Test Executor 7 started executing tests.
Gradle Test Executor 3 started executing tests.
Gradle Test Executor 9 started executing tests.
Gradle Test Executor 17 started executing tests.
Gradle Test Executor 15 started executing tests.
Gradle Test Executor 12 started executing tests.

The default of forkEvery is 0
According to the documentation forkEvery is
The maximum number of test classes to execute in a forked test process. The forked test process will be restarted when this limit is reached. The default value is 0 (no maximum).
So gradle (and probably junit) will fork by classes not tests within the class. It sounds like a few of the 87 test classes have long running tests or a large number of tests and they end up in one forked test process. I would consider setting forkEvery to 1. This will ensure that each test class is sent to a new fork. If there is still an issue you may need to find which test classes are taking the most time. Consider splitting these classes up into smaller groups of tests so the tests get spread over each jvm. If it is one test that takes forever consider redesigning it and possibly creating smaller tests from it.
I do not believe that gradle runs tests in batches. As a worker becomes available it takes a test class from the queue of remaining tests. You would really have to look at how JUnit works as I'm sure gradle is simply passing these configurations to JUnit.

Related

Robot - how to check if the execution of a jar is finished

I would like to run a jar file using a robot framework test suite, because this test-case is to be included among other test cases.
When the jar is run from the cmd, it produces the expected output, which can take 10 minutes, and then it ends.
My issue is that I'm not capable to detect when the execution of the jar is finished. I tried several keywords combination, in the last attempt I used the keyword Process Should Be Stopped, as shown below, and the result is that the process is always running.
One of my doubts is: which process is running, java? or the execution of the jar?
*** Settings ***
Library Process
Library OperatingSystem
Suite Setup log running on ${properties.hostname}
Suite Teardown Terminate All Processes kill=True
Variables C:/Users/theUser/Desktop/CheckOutRegression/Regression/RegressionScripts/config/properties.py
*** Test Cases ***
Check jar execution
${data}= Start Process java -jar da-1.0-SNAPSHOT.jar importFile1.json importFile2.zip cwd=${properties.pathToScripts} alias=myProc
${wait}= Wait Until Keyword Succeeds 10x 60s Process Should Be Stopped myProc
Log ${wait}
${result}= Get Process Result myProc
Log ${data.stdout}
Do you know how I can check that the execution of the jar file is finished?

why selenium test cases are got skipped randomly while running testng.xml in windows cmd

I have tried to run the selenium automation test cases using testng.xml file with multiple cases, but some cases got skipped. TestNG not showing skipped testcases count also, but showing total count and failures count. Can't find the reason and resolve this issue.
So i need solution for run the tescases using testng.xml without skipping in windows cmd using following command in cmd line;
java org.testng.TestNG %project_location%\sample_testng.xml
Could you please anyone help me on this issue ?
Your tests weren't skipped. They were successful.
Tests run: 19 = Total test count.
Failures: 15 = Failed tests.
Skips: 0 = Skipped tests.
Total test count - Failed tests - Skipped tests = Successful tests.
19 - 15 - 0 = 4 Successful tests

Gradle Task Not Running After Test

I have integration tests setup in my build.gradle file as such:
task integrationSetup(dependsOn: jar, type: Exec) {
workingDir "$projectDir/resources/integration"
commandLine 'sh', './start_service.sh'
}
task testIntegration(dependsOn: integrationSetup, type: Test) {
testClassesDirs = sourceSets.testIntegration.output.classesDirs
classpath = sourceSets.testIntegration.runtimeClasspath
ignoreFailures = true
}
task integrationTearDown(dependsOn: testIntegration, type: Exec) {
workingDir "$projectDir/resources/integration"
commandLine 'sh', './stop_service.sh'
}
testIntegration.mustRunAfter integrationSetup
testIntegration.finalizedBy integrationTearDown
integrationTearDown.mustRunAfter testIntegration
However since upgrading the Gradle Wrapper to version 4+ the tasks no longer execute correctly. The final tear down never runs and the service continues. What has changed between version 3 and 4 to change this behaviour. Pretty upsetting Gradle did this without warning or deprecation notices.
One dumb option is to downgrade the Gradle wrapper version (can confirm this setup still works on 3.1). But that shouldn't be necessary IMO.
UPDATE: Made some changes per user #Opal. However still have issue where if any errors occur during integration tests the final tear down does not run.
> Task :compileTestIntegrationJava
Putting task artifact state for task ':compileTestIntegrationJava' into context took 0.0 secs.
file or directory '/home/project/cleaner/src/testIntegration/java', not found
file or directory '/home/project/cleaner/src/testIntegration/java', not found
Executing task ':compileTestIntegrationJava' (up-to-date check took 0.072 secs) due to:
Output property 'destinationDir' file /home/project/cleaner/build/classes/java/testIntegration has changed.
Output property 'destinationDir' file /home/project/cleaner/build/classes/java/testIntegration/com has been removed.
Output property 'destinationDir' file /home/project/cleaner/build/classes/java/testIntegration/com/project has been removed.
All input files are considered out-of-date for incremental task ':compileTestIntegrationJava'.
file or directory '/home/project/cleaner/src/testIntegration/java', not found
Compiling with JDK Java compiler API.
/home/project/cleaner/src/integration/java/com/project/cleaner/CleansRequestsTests.java:415: error: reached end of file while parsing
}
^
1 error
:compileTestIntegrationJava (Thread[Daemon worker Thread 8,5,main]) completed. Took 0.162 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':compileTestIntegrationJava'.
> Compilation failed; see the compiler error output for details.
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output.
BUILD FAILED in 8s
8 actionable tasks: 8 executed
Stopped 0 worker daemon(s).
In discussion it turned out that OP wants to stop the service started before running test no matter what e.g. compilation errors. It can be done with the following script:
ext.integrationTearDown = {
workingDir "$projectDir/resources/integration"
commandLine 'sh', './stop_service.sh'
}
task(type: Exec, 'stop_service', integrationTearDown)
gradle.buildFinished {
exec integrationTearDown
}
testIntegration.dependsOn integrationSetup
testIntegration.finalizedBy stop_service
With this piece of code the service will be stopped after every build - event if it succeeds. To avoid this behaviour BuildResult which is passed to buildFinished may be used to determine the required behaviour.

Changes in JAR task from Gradle 1.7 to 4.1

I have a project which is currently built using Gradle version 1.7 and I'm trying to move to version 4.1 as builds are much faster and dependencies can be downloaded in parallel. However I'm seeing some weird behaviour that I don't quite understand. I have a build.gradle file for a couple of sub projects that overrides the main classes task of the java plugin. In it it runs an ant task that generates classes in the build directory.
task classes(overwrite: true) {
inputs.dir project.ext.inputsPath
outputs.dir "${project.buildDir}/classes/main"
doLast {
ant.taskdef(name: 'xmlbean', classname: 'org.apache.xmlbeans.impl.tool.XMLBean', classpath: configurations.compile.asPath)
ant.xmlbean(srcgendir: "${project.buildDir}/generated-sources/xmlbeans",
classgendir: "${project.buildDir}/classes/main",
javasource: '1.5',
failonerror: true,
includeAntRuntime: false,
classpath: project.configurations.compile.asPath) {
fileset(dir: schemaPath, includes: project.ext.has('inclusionPattern') ? project.ext.inclusionPattern : '*.xsd')
}
}
}
This all works as expected and I get classes generated in { project_dir }/build/classes/main
This is the output i get from the console
> Task :my-task:classes
Putting task artifact state for task ':my-task:classes' into context took 0.0 secs.
Executing task ':my-task:classes' (up-to-date check took 0.002 secs) due to:
[ant:xmlbean] Time to build schema type system: 0.616 seconds
[ant:xmlbean] Time to generate code: 1.512 seconds
[ant:xmlbean] Compiling 226 source files to E:\Development\my-task\build\classes\main
[ant:xmlbean] 4 warnings
> Task :my-task:classes
[ant:xmlbean] Time to compile code: 6.263 seconds
:my-task:classes (Thread[Task worker for ':' Thread 3,5,main]) completed. Took 8.899 secs.
:my-task:jar (Thread[Task worker for ':' Thread 3,5,main]) started.
> Task :my-task:jar
Putting task artifact state for task ':my-task:jar' into context took 0.0 secs.
Executing task ':my-task:jar' (up-to-date check took 0.004 secs) due to:
Output property 'archivePath' file E:\Development\my-task\build\libs\my-task.jar has changed.
:my-task:jar (Thread[Task worker for ':' Thread 3,5,main]) completed. Took 0.012 secs.
:my-task:install (Thread[Task worker for ':' Thread 3,5,main]) started.
> Task :my-task:install
The classes task seems to run twice (not sure if this makes any difference) with the task that runs ant running first. As mentioned earlier I do get classes generated by the ant task.
My problem is that the behaviour between Gradle 1.7 -> 4.1 seems to have changed (As you would expect it to) in that for some reason when the jar task runs my classes inside build/classes/main are not archived into the jar, i just get a blank manifest file. How do I get these classes generated using the ant task into the Jar using the default jar task?
Why are you overriding the classes task? The normal approach would be to create an additional task which writes to a new directory and wire it into the DAG
eg:
apply plugin: 'java'
task xmlBeanClasses {
def inclusionPattern = project.ext.has('inclusionPattern') ? project.ext.inclusionPattern : '*.xsd'
inputs.property 'inclusionPattern', inclusionPattern
inputs.dir project.ext.inputsPath
inputs.dir schemaPath
outputs.dir "$buildDir/classes/xmlbeans"
outputs.dir "$buildDir/generated-sources/xmlbeans"
doLast {
// TODO: generate classes in $buildDir/classes/xmlbeans
}
}
// wire the new task into the DAG
classes.dependsOn xmlBeanClasses
// add the generated classes dir to the "main" classesDirs
// (this dir will now be included in the jar etc)
sourceSets.main.output.classesDirs.add files("$buildDir/classes/xmlbeans")

Jenkins always show success - TestNG, Selenium

I'm new with Jenkins and I have a problem with builds. I'm writing UI tests with Selenium, Java and TestNG.
My problem is that Jenkins always shows Finished: SUCCESS even if some tests fail.
===============================================
TestAll
Total tests run: 10, Failures: 1, Skips: 0
===============================================
[SSH] exit-status: 0
TestNG Reports Processing: START
Looking for TestNG results report in workspace using pattern: **/testng-results.xml
Did not find any matching files.
Started calculate disk usage of build
Finished Calculation of disk usage of build in 0 seconds
Started calculate disk usage of workspace
Finished Calculation of disk usage of workspace in 0 seconds
Notifying upstream projects of job completion
No emails were triggered.
Finished: SUCCESS
How can I resolve my problem?
I assume you are building a Maven Project.
To stop a build on test failure, go to the configure part of your project then go to the build section and in "goals & options line" add :
-Dmaven.test.failure.ignore=false
this should stop the build if errors are found.

Categories

Resources