This question already has answers here:
How to continue execution when Assertion is failed
(5 answers)
Closed 7 years ago.
I'm using Selenium WebDriver + Java + TestNG for automation. In my test methods, sometimes there are more than one assertions.
Suppose, there are four assertions and second assertion fails, then rest of the execution is terminated.
What I want is - even after second assertion fails, code after it should be executed. And at the end(after test method is executed), it should return what assertions are failed out of four and test should be marked as "Fail".
Is there any way using Java + TestNG, I can achieve this?(And I would like to put this code at some central place, so that I won't have to add it in every test method)
If no assertion fails, then no worries. It'll execute as usual.
Here's something that you could be looking for:
https://rameshbaskar.wordpress.com/2013/09/11/soft-assertions-using-testng/
import org.testng.asserts.Assertion;
import org.testng.asserts.SoftAssert;
public class MyTest {
private Assertion hardAssert = new Assertion();
private SoftAssert softAssert = new SoftAssert();
softAssert.assertTrue(false);
}
Related
This question already has answers here:
How to order feature files in Cucumber test suite?
(6 answers)
Closed 3 years ago.
I have 3 separate feature files and one CucumberRunnerClass. As per the sequence need to execute those are listed below:
Feature files : Login.feature, NavigateCusMngt.feature, AddCustomer.feature
However, when executing it goes to first execute the AddCustomer.feature, then Login.feature and finally
NavigateCusMngt.feature.
Therefore, I observed AddCustomer.feature - skipped, system logged in then NavigateCusMngt.feature -gives errors.
#CucumberOptions(
features = {"src/test/resources/features/Login.feature", "src/test/resources/features/NavigateCusMngt.feature", "src/test/resources/features/AddCustomer.feature"},
glue = {"phptravelstestcases"},
tags = {"~#Ignore"},
format = {
"pretty",
"html:target/cucumber-reports/cucumber-pretty/mercury-tours-RegisterUserTest",
"json:target/cucumber-reports/json-reports/mercury-tours-RegisterUserTest.json",
"rerun:target/cucumber-reports/rerun-reports/mercury-tours-RegisterUserTest.txt"
}
)
please give me a solution.
The feature files are parsed alphabetically. I named mine with a starting letter in the right order, e.g.
A-Login.feature
B-NavigateCusMngt.feature
C-AddCustomer.feature
It's not ideal in the long run, but it is a workable solution.
This question already has answers here:
Conditionally ignoring tests in JUnit 4
(5 answers)
Closed 5 years ago.
I have a junit integration test that runs on our build machine, and calls an external server. We use a gating build - so code doesn't make it in to the main branch unless we get 100% of tests passing
Occasionally, the external server goes down for a while, and the test fails, but with a definitive exception that I can catch. I really don't want the test to fail the build, therefore blocking code getting in - but I also would prefer it's not marked as "passed". So I want to sort of mark it as a warning, ignored, or indeterminate result. This would be ideal:
#Test
public void someTest()
{
try
{
do whatever
}
catch (ServerDownException theException)
{
junit.markThisTestAsIgnored(); <---- something like this
}
}
found it -
throw new AssumptionViolationException( "skipping test because xxx is down");
One option is to set a category(stress category for example).
Check out this link:
https://github.com/junit-team/junit4/wiki/categories
If you want, you can put the #Ignore rule:
https://dzone.com/articles/allowing-junit-tests-pass-test
Regards
After creating my feature file in eclipse, i run it as a Cucumber feature. i use the step definition the console gives me to create the first base of the test file
#Given("^the input is <(\\d+)> <(\\d+)>$")
these should be outputted by the console however currently it is showing the feature without the step definitions.
Feature: this is a test
this test is to test if this test works right
Scenario: test runs # src/test/resources/Test.feature:4
Given: i have a test
When: i run the test
Then: i have a working test
0 Scenarios
0 Steps
0m0,000s
this feature is just to check if cucumber is working properly.
the runner:
import cucumber.api.CucumberOptions;
import cucumber.api.junit.Cucumber;
import org.junit.runner.RunWith;
#RunWith(Cucumber.class)
#CucumberOptions(
monochrome = true,
dryRun = false,
format = "pretty",
features = "src/test/resources/"
)
public class RunCukes {
}
what can be the cause of the console not showing all the info?
TL:DR console does not show the step regex for missing steps
EDIT: added feature file
Feature: this is a test
this test is to test if this test works right
Scenario: test runs
Given: i have a test
When: i run the test
Then: i have a working test
The problem is in the feature file. Using : after Given, When and Then is the problem. I was able to reproduce your issue with your feature file. But when I removed the : and ran the feature file, with the same Runner options provided above, I got the regex to implement missing step definitions.
P.S I am using IntelliJ, but don't think it would make a difference.
Feature: this is a test
this test is to test if this test works right
Scenario: test runs # src/test/resources/Test.feature:4
Given i have a test
When i run the test
Then i have a working test
Below is what I got:
Testing started at 19:12 ...
Undefined step: Given i have a test
1 Scenarios (1 undefined)
3 Steps (3 undefined)
0m0.000s
Undefined step: When i run the test
You can implement missing steps with the snippets below:
#Given("^i have a test$")
public void i_have_a_test() throws Throwable {
// Write code here that turns the phrase above into concrete actions
throw new PendingException();
}
#When("^i run the test$")
public void i_run_the_test() throws Throwable {
// Write code here that turns the phrase above into concrete actions
throw new PendingException();
}
#Then("^i have a working test$")
public void i_have_a_working_test() throws Throwable {
// Write code here that turns the phrase above into concrete actions
throw new PendingException();
}
Undefined step: Then i have a working test
1 scenario (0 passed)
3 steps (0 passed)
Process finished with exit code 0
it can happen if your .feature file is invalid somehow. I once had it happen just because I had two || together in the examples table of my Scenario Outline
This question already has answers here:
Run single test from a JUnit class using command-line
(4 answers)
Closed 9 years ago.
I am trying to run tests from a separate class where information can be compiled and reported. I am having difficulty running individual tests, however.
I tried:
for (int i = 0; i < testRuns; i++) {
JUnitCore.runClasses(InternetExplorerTestClass.class, MozillaFirefoxTestClass.class, GoogleChromeTestClass.class);
}
but that limits the control I have over the results and reporting the data.
How do I run a single test from a test suite? Thank you in advance.
It almost looks like you are doing something like a Selenium test? If you use Gradle as your build tool, you can easily run one specific test by using the "include" filter option like so. (You could do something similar with Ant, SBT, or Maven as well). Personally, I think using the build tool to pick the tests to run is more elegant than writing code to run certain classes.
tasks.withType(Test) {
jvmArgs '-Xms128m', '-Xmx1024m', '-XX:MaxPermSize=128m'
maxParallelForks = 4
// System properties passed to tests (if not http://localhost:8001/index.html)
systemProperties['testProtocol'] = 'http'
systemProperties['testDomain'] = 'djangofan.github.io'
systemProperties['testPort'] = 80
systemProperties['testUri'] = '/html-test-site/site'
systemProperties['hubUrl'] = 'localhost'
systemProperties['hubPort'] = '4444'
}
task runParallelTestsInFirefox(type: Test) {
description = 'Runs all JUnit test classes in parallel threads.'
include '**/TestHandleCache*.class'
testReportDir = file("${reporting.baseDir}/ParallelTestsFF")
testResultsDir = file("${buildDir}/test-results/ParallelTestsFF")
// System properties passed to tests
systemProperties['browserType'] = 'firefox'
// initial browser size and position
systemProperties['windowXPosition'] = '100'
systemProperties['windowYPosition'] = '40'
systemProperties['windowWidth'] = '400'
systemProperties['windowHeight'] = '600'
}
This is taken from a example project I wrote here.
Can anyone help me with my problem?
I test my program with Robotium in Junit.
My problem is:
When I detect there is a failure in junit, how can I use code to detect there is failure in program? So, I can continue run if no error occur? e.g. if no error, continue testing, else exit.
I suggest using Java's built-in assertions for your test. To create an assertion:
assert someBoolean : message;
For example:
assert (myValue == 3) : "myValue was " + myValue + ", should have been 3";
Assertions are disabled by default when running your program. To run your program with assertions, run it like this:
java -enableassertions MyClass
Then, if your program is running with this runtime option, whenever an assert is reached, the following happens:
If the boolean is true, the program will continue.
If it is false, an AssertionError is thrown with the specified message.
For example:
int myVar = 5;
assert (myVar == 3) : "myVar is " + myVar + " not 3";
results in
Exception in thread "main" java.lang.AssertionError: myVar is 5 not 3
IF assertions are enabled. Remember: all of that only happens when you enable asserts using -enableassertions or -ea. If you don't, the asserts are skipped.
When I detect there is a failure in junit, how can I use code to detect there is failure in program? So, I can continue run if no error occur? e.g. if no error, continue testing, else exit.
This doesn't make much sense. If you've got a failure in a JUnit test, that means there is a failure in your program. If no failure occurs, the unit testing will proceed to the next test automatically.
But maybe you are asking if you can do this:
// in some unit test
assert(....); // <<--- this test fails:
// Do something so that the unit test will continue to the next assertion ...
assert(....)
The answer is that you can't do that in any useful way:
The unit test framework can only report unit test failures that indicate that they have failed by terminating with an exception.
You could write a unit test to catch the exception that an assert(...) or fail(...) call throws and continue to the next assertion. But that would destroy all evidence of the previous unit test failure.
So if you want to be able to do the second assertion despite the first one failing, you need to make them separate testcases.
You might also be asking if there is a way to get the JUnit test runner to abort on the first failed unit test. The answer is yes that it is possible, but how you would do it would depend on the test runner you are using.
You can make assertions for a condition to be true or false-
assertTrue(soloObject.waitForActivity("Activity Name"));
Instead of wating for an activity you can use all the methods provided by robotium to make assertions example isTextFound("text"), isTextFound("text"), isCheckBoxChecked(index), etc.