running a geb test from a java class - java

I've recently stumbled across geb and it looks like a good way to perform integration tests on our web applications. Our platforms are all java based and from reading that
"Geb provides first class support for functional web testing via
integration with popular testing frameworks such as ...JUnit,
TestNG..."
i assumed it would be easy to execute a test from a java class (testng test?).
I'm new to groovy and geb.
So far I have included geb-testng and groovy in my pom:
<dependency>
<groupId>org.codehaus.geb</groupId>
<artifactId>geb-testng</artifactId>
<version>0.7.0</version>
</dependency>
<dependency>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy</artifactId>
<version>1.8.6</version>
</dependency>
... However i can't find any examples of creating a test and running it from a java class.
help appreciated.

Geb is designed for, and can only be used from, Groovy code. This is mainly due to the dynamic nature of its APIs. What you can choose is which test framework to use (JUnit, TestNG, Spock, etc.). As Geb itself is just a library, it can also be used without a test framework, for example to automate an interaction with a website.
If you need to stick to Java, you'll have to use something like Selenium2, which is what Geb uses under the covers.

I stumpled upon this question while searching a Geb-TestNG example. Here is what is working for me:
import geb.testng.GebTest
import org.testng.annotations.Test
class GroovyYourTestClass extends GebTest {
#Test
void "should test something"() {
to YourPageObject
// ...
}
}

The best alternative to Geb if you're using Java is IMHO Selenide (http://selenide.org)
example from Quick Start Guide:
#Test
public void userCanLoginByUsername() {
open("/login");
$(By.name("user.name")).setValue("johny");
$("#submit").click();
$(".loading_progress").should(disappear); // Waits until element disappears
$("#username").shouldHave(text("Hello, Johny!")); // Waits until element gets text
}
It's independent from your testing framework and therefore can be used with e.g. JUnit, TestNG, Cucumber, ScalaTest, JBehave

Related

Integrating Cucumber into a test framework?

I have written a Test framework (that supports Selenuim).
The way it works is such that when I create a test I only have to extend the base class..
class NewTest extend BaseTest {
#Test
public ABC() {
this.sel.driver. ....
}
}
My question is how to integrate Cucumber into my framework, where the Framework is the provider of the Selenium driver, db connections,screenshots, ...etc
The idea is to make it in a way that it does not interfere with the Cucumber normal flow of programming.
Also I'm using maven and junit.
#Test is for Junit not for cucumber.
Cucumber using gherkin language
You can check the setup and usage of cucumber from below tutorial
https://www.toolsqa.com/cucumber-tutorial/
My question is how to integrate Cucumber into my framework, where the
Framework is the provider of the Selenium driver, db
connections,screenshots, ...etc
Just create an package and add your classes like db, selenium etc into it, call them in script as we did in normal framework.
Note that cucumber just provide a flow to run the project from feature and using hook. so learn and did poc in cucumber and you will understand where you need to call what.

Is there an equivalent for DropwizardTestSupport in Micronaut?

I'd like to run my application with JUnit for the integration test.
For Dropwizard, I've been using DropwizardTestSupport library to achieve that.
Wondering if there's an equivalent for that in Micronaut.
Thanks
I cannot say that is the equivalent of DropwizardTestSupport because I have not used that but we do have a Micronaut Test library which is described at https://micronaut-projects.github.io/micronaut-test/latest/guide/index.html. In short, you can annotation your test with #MicronautTest and that causes useful things to happen including starting up the app and subjecting your test class to dependency injection.
I hope that helps.

Unit tests coverage using Jacoco for test classes written using Powermock

I am trying to get the code coverage report on the sonarqube dashboard on jenkins. The code coverage report is coming up but showing only 4.6% coverage. On investigating I found out that the test classes written using PowerMocks are getting skipped.
On further investigation I found that "JaCoCo doesn't play well with dynamically modified/created classes (this is the way how powermock works). This is a known limitation we can't currently do anything about".
Is there any work around for this so that I can get proper code coverage for test classes written using PowerMocks too.
Simple answer: no, there isn't.
Long answer - boils down to these options:
have look into this Wiki page by the PowerMock team - maybe maybe "offline instrumentation" works out for you.
hope that the corresponding bug gets fixed at some point (I wouldn't hold my breath on that)
get rid of your dependency to PowerMock(ito) - by refactoring and improving your production code
[ I think I evaluated various coverage tools long time ago; and there was one commercial one that claims to work even with PowerMock. But I don't recall any specifics. So I am basically saying: there might be a minuscule chance that another, proprietary coverage tool works with PowerMock ]
I have managed to generate PowerMock coverage with Jacoco, using powermock-module-javaagent.
Just make sure you put powermock agent after jacoco agent:
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<useSystemClassLoader>true</useSystemClassLoader>
<argLine>${jacocoArgLine} -javaagent:${settings.localRepository}/org/powermock/powermock-module-javaagent/${powermock.version}/powermock-module-javaagent-${powermock.version}.jar -noverify</argLine>
...
If you want to see an example, take a look at this project: https://github.com/jfcorugedo/sonar-scanner
Here you can see that sonar takes into account static methods and new statements mocked by PowerMock:
If you want to mock newstatements make sure you use PowerMockRule instead of PowerMockRunner.
Take a look at this test
What works for me is to remove this
#RunWith(PowerMockRunner.class)
And add this in the class
#Rule
public PowerMockRule rule = new PowerMockRule();
Also need to add the dependency for powermockito junit4 rule.
<dependency>
<groupId>org.powermock</groupId>
<artifactId>powermock-module-junit4-rule</artifactId>
<version>2.0.2</version>
<scope>test</scope>
</dependency>
This official page will help more to understand it. PowerMock

Jacoco misses all coverage if #PrepareForTest is used [duplicate]

I am trying to get the code coverage report on the sonarqube dashboard on jenkins. The code coverage report is coming up but showing only 4.6% coverage. On investigating I found out that the test classes written using PowerMocks are getting skipped.
On further investigation I found that "JaCoCo doesn't play well with dynamically modified/created classes (this is the way how powermock works). This is a known limitation we can't currently do anything about".
Is there any work around for this so that I can get proper code coverage for test classes written using PowerMocks too.
Simple answer: no, there isn't.
Long answer - boils down to these options:
have look into this Wiki page by the PowerMock team - maybe maybe "offline instrumentation" works out for you.
hope that the corresponding bug gets fixed at some point (I wouldn't hold my breath on that)
get rid of your dependency to PowerMock(ito) - by refactoring and improving your production code
[ I think I evaluated various coverage tools long time ago; and there was one commercial one that claims to work even with PowerMock. But I don't recall any specifics. So I am basically saying: there might be a minuscule chance that another, proprietary coverage tool works with PowerMock ]
I have managed to generate PowerMock coverage with Jacoco, using powermock-module-javaagent.
Just make sure you put powermock agent after jacoco agent:
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<useSystemClassLoader>true</useSystemClassLoader>
<argLine>${jacocoArgLine} -javaagent:${settings.localRepository}/org/powermock/powermock-module-javaagent/${powermock.version}/powermock-module-javaagent-${powermock.version}.jar -noverify</argLine>
...
If you want to see an example, take a look at this project: https://github.com/jfcorugedo/sonar-scanner
Here you can see that sonar takes into account static methods and new statements mocked by PowerMock:
If you want to mock newstatements make sure you use PowerMockRule instead of PowerMockRunner.
Take a look at this test
What works for me is to remove this
#RunWith(PowerMockRunner.class)
And add this in the class
#Rule
public PowerMockRule rule = new PowerMockRule();
Also need to add the dependency for powermockito junit4 rule.
<dependency>
<groupId>org.powermock</groupId>
<artifactId>powermock-module-junit4-rule</artifactId>
<version>2.0.2</version>
<scope>test</scope>
</dependency>
This official page will help more to understand it. PowerMock

JUnit test report enrichment with JavaDoc

For a customer we need to generate detailed test reports for integration tests which not only show, that everything is green, but also what the test did. My colleagues and I are lazy guys and we do not want to hack spreadsheets or text documents.
For that, I think about a way to document the more complex integration tests with JavaDoc comments on each #Test annotated method and each test class. For the test guys it is a good help to see to which requirement, Jira ticket or whatever the test is linked to and what the test actually tries to do. We want to provide this information to our customer, too.
The big question now is: How can we put the JavaDoc for each method and each test class into the JUnit reports? We use JUnit 4.9 and Maven.
I know, that there is a description for each assertXXX(), but we really would need a nice HTML list as result or a PDF document which lists all classes and there documentation and below that all #Test methods and their description, the testing time, the result and if failed, the reason why.
Or is there another alternative to generate fancy test scripts? (Or should we start an OpenSource project on this!? ;-) )
Update:
I asked another question on how to add a RunListener to Eclipse to have it also report in Eclipse when started there. The proposed solution with a custom TestRunner is another possibility to have the test results report. Have a look: How can I use a JUnit RunListener in Eclipse?
One way to achieve this would be to use a custom RunListener, with the caveat that it would be easier to use an annotation rather than javadoc. You would need to have a custom annotation such as:
#TestDoc(text="tests for XXX-342, fixes customer issue blahblah")
#Test
public void testForReallyBigThings() {
// stuff
}
RunListener listens to test events, such as test start, test end, test failure, test success etc.
public class RunListener {
public void testRunStarted(Description description) throws Exception {}
public void testRunFinished(Result result) throws Exception {}
public void testStarted(Description description) throws Exception {}
public void testFinished(Description description) throws Exception {}
public void testFailure(Failure failure) throws Exception {}
public void testAssumptionFailure(Failure failure) {}
public void testIgnored(Description description) throws Exception {}
}
Description contains the list of annotations applied to the test method, so using the example above you can get the Annotation TestDoc using:
description.getAnnotation(TestDoc.class);
and extract the text as normal.
You can then use the RunListener to generate the files you want, with the text specific to this test, whether the test passed or failed, or was ignored, the time taken etc. This would be your custom report.
Then, in surefire, you can specify a custom listener, using:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.10</version>
<configuration>
<properties>
<property>
<name>listener</name>
<value>com.mycompany.MyResultListener,com.mycompany.MyResultListener2</value>
</property>
</configuration>
</plugin>
This is from Maven Surefire Plugin, Using JUnit, Using custom listeners and reporters
This solution has the disadvantage that you don't have the flexibility of javadoc as far as carriage returns, formatting is concerned, but it does have the advantage that the documentation is in one specific place, the annotation TestDoc.
Have you looked at Maven Sure-fire reports?
You can generate a HTML report from your JUnit Tests.
http://maven.apache.org/plugins/maven-surefire-report-plugin/
I'm not sure how customizable it is though. But it's a good starting point.
I also know that TestNG ( alternative to JUnit ) has some report generating capabilities.
http://testng.org/doc/documentation-main.html#logging-junitreports
I would also recommend log4j
http://logging.apache.org/log4j/1.2/manual.html
you can use jt-report an excellent framework for test reporting.
I have created a program using testNG and iText which outputs the test results in a nice pdf report. You can put a description of your test in the #Test tag, and that can be included in the .pdf report also. It provides the run times of the tests, and for the entire suite. It is currently being used to test webapps with selenium, but that part could be ignored. It also allows you to run multiple test suites in one run, and if tests fail, it allows you to re-run only those tests without having to re-run the entire suite, and those results will be appended to the original results PDF. See below the image for a link to the source if you are interested. I wouldn't mind this becoming an opensource project as I have a good start on it, though I'm not sure how to go about doing that. Here's a screenshot
So I figured out how to create a project on sourceforge. Here's the link sourceforge link
As mentioned above maven is definitely the way to go.. It makes life really easy. You can create an maven project pretty easy using m2eclipse plugin. Once that is done. Just run these commands:
cd <project_dir_where_you_have_pom_file>
mvn site:site
This command will create the style sheets for you. In the same directory run:
mvn surefire-report:report
This will run the test cases and convert the output to html. You can find the output in the 'target/site/surefire-report.html'.
Below is the snippet. As you can see all the test cases (written in JUnit) are shown in the html. Other meta info like total no of test cases ran, how many successful, time taken etc., is also there.
Since I cannot upload image I cant show you the output.
You can go a step further and can give the exact version of the plugin to use like
mvn org.apache.maven.plugins:maven-site-plugin:3.0:site org.apache.maven.plugins:maven-surefire-report-plugin:2.10:report
Maybe it is worth taking a look on "executable specification" / BDD tools like FIT/FitNesse, Concordion, Cucumber, JBehave etc.
With this practice you will have a possibility not only satisfy the customer's requirement formally, but you will be able do bring transparency onto a new level.
Shortly speaking, all these tools allow you (or, better, customer) to define scenarios using natural language or tables, define binding of natural language constructs to real code, and run these scenarios and see if they succeed or fail. Actually you will have a "live" spec which shows what is already working as expected and what is not.
See a good discussion on these tools:
What are the differences between BDD frameworks for Java?

Categories

Resources