Automated Testing using ReadyAPI (SoapUI NG) - java

I'm just getting started with ReadyAPI(SoapUI NG) to do automated testing for our Spring based project. So using the ReadyAPI documentation I successfully tested REST URI calls using below steps:
Created a project: File > New Project > Create a Project using REST URI
It creates a project structure like below: REST Project > URI > createAccount < Request 1
In the Request window I added my parameters by using xml/json.
After Running the Request I'm getting my desired response.
I also checked Assertion that also giving desired results.
I did all this using ReadyAPI Documentation
My questions are below:
How to achieve Automated Testing using ReadyAPI(SoapUI NG)?
What features of ReadAPI are generally used? How to use them effectively?
What are the differences between SoapUI NG, LoadUI NG, SoapUI, ReadyAPI? (I thinks these are just different versions, not sure though)

Let me try to answer them to my best of abilities.
How to achieve Automated Testing using ReadyAPI(SoapUI NG)?
Ready API or SoapUI are the automation tools themselves. To manually
test the APIs, there are other tools like swagger. However if
you are planning to automate the flow of your whole project and
planning to use some sort of architecture/framework which will enable
you to do a lot more than just execute the tests at once then you can
try using tools like Jenkins.
What features of ReadAPI are generally used? How to use them
effectively?
One of the most talked about things in Ready API is Data Driven
Testing. If you follow this approach, then you may use test steps like
DataGen, DataSource, DataSink, PropertyTransfer, Groovy Assertions,
Groovy Script Step, JDBC Step. Those are the steps that are used
widely in Ready API Projects.
What are the differences between SoapUI NG, LoadUI NG, SoapUI,
ReadyAPI? (I thinks these are just different versions, not sure
though)
Ready API is a collection of different API testing solutions, like
Performance Testing(LoadUI), Security testing(Secure), Functional
testing (SoapUI). When you install Ready API, you install all the
solutions together. And then you can chose to buy licence for each of
those solutions separately.
Hope that answers your question.

Related

how to combine Webdriver, Appium and soup in one framework

I need to put Web sites testing , Mobile testing and Webservice testing in single Framework. Means single framework will perform all these type of testing.
As the configuration is define in environment file (.property file ) the framework should get ready to perform relevant testing
Using environment:-
testNG, selenium - grid,
Window , Appium , Webdriver , etc
Can anyone please provide me guideline?
Integration of API testing in framework :
+Appium utility
+Selinium utility
-API
- RestAPIService using HTTPClient
- ResponseParser - ObjectMapper
- DataSource similar to UI framework
+TestClass
-Here you can call you Appium/Selenium/API methods
For Api integration you can create one RestService class in that you can create wrapper methods getRequest(url, header) , postRequest(url, header) , putRequest(url, header) and deleteRequest(url, header). For response parsing you can use available json parser. and data you can take from same source which you are using in UI automation.
You can use all the libraries and create a wrapper around it.
Create interfaces which uses the libraries and implement them in your class.
Since all that you mentioned are open source jars, it should not be a problem to integrate them.
You can check here! This is an open source framework and they have integrated Appium, Jersey client and Selenium. If you are facing specific issue, please reply I will try to reproduce it from my end :)

How to get code coverage using postman test

We have REST services created in RestEasy and running in wildfly server. We are running Postman test cases to test the Rest URLs.
Is there a way to get a code coverage of the services when we execute postman test suite?
We use SonarQube to analyse the code coverage.
I think no, a similar question was asked here:
Generate Sonar code coverage report from Postman tests
The original poster commented further down:
In fact, after a bit of googling, as a work-around we could use remote
Jacoco agent hooked in the java application server. We'll try to run
jacoco maven goals before and after the tests execution in order to
generate jacoco coverage report. See: link I'll update the post if we
have some progress.
Also, newman seems to have aticket about it:
https://github.com/postmanlabs/newman/issues/408
Though this might help
Karate is the answer to your problem, provided you are willing to switch to another testing framework.
Here is the link to the demo-example which has code-coverage working: https://github.com/intuit/karate/tree/master/karate-demo#code-coverage-using-jacoco. Since Karate is a JVM implementation it is straightforward, and I recommend you keep Karate tests in the same Maven module (or equivalent) for the easiest option. Otherwise it is possible, but just harder - and you will need to fiddle with a Maven profile etc or do some instrumentation synchronization gymnastics.
I guess if you already have a lot of tests in Postman, the advice here may not be practical. But I'm posting this answer for the benefit of others who will come across this question in the future.
If you are lucky, you may be able to quickly port your tests to Karate using the experimental converter built into the UI: https://github.com/intuit/karate/wiki/Karate-UI#postman-import
Perhaps you can contribute to making that feature prod-ready.
There is nothing available that provides code coverage for postman tests.
In the end we chose rest assured and started replacing all postman tests.

REST-API documentation generation from unittests

I want to automatically document my REST-API. I know, there are many tools for that, but I want to generate the documentation from my unit tests.
The reason for this is, that I want the documentation to mirror, what is tested and what not. Nevertheless the documentation should be as rich as a documentation generated by let's say swagger.
I already found two projects with this approach, doctester and testdoc4j. Both do not satisfy my needs. The resulting documentation does not aggregate the happy path and the error paths.
What tools do you use and can you suggest any good one?
Cheers.
Edit:
There is a difference between documenting the API contract, defined in the interface, and documenting the test scenarios. If my documentation only includes the tested endpoints and pathes, I am able to define my interface and can hand out only the portions, I have tested.
This means I am able to define an interface with let's say ten endpoints. After implementing a basic functionality with corresponding tests, I can release this part with documentation. Not yet stable or implemented endpoints are not included, which prevents the users from using them.
Perhaps you want a BDD framework? Eg:
cucumber
fitnesse
jbehave
I recently did some research about the same topic and decided to use the free version of Miredot because it is the only tool that fulfilled my requirements:
Does not need extra annotations. All information is extracted from JavaDoc
Can handle JAX-RS as well as Spring annotations
Easy maven integration
Miredot generates automatically a HTML based documentation when you run mvn test
Swagger is a beautiful option. It's a project on GitHub, has Maven integration and loads of other options to keep it flexible.
Integration guide: swagger-core wiki
More Information: here
Not sure if you have already found something for this, but Spring RestDocs does exactly what you are asking about here.
https://spring.io/projects/spring-restdocs
Im curious as to other tools you may have run across in other languages too.

Can I generate a regression test suite automatically?

We're looking at an upgrade from zimbra 6 to zimbra 7, and we want to avoid regressions in the account lifecycle software we wrote to integrate with it. Since most things are documented by a WSDL file, we were thinking of just using that to test.
What I'm wondering is, is there a way to use an API definition and two target servers to populate a test suite?
Take a look at SoapUI. It is an open source product that can do some automated testing based on a WSDL.
I am using a combination of the below three tools to solve the kind of problem you have mentioned
Model Based Testing
Define graphic model and MBT based code gen using yEd and use GraphWalker to generate the tests dynamically bu walking the model
Spock testing framework + spring-ws
I've used soapUI extensively but it is not very flexible for code gen related tests. It is however super to create the tests once, parameterize them and maintain overtime by checking in the soapui project to version control

How to develop a Nightly Builder

I was told to create a tool like a Nightly Builder for a JUnit project. It's a client-server project with oracle database.The tests are based on QTP. Also there is a test interface written on C#. The tester can click on the interface to choose which tests to run and get a report from each test. So I have to make this procedure automated. So what tools should I use?
Thanks in advance
Best regards
Have you considered using CruiseControl or like tool? We use this at my work and it was easy to get up and running with Junit and/or TestNG. Other tools to consider are buildbot, continuum, hudson, etc. (Go to google and type "cruisecontrol vs" and see a bunch of other auto builder tools.) Then see how they handle nightly builds.... here's a reference for CruiseControl.
You should use Quartz. In the quart scheduling xml file you can specify it to build your project. In your project you should have junit test cases execute whenever a build happens. This way you can achieve a daily build process. If your project already utilizes Spring framework then you can use the spring job scheduler helper library too (it's a wrapper around quartz).
Ideally you should use hudson to manage the daily builds but I am not sure if your organization utilizes it or not.
Hope this helps.

Categories

Resources