Capture request & response from charles proxy using selenium webdriver - java

I am trying to automate a Video stream player application something like netflix.
What i do now:
I have to tune to a couple of programs / channels manually and check the response in charles proxy to verify if the streaming comes from the right source and validate bitrate, response time and manifest information etc.
What i need to do:
I want to automate the above process. I have searched in the web where people have mentioned about getting the responses for certain browsers ( Firefox / chrome). We can get the responses from firefox using the web console i guess.
I want to write an automation script that is generic enough, so that i can run the tests on different web browsers and compare the responses to one source i.e charles proxy.
Below is what i am thinking of.
cucumber (my test cases) + selenium Web driver (automate things) + Charles proxy ( to validate responses)
Kindly help me if anyone has done this before. Sort of interesting and challenging. Your help is highly appreciated.

Related

Tracking site XHR with Java

Tried to use htmlUnit to send POST requests to communicate to server and met a tiny problem: target .php url is being changed from time to time
(www123.example.net -> www345.example.net, etc.). The only way to get new adress is to open site and check it's XHR requests, find one which goes to www???.example.net and then use this address to send POSTs.
So the question is: is there a way to track XHR using htmlUnit or any other Java library?
If you really need help you have to show your problem in more detail, provide some info about the web site you are requesting, show you code and try to explain what you expect and what goes wrong. Without this details we can only guess.
Looks like you should try to think about HtmlUnit more like a browser you can control from java instead of doing simple Http requests. Have a look at the simple samples on the HtmlUnit web site (the one at the bottoms is for you).
Try something like this (the same steps as the user of an ordinary browser does)
* open the url/page
* fill the various form fields
* find the submit button an click
* use the resulting page content
Usually HtmlUnit does all the stuff in the background for you.

Http Get Request - what data is actually send?

I'm currently building a web spider with java apache commons. I'm crawling basic google search queries like https://google.com/search?q=word&hl=en
Somehow after about 60 queries I get blocked, it seems they recognize me as a bot and I get a 503 Service Unavailable response
Now the important part:
If I visit the same site with firefox/chrome I get the desired result.
If I make a GET Request with my Application using the same http header (user-agent, cookies, cache etc.) I am still blocked.
HOW does Google know whether I'm connecting via Application or Chrome-Browser, when there is only the IP and the HTTP-Header as Information?(maybe I'm wrong?)
Are there more parameters to recognize my App? Something that Google sees and I don't?
(Maybe important: I'm using Chrome Developer Tools and httpbin.org to compare the headers of Browser and Application.)
Thanks a lot
Since you have not specified how quickly you send the 60 queries, I am assuming at a high rate. This is why google is blocking you. Several times I have rapidly done google searches from chrome and it asks for a captcha after a while and then blocks soon after.
Please see the API on Custom Search and this post about terms of Service Replacement for Google API
FAQ on blocked searches: Google FAQ

USSD app automation testing

I am very new to automation and API testing. My current project is USSD based for banking. I am currently testing manually by giving inputs on browser simulator and reading the console output (ssh into the server). The basic function of the API is to generate POST requests with all the parameters and send it to respective bank. I am looking to to automate this process and validate the request sent and the response received. I was thinking of using Selenium for browser automation but I have no idea how would I extract the response from the terminal to validate.
Please suggest as how I should go about this and if this is the right process. If there is any other way to handle this please suggest.
If you're testing REST API You should use REST Assured for automated testing (postman tool) for manual testing. If You should test end to end - then Selenium is the right tool.

How to write a java program that can post url on browser and log the results from html div or request response of HTTP?

I am planning to write a java program where I have the url of website x with which I will appending number from 1 to 100 and I will be getting result from the website.
Should I write using request and response of HTTP or mere java program where the url as string would do?
If I am getting the result as posted on browser, how to get the values from a div and write it to a text file. I guess the other option is also to get it via response.
All you need is a programatic Browser, which submits the request and gets you the response,
You can study the Http Request and Response Objects under Tcp/Ip Protocol stack and implement your own, but instead of Reinventing the wheel, you can use the apache commons Http Components Project, which has all this implemented
Apache Http Components
I'm not sure if you will be able to control the browser using only java. Even if you know where the browser exe file is installed you will not be able to use it's handle to control it (no pointers in java, different process, different memory area, etc). Sure, you could write one dll and then use it with jni but the final result would not be multplatform ...
Other possible approach would be to inject some keypress but you would be blind about the browser response (you would have to do some ugly screen capture ).
I don't think it is an easy task so IF I were you I would look in the web for some already made dll or library to control the browser.
I know that selenium does some kind of browser control (http://docs.seleniumhq.org/)
my 5 cents in 5 minutes.

Flex file upload with HTTPS and JAAS?

We're trying to upload a file from a flex client to a Java EE app.
In a full HTTPS environment
Java EE server is JBoss 5
Using BlazeDS 'Custom' authentication (username and password are entered trhough a flex form)
Using BlazeDS per session authentication
In regular AMF calls, we can access user principal and use role mecanism.
However, in our upload servlet, we have no access to user principal.
request.getUserPrincipal() // returns null
How to fix this ?
A while ago a guy commented on a blog post of mine that https + flex + firefox doesn't work:
have you tried uploading a file in firefox via https? Well, don’t bother, it can’t be done! Adobe blames it on firefox and puts their head in the sand. Read the teeth gnashing and ridiculous claims of Adobe here:
http://bugs.adobe.com/jira/browse/FP-201
Ultimately they threw up their hands and said it couldn’t be fixed, and, although said ‘We understand that this is a serious issue and are committed to resolving it’ suggested that either you:
1) Send the file to your server in a different way
2) Find another form of authentication
This may no longer be the case - register and see if the linked bug is still unresolved.
Also - this might not be your exact issue (at least not yet) - I'm just giving pointers.
From your post, and since I haven't used BlazeDS, I can't tell whether you're running into this issue specifically, but it sounds to me like you are --
Take a look at your server logs, or try using a Web debugger like Fiddler (you can tweak it to reveal HTTPS traffic in clear text), and you'll see that Flash blocks custom HTTP auth headers with FileReference.upload(). Why it does, I've no idea, but there's no workaround I know of, other than crafting something or your own manually.

Categories

Resources