How to get ajax added contents on web page using selenium webdriver? - java

I have written a code to scrap some information from one of the website. For scrapping purpose i am using Selenium Webdriver. Now my problem is, there are some information present on one the page but initially only ten entries are displayed on page, i am able to scrap this ten entries. But there are some more entries which will be displayed only after clicking load more link which is at the bottom of the page. whenever i click load more link remaining entries will be loaded on to the page with out refreshing the whole page(probably ajax update) i am unable to scrap this newly loaded entries. Please help
This is the code i have written
WebDriver driver = new HtmlUnitDriver(BrowserVersion.FIREFOX_3_6);
driver.get("some site url");
Thread.sleep(500);
driver.findElement(By.xpath("//*[#id=\"username\"]")).sendKeys("user name");
driver.findElement(By.xpath("//*[#id=\"password\"]")).sendKeys("password");
driver.findElement(By.xpath("//*[#id=\"login\"]//div/button")).click();
Thread.sleep(200);
if(driver.getPageSource().toString().contains("Hi "+un)) {
driver.get("http://www.somesite/m/searches/Loads/new");
driver.findElement(By.xpath("//*[#id=\"Criteria_PostingAge\"]")).clear();
driver.findElement(By.xpath("//*[#id=\"Criteria_PostingAge\"]")).sendKeys("12");
driver.findElement(By.xpath("//*[#id=\"Criteria_Origin_RawValue\"]")).clear();
driver.findElement(By.xpath("//*[#id=\"Criteria_Origin_RawValue\"]")).sendKeys(orgn[paraCount]);
driver.findElement(By.xpath("//*[#id=\"Criteria_Destination_RawValue\"]")).clear();
driver.findElement(By.xpath("//*[#id=\"Criteria_Destination_RawValue\"]")).sendKeys(destn[paraCount]);
date=new Date();
calendar=Calendar.getInstance();
calendar.setTime(date);
driver.findElement(By.xpath("//*[#id=\"Criteria_PickupFrom\"]")).clear();
driver.findElement(By.xpath("//*[#id=\"Criteria_PickupFrom\"]")).sendKeys(pickupDtFmt.format(date).toString());
calendar.add(Calendar.DAY_OF_MONTH, 1);
date=calendar.getTime();
driver.findElement(By.xpath("//*[#id=\"Criteria_PickupTo\"]")).clear();
driver.findElement(By.xpath("//*[#id=\"Criteria_PickupTo\"]")).sendKeys(pickupDtFmt.format(date).toString());
driver.findElements(By.xpath("//*[#id=\"search-entry\"]//div/input")).get(11).click();
Thread.sleep(2000);
// The code upto here will get me starting ten entries
// In order to get more entries i need to click on load more o link
driver.findElement(By.xpath("//*[#id=\"loadMore\"]")).click();
Thread.sleep(3000);
WebElement myDynamicElement = (new WebDriverWait(driver, 30))
.until(ExpectedConditions.presenceOfElementLocated(By.xpath("//*[#id=\"search-results\"]/div[2]/ul/li[12]")));
Actually after clicking on load more anchor link i was supposed to get ten more entries. So totally 20 entries. But i am getting only same 10 entries which i got when page loaded for the first time

I bet you would get a different result if you switched to using GhostDriver instead. No reason to use HtmlUnitDriver when GhostDriver can do the same and is probably more cutting edge technology.

Related

How to make a WebDriverWait if page changes its content asynchronously (without page reloading)?

I'm coding test with Selenium Webdriver (Java), getting https://cloud.google.com as a driver.
I start with finding search input field, sendKeys("search phrase \n"). After that page starts changing its content and I'm trying to intersept these changes with WebDriverWait:
// first Wait - is to wait before page starts changing is content by removing search google icon
new WebDriverWait(driver, 30).until(ExpectedConditions.invisibilityOf(searchInputFieldIcon));
//second Wait - i'm waiting new hyperlink to appear (this hyperlink appears in search results after the whole page is asynchronically reloaded without page reloading)
new WebDriverWait(driver,30)
.until(ExpectedConditions.visibilityOfElementLocated(By.xpath("//a[#href='https://cloud.google.com/products/calculator']")));
The point is, that Wait doesn't waits for 30 seconds before element shows up. Code just throws an exception:
org.openqa.selenium.NoSuchElementException:
no such element: Unable to locate element: {"method":"xpath","selector":"//a[#href='https:`//cloud.google.com/products/calculator']"}`
Any help will be much appreciated!
Please check the attached screenshot. Here the Href link is different then you have used in your code.
You can use the below code
wait.until(expectedConditions.visibilityOfElementLocated(By.linkText("Google Cloud Platform Pricing ")));
To locate the first search result you can use the following xpath;
//a[contains(text(),'Google Cloud Platform Pricing')]
Checking your xpath
You can check whether your xpath is correct or not from the browser itself.
Go to DevTools (Ctril + Shift + I)
In the 'Elements' tab, press Ctrl + F
Input the xpath that you want to check
And it will show you whether it is correct and how many web-elements can be located from it.

How to access newly opened page in same tab using selenium webdriver

I am unable to access the newly opened page after submitting a form in same tab (not in new tab). I am a newbie in selenium driver, please help me.
In-fact, I accessed first page and filled form successfully, and then I clicked a submit button and proceed to next page, opened in same tab. But then I failed to access that newly opened page.
I also use 10 sec explicit wait to manage form submission and new page opening time but doesn't working. I didn't use WindowHandles() because I am not comfortable to use that. Will Windowhandles() work ?
Only one way I am able to access new page by using separately navigate().to() method after clicking submit button. But is this good approach to get control at new page rather than automating control to new page by some other way ?
I used following explicit wait and then accessed a button at newly page, but doesn't working:
WebDriverWait WaitVar = new WebDriverWait (driver, 10);
WaitVar.until(ExpectedConditions.visibilityOfElementLocated(By.id("BTNCustomQuestionFinalStep")));
driver.findElement(By.id("BTNCustomQuestionFinalStep")).click();
Without using explicit wait I got following error:
no such element: Unable to locate element
After using explicit wait I got following error:
Expected condition failed: waiting for visibility of element located by By.id: BTNCustomQuestionFinalStep (tried for 10 second(s) with 500 milliseconds interval)

Load new elements and take them with Selenium and HtmlUnitDriver

I'm trying to crawl a site which has title and comments. When the page is loaded 40 comments are rendered, but after clicking the button "Load comments" there are new 40 comments and so on. I want to load all the comments first, and then take them all.
The problem is that I'm getting only the first 40. This is my code:
WebDriver driver = new HtmlUnitDriver();
driver.get(www.website.com);
String title = driver.findElement(By.className("title")).getText();
while(driver.findElements(By.className("load-comments")).isDisplayed() || !driver.findElement(By.className("expand-loading")).isDisplayed()){
Thread.sleep(500);
if(!driver.findElements(By.className("loading")).isDisplayed()){
driver.findElements(By.className("load-comments")).click();
}
}
List<WebElement> comments = (List<WebElement>) driver.findElements(By.className("comment"));
for(WebElement comm:comments){
System.out.print(comm.getText());
}
So, if i need all 150 comments, in this situation i'm getting only the first 40 that are visible when the page is loaded.
I tried options.addArguments("--headless"); with ChromeDriver(options); but it's very slow.
PS: The load-comments button is hidden when all comments are loaded and the loading element is displayed if the page is loading new comments.
The website you provided din't display any comments. please provide exact web address to locate elements.
Why are you using "driver.findelements" instead of "driver.findelement" for is displayed condition? This logic will give you compilation error.
You need to add argument to set screen resolution while using headless.I suggest you to go with chrome with ui first then check the headless.

Selenium script too slow with new FirefoxDriver()

I doing automation on a particular website(say xyz.com). When I open the URL manually, it lands me onto a login page as expected and I am able to login there as well.
However, when I am automating the scenario by creating new instance of Firefox using new FirefoxDriver(), login page opens quickly but; when I click on login button it takes almost 2 minutes to navigate to a homepage.
I tried using a new profile but it didnt help.
I am using Selenium 2.44.0 on MAC with Java(Eclipse).
Please help.
I had the same problem with Selenium. What I ended up doing was making the webdriver wait till the page title changes(to homepage) using Expected Conditions.
WebDriverWait wait = new WebDriverWait(driver, 15);
wait.until(ExpectedConditions.titleContains(": My Expected Page title"));
I would suggest you to have a look here:
driver.wait() throws IllegalMonitorStateException
Wait for page load in Selenium

Selenium Web Driver: not able to find the element on the 2nd page.

I am using Java and Firefox and Firebug
I am not able to locate the element on the second page. For example if I login to gmail then I am not able to locate and click on the sent items or any other button on the next page.
I tried with the xpath (both absolute and relative) but every time I am getting an error that element not found.
with the code I am successfully able to login but as soon as the page loads I get an error "Element not Found".
Please suggest any solution
Unless you are telling WebDriver to wait until the element on the 2nd page is loaded, WebDriver will simply try to click the element as soon as its able to run. This is bad because your element might not yet be loaded while WebDriver is already trying to click it... TIMEOUT mayhem ensues...
Try the following... use the WebDriverWait class to make WebDriver wait for the element on the page to be loaded before attempting to click it...:
WebDriverWait wait = new WebDriverWait(driver, 100);
WebElement element = wait.until(ExpectedConditions.visibilityOfElementLocated(By.xpath("your xpath")));
element.click();
The '100' in WebDriverWait(driver, 100) is the maximum amount of seconds you want WebDriver to repeatedly attempt to locate the element before it times out...
I agree with the answer by CODEBLACK. Also you can go for Imlicit wait,which would make selenium wait implicitly for a given period of time.
Try following:-
driver.manage().timeouts().implicitlyWait(20,TimeUnit.SECONDS);
You can specify time as per your convenience.
Best O Luck. . .!

Categories

Resources