For the example program on the webpage
http://www.qaautomation.net/?p=263
I carry out the following steps:
Run the program with the driver.close() line of code commented out.
The program opens a Firefox browser, then searches for the term "qa automation" on google.
Once the "Test passed." message has been printed to the screen (in the console), go to the google search results page.
Using the browser menu, go to Tools/Web Developer/Page Source.
On the page source page, search for the term "qaautomation.net".
Quit the Firefox application.
Open Firefox and a browser window manually i.e. not using the program.
Go to google.com and search manually for "qa automation".
When the results page has loaded, carry out Steps 4 and 5 above.
I get no search results in Step 5 but I do in Step 9. Why is this? Both page sources appear to derive from the same webpage. Any help on this would be appreciated.
Instead of searching through the HTML source, I would suggest that you just scrape the page. This method will let you more specifically target where you are seeing the text... in a link href vs text on the page vs ???, etc.
The code below demonstrates two different places that you can look for the "qaautomation.net" text. The first is in the heading link href and the second is in the citation link that occurs immediately below the heading href.
WebDriver driver = new FirefoxDriver();
driver.manage().window().maximize();
driver.get("http://www.google.com");
driver.findElement(By.id("lst-ib")).sendKeys("qa automation\n"); // need the \n to simulate hitting ENTER to start the search
WebDriverWait wait = new WebDriverWait(driver, 10);
wait.until(ExpectedConditions.titleContains("Google Search")); // need to wait until the results page loads
List<WebElement> links = driver.findElements(By.cssSelector("h3.r > a"));
// System.out.println("links: " + links.size()); // for debugging
for (WebElement link : links)
{
// looking for qaautomation.net in the link href
if (link.getAttribute("href").contains("qaautomation.net"))
{
System.out.println("PASS: found in href");
}
}
List<WebElement> cites = driver.findElements(By.tagName("cite"));
// System.out.println("cites: " + cites.size()); // for debugging
for (WebElement cite : cites)
{
// looking for qaautomation.net in the citation (text below the heading link)
if (cite.getText().contains("qaautomation.net"))
{
System.out.println("PASS: found in cite");
}
}
System.out.println("DONE");
Related
I am doing automation using selenium webdriver (java) on a search engine BookMyCrop (http://www.bookmycrop.com). Here, I searched for a crop but, I am not able to click on desired search result. Please help me with it.
Code :
WebElement search = driver.findElement(By.xpath("//*[#id=\"search_keyword\"]"));
search.sendKeys("59825");
search.sendKeys(Keys.ENTER);
driver.findElement(By.partialLinkText("Cashew")).click();
------My 1st try-------------
//WebElement link = driver.findElement(By.xpath("\"//div[#id = 'Links']/a[3]\""));
//link.click();
------My 2nd try-------------
//List<WebElement> find = driver.findElements(By.xpath("/html/body/section[2]/div[2]/div/div/div/div[1]"));
//find.get(1).click();
}
} –
You can use the css selector based on class names: ".product-block.inner-product-block" and get the list of all the search results.
Then click on whatever index you want to click.
I am not using an IDE for this but it would look something like this:
driver.findElements(By.cssSelector(".product-block.inner-product-block")).get(0).click();
As said, you can try with css ".product-block.inner-product-block"
Then
get List of WebElements
do loop
inside loop, try get text of each element or innerText attribute
cross check if it is required one or not by simple if condition
If so, click on that web element and break loop
if this locator is not giving required info, try other locator. say $$("a h3") for veg names.
The below code worked for me. It is navigates to the correct link
WebDriver driver = new ChromeDriver();
driver.manage().timeouts().setScriptTimeout(20, TimeUnit.SECONDS);
driver.manage().window().maximize();
driver.get("http://www.bookmycrop.com");
WebElement search = driver.findElement(By.xpath("//*[#id=\"search_keyword\"]"));
search.sendKeys("59825");
search.sendKeys(Keys.ENTER);
driver.findElement(By.partialLinkText("Cashew")).click();
I'm coding test with Selenium Webdriver (Java), getting https://cloud.google.com as a driver.
I start with finding search input field, sendKeys("search phrase \n"). After that page starts changing its content and I'm trying to intersept these changes with WebDriverWait:
// first Wait - is to wait before page starts changing is content by removing search google icon
new WebDriverWait(driver, 30).until(ExpectedConditions.invisibilityOf(searchInputFieldIcon));
//second Wait - i'm waiting new hyperlink to appear (this hyperlink appears in search results after the whole page is asynchronically reloaded without page reloading)
new WebDriverWait(driver,30)
.until(ExpectedConditions.visibilityOfElementLocated(By.xpath("//a[#href='https://cloud.google.com/products/calculator']")));
The point is, that Wait doesn't waits for 30 seconds before element shows up. Code just throws an exception:
org.openqa.selenium.NoSuchElementException:
no such element: Unable to locate element: {"method":"xpath","selector":"//a[#href='https:`//cloud.google.com/products/calculator']"}`
Any help will be much appreciated!
Please check the attached screenshot. Here the Href link is different then you have used in your code.
You can use the below code
wait.until(expectedConditions.visibilityOfElementLocated(By.linkText("Google Cloud Platform Pricing ")));
To locate the first search result you can use the following xpath;
//a[contains(text(),'Google Cloud Platform Pricing')]
Checking your xpath
You can check whether your xpath is correct or not from the browser itself.
Go to DevTools (Ctril + Shift + I)
In the 'Elements' tab, press Ctrl + F
Input the xpath that you want to check
And it will show you whether it is correct and how many web-elements can be located from it.
I'm trying to crawl a site which has title and comments. When the page is loaded 40 comments are rendered, but after clicking the button "Load comments" there are new 40 comments and so on. I want to load all the comments first, and then take them all.
The problem is that I'm getting only the first 40. This is my code:
WebDriver driver = new HtmlUnitDriver();
driver.get(www.website.com);
String title = driver.findElement(By.className("title")).getText();
while(driver.findElements(By.className("load-comments")).isDisplayed() || !driver.findElement(By.className("expand-loading")).isDisplayed()){
Thread.sleep(500);
if(!driver.findElements(By.className("loading")).isDisplayed()){
driver.findElements(By.className("load-comments")).click();
}
}
List<WebElement> comments = (List<WebElement>) driver.findElements(By.className("comment"));
for(WebElement comm:comments){
System.out.print(comm.getText());
}
So, if i need all 150 comments, in this situation i'm getting only the first 40 that are visible when the page is loaded.
I tried options.addArguments("--headless"); with ChromeDriver(options); but it's very slow.
PS: The load-comments button is hidden when all comments are loaded and the loading element is displayed if the page is loading new comments.
The website you provided din't display any comments. please provide exact web address to locate elements.
Why are you using "driver.findelements" instead of "driver.findelement" for is displayed condition? This logic will give you compilation error.
You need to add argument to set screen resolution while using headless.I suggest you to go with chrome with ui first then check the headless.
Please check out the element of this website.
It has a form, and along with 2 text input and 1 submit button.
I dont know which one from those 2 inputs that is actually used when the user type-in some urls over there.
But when I tried this (using firefoxDriver) to get the element:
WebElement textfieldURL = driver.findElement(By.id("ping-url")); // even ping-box not working
The result's unable to locate the element.
Then I change my code to this :
driver.switchTo().frame(driver.findElement(By.className("ping-iframe")));
WebElement textfieldURL = driver.findElement(By.id("ping-url")); // even ping-box not working
The result's still unable to locate the element.
Any clues?
You haven't mentioned the exception which you are facing. As your input tag present under iframe so you need to first switch into frame and than have to perform actions -
driver.switchTo().frame(driver.findElement(By.className("ping-iframe")));
//or you can use frame index as well
driver.switchTo().frame(0);
your element is available with the id ping-box . Try the following complete code -
System.setProperty("webdriver.gecko.driver","D:/Application/geckodriver.exe");
driver = new FirefoxDriver();
driver.manage().window().maximize();
driver.get("https://www.twingly.com/ping");
driver.manage().timeouts().implicitlyWait(20, TimeUnit.SECONDS);
driver.switchTo().frame(driver.findElement(By.className("ping-iframe")));
driver.findElement(By.id("ping-box")).sendKeys("http://www.google.com");
driver.findElement(By.id("ping-button")).click();
Same is working for me.
My question is simple- Using Selenium, how do you keep on clicking links when each hyper link opens up in a new page or new window or opens in the same web page.
For example I have following links on a webpage:
Log in
Sign up
Forgot Password?
Signup with us
follow this link
Home
Terms
Privacy Policy
Here is the small snippet of code that I have written to click these mentioned links on webpage:
List<WebElement> elements = driver.findElements(By.tagName("a"));
//clicking all links
for (WebElement el : elements){
System.out.println("Link getting clicked" + el.getText());
el.click();
driver.navigate().back();
}
As you can see I am trying to get links and trying to click them one by one. However, I am getting an error after the first click itself. Console says: "org.openqa.selenium.StaleElementReferenceException: Element is no longer attached to the DOM".
I am petty sure I know the cause as the links are getting opened on the same webpage, but I am doing a back navigation, which is not helping me at all.
Any thoughts / suggestions?
This could be due to program execution advancing to the call to "driver.navigate().back();" before the page has loaded.
Try introducing an implicit wait, which tells the "WebDriver to poll the DOM for a certain amount of time when trying to find an element or elements if they are not immediately available"
e.g. When you create your web driver try:
WebDriver driver = new FirefoxDriver();
driver.manage().timeouts().implicitlyWait(2, TimeUnit.SECONDS);
You have to refind the element whenever a page is reloaded before you can interact with it, for your code, please try modify them to this:
driver.manage().timeouts().implicitlyWait(3000, TimeUnit.MILLISECONDS);
List<WebElement> elements = driver.findElements(By.tagName("a"));
//clicking all links
for (int i=0; i<elements.size(); i++){
WebElement el = driver.findElements(By.tagName("a")).get(i);
System.out.println("Link getting clicked" + el.getText());
el.click();
driver.navigate().back();
}