How to click the links randomly in webpage using java,selenium - java

http://www.toysrus.com/family/index.jsp?categoryId=2535588&sr=1&origkw=watches
In the above web page lots of products are available, we need to click the product links randomly using java selenium. please help me out!

List<WebElement> links = driver.findElements(By.cssSelector('a.prodtitle'));
links.get(new Random().nextInt(links.size())).click();
Try this

The answer already posted will only click one link at random. If you eventually want to click every link randomly, then you will need to keep some sort of record of what links you have visited already. You will also need to go back after visiting a page.
List<WebElement> links = driver.findElements(By.cssSelector('a.prodtitle'));
List<WebElement> visited = new List<WebElement>();
WebElement random = links.get(new Random().nextInt(links.size()));
if( !visited.contains(random))
{
random.click();
visited.add(random);
driver.navigate().back();
}

Related

Selenium webdriver (java) - Click on desired link of some search results

I am doing automation using selenium webdriver (java) on a search engine BookMyCrop (http://www.bookmycrop.com). Here, I searched for a crop but, I am not able to click on desired search result. Please help me with it.
Code :
WebElement search = driver.findElement(By.xpath("//*[#id=\"search_keyword\"]"));
search.sendKeys("59825");
search.sendKeys(Keys.ENTER);
driver.findElement(By.partialLinkText("Cashew")).click();
------My 1st try-------------
//WebElement link = driver.findElement(By.xpath("\"//div[#id = 'Links']/a[3]\""));
//link.click();
------My 2nd try-------------
//List<WebElement> find = driver.findElements(By.xpath("/html/body/section[2]/div[2]/div/div/div/div[1]"));
//find.get(1).click();
}
} –
You can use the css selector based on class names: ".product-block.inner-product-block" and get the list of all the search results.
Then click on whatever index you want to click.
I am not using an IDE for this but it would look something like this:
driver.findElements(By.cssSelector(".product-block.inner-product-block")).get(0).click();
As said, you can try with css ".product-block.inner-product-block"
Then
get List of WebElements
do loop
inside loop, try get text of each element or innerText attribute
cross check if it is required one or not by simple if condition
If so, click on that web element and break loop
if this locator is not giving required info, try other locator. say $$("a h3") for veg names.
The below code worked for me. It is navigates to the correct link
WebDriver driver = new ChromeDriver();
driver.manage().timeouts().setScriptTimeout(20, TimeUnit.SECONDS);
driver.manage().window().maximize();
driver.get("http://www.bookmycrop.com");
WebElement search = driver.findElement(By.xpath("//*[#id=\"search_keyword\"]"));
search.sendKeys("59825");
search.sendKeys(Keys.ENTER);
driver.findElement(By.partialLinkText("Cashew")).click();

I am getting this error "stale element reference: element is not attached to the page document",

I am trying to add multiple products in to the cart
WebElement ele = driver.findElement(By.xpath("//li[text()='Grocery ']"));
//Creating object of an Actions class
Actions action = new Actions(driver);
//Performing the mouse hover action on the target element.
action.moveToElement(ele).perform();
Thread.sleep(3000);
driver.findElement(By.xpath("//li[#title='staples']")).click();
Thread.sleep(5000);
List<WebElement> products = driver.findElements(By.xpath("//div[#class='product-grid-img']"));
for(int i=0;i<products.size();i++)
{
products.get(i).click();
Thread.sleep(3000);
driver.findElement(By.xpath("(//span[text()='ADD TO CART'])[1]")).click();
driver.navigate().back();
driver.navigate().refresh();
}
As we can see, by clicking on a product a new page is opened so to get back you are navigating to the previous page explicitly and refreshing the page.
When you leaving the original page or refreshing it the elements collected by Selenium on that page becoming no more relevant AKA Stale.
You can read here or at any other online resource about this issue.
To make your code work you will have to get the product again.
Something like this should work:
List<WebElement> products = driver.findElements(By.xpath("//div[#class='product-grid-img']"));
for(int i=0;i<products.size();i++)
{
List<WebElement> products = driver.findElements(By.xpath("//div[#class='product-grid-img']"));
products.get(i).click();
Thread.sleep(3000);
driver.findElement(By.xpath("(//span[text()='ADD TO CART'])[1]")).click();
driver.navigate().back();
driver.navigate().refresh();
Thread.sleep(3000);
}
Every time you navigate to anywhere, your browser constructs a map of a sorts of the page, in memory. This is called the DOM. When you navigate somewhere else, the DOM is replaced with a new DOM of the new page. The browser does not keep a history of the DOMs you visited, so when you navigate back to somewhere you have already been, the browser has to construct the DOM again.
Let's look at the relevant parts of your code:
List<WebElement> products = driver.findElements(By.xpath("//div[#class='product-grid-img']"));
Find a bunch of elements in the current DOM.
for(int i=0;i<products.size();i++)
{
products.get(i).click();
First time navigate to another page. So discard all the elements you just found above and rebuild the DOM.
Second time through the loop, you are trying to use an element that is no longer found in the current DOM. Which throws the StaleElementException.
Thread.sleep(3000);
driver.findElement(By.xpath("(//span[text()='ADD TO CART'])[1]")).click();
driver.navigate().back();
Navigate again somewhere. Another rebuild of the DOM.
driver.navigate().refresh();
Will not help.
}
You will need to restructure your loop to find the element each time. Perhaps something like:
for(int i=0;i<products.size();i++)
{
// You will have to find the correct XPath here!
WebElement product = driver.findElement(By.xpath("//div[#class='product-grid-img'][" + i + "]"));
product.click();
Thread.sleep(3000); // You might want to use a proper wait here.
driver.findElement(By.xpath("(//span[text()='ADD TO CART'])[1]")).click();
driver.navigate().back();
}
StaleElementExceptions are caused by going to a new page and accessing old elements that are no longer valid.
You could do what others did and regrab the values or if it's in an a tag you could just collect all the hrefs and then just driver.get() to them removing the need to driver.navigate().back();
.getAttribute("href")
I just searched for your query in Google to check , with the intention to learn more about what exactly is the question and topic ,
I got this :
"People also ask"
(Q)How do you fix stale element reference element is not attached to the page document in selenium?
Solution : The element in the DOM is not found because your page is not entirely loaded when Selenium is searching for the element. To solve that, you can put an explicit wait condition that tells Selenium to wait until the element is available to be clicked on.
https://www.google.com/search?q=his+error+%22stale+element+reference%3A+element+is+not+attached+to+the+page+document%22%2C&sourceid=chrome&ie=UTF-8

Load new elements and take them with Selenium and HtmlUnitDriver

I'm trying to crawl a site which has title and comments. When the page is loaded 40 comments are rendered, but after clicking the button "Load comments" there are new 40 comments and so on. I want to load all the comments first, and then take them all.
The problem is that I'm getting only the first 40. This is my code:
WebDriver driver = new HtmlUnitDriver();
driver.get(www.website.com);
String title = driver.findElement(By.className("title")).getText();
while(driver.findElements(By.className("load-comments")).isDisplayed() || !driver.findElement(By.className("expand-loading")).isDisplayed()){
Thread.sleep(500);
if(!driver.findElements(By.className("loading")).isDisplayed()){
driver.findElements(By.className("load-comments")).click();
}
}
List<WebElement> comments = (List<WebElement>) driver.findElements(By.className("comment"));
for(WebElement comm:comments){
System.out.print(comm.getText());
}
So, if i need all 150 comments, in this situation i'm getting only the first 40 that are visible when the page is loaded.
I tried options.addArguments("--headless"); with ChromeDriver(options); but it's very slow.
PS: The load-comments button is hidden when all comments are loaded and the loading element is displayed if the page is loading new comments.
The website you provided din't display any comments. please provide exact web address to locate elements.
Why are you using "driver.findelements" instead of "driver.findelement" for is displayed condition? This logic will give you compilation error.
You need to add argument to set screen resolution while using headless.I suggest you to go with chrome with ui first then check the headless.

Selenium Java how to return the driver back to the previous page

I have two links in pageA. when I click the 1st link it redirects to another page called pageB and do some certain jobs and returns back to pageA. From here it should again click to the 2nd link but instead it says page has reloaded and no cache available.
//List of all tickets
for(WebElement ticket: ticketList){
List<WebElement> ticketCells = ticket.findElements(By.tagName('td'));
if(ticketCells.get(4).getText().equalIgnoreCase("Some Text")){
ticketCells.get(2).click(); //Redirects to pageB
.....
do some job
.......
//Finally clicking on the 'SAVE & BACK' button which should return to previous
//page and pick the 2nd ticket from the list of all tickets (1st for loop)
driver.findElement(By.id("save&back")).click();
}
}
Here though it it is going back to previous page pageA but unable to pick the 2nd element from for loop for next operation.
Any thoughts on how to make it work.
I think that what you want is the equivalent to press the Back button of your browser, right?
If its this, try that:
webDriver.navigate().back();
If you want to navigate to back or previous page in java selenium browser. You can try
webDriver.navigate().back();
If you want to navigate to back or previous page in python selenium browser. You can try
driver.back();
If you want to navigate to back or previous page in JavaScript selenium browser. You can try
driver.execute_script("window.history.go(-1)");
#try-catch-finally explained it very clearly. The below code is exactly what you need to handle the error.
for(int i = 0; i <2 ;i++){
ticketList = driver.findElements(selector);
ticket = ticketList.get(i);
List<WebElement> ticketCells = ticket.findElements(By.tagName('td'));
if(ticketCells.get(4).getText().equalIgnoreCase("Some Text")){
ticketCells.get(2).click(); //Redirects to pageB
.....
do some job
.......
//Finally clicking on the 'SAVE & BACK' button which should return to previous
//page and pick the 2nd ticket from the list of all tickets (1st for loop)
driver.findElement(By.id("save&back")).click();
}
}

Clicking Web Links in Selenium Webdriver

My question is simple- Using Selenium, how do you keep on clicking links when each hyper link opens up in a new page or new window or opens in the same web page.
For example I have following links on a webpage:
Log in
Sign up
Forgot Password?
Signup with us
follow this link
Home
Terms
Privacy Policy
Here is the small snippet of code that I have written to click these mentioned links on webpage:
List<WebElement> elements = driver.findElements(By.tagName("a"));
//clicking all links
for (WebElement el : elements){
System.out.println("Link getting clicked" + el.getText());
el.click();
driver.navigate().back();
}
As you can see I am trying to get links and trying to click them one by one. However, I am getting an error after the first click itself. Console says: "org.openqa.selenium.StaleElementReferenceException: Element is no longer attached to the DOM".
I am petty sure I know the cause as the links are getting opened on the same webpage, but I am doing a back navigation, which is not helping me at all.
Any thoughts / suggestions?
This could be due to program execution advancing to the call to "driver.navigate().back();" before the page has loaded.
Try introducing an implicit wait, which tells the "WebDriver to poll the DOM for a certain amount of time when trying to find an element or elements if they are not immediately available"
e.g. When you create your web driver try:
WebDriver driver = new FirefoxDriver();
driver.manage().timeouts().implicitlyWait(2, TimeUnit.SECONDS);
You have to refind the element whenever a page is reloaded before you can interact with it, for your code, please try modify them to this:
driver.manage().timeouts().implicitlyWait(3000, TimeUnit.MILLISECONDS);
List<WebElement> elements = driver.findElements(By.tagName("a"));
//clicking all links
for (int i=0; i<elements.size(); i++){
WebElement el = driver.findElements(By.tagName("a")).get(i);
System.out.println("Link getting clicked" + el.getText());
el.click();
driver.navigate().back();
}

Categories

Resources