I'm trying to crawl a site which has title and comments. When the page is loaded 40 comments are rendered, but after clicking the button "Load comments" there are new 40 comments and so on. I want to load all the comments first, and then take them all.
The problem is that I'm getting only the first 40. This is my code:
WebDriver driver = new HtmlUnitDriver();
driver.get(www.website.com);
String title = driver.findElement(By.className("title")).getText();
while(driver.findElements(By.className("load-comments")).isDisplayed() || !driver.findElement(By.className("expand-loading")).isDisplayed()){
Thread.sleep(500);
if(!driver.findElements(By.className("loading")).isDisplayed()){
driver.findElements(By.className("load-comments")).click();
}
}
List<WebElement> comments = (List<WebElement>) driver.findElements(By.className("comment"));
for(WebElement comm:comments){
System.out.print(comm.getText());
}
So, if i need all 150 comments, in this situation i'm getting only the first 40 that are visible when the page is loaded.
I tried options.addArguments("--headless"); with ChromeDriver(options); but it's very slow.
PS: The load-comments button is hidden when all comments are loaded and the loading element is displayed if the page is loading new comments.
The website you provided din't display any comments. please provide exact web address to locate elements.
Why are you using "driver.findelements" instead of "driver.findelement" for is displayed condition? This logic will give you compilation error.
You need to add argument to set screen resolution while using headless.I suggest you to go with chrome with ui first then check the headless.
Related
I'm coding test with Selenium Webdriver (Java), getting https://cloud.google.com as a driver.
I start with finding search input field, sendKeys("search phrase \n"). After that page starts changing its content and I'm trying to intersept these changes with WebDriverWait:
// first Wait - is to wait before page starts changing is content by removing search google icon
new WebDriverWait(driver, 30).until(ExpectedConditions.invisibilityOf(searchInputFieldIcon));
//second Wait - i'm waiting new hyperlink to appear (this hyperlink appears in search results after the whole page is asynchronically reloaded without page reloading)
new WebDriverWait(driver,30)
.until(ExpectedConditions.visibilityOfElementLocated(By.xpath("//a[#href='https://cloud.google.com/products/calculator']")));
The point is, that Wait doesn't waits for 30 seconds before element shows up. Code just throws an exception:
org.openqa.selenium.NoSuchElementException:
no such element: Unable to locate element: {"method":"xpath","selector":"//a[#href='https:`//cloud.google.com/products/calculator']"}`
Any help will be much appreciated!
Please check the attached screenshot. Here the Href link is different then you have used in your code.
You can use the below code
wait.until(expectedConditions.visibilityOfElementLocated(By.linkText("Google Cloud Platform Pricing ")));
To locate the first search result you can use the following xpath;
//a[contains(text(),'Google Cloud Platform Pricing')]
Checking your xpath
You can check whether your xpath is correct or not from the browser itself.
Go to DevTools (Ctril + Shift + I)
In the 'Elements' tab, press Ctrl + F
Input the xpath that you want to check
And it will show you whether it is correct and how many web-elements can be located from it.
I am a beginner in selenium and I would like to press a file submission field.
I have already done a whole code to connect to the page, click on the buttons etc. (everything works, my driver is good)
But impossible to click on adding file
I looked on the internet how to do it, I added time, tried to browse the frames, used javascript for the hidden class... I tried all the buttons in the field and it doesn't detect them.
Add File
Source code
Thread.sleep(2000);
JavascriptExecutor js = (JavascriptExecutor) driver;
js.executeScript("window.scrollBy(0,1000)");
WebDriverWait wait = new WebDriverWait(driver, 60);// 1 minute
wait.until(ExpectedConditions.visibilityOfElementLocated(By.xpath("//*#id=\"yui_3_17_2_1_1584634673387_348\"]/div[1]/div[1]/a")));`
org.openqa.selenium.NoSuchElementException: no such element: Unable to locate element: {"method":"xpath","selector":"//*[#id="yui_3_17_2_1_1584634673387_348"]/div[1]/div[1]/a"}
Do you have an idea ?
The primary issue I observed looking at the code is the incorrect locator in your Explicit Condition element checking line. It can be replaced with below code:
wait.until(ExpectedConditions.visibilityOfElementLocated(By.xpath("//*[#id='yui_3_17_2_1_1584634673387_348']/div[1]/div[1]/a")));
Basic XPath Syntax for reference though it has it's variations:
//tagname[#attrbute='value']
Additional advice Though I am not sure about the application you are automating, but the ID is likely to be changed. Based on the DOM Structure you have provided in the link above I would say change the locator to something on the lines of:
wait.until(ExpectedConditions.visibilityOfElementLocated(By.xpath("//a[#role='button'][#title='Add..']")));
i didn`t find any useful info about my problem. sorry if i repeat.
for example i want to click at the main page of http://www.bbc.com/ in the bottom of site link "Mobile site". in casual i do smth like this, to click on my button:
driver.getMouse(driver.findElement(By.Id("blq-footer-mobile"))).click();
but now i need to simulate the activity of user.
1. i need to scroll the page to bottom
2. need to move the cursor on link
3. click it
i realy tried all what i found in the internet, but everything wrong.
WebDriver simulates user interactions with web applications using native browser APIs. So as long as you are using pure WebDriver API, you are simulating natural user. You don't need to explicitly scroll, WebDriver would do that for you. If it's not scrolling then it is a bug and please report it accordingly. As for your question, here is the code that works.
WebDriver driver = new FirefoxDriver();
driver.get("http://www.bbc.com/");
WebElement element = driver.findElement(By.id("blq-footer-mobile"));
element.click();
The Mobile site link in the above website will just take u to UK website of BBC..
which means, a click on Mobile site link in http://www.bbc.com/ will actually lead you to http://www.bbc.co.uk/, where in the page remains same with just the URL changed..
if you really want to experiment on Mobile site link, use this URL : http://www.bbc.co.uk/
you can try the following code :
WebDriver driver = new FirefoxDriver();
driver.get("http://www.bbc.co.uk/");
new WebDriverWait(driver,30).until(ExpectedConditions.visibilityOfElementLocated(By.id("blq-footer-mobile"))).click();
this will wait for elements visibility and click on it,and this will take you to actual mobile site of BBC..
My question is simple- Using Selenium, how do you keep on clicking links when each hyper link opens up in a new page or new window or opens in the same web page.
For example I have following links on a webpage:
Log in
Sign up
Forgot Password?
Signup with us
follow this link
Home
Terms
Privacy Policy
Here is the small snippet of code that I have written to click these mentioned links on webpage:
List<WebElement> elements = driver.findElements(By.tagName("a"));
//clicking all links
for (WebElement el : elements){
System.out.println("Link getting clicked" + el.getText());
el.click();
driver.navigate().back();
}
As you can see I am trying to get links and trying to click them one by one. However, I am getting an error after the first click itself. Console says: "org.openqa.selenium.StaleElementReferenceException: Element is no longer attached to the DOM".
I am petty sure I know the cause as the links are getting opened on the same webpage, but I am doing a back navigation, which is not helping me at all.
Any thoughts / suggestions?
This could be due to program execution advancing to the call to "driver.navigate().back();" before the page has loaded.
Try introducing an implicit wait, which tells the "WebDriver to poll the DOM for a certain amount of time when trying to find an element or elements if they are not immediately available"
e.g. When you create your web driver try:
WebDriver driver = new FirefoxDriver();
driver.manage().timeouts().implicitlyWait(2, TimeUnit.SECONDS);
You have to refind the element whenever a page is reloaded before you can interact with it, for your code, please try modify them to this:
driver.manage().timeouts().implicitlyWait(3000, TimeUnit.MILLISECONDS);
List<WebElement> elements = driver.findElements(By.tagName("a"));
//clicking all links
for (int i=0; i<elements.size(); i++){
WebElement el = driver.findElements(By.tagName("a")).get(i);
System.out.println("Link getting clicked" + el.getText());
el.click();
driver.navigate().back();
}
I have written a code to scrap some information from one of the website. For scrapping purpose i am using Selenium Webdriver. Now my problem is, there are some information present on one the page but initially only ten entries are displayed on page, i am able to scrap this ten entries. But there are some more entries which will be displayed only after clicking load more link which is at the bottom of the page. whenever i click load more link remaining entries will be loaded on to the page with out refreshing the whole page(probably ajax update) i am unable to scrap this newly loaded entries. Please help
This is the code i have written
WebDriver driver = new HtmlUnitDriver(BrowserVersion.FIREFOX_3_6);
driver.get("some site url");
Thread.sleep(500);
driver.findElement(By.xpath("//*[#id=\"username\"]")).sendKeys("user name");
driver.findElement(By.xpath("//*[#id=\"password\"]")).sendKeys("password");
driver.findElement(By.xpath("//*[#id=\"login\"]//div/button")).click();
Thread.sleep(200);
if(driver.getPageSource().toString().contains("Hi "+un)) {
driver.get("http://www.somesite/m/searches/Loads/new");
driver.findElement(By.xpath("//*[#id=\"Criteria_PostingAge\"]")).clear();
driver.findElement(By.xpath("//*[#id=\"Criteria_PostingAge\"]")).sendKeys("12");
driver.findElement(By.xpath("//*[#id=\"Criteria_Origin_RawValue\"]")).clear();
driver.findElement(By.xpath("//*[#id=\"Criteria_Origin_RawValue\"]")).sendKeys(orgn[paraCount]);
driver.findElement(By.xpath("//*[#id=\"Criteria_Destination_RawValue\"]")).clear();
driver.findElement(By.xpath("//*[#id=\"Criteria_Destination_RawValue\"]")).sendKeys(destn[paraCount]);
date=new Date();
calendar=Calendar.getInstance();
calendar.setTime(date);
driver.findElement(By.xpath("//*[#id=\"Criteria_PickupFrom\"]")).clear();
driver.findElement(By.xpath("//*[#id=\"Criteria_PickupFrom\"]")).sendKeys(pickupDtFmt.format(date).toString());
calendar.add(Calendar.DAY_OF_MONTH, 1);
date=calendar.getTime();
driver.findElement(By.xpath("//*[#id=\"Criteria_PickupTo\"]")).clear();
driver.findElement(By.xpath("//*[#id=\"Criteria_PickupTo\"]")).sendKeys(pickupDtFmt.format(date).toString());
driver.findElements(By.xpath("//*[#id=\"search-entry\"]//div/input")).get(11).click();
Thread.sleep(2000);
// The code upto here will get me starting ten entries
// In order to get more entries i need to click on load more o link
driver.findElement(By.xpath("//*[#id=\"loadMore\"]")).click();
Thread.sleep(3000);
WebElement myDynamicElement = (new WebDriverWait(driver, 30))
.until(ExpectedConditions.presenceOfElementLocated(By.xpath("//*[#id=\"search-results\"]/div[2]/ul/li[12]")));
Actually after clicking on load more anchor link i was supposed to get ten more entries. So totally 20 entries. But i am getting only same 10 entries which i got when page loaded for the first time
I bet you would get a different result if you switched to using GhostDriver instead. No reason to use HtmlUnitDriver when GhostDriver can do the same and is probably more cutting edge technology.