I'm trying to take a screenshot of specific Element using Ashot.
unfortunately, I'm not getting the element screenshot.
When I sent my code to someone else who ran it the element screenshot was taken correctly.
I don't understand why it is not working on my computer.
The code is:
public class ScreenShot
{
WebDriver driver;
#BeforeClass
public void StartSession()
{
WebDriverManager.chromedriver().setup();
driver = new ChromeDriver();
driver.manage().window().maximize();
driver.get("https://www.google.com/");
}
#Test
public void VerifyScreenShot()
{
TakeScreenShot();
//CompareScreenShot();
}
#Step("take element screen shot")
public void TakeScreenShot()
{
try
{
//find the element you want to picture
WebElement imageElement = driver.findElement(By.id("hplogo"));
//take the element screenshot
Screenshot imageScreenShot = new AShot().coordsProvider(new WebDriverCoordsProvider()).takeScreenshot(driver,imageElement);
//save the element picture in the project folder under img folder
ImageIO.write(imageScreenShot.getImage(),"png",new File("./img/glogo.png"));
}
catch (Exception e)
{
System.out.println("Error writing image file: "+ e);
}
}
#AfterClass
public void EndSession()
{
driver.quit();
}
}
i found the solution.
i was needed to change my computer resolution to 100% and than take the element screenshot.
than the element image was as i expected.
Related
I am trying to test the GeeksForGeeks UI. When I click the tutorials dropdown, then select languages and select Java, it links to a new page and the following error occurs org.openqa.selenium.StaleElementReferenceException. How can I solve this issue? I have tried all the possible solutions from stackoverflow.
public class SeleniumTest {
public static WebDriver driver;
#BeforeClass
public static void setupClass() {
System.setProperty("webdriver.chrome.driver", "driver/chromedriver.exe");
}
#Before
public void setup() {
driver = new ChromeDriver();
driver.manage().timeouts().implicitlyWait(Duration.ofSeconds(5));
}
#After
public void after() {
driver.close();
}
#Test
public void testGeeksForGeeksR() throws InterruptedException {
driver.get("https://www.geeksforgeeks.org/");
WebElement tutorialsMenu = driver.findElement(By.className("header-main__list-item"));
tutorialsMenu.click();
List<WebElement> tutorialsList = tutorialsMenu.findElements(By.tagName("li"));
for (WebElement li : tutorialsList) {
if (li.getText().equals("Languages")) {
li.click();
List<WebElement> languages = driver.findElements(By.tagName("a"));
for (WebElement a : languages) {
if (a.getText().equals("Java")) {
WebDriverWait wait = new WebDriverWait(driver, 20);
wait.until(ExpectedConditions.elementToBeClickable(a));
a.click();
WebElement title = driver.findElement(By.className("entry-title"));
assertEquals("Java Programming Language", title.getText());
}
}
}
}
Thread.sleep(6000);
}
}
Solution:
#Test
public void testGeeksForGeeksR() throws InterruptedException {
driver.get("https://www.geeksforgeeks.org/");
WebElement tutorialsMenu = driver.findElement(By.className("header-main__list-item"));
tutorialsMenu.click();
List<WebElement> tutorialsList = tutorialsMenu.findElements(By.tagName("li"));
WebElement javaLanguage = null;
for (WebElement li : tutorialsList) {
if (li.getText().equals("Languages")) {
li.click();
List<WebElement> languages = driver.findElements(By.tagName("a"));
for (WebElement a : languages) {
if (a.getText().equals("Java")) {
javaLanguage = a;
break;
}
}
}
}
javaLanguage.click();
driver.switchTo().activeElement();
WebElement title = driver.findElement(By.className("entry-title"));
assertEquals("Java Programming Language", title.getText());
Thread.sleep(3000);
}
After clicking on the a element with Java text the Java Programming Language page is opened.
At this point all the element references collected on the previous page are becoming Stale.
Generally, each Selenium WebElement object is actually a reference (pointer) to physical web element object.
So, when you are opening another web page or refreshing the existing page (reloading the web elements there) all the references to the web elements on the previous web page are no more valid.
In the Selenium terminology this situation is called Stale Element.
Getting back to your specific code flow.
Looks like your target here is to open the Java Programming Language page. If so, all what you are missing here is to exit your loop once that page is opened and finish the test.
In case you wish to continue opening another tutorials from the menu on the main page you will have to go back from the internal page you opened and then get all the elements you wish to use there again.
I am trying to automate the following actions:
Launching https://www.flipkart.com > Click on Mobiles > Mouse hover on Electronics and then click on Mi.
I am getting Expection in thread "main" state Element Reference: element is not attached to the page document in the miButton() method.
Please refer the error details section.
HTML Code
Mi Button Click - HTML
Base class:
public class Base {
static WebDriver driver;
public void setupBrowser(String browser, String url) {
String currDir = System.getProperty("user.dir");
if(browser.equalsIgnoreCase("chrome")) {
System.setProperty("webdriver.chrome.driver", currDir + "\\drivers\\chromedriver.exe");
driver = new ChromeDriver();
}
else if(browser.equalsIgnoreCase("firefox")) {
System.setProperty("webdriver.gecko.driver", currDir + "\\drivers\\geckodriver.exe");
driver = new FirefoxDriver();
}
else if(browser.equalsIgnoreCase("edge")) {
System.setProperty("webdriver.edge.driver", currDir + "\\drivers\\msedgedriver.exe");
driver = new EdgeDriver();
}
else {
System.out.println("Valid browser not found therefore quitiing");
System.exit(0);
}
driver.manage().window().maximize();
driver.manage().timeouts().implicitlyWait(10, TimeUnit.SECONDS);
if(url != "")
driver.get(url);
else
driver.get("about:blank");
}
public void closeBrowser() {
driver.close();
}
Page Class
public class pagetest extends Base{
Actions action;
public void closebtn() {
driver.findElement(By.cssSelector("button._2doB4z")).click();
}
public void mibutton() {
WebElement mobiles = driver.findElement(By.xpath("//div[text()='Mobiles']"));
action = new Actions(driver);
action.moveToElement(mobiles).click().perform();
WebElement electronicsmenu = driver.findElement(By.xpath("//span[text()='Electronics']"));
action.moveToElement(electronicsmenu).click().perform();
List <WebElement> value = driver.findElements(By.xpath("//div[#class='_1kidPb']/div[#class='_1QrT3s']//a");
for(WebElement elem:value) {
if(elem.getText().equals("Mi")) {
elem.click();
}
}
WebElement label = driver.findElement(By.xpath("//p[text()='Latest from MI : ']"));
System.out.println("The Label 'Latest from MI' is present : " +label.isEnabled());
}
public static void main(String[] args) {
pagetest obj = new pagetest();
obj.setupBrowser("chrome", "https://www.flipkart.com/");
obj.closebtn();
obj.mibutton();
}
}
Error Details
Exception in thread "main" org.openqa.selenium.StaleElementReferenceException: stale element reference: element is not attached to the page document
(Session info: chrome=89.0.4389.90)
For documentation on this error, please visit: https://www.seleniumhq.org/exceptions/stale_element_reference.html
You never find elements and then right away try to click on them. When the page is loading the elements are created, but after a few milliseconds they change their attributes, size and or location. This exception means that you found the element but as of now you "lost" it.
In order to avoid StateElementReferecneException use WebDriverWait object. You can wait for element to be visible / clickable as well as other options. Best option is to wait for the element to be clickable:
WebDriverWait wait = new WebDriverWait(driver, 20, 10);
wait.until(ExpectedConditions.elementToBeClickable
(By.id("elementID")));
I am unable to click on the images present in below carousel as shown in below snapshot. I tried a lot but failed though i was able to navigate to other images by clicking on the right Arrow navigation button . The Image Carousel is Present below the Featured Vehicles header Text (in website) which has navigation buttons both left and right.
Site Link : https://ryder.com/used-trucks
Below is the one of the approach that I tried ,and the code is below. Please i need your help on this at the earliest.
public void click_Image_Carousel_To_Open_ProductDetailsPage() throws InterruptedException
{
JavascriptExecutor js = (JavascriptExecutor) driver;
js.executeScript("arguments[0].scrollIntoView();", driver.findElement(By.xpath("//h2[contains(text(),'FEATURED VEHICLES')]")));
WebElement ele1= null;
String image_link = null;
List<WebElement> image_Carousel_Links_list = driver.findElements(By.xpath("(//div[#class='photo']/a)"));
System.out.println("Size :"+image_Carousel_Links_list.size());
WebElement image_Carousel_Next_Btn = driver.findElement(By.xpath("//BUTTON[#class='slick-next slick-arrow'][text()='Next']"));
for(int i=1;i<image_Carousel_Links_list.size();i++)
{
System.out.println(+i+")Image links :"+driver.findElement(By.xpath("(//div[#class='photo']/a)["+i+"]")).getAttribute("href"));
System.out.println(" Element : (//div[#class='photo']/a)["+i+"]");
ele1=driver.findElement(By.xpath("(//div[#class='photo']/a)["+i+"]"));
//this for loop to rediscover the elements to avoid stale element exception
for(int k=0;k<500;k++)
{
image_Carousel_Links_list = driver.findElements(By.xpath("(//div[#class='photo']/a)"));
if(driver.findElement(By.xpath("(//div[#class='photo']/a)["+i+"]")).isDisplayed())
{
ele1=driver.findElement(By.xpath("(//div[#class='photo']/a)["+i+"]"));
break;
}
click_ImageCarousel_NextButton(image_Carousel_Next_Btn);
Thread.sleep(300);
}
if(ele1.isDisplayed())
{
ele1.click();
System.out.println(i+") Clicked on the Image present in Image Carousel :" +ele1.getAttribute("href"));
Thread.sleep(3000);
driver.navigate().back();
Thread.sleep(4000);
}
else
{
System.out.println(" Image Not found in Image Carousel");
}
}
}
private void click_ImageCarousel_NextButton(WebElement image_Carousel_Next_Btn) {
image_Carousel_Next_Btn.click();
}
I'm running a series of automated GUI tests using Selenium in Java. These tests regularely takes screenshots using:
public static void takeScreenshot(String screenshotPathAndName, WebDriver driver) {
File scrFile = ((TakesScreenshot)driver).getScreenshotAs(OutputType.FILE);
try {
FileUtils.copyFile(scrFile, new File(screenshotPathAndName));
} catch(Exception e) {
e.printStackTrace();
}
}
This works excellently in Chrome and IE, however in firefox I keep getting large pieces of whitespace under the screenshots. I suspect that the whitespace is actually a part of the page itself, but normally hidden from view in the browser(the scrollbar stops before the whitespace). I did a quick test with
driver.get("http://stackoverflow.com/");
takeScreenshot("D:\\TestRuns\\stackoverflow.png", driver);
and found that when using the Firefox driver the entire page in captured in the screenshot, while with the Chrome driver only what's shown in the browser is captured.
Is there any way to force the Firefox driver to take a screenshot containing ONLY what can actually be seen in the browser (what an actual user would see)?
Based on answers from this question I was able to add 4 lines of code to just crop the image down to the browser size. This does solve my problem, although it would have been nicer if it could be solved through the driver instead of cropping after the screenshot has been taken.
public static void takeScreenshot(String screenshotPathAndName, WebDriver driver) {
File scrFile = ((TakesScreenshot)driver).getScreenshotAs(OutputType.FILE);
try {
int height = driver.manage().window().getSize().getHeight();
BufferedImage img = ImageIO.read(scrFile);
BufferedImage dest = img.getSubimage(0, 0, img.getWidth(), height);
ImageIO.write(dest, "png", scrFile);
FileUtils.copyFile(scrFile, new File(screenshotPathAndName));
} catch(Exception e) {
e.printStackTrace();
}
}
Try this :
private static void snapshotBrowser(TakesScreenshot driver, String screenSnapshotName, File browserFile) {
try {
File scrFile = driver.getScreenshotAs(OutputType.FILE);
log.info("PNG browser snapshot file name: \"{}\"", browserFile.toURI().toString());
FileUtils.deleteQuietly(browserFile);
FileUtils.moveFile(scrFile, browserFile);
} catch (Exception e) {
log.error("Could not create browser snapshot: " + screenSnapshotName, e);
}
}
I want to do the following:
I want to fetch and display all links on webpage.
After displaying, I want to click each link one by one.
I'm able to do point 1 using foreach loop but I'm not able to 2nd point.
Here is the code:
public class OpenAllLinks {
public static void main(String[] args) {
WebDriver driver=new FirefoxDriver();
driver.get("http://bing.com");
List<WebElement> demovar = driver.findElements(By.tagName("a"));
System.out.println(demovar.size());
for (WebElement var : demovar) {
System.out.println(var.getText()); // used to get text present between the anchor tags
System.out.println(var.getAttribute("href"));
}
for (WebElement var : demovar) {
var.click();
}
}
}
when the first link is clicked, the browser will load the respective page. hence the other links those you had captured in the first page wouldn't be available.
If the intent is to navigate to the every link's target, then store the target location and navigate to it, like this
driver.get("<some site>");
List<WebElement> links=driver.findElements(By.tagName("a"))
ArrayList<String> targets = new ArrayList<String>();
//collect targets locations
for (WebElement link : links) {
targets.add(link.getAttribute("href"));
}
for (WebElement target : targets) {
driver.get(target);
//do what is needed in the target
}
static WebDriver driver=null;
public static void main(String[] args) throws IOException
{ System.setProperty("webdriver.chrome.driver","D:\\softwaretesting\\broswer driver\\chromedriver.exe");
WebDriver driver = new ChromeDriver();``
driver.manage().timeouts().implicitlyWait(20, TimeUnit.SECONDS);
//driver.manage().window().maximize();
driver.get("http://google.com/");
List<WebElement> links=driver.findElements(By.tagName("a"));
System.out.println("Total links are "+links.size());
for(int i=0;i<links.size();i++)
{
WebElement ele= links.get(i);
String url=ele.getAttribute("href");
verifyLinkActive(url);
}
}
public static void verifyLinkActive(String linkUrl)
{ try
{
URL url = new URL(linkUrl);
HttpURLConnection httpURLConnect=(HttpURLConnection)url.openConnection();
httpURLConnect.setConnectTimeout(3000);
httpURLConnect.connect();
if(httpURLConnect.getResponseCode()==200) {
System.out.println(linkUrl+" - "+httpURLConnect.getResponseMessage());
File src= (TakesScreenshot)driver).getScreenshotAs(OutputType.FILE);
FileUtils.copyFile(src, new File("D://screenshort//Spiritualbridge//"+System.currentTimeMillis()+".png"));
} if(httpURLConnect.getResponseCode()==HttpURLConnection.HTTP_NOT_FOUND)
{
System.out.println(linkUrl+" - "+httpURLConnect.getResponseMessage() + " - "+ HttpURLConnection.HTTP_NOT_FOUND);
}
} catch (Exception e)
{
}
}
That happens because the link when clicked, navigates to a new page where it doesn't find the next element in your list to click. Please try the below code that will navigate to each link (I have used the code by #deepak above and have modified it accordingly as per your need):
WebDriver driver=new FirefoxDriver();
driver.manage().window().maximize();
driver.manage().timeouts().implicitlyWait(10, TimeUnit.SECONDS);
driver.get("http://bing.com");
List<WebElement> demovar=driver.findElements(By.tagName("a"));
System.out.println(demovar.size());
ArrayList<String> hrefs = new ArrayList<String>(); //List for storing all href values for 'a' tag
for (WebElement var : demovar) {
System.out.println(var.getText()); // used to get text present between the anchor tags
System.out.println(var.getAttribute("href"));
hrefs.add(var.getAttribute("href"));
System.out.println("*************************************");
}
//Navigating to each link
int i=0;
for (String href : hrefs) {
driver.navigate().to(href);
System.out.println((++i)+": navigated to URL with href: "+href);
Thread.sleep(3000); // To check if the navigation is happening properly.
System.out.println("+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++");
}