I have a web app running in ie, and when i close the application with click "X" right corner on IE, it will call(by javascript window.onunload) a struts action and delete a record in a table in database.
But now I found it was incorrect sometimes. the record was not deleted sometimes when i close ie. It should be the case that struts action was not be called when record was not deleted.
More interesting is that when I open Fiddler, record was deleted correctly when close ie everytime, no fail case.
Thanks a lot.
You cannot rely on "onunload" event.
Think about it from the perspective of a web-browser: Should it wait till all open pages process "onunload" event, even if it takes 30 seconds? Probably browser reliably fires the "onunload" event to all open tabs, however it waits just very short time to complete them. Thats why you see inconsistent behavior.
If you really have to do this, take a look at onbeforeunload, but its implementation is inconsistent across browsers: IIRC, opera doesn't implement it.
Related
I am developing a spring MVC application. I ran into some interesting case.
To make it easier to explain i am taking the stackover flow buttons on the top as example( i mean those questions, tags, users, badges, unanswered buttons).
Now in my app i have similar buttons. when user clicks on any button it makes ajax call by passing proper arguments. Server makes sql queries and returns the results back.
Now assume that there is a crazy user like me who keeps on clicking those buttons without break. So each click is making a ajax call. And which ever completes its operation is showing up on front end. So even if the user clicks Tags button in the last it may show up and again the previous click on questions which took long time to return to front end can overwrite the page. How can i fix that? ( i want the tags data to be shown as it is the users last click)
In the first place i know that when user first clicks on question and then on tag i no longer need to query sql for questions button. Is there some way for me to stop processing the sql query for questions button.
Thanks
The best way to handle this is through the user interface - if the user takes some action (clicking an image) that will require significant processing on the backend, your UI should prevent other actions on the page from sending further messages to the backend until the original request is complete.
Ways to tackle this visually would be to disable/gray out other elements, make it obvious that some work is going on (with spinners, progress bars), etc.
On the server side, since each HTTP request is independent it would be cumbersome and difficult to add logic on the server to be able to detect if the user making this current request has another ongoing request currently being processed.
You probably need to take help of cookies. When the first time the action is done, write some cookie. Every time, check that cookie before you process.
You cannot simply disable a link or button from the UI and hope the user cannot do it. It can always be done in multiple ways. Additional checking is must.
(I haven't read your post completely. But from what I understand from the 1st answer...)
I had a similar problem, and I tackled it this way.
I did hand-coded ajax calls (as opposed to jQuery etc.)
I had a single global XMLHTTPRequest.
var xhr = new XMLHTTPRequest();
When the user clicked something, which needed an ajax call, I aborted the previous call, if already running.
if( xhr.readystate !=0 || xhr.readystate !=4 )
xhr.abort();
Then create a new instance of XHR, and do your business.
xhr = new XMLHTTPRequest();
xhr.open("GET", myUrl, true);
//attach callback function etc and do the send
I have a small Vaadin v8 application that has several input fields (comboboxes, selectgroups, etc...). The content of most of these is determined by the chosen content of the first ComboBox. However, when I select something in it, all the others stay blank until I click one, at which point they all update. This is not desired behaviour, but I assume it's being caused by the server-side being up to date but not updating the client side view. (Even when adding requestRepaint() in my first Combobox's ValueChangeListener)
There must be some method to force Vaadin to get the data I want it to display even if no other components are clicked?
EDIT
I'm not allowed to post answers to my own question so soon, so I'm putting it here temporarily:
I found that there's a javascript method that synchs client and server.
myComponent.getApplication().getMainWindow().executeJavaScript("javascript:vaadin.forceSync();");
The only problem I have now is that the ValueChangeListener on one of my comboboxes still only fires when I click another combobox (or the same one twice). It's the weirdest thing because the second combobox, when loaded, fires it's event perfectly.
Is the first ComboBox in "immediate" mode?
If not, it probably should be : component.setImmediate(true).
See https://vaadin.com/book/-/page/components.selection.html
I had the same problem, see below how it could be done in version 8.0.5 (from 2017):
#Push
public class WebUi extends UI {
public void fireComponentUpdated() {
getUI().push();
}
}
There is a hack you can use if you have set a datasource for your componets that forces vaadin to re-render them. I use this for updating tables that have dynamic data
yourcomponent.setContainerDataSource(yourcomponent.getContainerDataSource());
Did you requestRepaint on the correct components?
Keep in mind that requestRepaint marks the component as dirty but doesn't mean it will be repainted - a client can only be refreshed when it makes a request to the server.
See this thread https://vaadin.com/forum/-/message_boards/view_message/231271 for more information about your options (it deals with UI refreshes due to background thread processing).
In Vaadin 7 it is enough to put this line in main UI.init(VaadinRequest) method:
UI.getCurrent().setPollInterval( 1000 );
if you want to refresh your UI (in this case) every second. This way you instruct UI to poll server for changes in defined interval.
Beware, excessive server traffic might be a problem if you have lot of users that use your application at the same time.
In Vaadin 6 you will have to play with ProgressIndicator (which could be invisible if you want) and try to do the similar what UI.getCurrent().setPollInterval(int) in Vaadin 7 does.
I have created a search bar on GAE similar to facebook that shows you
names and pictures as you type into a search bar. e.g. typing jo
would bring up a drop down of "John Smith" and "Michael Jordan". This
works really well on the development server, and even works really
well in production, when the user searches immediately after loading
the page. If I wait on the page, say 30 seconds to a min. Then try
and use the search bar, it takes a very long time to show the
results. Sometimes as long as 15 seconds, when it used to be
immediate. Can someone explain what is going on here. Is there any
way I can keep this request hot? Also, if I search for jo then wait, get the results, delete jo and then type in another search, my results are shown immediately. This makes me think that something is shutting down if not kept active.
My search bar implements the JQuery
autocomplete bar and uses Jquery ajax gets to fetch the results.
Check if it actually starts new instances. Go to Dashboard and in Chart section in combo select "Instances" instead of requests / second.
This typically happens because if your application has less traffic, then the GAE "warms up" by starting a new JVM and that takes a few seconds.
More here about the "Loading Request".
The workaround to this is to use the Always On mode for your application. Note that this is a premium feature.
Here is the thing : my webapp has loads of popups and my boss wants 'em closed on session expiry, coz when session expires and an user presses refresh on a popup, he is being shown the logon page -> user logs on -> user is directed to the dashboard. Now, a dashboard screen in a popup is totally uncool. Here is where google got me:
Have javascript to close popup onload. Generate this onload script into the response if session has expired (checking session expiry from jsp and including onload script conditionally).
Do you think this is a good way to it? What is the best practice for this scenario?
P.S: I am not allowed to use AJAX
In a past life, I made a popup manager object that maintained what windows were open. You should probably make one of these if not already done. Then, you can use setTimeout to call a function after so many minutes (or whatever time you want) have gone by. This will check for recent activity (probably via AJAX) and close the popup if you determine that the session has expired. If not, call setTimeout again with your new time, properly adjusted for most recent activity.
^^before the AJAX edit.
Well, since you can't use AJAX, can you put something in the url that will tell you it's a popup? Then you'll know not to show the login screen when the user hits reload.
The best way would be an XMLHTTP request to check login and close them if required - do this periodically.
Astute readers (meaning everyone) will notice that this is an AJAX request, but if you phrase it that way it might get accepted as whoever dictated that you 'aren't allowed to use AJAX' is clearly an idiot.
An alternative way to implement modal dialogs in a web application is to:
Model the dialog in a DIV, default styled to display: none;
On desired action, inject/append the Modal dialog DIV into the page source
Reset the CSS display so the modal dialog DIV is visible, overlaid on top of the page by setting the CSS z-index property
Make the modal dialog disappear upon either successful execution or the user cancelling out
Because the modal dialog is part of the page source, the dialog will disappear when the session times out. This approach doesn't spawn supporting windows that can be orphaned as the poster is attempting to address. And it fits the requirement of not using AJAX.
You can code these by hand, but I don't really recommend it because of having to support various browser. I suggest looking at the Yahoo User Interface. You can tailor it to suit your needs (IE: only modal dialogs), and it would support AJAX if requirements change down the road.
Beware of spawning modal dialogs from modal dialogs.
If your boss is asking you to achieve this, without using AJAX, then you're in trouble. He should understand that the only connection a browser has to the server (without refreshing the page) is javascript (what he understands to be ajax).
The best way to do this is to setup a script on the pages to ask the server if the user is still logged in every 30 seconds or so.
setInterval(function(){
$.get("loggedin.php", function(result) {
if (!result.isLoggedIn)
window.close();
});
}, 30000);
This script assumes you're using the jQuery framework for rapid development of javascript solutions. This also uses JSON (Javascript Object-notation) to test a return-value from the loggedin.php file.
Bottom line, you need to use AJAX. Tell your boss there is no other way. If he still doesn't get it, ask him to balance his checkbook without using math.
In theory, you could avoid AJAX by using a hidden flash widget...
But more practically, AJAX is the 'right' solution, and I think you will have to talk to your boss, determine where this 'no AJAX' rule came from, and convince him that AJAX is the best way to solve this problem.
Does he think AJAX would be take too much time to implement? If so, you should prove him wrong. Does he think it will be hard to maintain? If so, show how simple the code to do this will be, and how widely used the common AJAX libraries are. If your boss is reasonable, then his goal is to what is best for the product, and you should be able to reason with him.
Here's what I do:
selenium.click("link=mylink");
selenium.waitForPageToLoad(60000);
// do something, then navigate to a different page
// (window focus is never changed in-between)
selenium.click("link=mylink");
selenium.waitForPageToLoad(60000);
The link "mylink" does exist, the first invocation of click() always works. But the second click() sometimes seems to work, sometimes not.
It looks like the click() event is not triggered at all, because the page doesn't even start to load. Unfortunately this behaviour is underterministic.
Here's what I already tried:
Set longer time timeout
=> did not help
Wait for an element present after loading one page
=> doesn't work either since the page does not even start to load
For now I ended up invoking click() twice, so:
selenium.click("link=mylink");
selenium.waitForPageToLoad(60000);
// do something, then navigate to a different page
// (window focus is never changed in-between)
selenium.click("link=mylink");
selenium.click("link=mylink");
selenium.waitForPageToLoad(60000);
That will work, but it's not a really nice solution. I've also seen in another forum where someone suggested to write something like a 'clickAndWaitWithRetry':
try {
super.click("link=mylink");
super.waitForPageToLoad(60000);
}
catch (SeleniumException e) {
super.click("link=mylink");
super.waitForPageToLoad(60000);
}
But I think that is also not a proper solution....
Any ideas/explanations why the click() event is sometimes not triggered?
Sometimes, seemingly randomly, Selenium just doesn't like to click certain anchor tags. I am not sure what causes it, but it happens. I find in those cases w/ a troublesome link instead of doing
selenium.click(...)
do
selenium.fireEvent( locator, 'click' );
As others have stated above me, I have specifically had issues with anchor tags that appear as follows:
<a href="javascript:...." >
I've done selenium for awhile, and I really have developed a dislike for waitForPageToLoad(). You might consider always just waiting for the element in question to exist.
I find that this method seems to resolve most weird issues I run into like this. The other possibility is that you may have some javascript preventing the link from doing anything when clicked the first time. It seems unlikely but worth a double-check.
I just tried WebDriver (Selenium 2.0) and found that WebElement#sendKeys(Keys.ENTER) works.
Selenium click() event seems not to be always triggered => results in timeout?
Try selenium.pause before Selenium.click command. I have tried all above but none of them seems to resolve our problem. So finally we got a Magic selenium.pause which solved problem for me..
Hope this will solve your problem as well
I am running into this issue now also. From my usages of this, it seems like the following is the most consistent:
#browser.click(selector, {:wait_for => :page})
Not exactly sure why that would be. But it seems that if you do:
#browser.click(selector)
[maybe some stuff here too]
#browser.wait_for(:wait_for => :page)
Then you could end up waiting for a page that has already been loaded (i.e. you end up waiting forever).
I dug into the Selenium source code and found this nugget:
def click(locator, options={})
remote_control_command "click", [locator,]
wait_for options
end
...
# Waits for a new page to load.
#
# Selenium constantly keeps track of new pages loading, and sets a
# "newPageLoaded" flag when it first notices a page load. Running
# any other Selenium command after turns the flag to false. Hence,
# if you want to wait for a page to load, you must wait immediately
# after a Selenium command that caused a page-load.
#
# * 'timeout_in_seconds' is a timeout in seconds, after which this
# command will return with an error
def wait_for_page(timeout_in_seconds=nil)
remote_control_command "waitForPageToLoad",
[actual_timeout_in_milliseconds(timeout_in_seconds),]
end
alias_method :wait_for_page_to_load, :wait_for_page
Basically, this is doing the following:
#browser.click(selector)
#browser.wait_for(:wait_for => :page)
However, as the comment states, the first thing necessary is to use the :wait_for command immediately after.
And of course... switching the order puts you into the same wait forever state.
#browser.wait_for(:wait_for => :page)
#browser.click(selector)
Without knowing all the details of Selenium, it seems as though Selenium needs to register the :wait_for trigger when it is passed as an option with click. Otherwise, you could end up waiting forever if you somehow tell Selenium to wait the very instant before :wait_for is called.
Here this one will work:
selenium.waitForPageToLoad("60000");
selenium.click("link= my link");
I had the same problem - with Selenium 1.0.12 and Firefox 5.0 ; I managed to make the automated tests work this way:
I removed all "AndWait" commands (sometime they hang the test/browser)
I added a pause before the click
I added a waitForVisible after the click (usually I wait for the next html control I want to interact with on next page).
It goes like this:
waitForVisible OK
pause 1000
click OK
waitForVisible link=Go
pause 1000
click Go
etc...
It seems that the "waitForVisible" is triggered too soon, i.e. before the event handler are plugged into the control (thus clicking on the control has no effect). If you wait for 1 second, it's enought to plug/activate the click handlers...
The page has not loaded properly when you are clicking on it. Check for different elements on the page to be sure that the page has loaded.
Also, wait for the link to appear and be visible before you click on it.
Make sure you are increasing the timeout in the correct place. The lines you posted are:
selenium.click("link=mylink");
selenium.waitForPageToLoad(60000);
This wait is for the page to load that comes back After the click. But the problem you describe is that it is failing when trying to do the click. So, make sure to increase the wait Before this one.
selenium.click("link=mylink");
selenium.waitForPageToLoad(60000);
// do something, then navigate to a different page
// (window focus is never changed in-between)
// after the last click in these steps:
selenium.waitForPageToLoad(60000);
// anything else that happened after that
selenium.click("link=mylink");
selenium.waitForPageToLoad(60000);
If you're using FireFox, make sure you're using 3.6 or later.
WaitForPageToLoad uses the javascript variable 'readyState', but Firefox only supported this in 3.6. Earlier versions just don't wait
(see org.openqa.selenium.internal.seleniumemulation.WaitForPageToLoad)
I am having the same issue :( with selenium IDE 1.0.10 , phpunit 3.5 , selenium RC server 1.0.3
EDITED:
The culprit seems to be browser FF 3.6.13 , after upgrade to FF 3.6.14
all my errors are gone . My tests are working like charm :).
Selenium IDE 1.0.10
PHPUnit: 3.5.6
Selenium Server:selenium-2.0b1 (selenium-2.0b2 is buggy)
selenium.click("link=Continue to this website (not recommended).");
Thread.sleep(5000);
I've been having the same issue and found that I have to get the text of the link first. I know it's not the ideal way to do it, but I'm fortunate my links are uniquely named.
C# code:
var text = Selenium.GetText(myLocator);
Selenium.Click("link=" + text);
Try this:
selenium.fireEvent(ID, "keypress");