GWT - Prevent PDF file from being cached in IE? - java

I have a GWT Web App (which requires a login) that calls a method on the Servlet (running in Tomcat) to generate a PDF file. The Servlet returns a URL to the file and opens the PDF in an iFrame.
Frame frame = new Frame(reportUrl);
frame.show();
Upon closing the frame (or browser), a request is made to delete the file that was generated server side. Now here is where the problem lies. If I log out of the web application, and open a new tab in IE, it shows the URL in the history that was used to display the PDF. Ideally this file is no longer accessible since it has been deleted on the server, and the user is no longer logged in, however the PDF still displays in the new tab. I assume this is because the PDF file is being cached.
I am unable to reproduce this behavior in Chrome, so I assume the file either isn't being cached in Chrome, or Chrome just handles things a little differently. Long story short, how do I make sure the file/url is no longer accessible once the user logs out of the web app?

Theoretically this is impossible, as you can't remotely clear the client-side cache. Also the user may have used wget or whatever to download the file so you can't assume information will be 100% inaccessible after the session has expired.
That being said, caching hints in the HTTP response headers can to some extent steer what a (well-behaved) client caches. As always implementation differs accross browsers. You can set HTTP headers either from your servlet directly in Java, or you can add them for instance from Apache HTTPD specifying cache headers for instance for all PDF downloads.
There are many resources on cache headers in HTTP, here's a good one: http://www.mobify.com/blog/beginners-guide-to-http-cache-headers/
I've also seen that HTTPS connections cause IE to be much stricter in what it caches, not sure if that is relevant/an option for you.

Related

Is there any way to encrypt form data header values which can be seen in network resources?

I am able to see my credentials in network of the browser
I don't know if that is supposed to be like that or a problem of browser
But for my website, I don't want this in ruby on rails or java.
Https does encrypt the whole request including the headers. The reason you are seeing them, I believe, is because you are using the capabilities of the very same browser in which it was generated. Try to spoof the packages outside the browser, which by virtue of being the originator of the information can have a pick at it.

JavaScript and CSS files are cached by the browser by default?

As per my understanding files are cached by the browser by default. Developers don't have to set specific
headers to enable the caching of these files. Right?
If yes, how come browser know that it has to enable caching for specific types of files only not for servlet/jsp calls. For example :- if i make call to same servlet
request goes to server side and fresh page is served (not the browser cached paged).

Best practice to update web application on client side (GWT)

Now I have a problem:
user opens web app page, gets javaScript (ModuleName.nocache.js);
Then I update client side (roc requests, view etc).
user didn't close web app tab in browser, didn't update page. He clicks somewhere and gets random explosion. For example, RPC doesn't work, servlets moves anywhere, there is many errors or not.
Now I want to implement scenario:
User must have cookie attribute with web app version.
By request I see it and in response force him to update page (don't know how).
If user's request can't be delivered gwt force him to update page (don't know how).
But I think there must be a best-practice-way to solve this problem.
Catch IncompatibleRemoteServiceExceptions and StatusCodeExceptions in your AsynCallbacks. The first one tells you the client-side code is not compatible with the server-side code; the second can tell you that there no longer is a RPC servlet there (look for a 404 status code).
You can then show a message to the user prompting him to reload the page (this is what Google Groups does for example).
That said, there are some ways to mitigate this if the changes are relatively small: you can keep the old serialization policy files around server-side so the server can process requests from different client versions. The changes have to be somehow backwards compatible though.
You could then detect the client version on the server-side (either using a list of the latest serialization policy files and checking whether the client is using one of them or an older one; or using a request header or cookie) and include something in the response (response header or cookie) telling it there's a new version.
Or you could regularly poll the server (obviously not using RPC though) for the latest version of the app.
Another approach:
If it is empty (if not, go to next step), save the com.google.gwt.core.client.GWT.getPermutationStrongName(); in your LocalStorage or in your Cookies and finish the flow.
When your app loads, get the permutationStrongName again, and check if the saved one is different.
If it is different force to request to the server everything (if you want/need "break" async process you can use GQuery - promises). Then, replace in your LocalStorage or Cookies the new gwt's permutation id.
You can do this always that your app is loaded (is not a big deal). And you can now when the version on your server has changed.

Changes in HTML not reflected as long as I am using proxy

I shifted from Eclipse to Jdeveloper. I had a weird problem that I was able to solve but not able to understand.
Whenever I made any changes in HTML in Jdeveloper's web projects the changes were not reflected when I ran the HTML again. The old webpages kept coming in the webbrowser. Same source code. Same CSS/JS. I found that as long as there were proxy settings in my web browser the changes were not reflected. But if I switched off the proxy the changes made in HTML were reflected i.e webpage were displayed with the changes made from last time.
By proxy set I mean proxy setting placed at the following
Window -> Start Menu -> internet options -> Connections -> LAN Settings -> Proxy Server
I have tried to run the resulting URL on Google chrome, Firefox and internet explorer. As long as the web browser was using proxy the changes made in HTML were not shown by running it again.
In Eclipse Juno I simply had to clean Tomcat's directory to get changes reflected.
Anyone can explain why this happens?
Web servers return HTTP headers with every response, and usually those headers specify how long the response can be cached for. Proxy servers read those headers and make a decision whenever they see the same request again -- whether to propagate that request to the server again, or to simply return the cached copy of the response.
You can modify your server's configuration so that it the next time it tells the proxy server not to cache pages. However, some proxy servers are mis-configured or broken, and will cache pages that they are not supposed to cache.
For those cases, one ugly solution that works is to give your JS and CSS files new names whenever you change them. For example, if your index.html file includes index.css and index.js, and you make a change to index.js, you can save the changed file as index.2.js and change the tag in your index.html file to point to index.2.js from now on.
That's a bit drastic, but it works. A simpler solution to start with is to refresh your page using Shift-F5 rather than just F5 (in your browser). That tells the browser to force a refresh of all cached pages, whenever possible.
This seems tied to your proxy server type. There are several proxy server types, one of which is a "Caching Proxy Server". Which, if many users are connected to it, allows static pages to be stored locally on the server for repetitive requests from the client(you). When you change the proxy it is most likely just sending you an updated copy due to not having you as an active client, or being that you are a new user.
I would assume that the content on the new software you are building is precaching saved page names where as Eclipse Juno is was generating real time screens on the fly, bypassing the cached server option.

The Case of The Mysterious Content-Disposition Header

Our product includes a Flash application that is loaded by SWFObject. For one customer, when accessing this SWF via HTTPS (but not HTTP), Flash Player will not load it.
I asked the customer to go directly to the URL of the SWF file (rather than the wrapper page):
When he does so via HTTP, the SWF loads in the browser.
When he does so via HTTPS, IE7 presents him with a 'save file' dialog box. This implies that a "Content-Disposition: attachment" header is present in the response. That would also explain why the SWF isn't loading in Flash Player: as a security measure, it will not play SWFs served with that header.
So, I have a couple of things I'm trying to figure out:
How can I be certain that a Content-Disposition header is being sent by the server (rather than it being a strange artifact of IE7)? The user only has IE7 at his disposal, and cannot use Firefox, Chrome, etc. IE7 doesn't include the handy 'Network' tab that's present in IE9's developer tools.
Assuming that the header is present, how is it getting there? They are running Tomcat 6. The SWF is being served by Tomcat's default servlet. The header appears to be present if the HTTPS connector is used, but not if the HTTP connector is used. The Tomcat configuration is stock except for enabling the HTTPS connector.
On a side note, I don't trust Flash's cache clearing. On my machine under IE9, the SWF is often satisfied by cache even after I explicitly clear the browser cache and Flash Player's stored data: I don't see any request for it in Fiddler, or in Tomcat's access logs, but the SWF loads in the browser. Am I missing something here? Could the customer be accessing some bogus cached version of the SWF?
Edit: Apparently the 'clear cache' command in the developer tools doesn't really clear the cache. Using the standard method yielded the expected results.
Edit 2: Tracing within Tomcat indicates that the Content-Disposition header is not set. I don't know for certain that it's not being received by the browser, but AFAIK the browser is connecting directly to Tomcat. This seems like an odd browser-side behavior.
The issue had to do with presence of the following headers in the response:
Cache-Control: no-cache
Pragma: no-cache
These were being sent by Tomcat because the page was being protected by a security constraint (configured in conf/web.xml). These headers caused IE7 to act just like a 'Content-Disposition: attachment' header was present.
My solution was to have the customer add the following configuration to Tomcat's conf/context.xml:
<Valve className="org.apache.catalina.authenticator.BasicAuthenticator" securePagesWithPragma="false" />
This replaces the headers with:
Cache-Control: private
...which should still fulfill the goal of preventing proxies from caching the page, while working around IE's issues. This was based on the solution found here:
http://daveharris.wordpress.com/2007/07/09/how-to-configure-cache-control-in-tomcat/
However, that very-similar solution suppressed the headers entirely. Details of these attributes can be found in Tomcat docs here:
http://tomcat.apache.org/tomcat-6.0-doc/config/valve.html#Basic_Authenticator_Valve/Attributes
You should be able to log the outgoing HTTP responses on the server side before encryption, use the null cypher, or provide the RSA keys to wireshark and look at the headers from a packet capture.

Categories

Resources