As per my understanding files are cached by the browser by default. Developers don't have to set specific
headers to enable the caching of these files. Right?
If yes, how come browser know that it has to enable caching for specific types of files only not for servlet/jsp calls. For example :- if i make call to same servlet
request goes to server side and fresh page is served (not the browser cached paged).
Related
I'm currently running into issues with accessing HTTP resources (which are all local) while using the XForms Filter for Orbeon (in accordance with http://doc.orbeon.com/xforms/filter.html).
I am using a Java Servlet to process the data, and I'm trying to use the xf:submission element to access the Servlet in order to get/post the data required. The Java App and Orbeon wars are both deployed in the same tomcat instance, with all of the session handling etc set up as described in the link above.
Iniitally, the Servlet forwards the request to a .JSP with the XForms implementation, which loads fine. However, the page is then meant to request the data and display it. However, it doesnt do this - and the tomcat localhost access logs show no requests were made.
I know both the java code (and the XForms implementation) is correct, as I have written two .JSP's containing the java code from the servlet (and then call the JSP's directly from the xf:submission instead of the HTTP requests) and it works perfectly.
<xf:submission id="post-results-submission"
ref="instance('categories-instance')"
resource="http://localhost:8082/EmbeddedTesting/questionnaire"
method="post"
serialization="application/xml"
mediatype="application/xml"
replace=""/>
<xf:submission id="get-data-submission"
ref="instance('response-instance')"
resource="http://localhost:8082/EmbeddedTesting/questionnaire"
method="post"
serialization="application/xml"
mediatype="application/xml"
replace="instance"
instance="categories-instance"/>
These are the submission elements for accessing the HTTP resources.
Is there a reason these aren't being called at all? (as shown by logs) and if so, is it possible to fix them?
Also to note - I tested the servlet itself using the Advanced REST Chrome app and through the orbeon form builder HTTP actions (then clicking test) and both worked fine. It just wont work here for some reason. I've also made sure that the licence (for Orbeon Forms PE) is in the WEB-INF/resources/config and that it is still valid.
Here is a link to my XForms, uploaded to dropbox as an XML file so it can be previewed on dropbox: https://www.dropbox.com/s/aq4zx39ohjulcbx/index.xml?dl=0
I'm not sure if this is something I can actually do outside of Form Runner/Builder, so any help would be appreciated!
I have a GWT Web App (which requires a login) that calls a method on the Servlet (running in Tomcat) to generate a PDF file. The Servlet returns a URL to the file and opens the PDF in an iFrame.
Frame frame = new Frame(reportUrl);
frame.show();
Upon closing the frame (or browser), a request is made to delete the file that was generated server side. Now here is where the problem lies. If I log out of the web application, and open a new tab in IE, it shows the URL in the history that was used to display the PDF. Ideally this file is no longer accessible since it has been deleted on the server, and the user is no longer logged in, however the PDF still displays in the new tab. I assume this is because the PDF file is being cached.
I am unable to reproduce this behavior in Chrome, so I assume the file either isn't being cached in Chrome, or Chrome just handles things a little differently. Long story short, how do I make sure the file/url is no longer accessible once the user logs out of the web app?
Theoretically this is impossible, as you can't remotely clear the client-side cache. Also the user may have used wget or whatever to download the file so you can't assume information will be 100% inaccessible after the session has expired.
That being said, caching hints in the HTTP response headers can to some extent steer what a (well-behaved) client caches. As always implementation differs accross browsers. You can set HTTP headers either from your servlet directly in Java, or you can add them for instance from Apache HTTPD specifying cache headers for instance for all PDF downloads.
There are many resources on cache headers in HTTP, here's a good one: http://www.mobify.com/blog/beginners-guide-to-http-cache-headers/
I've also seen that HTTPS connections cause IE to be much stricter in what it caches, not sure if that is relevant/an option for you.
I shifted from Eclipse to Jdeveloper. I had a weird problem that I was able to solve but not able to understand.
Whenever I made any changes in HTML in Jdeveloper's web projects the changes were not reflected when I ran the HTML again. The old webpages kept coming in the webbrowser. Same source code. Same CSS/JS. I found that as long as there were proxy settings in my web browser the changes were not reflected. But if I switched off the proxy the changes made in HTML were reflected i.e webpage were displayed with the changes made from last time.
By proxy set I mean proxy setting placed at the following
Window -> Start Menu -> internet options -> Connections -> LAN Settings -> Proxy Server
I have tried to run the resulting URL on Google chrome, Firefox and internet explorer. As long as the web browser was using proxy the changes made in HTML were not shown by running it again.
In Eclipse Juno I simply had to clean Tomcat's directory to get changes reflected.
Anyone can explain why this happens?
Web servers return HTTP headers with every response, and usually those headers specify how long the response can be cached for. Proxy servers read those headers and make a decision whenever they see the same request again -- whether to propagate that request to the server again, or to simply return the cached copy of the response.
You can modify your server's configuration so that it the next time it tells the proxy server not to cache pages. However, some proxy servers are mis-configured or broken, and will cache pages that they are not supposed to cache.
For those cases, one ugly solution that works is to give your JS and CSS files new names whenever you change them. For example, if your index.html file includes index.css and index.js, and you make a change to index.js, you can save the changed file as index.2.js and change the tag in your index.html file to point to index.2.js from now on.
That's a bit drastic, but it works. A simpler solution to start with is to refresh your page using Shift-F5 rather than just F5 (in your browser). That tells the browser to force a refresh of all cached pages, whenever possible.
This seems tied to your proxy server type. There are several proxy server types, one of which is a "Caching Proxy Server". Which, if many users are connected to it, allows static pages to be stored locally on the server for repetitive requests from the client(you). When you change the proxy it is most likely just sending you an updated copy due to not having you as an active client, or being that you are a new user.
I would assume that the content on the new software you are building is precaching saved page names where as Eclipse Juno is was generating real time screens on the fly, bypassing the cached server option.
I have few same js and css files and i used to load from different jsps.
If already css and js loaded, we need to not load files again.
But those files are reloading. And giving 200 Success code in firebug console.
it should give 302 success code
i can hit the url directly. I dont want to call js and css dynamically.
load the jsp content
css and js loading done through relative path. i couldnt copy the code. Because it is not allowing here
If any additional information require, pls update
Assuming that you are referencing the CSS and JS files from your JSPs as static references in <script> and <link> tags, you can avoid the scenario where the browser is provided with these files on every request by setting HTTP headers when you first serve them.
Without setting appropriate expiration times or "freshness" guarantees for resources, browsers will typically end up issuing for a fresh copy of the resource every time, even though the resource may be present in the browse cache. Note, that if the browser cache is disabled, then you cannot, as a web-programmer do anything about the browser's behavior, unless you get your users to enable the browser cache; until then, the browser will request for fresh copies of resources every single time. These requests may be served by a proxy cache, but it is not necessary that such a cache exist.
Setting resource expiration times or freshness guarantees in Java web applications can be accomplished by either of the following:
Configuring the HTTP server, or the Web Application server (if that is possible) to set Expires and Cache-Control headers in the HTTP responses generated for the CSS and JS files or mimetypes. Configuration changes that ought to be done would vary from server to server and are too expansive to list here.
Writing a Servlet filter for your web application that will intercept all responses for URLs ending with *.js and *.css and add HTTP Expires and Cache-Control headers to the responses. This option is less preferable especially if you wish to have the same amount of configurability that is afforded by web and application servers; you will need to provide any configuration options via init parameters for the filter in your web.xml file. Also, changes to the configuration will require a restart of the application containing the filter.
If you need to understand the values to be specified for the Expires and Cache-Control headers, then I would recommend that you read the "Caching Tutorial for Web Authors and Webmasters", which provides details on how to use the Expires and Cache-Control headers and how they affect caching behavior.
Suggested values for the expiration times specified in the Expires header or the max-age value in the Cache-Control header, range from a few minutes to a few hours, depending on how often you change the site content. You must also avoid specifying Cache-Control headers with no-cache and no-store values, for they would often end up issuing requests for the same resource on the next attempt.
I have a java applet that needs to do http requests to the server. A quick experiment showed that the session id cookies from the browser is not sent along, so i'll have to set it manually.
Currently, I see 2 possible solutions:
somehow get hold of the cookie data from within the applet
pass the cookie information into the applet's contructor via javascript
I'd prefer the first solution, but i wasn't able to find a working solution for that. all information i found (mostly netscape.javascript.JSObject) were outdated (my 1.5 VM does not have this class)
Any great ideas or resources i have not yet found?
Are you sure your JVM doesn't contain this class? You should look in $JAVA_HOME/lib/plugin.jar. I found it in my 1.5.0_14 installation.
If you are generating the page dynamically, you don't necessarily need to use JavaScript to communicate the session ID. As long as you know it server-side, you can place it into an applet parameter tag in the generated HTML which contains the <applet> tag.
Nowadays all browsers are supporting httponly cookies.If cookies are consciously set in server as 'httponly' in appropriate response headers,applets may not be able to access it.