How to perform file upload monitoring with Play! framework - java

Is it possible to monitor file uploads, somehow, with Play! framework? Also, if the file should be BIG (i.e. +500MB), would it be possible to save the received bytes into a temporary file instead of keeping it in memory? (see update below)
note : there is no code to show as I'm wondering these questions and cannot seem to find the answers with Google
Thanks!
** Update **
(I almost forgot about this question.) Well, apparantly, files uploaded are stored in temporary files, and the files are not passed as a byte array (or something), but as a Java File object to the action controller.
But even in a RESTful environment, file monitoring can be achieved.
** Update 2 **
Is there a way to get early event listeners on incoming HTTP requests? This could allow for monitoring request data transfer.

Large requests and temp files
Play! is already storing large HTTP requests in temp files named after UUIDs (thus reducing the memory footprint of the server). Once the request is done, this file gets deleted.
Monitoring uploads in Play!
Play! is using (the awesome) Netty project for its HTTP server stack (and also on the client stack, if you're considering Async HTTP client.
Netty is:
asynchronous
event-driven
100% HTTP
Given Play!'s stack, you should be able to implement your "upload progress bar" or something. Actually, Async HTTP client already achieves progress listeners for file uplaods and resumable download (see the quick start guide).
But play.server package doesn't seem to provide such functionality/extension point.
Monitoring uploads anyway
I think Play! is meant to be behind a "real" HTTP server in reverse proxy mode (like nginx or lighthttpd).
So you'd better off using an upload progress module for one of those servers (like HttpUploadProgressModule for nginx) than messing with Play!'s HTTP stack.

Related

Server Side Caching

I have a standalone WildFly 9.0.2 and I want to cache on the server side the responses for certain requests.
Some of the requests are available for all users (visitors), others should be available only to authenticated users.
I do not understand from the documentation how to do this.
Can you point me to a tutorial or manual that implements this functionality?
I started wildfly using the default configuration for Infispan that is found in the standalone\configuration\standalone.xml
Then, I modified the response object to contain in the header information for caching, hoping it would work like JAX-RS where it would check the headers and automatically cache.
final HttpServletResponse response
long current = System.currentTimeMillis();
long expires = current + 86400000;
response.setHeader("Cache-Control", "no-transform, max-age="+ 86400 + ", public");
response.addDateHeader("Expires", expires);
response.addDateHeader("Last-Modified", current);
That unfortunately did not work on the server side (thought it did work for my web application which is reading properly the header information for cache and it re-uses its local cache).
When I tried to view the Infinispan settings from the administration panel at http://127.0.0.1:9990, I get an exception and cannot proceed.
Thank you in advance for your help.
There is no standalone Java servlet server that does response caching the way you anticipated. The headers you set in the response, will be interpreted by the browser (which does cache) or intermediate proxies, which might cache also. Specialized proxies for caching are: Varnish, NGINX. These proxies are also called Edge Proxies.
Crafting a library that enables a standalone server to cache like you want to, seams possible, the normal request flow could be intercepted by a ServletFilter. I don't know of any public library that is doing something like that.
If you want to cache inside the application, the normal thing to do is to use a caching library, like EHCache, cache2k, Google Guava Cache and others.
In your specific example case I would recommend, that you become familiar with a proxy cache server like NGINX and put it in front of your application. That is the, let's say, the "industry standard". It is not desired to do HTTP response caching inside the Java server, for a couple of reasons:
In case of a cache hit, the response from the proxy is faster and the Java server is not hit
You can scale, by putting more caching proxies in front of your application
The Java heap is not a good fit to cache a large amount of data. Where should it go? There are caches that do overflow to disk. This needs complex setup, as well as a caching proxy in front of your application
For debugging and transparency it is better that the server generates a fresh answer when a request is sent to it
I always recommend to do caching inside the application, too. However we do it on a Java object level. The cache size is limited, so the heap keeps small. A lot of cached objects inside the application are used for many different responses, so object caching is on a more finer level then HTTP response caching.
Only in some special cases we do something similar to HTTP response caching inside the application, too. This is used to compress or recompress some images and CSS resources that are used very often. Here is some potential that is a general useful thing. Maybe we will open source this.
Hope that helps.

Restful Server Response triggered Via Client

This question might sound a bit abstract,answered (but did my search didn't stumble on a convenient answer) or not specific at all ,but I will try to provide as much information as I can.
I am building a mobile application which will gather and send sensory data to a remote server. The remote server will collect all these data in a mySQL database and make computations (not the mysql database ,another process/program) . What I wanna know is :
After some updates in the database , is it doable to send a response from a RESTful Server to a certain client (the one who like did the last update probably) ,using something like "a background thread"? Or this should be done via socket connection through server-client response?
Some remarks:
I am using javaEE, Spring MVC with hibernate and tomcat (cause I am familiar with the environment though in a more asynchronous manner).
I thought this would be a convenient way because the SQL schema is not much complicated and security and authentication issues are not needed (it's a prototype).
Also there is a front-end webpage that will have to visualize these data, so such a back-end system would look like a good option for getting the job done fast.
Lastly I saw this solution :
Is there a way to 'listen' for a database event and update a page in real time?
My issue is that besides the page I wanna update the client's side with messages from the RESTful server.
If all these above are unecessary and a more simple client-server application will prove better and less complex please be welcome to inform me.
Thank you in advance.
Generally you should upload your data to a resource on the server (e.g. POST /widgets and the server should immediately return with a 201 Created or (if creation is too slow and needs to happen later) 202 Accepted status. There are several approaches after that happens, each has their merits:
Polling - The server's response includes a location field which the client can then proceed to poll until a change happens (e.g. check for an update every second). This is the easiest approach and quite efficient if you use HTTP caching effectively and the average number of checks is relatively low.
Push notification - Server sends a push notification when the change happens, report's generated, etc. Obviously this requires you to store the client's details and their notification requirements. This is probably the cleanest approach and also easy to scale. In the case of Android (also iOS) you have free push notifications available via Google Cloud Messaging.
Set up a persistent connection between client and server, e.g. using a Websocket or low-level TCP connection. This should yield the fastest response times, but will probably be a drain on phone battery, harder to scale on the server, and more complex to code.

Client server java application: send large file using SOAP and AXIS2

I have to send a millions of data over the network using Soap Web Services (java2wsdl) between java client/server. So I tried to serialize objects into a file and then send it to server.
But the problem is that serialization generates a very large file that causes memory problems in java application.
Since the file is very big I tried to split this file into small ones. The problem is that I must send n files between the client and the server, which will consume a lot of time while the objectif is to optimize the processing time.
Do you have any suggestions to optimize the processing time and ensuring no "out of memory"?
Web services arent designed primarily as a large file transfer mechanism. For that specific file transfer protocols will do a better job, like dealing with partial recovery, error recovery etc.
Try this solution

Sending large data over a java web service

I have a Java web service that returns a large amount of data. Is there a standard way to stream a response rather than trying to return a huge chunk of data at once?
This problem is analogous to the older problem with bringing back large RSS feeds. You can do it by parameterizing the request: http://host/myservice?start=0&count=100, or by including next/prev urls in the response itself.
The latter approach has a lot of advantages. I'll search for a link that describes it and post it here if I find one.
I would look into a comet like approach:
From WIKI:
Comet is a web application model in which a long-held HTTP request
allows a web server to push data to a browser, without the browser
explicitly requesting it.
Basically, rather than sending the large data all at once, allow your web server to push data at its own pace and according to your needs.
Webservice might not be a good method for data transfer.
If I were you, I would like to setup another service like FTP or SFTP.
The server puts the data to the specific path of the FTP server and sends the path information to the client through the webservice response.

File upload by mulitiple user

I am Uploading a file into a webserver using File upload API it works good for a single user, if multiple user upload a file simultaneously how to improve my code using threads?
What type of web server are you using? Typically web servers process separate requests on separate threads, so you shouldn't have to do anything special, your web service code will be inherently multi-threaded.
According to the Servlet API each request is supposed to be processed by a single thread, therefore you shouldn't have any issues.
However if you're trying to maximize the number of users your server could potentially service then you might take a look at advanced connectors for Tomcat or whatever container you are using
http://tomcat.apache.org/tomcat-6.0-doc/aio.html

Categories

Resources