I'm looking for a way to handle my data while the multipart-file is being transferred. I've tried servlet and spring boot but both can only direct me to the post function after all files are fully uploaded. is there a way to process the data without waiting for the transfer to be done?
thanks
Related
I am developing a web application in Java Spring where I want the user to be able to upload a CSV file from the front-end and then see the real-time progress of the importing process and after importing he should be able to search individual entries from the imported data.
The importing process would consist of actually uploading the file (sending it via REST API POST request) and then reading it and saving its contents to a database so the user would be able to search from this data.
How could I show the real-time progress of this process? I found a tutorial for jQuery, which shows the progress of amount of data uploaded/transferred, but as the most the work is done while processing the uploaded file, I thought I would like a solution where before the line processing I find out the amount of lines in the file and then the user could see a live message like:
Lines processed: 1 out of 10000
It could update/change incrementally, but as one line is processed pretty quickly, showing each number of lines processed is not that important.
Either way, the question is, what's the easiest way to send these messages from Spring REST API to the client?
I found a solution myself and used Web Sockets for that.
I used this approach from the Spring documentation:
https://spring.io/guides/gs/messaging-stomp-websocket/
It could help on sending the messages for each processed line to the front end listener (after the web socket topic/connection is started) but I used a different approach for the data import, I used batch insert so that was unavailable for me, but web sockets are capable of doing that.
In my java web application I have to process excel file from user. There are two way to process this first as File second as InputStreams.
I think InputStreams will be a memory consuming thing.
Is there any possible threat if I first save user uploaded file as .xls or .xlsx and then process it?
What are the cons & pros of both approach?
The best way to process web application files is after uploading completely and saving in your server as a file.
Streaming file processing should be avoided because HTTP model is designed to be a Request, Response model. You shouldn't ask the Web Client wait until you finish the file processing.
Best thing to do is upload file to a directory and send the Web Client a upload success message, with possibly a link where the end user can check for the results in the future.
And having a scheduled task to process the files in the uploaded directory and post the results in the results page.
This way web application will not have unnecessary delays and scale-able.
I have a problem and i don't know how to resolve it...
I have 2 .dbf files with ~10000 records (geospatial informations) that i want to read from a jsp page, every time i read 200 records i have to send them to a servlet in some format (i haven't decided yet).
The Servlet must save every record as a Document object in Google Appengine (the limit of 200 records is specified by Appengine api's).
I can't upload the file to server and read server-side cause of some AWT classes not supported by Appengine, then i tried to read the files client-side and send to server the parsed records but i don't know how i can do this.
Someone have a solution to this problem?
jsp, once complied, is nothing but a servlet. That being said, you say:
I can't upload the file to server and read server-side cause of some AWT classes not supported by Appengine
Once you finish reading 200 records, why not insert it into the appengine from the jsp itself. The jsp will run on the server either way.
How can we reduce time at the time of uploading a file from jsp to servlet in apche tomcat and put it in queue and start some other thing at the mean time while uploading of file is going on
You can make the call asynchronous using #Asynchronous
The server should be able to handle any (reasonable) number of simultaneous requests: just let the upload happen and do something else!
Now, to do that on the "same" page as the upload is not possible if you are using plain-old HTTP POST with a form. Instead, you'll have to use one of those Flash-based upload tools or maybe there is one in Javascipt you can use that will send the upload some other way (XMLHTTPRequest?). Then you can have the upload run but the page itself is still functional.
I have few question related to Web technologies. From my reading ant looking at Apache and Netty documents I could not figure out few things about downloading a large file with HTTP multipart/post request.
Is it possible to send HTTP request indicating request to download a file in smaller multipart (chunks)?
How to download large file in multipart ?
Please correct me if I have not understood the 'multipart' term itself. I know lot of people have faced this problem, where application (client) downloads files in smaller portion, so when network outage happens, application does not need to download whole file from the beginning again. Specially, when the file is not any media file.
Thanks.
Multipart refers to encoding multiple documents in one body, see this for the definition. For http, a multipart upload allows the client to send multiple documents with one post, for example uploading an image, and form fields in one request.
Multipart does not refer to downloading a document in multiple chunks.
You can use http ranges to restart downloading if a network outage occurs.