Saving Large file using GWT + GAE + S3 path - java

I'm building app that will store large video files to the server and then user will be able to view them. For my app I'm using GWT + GAE/J and to store files I would like to use S3 account. But as we know that you can upload max 10mb to GAE. I have asked this kind of question before and the answer that I have accepted will work only if you have file less then 10mb. That solution that KevMo have suggested uploads whole files to the server but what if my file is 20mb or 100mb. Is it possible to divide that file into 10mb peaces, send them to GAE and then assemble those files on S3 server.
Here is the picture of what I'm trying to accomplish here:
Thanks

Why not have your GWT client upload the video directly to S3? You can have your app engine code create the authentication token or password or whatever S3 calls it, and then your GWT client would send the file straight there. If need be, it could pass back whatever meta data that your app engine code needs.(file size, name, whatever)
see this question for more info on giving users permission to upload to S3:
PS - obviously this doesn't work quite as well if you are doing some kind of processing to the video in your app engine code before uploading it to S3alt text http://www.freeimagehosting.net/uploads/b49fdee149.jpg

I'd highly recommend developing your own transfer control instead. Likely it will take the same amount of time.

Related

need help to store many documents from a java web app

i have a little issue. I hava to generate thousands of files from a web application. I will then put them into a zip send it by mail and delete them. I was thinking about storing them on the jboss server, but i'm not a big fan of this solution.
any idea of a cleaner solution?
If you send the files as an attachment, the files are stored on the mail server and there's no need to store them anywhere else. On the other hand, you might run into size limitations. If your files are too big to be directly attached to an email, you might consider storage services like AWS S3.

Android Google Drive SDK: Saving to App Folder

For my current project, I would like to allow a user to create a sqlite database file and have them enter some content. Then the user has the option to sign into their google drive account and upload this file. Afterwards, after the user makes further edits, the new database file is uploaded to replace the old file. Finally, if the user has multiple devices, the database should be downloaded from the google drive and replace the existing file stored on the device.
Currently, I have successfully setup Google Drive SDK authentication and I can sign in to the app with my account.
My main question is, how do I upload a sqlite database file to the APP FOLDER when I choose to press a sync button? (This method should be called when the user needs to sync)
Additionally, how do I upload a sqlite database file to the APP FOLDER?
Your question is a bit broad, but I'll try to send you in the right direction.
First you have to decide if to use the REST Api or GDAA. Both will accomplish the same (actually the GDAA's functionality is a bit narrower now, but for your situation will do).
The big difference is that GDAA will handle on-line / off-line states for you, where with the REST Api, you have to implement some kind of non-UI thread (sync service) synchronization. Also, there are latency issues you must be aware when using GDAA.
Next, the process of uploading SQLite database is the same as any other binary data stream.
Grab the 'xxx.db' file, make output stream (or byte[] buffer) and create a GooDrive file with title + mimetype metadata, push the stream into it's content and send it on it's merry way. The only difference between a standard folder and an app folder is the parent of the file.
You get an ID you can subsequently use to download the file to the device. Or you can use search by metadata (title in your case) to get this ID. Again it comes as input stream and you dump it to an 'xxx.db' file on your device.
The second portion of your question deals with multiple devices. None of the apis will notify you about a change in GooDrive, so you must implement one of the 2 strategies:
1/ Polling (ouch), preferably in sync service with sync intervals the system gives you.
2/ GCM message broadcasted to the devices / users who are interested (not trivial, but efficient ... and sexy).
Another pitfall you must be aware when using multiple devices with GDAA is described in SO 29030110 and SO 22874657.
In case you decide to play with the 2 apis, I maintain basic CRUD implementation demos for both the REST and GDAA. The GDAADemo has also an option to work with the app folder.
Good Luck

Where to upload files in Tomcat?

I have a web application developed using Servlet and JSP, we will be hosting this in Daily Razor Privat Tomcat Hosting soon.
But, now we have a problem. We have a form where the users can upload files to the server, files like images and PDF's. Now, we are not sure in which place we should save these files in server. I have seen lot of Stackoverflow answers telling the user to use the path like "C:/Upload/.." but this is a real product, so this is not gonna work.
I contacted the hosting company about this matter and all they said is they will give the FTP logging details once I purchased the system, no word about where to upload the files.
I also thought about uploading to Amazon S3, but we have to create folders "dynamically" for each user and subfolders for their uploaded content, therefor I am not so sure about S3. Apart from that, I believe S3 will drain my wallet.
Any advice about the upload location in tomcat or an alternate will be really appreciated.
But, now we have a problem. We have a form where the users can upload files to the server, files like images and PDF's. Now, we are not sure in which place we should save these files in server.
If you're talking about long-term storage then there's no correct answer here. You'd just need to find a place on the server where your Tomcat user has permissions to write and configure the application to put them there.
I contacted the hosting company about this matter and all they said is they will give the FTP logging details once I purchased the system, no word about where to upload the files.
Once you have FTP details, you should be able to connect to the server and look at the file system. Presumably this would allow you to see the path where your application will be deployed. From there, it's just a matter of picking a location where you'd want to save the files.
I'm not familiar with that hosting provider, so I can't really same more.
I also thought about uploading to Amazon S3, but we have to create folders "dynamically" for each user and subfolders for their uploaded content, therefor I am not so sure about S3. Apart from that, I believe S3 will drain my wallet.
This can also be a good option, depending on your needs. Price is definitely something to evaluate. Demand and bandwidth on your hosting environment are others. If you have high demand on these resources, putting them on AWS will take the load of your server and not eat up the bandwidth allocation from your hosting provider.
You could also store your uploads in a database. I'm not personally a fan of this, but some people do it and I hear it works fine for them.

How to upload file by giving a path in url

Is it possible to upload a file in Play! framework only by giving a path in url?
For example I would like to call:
www.mywebsite.com/upload_PATH
It's for me quite important, because I would like to upload and process a lot of data. Selecting manually 1000 files is too much time consuming and I want to write a program which will make it for me :-) I'm using Play with Java.
If upload_PATH is a local file path on your system, it is not a good way to go.
You should write a Play action where you can upload a file, as it is done in this example.
Then, you should write a HTTP client (started by a main Java method) which go through your files and upload then calling the Play! action. You can use the httpclient Apache library for writing the client part.
As nico_ekito wrote www.mywebsite.com/upload_from_local_path won't work
In case of huge amount of files you can create temporary folder on the distant server and upload your files with FTP. Then in your app you'll need only action for post-upload processing, ie. it can check if file is valid and move it to calculated destination and register in database if required.
Other possibility is using some flash/ajax multi uploader for an example swfupload (don't know it, it's just first hit from the search engine). This approach will be better if you are going give the upload possibility to people who you don't want to give any FTP access.
Finally you can mix the solutions -> use uploader instead of FTP and later post-process new items remotely.

Java or PHP Applet to download groups of files from Amazon S3

I have a website that provides a photo service for clients. I want to use Amazon S3 as the storage space for all the photos but am having trouble interacting with the S3 buckets. What I need to do is give my customers access to all their photos sitting in their S3 bucket. I'd like to give them a visual display of all the images and then allow them to select a group or all of the photos for download. I'm assuming a Java applet is needed to handle this interaction. Does anyone know of a java based downloader that will interface with S3 or could possibly build one?
We've also thought of all or a group of files that reside on S3 but can't figure out how to zip files while they're on S3.
Any help is much appreciated!
I suppose when you mean by java based downloader you're talking about web services, sincerely I don't know much about Amazon S3 web services, but I do know they exist, and what I'd advise is to first get to know the Amazon web services, try to find if there is any web method to download the files (my bet is that it has to be a bit per bit download)
Concerning java applet you would only need it if you want to upload files, because of the existing security.

Categories

Resources