I know it sounds strange. I've a php file on my server which is responsible for handling uploads. And a java program on my desktop sends files to this php file. I want to cancel the file upload process to the server whenever I want. So I want to create a kind of stream or something like that to stop ongoing upload process.
I tried to use PIDs to stop a php file's running. But a php file doesn't start running before the client finished uploading.
I want to run the "savePid()" function before the upload started. So I can get the PID and stop running of the file whenever I want.
<?php
include('func.php');
savePid(); //run this before upload started
$in = stream_get_contents(fopen("php://input", "rb"));
$out = fopen('pipeupload.txt', 'w');
while ( ! feof($in) ) {
fwrite($out, fread($in, 8192));
}
?>
I know this won't work. I'm just looking for a solution to stop an ongoing upload process.
It might be possible on server side.
Example :
You have 20 images, every image is more than 2 mb in size. Your java
file from desktop send array of url's of images which is ready to
upload by php file as you told. In the php file you have to do some
change, before every image upload you have to check in database
(create table and add new field image_cancel or add this field in your
prev table, set default value is false) column name image_cancel is
true/false. if it true then stop execution and exit. Now create a new
script cancel_image.php ---> Write code inside and set image_cancel
field true.
This is not an easy thing to do.
There are couple of options I can think of, but both are not easy. Easier first
First Method:
You should have 2 PHP scripts one for handling the upload (which you have now) and another to notify the client to abort. The client should upload to the upload script as you are doing now, and also regularly check if an abort is posted at the notify script. Once the client finds abort it aborts the rest of the upload. Thus your file never reaches the PHP file handling the upload. This method requires the client to handle the abort. A client not respecting the abort still can upload the full file.
Second Method:
You should write your own file handling module or application instead of the normal web server. You can also write apache module which hooks into file upload (if your web server is apache). But this method is more complex and will not run on web servers if you are not hosting it. This method requires patching the webserver or installing application working at system level, and I don't think any administrator will allow this on their server.
Related
I'm new to java. I'm trying to Load some csv's to Postgres using java.
My requirement is as following:-
I'm transferring zip file from local server via sftp to the server where I would like to run loading.jar.
Usually, once all the csv's zipped and transferred to the server we can run loading.jar to unzip and load data to postgres. But, I'm looking at, what if we run loading.jar first at the reomte server and sftp.jar at local server later and loading.jar must frequently check for the status as 'RECEIVED' in one of the driving table which will be updated as soon as the file is transferred successfully by the sftp.jar.
How can we achieve this?? How to build a loading.jar which will frequently check for status in one of the table and pick it up for loading?? Sample code can help me out.
Thanks in advance
I have scheduler job which every X minutes checks if there is some file on sftp server, downloads it, parse and upload status file which says that file is downloaded successfully. If file is not downloaded and parsed successfully we don't upload status file.
Status file is used by 3rd party application which based if the status file exists on sftp server, it starts doing some another job. If there is no status file it will not start the job.
Problem starts with multiple server instances which are running the same scheduler job. I can't figure it the best way how to ensure that all servers are successfully downloaded file and tell to 3rd party app with status file that it can start his job?
The only way how I can communicate with this 3rd party app is through status file.
Some solutions:
before we were running a scheduler job only on one server and have shared disk between them to use this files. This not a option anymore
I was thinking to upload a status file in wrong format (so that 3rd party app don't start his job) with some server id and that will be confirmation that this server has downloaded the file. All other servers will also put their id's in the same file. Then, first server which will find out that there are at least 3x times mentioned the same server id's ( 3x server1, 3x server2, 3x server3) it will change status file in correct format and then 3rd party can start his job. In theory problems could happen if in file was mention 3x times server1 and server2, but server3 was not mentioned at all (all servers have same the same cron expression, like every 2 minutes)
Use some configuration where will be defined number of servers which need to download file, and based on that config I could check if all of them has write their id's in fake status file. Problems is if I add new server I need to update config file.
I guess this is common problem and there is some pattern or algorithm?
If I were you, I would try to create an "interface file" between the 3rd party app and the sftp servers. The "interface file" would periodically (every X minutes) be updated. If the status files are ready (all of them), then only the "interface file" will be marked as "ready". Then send this "interface file" to the 3rd party app.
Hope it helps with your problem
EDIT : syntax
I have a file that changes every day for my website and I want to make a program that takes it from the FTP server and either downloads it to my phone or my laptop.
How would I go about making this and what code do I use?
A few options not already mentioned:
If you have access to the ftp server, a cron job could email the file to you each day.
You could create a web page populated with the data, using jsp, php, or servlet etc. to read the file from disk.
If you're familiar with rsync, you could simply rsync the file to your laptop
Sync the file using Dropbox
I have a Spring web application which triggers a SAS job on a remote Linux server, the SAS job will generate a result file on the remote server upon finished. I need to display the result on my Spring application, so I want to create a listener for the directory changes on the server.
I have being looking at the java.nio library, but it looks like it only works local directories. Any ideas other than keep pinging the server through ssh? Thanks!
You might use FTP from org.apache.commons.net.ftp
Using FTP (or any other Java FTP library), you only need to check for the content on the remote directory.
If the directory supposedly is always empty, then when the first file appears your process will be triggered.
If the directory is not always empty, you might need to implement something to control which files are new, and which are not.
Please let me know if you need further assistance.
I have created an applet which creates a file on running it. But when I run my applet via server, it fails.
Is there any possible way to create a file on server with applet?
EDIT:I am creating a sound record applet which works fine when I run the applet in browser locally.It actually creates a file of recorded sound,but when I run the same applet on server,it does not create file.Is it because the server does not allow you to do so?
Is there any possible solution so that the file can be created?
File objects always point to a location (that may not exist) on the client machine.
To store something on a server, it would require some server side functionality to accept the bytes and create a (server-side) File. That might be done with PHP, servlets/JSP, ASP etc. Once the server-side is organized to accept the bytes, the applet can connect to it and push the sound recording through.
Java Applets are run on the client machine. Once you invoke the page containing an applet, the applet gets downloaded to the client's machine and runs. Hence it will not get access to the server.