I am using com.enterprisedt.net.ftp jar to FTP files to remote location. My code looks like
try{
ftp.mkdir(actDirToSearch);
}catch(Exception e){
Log.addInLog(Log.ERR,e.getMessage());
}
ftp.chdir(actDirToSearch);
try{
ftp.put(tarStream, fileName);
}
catch(Exception ex){
throw new FTPException(ex.getMessage());
}
}catch(FTPException e){
throw e;
}catch(IOException e){
throw e;
}finally{
try {
if(ftp != null){
ftp.quit();
}
}
Also i am using this code to upload tar.gz file to 2 different remote machine having RHEL 5.4 and 6. But sometime i am getting successful message , sometime tar.gz file gets corrupt after uploading with lesser size on remote machine. While debugging i found the behavior that if i halt at if(ftp != null) line , and then after some time i execute ftp.quit() , it will always get succeed. I have seen through the ftp code , i found no separate thread to ftp the tar.gz file . Its all executing serially. My doubt is why this tar.gz file is getting corrupt , and why i am getting succeeded while debugging?
vsftpd services are running on both the machine. Also while doing the ftp manually from terminal , its getting succeeded. Java version is 1.6 .
Check your FTP settings -- most FTP implementations have settings for the type of file; if it is text, binary, or whether the implementation is supposed to determine for itself what type it is.
in response to comment:
I am not familiar with the FTP protocol in great detail, but I know that FTP clients normally have a setting for "text" or "binary" files, and an option for the client to determine which kind of file is being transferred by looking at the first bytes in it. I expect this is something that is communicated by the client to the server, so that the server can do things like translate end-of-line to a set of bytes that is correct for the server's particular OS.
I gather you are not setting that, since you don't seem to know about it, and it could cause these symptoms. Read your library documentation and look for this. You refer to an 'ftp' object, look at the documentation (or the source) for that library and figure out if there isn't a way to set an option for text or binary.
You can also use a hex editor to look at the bytes in the source and the result files to see if you can see a pattern in the corruption -- do the bytes look all right until they get to a place where the source had an end-of-line character, but in fact it's a binary file? Is the server stripping off 8th bits (FTP goes back to the days of commonly used 7-bit ASCII, after all).
Related
I am trying to download an EXE file from my website. So just for example, I went ahead to rarlab's site, and downloaded my self a fresh install of their 64 bit WinRAR release (we all know what that is.)
Anyways, I uploaded the 64 bit "setup" exe file to my root folder of my site where it is easily reachable to download (for testing purposes.) Going on my site through any browser I can successfully download the "setup" file from rarlabs AND execute it like any other EXE file.
Now, this is the confusing part I cannot for the love of God figure out. Using this simple code, that I ripped of some other Stack Overflow answer, I can download any EXE file from a given URL.
The code...
public static void downloadEXE(URL url, String file) throws IOException {
InputStream in = url.openStream();
FileOutputStream fos = new FileOutputStream(new File(file));
int length = -1;
byte[] buffer = new byte[1024];// buffer for portion of data from
// connection
while ((length = in.read(buffer)) > -1) {
fos.write(buffer, 0, length);
}
fos.close();
in.close();
}
Where the URL and FILE arguments are...
URL url = new URL("http://www.website.com/winrar-x64-540.exe");
String file = "c:\\Users\\..\\Documents\\winrar-x64-540.exe";
Yes, this downloads the file from my site into my documents folder without any errors, but when I run it I get this error...
LINK TO ERROR IMAGE
Not stopping there, I decided to try using this same exact code to download the same 64 bit EXE "setup" file EXCEPT this time from the official site, rarlabs. You can take my word for it I used the correct URL, because this time I not only downloaded the EXE file using this code, but also was able to successfully run it!
Leaving me to suspect there is something wrong with my site, not the EXE file? I should also mention a very IMPORTANT discovery I found that will most likely help. I decided to compare the properties from the WORKING EXE file with the BROKEN diseased one. The size of the working one was 2.07 MB whilst the broken one was a whopping 375 bytes! The broken exe didn't match the properties the working exe had AT ALL.
Seeing the comparison of the two files leaves my thinking that my site is not at fault, as I am able to download and launch my file successfully through any browser, but the code is at error.
Please ask me any questions you need to figure out the problem. Let me know where I need to be specific. Thanks all. :)
You are probably not downloading anything usefull, maybe the server just rejects your request and you store that as a file?
What is the size of the downloaded file using java code? is it the same as the file you expect?
Try to open it with a notepad and see if it is actually an exe (some random ASCII signs, and not HTML, you'll know the difference).
In the case the server rejects your request and returns HTML, you probably need to set something in the request (like cookies - you may need to log in to obtain that, user-agent property, other stuff) Try to see how the communication looks like when done from browser, run firebug or something, activate the network tab and see how the requests looks like.
I know there are no standard Java mechanisms for testing if a file is open or not.
Can anyone please tell me whether there is any libraries through which we can achieve this task, or some other ways,
This is my scenario, say I've open a docx file manually, when I try to upload that file itself its not getting uploaded since the file is opened, so if we tries to upload a file which is open outside it needs to display a message telling "The file is opened".
Using Java
If your client is a Java client (which you don't say in your question), you could try the following code from this answer:
Using the Apache Commons IO library...
boolean isFileUnlocked = false;
try {
org.apache.commons.io.FileUtils.touch(yourFile);
isFileUnlocked = true;
} catch (IOException e) {
isFileUnlocked = false;
}
if(isFileUnlocked){
// Do stuff you need to do with a file that is NOT locked.
} else {
// Do stuff you need to do with a file that IS locked
}
BTW: This is the very first answer when you search for "java file is open" on Google.
Using JavaScript
However, I assume that you don't have a Java client, only a Java server and you want to upload a file via a webpage (so you have a HTML + JS client). I assume this, because you don't tell us.
JavaScript doesn't allow you to write to the disk since this would be a security issue (I think Chrome supports some kind of the JavaScript File API which allows you to write to the disk, but none of the other browsers). So you cannot test this via JavaScript.
In general
The general way to test if a file is open, would be to test if there is some lock on the file. But this would not guarantee that the file is not opened without a lock, but it should be sufficient.
I am trying to make an app that uses a bunch text files as a base for most of its actions and this text files can be updated from a web server.
Currently my app is able to download a batch of text files via a zipped archive, but I was wondering if there was a way to check if I already had the contents of the zip file before downloading them.
What I had now was that I would download and unzip followed by a line by line check to see if the current files where different from the recently downloaded files.
This is seemingly very inefficient but I do not know of any other way.
If anybody has any suggestions and can either give a small example or point me to one I would greatly appreciate it.
To assemble what BackSlash and the others already said in the comments:
One possible solution could be to:
Create a hash of the file when the file is being created (good) or
after download (bad)
Store this hash somewhere (e.g. inside the filename instructions-d41d8cd98f00b204e9800998ecf8427e.zip)
Client: Query the server with the string
Server: Check the transmitted hash against the hash of the newest version
Server: Respond accordingly (e.g
by using the HTTP built-in 304 response)
Client: Act upon the response of the server
I have written and tested Java code that uses Jsch to transfer files. The transfer from the source computer, HERE, to a destination computer, THERE, works flawlessly. Using the same unmodified code, the transfer from HERE to another computer, PROBLEM, works about half the time. The problem being that the code will, on random file, hang on the write or close, and there are no exceptions thrown, even after a extremely long timeout. (After the next use of the channel after the extremely long timeout causes an exception.) Using the linux command "scp" (openssh-clients) works flawlessly copying the same set of files, both from HERE to THERE and HERE to PROBLEM.
I assume there is an imperfection in the transmission or reception of the files that openssh::scp has been designed to detect and work around. Any suggestions as to how to
proceed.
Detail(s):
methods used for write/close
OutputStream fos = put(String rp3);
fos.close();
Is there a means similar to unix alarm/SIGALRM to interrupt the write/close so that
the attempt can be retried?
Is there a session setConfig parameter that instructs Jsch to be more fault tolerant? Where are these documented?
Should I switch to another Java implementation of scp?
I'm currently using commons-net library for FTP client in my app. I have to download from remote server some files, by some criteria based on the file name. This is a very simplified and reduced version of my actual code (because I do some checks and catch all possible exceptions), but the essence is there:
//ftp is FTPClient object
//...
files = ftp.listFiles();
for (FTPFile ftpFile : files) {
String name = ftpFile.getName();
if(conformsCriteria(name)) {
String path = outDirectory + File.separatorChar + name;
os = new FileOutputStream(path);
ftp.retrieveFile(name, os);
}
}
Now, what I noticed is that when I run this code, wait a few seconds, and then plug out network cable, output directory contains some "empty" files plus the files actually downloaded, which leads me to believe that this method is working somewhat asynchronously... But then again, some files are downloaded (size > 0KB), and there are these empty files (size = 0KB), which leads me to believe that it is still serialized download... Also, function retrieveFile() returns, I quote documentation:
True if successfully completetd, false if not
What I need is serialized download, because I need to log every unsuccessful download.
What I saw browsing through the commons-net source is that, if I'm not wrong, new Socket is created for each retrieveFile() call.
I'm pretty confused about this, so If someone could explain what is actually happening, and offer solution with this library, or recommend some other FTP java library that supports blocking download per file, that would be nice.
Thanks.
You could just use the java.net.URLConnection class that has been present forever. It should know how to handle FTP URLs just fine. Here is a simple example that should give the blocking behavior that you are looking for.
The caveat is that you have to manage the input/output streams yourself, but this should be pretty simple.
Ok, to briefly answer this in order not to confuse people who might see this question.
Yes, commons-net for FTP is working as I thought it would, that is, retrieveFile() method blocks until it's finished with the download.
It was (of course) my own "mistake" in the code that let me think otherwise.