A Problem while Retriving file from FTPS Server - java

I am working in an application development. On that application i am performing files store, retrieve and delete operations. For identifying the files on server i am using an index(a hash map file) file. every time when i perform upload operation i update "index" file and upload "index" file on server along with other uploading files.
For performing delete operation first i am retrieving the "index" file and based on the index i am deleting the files from server and after updating "index" file i again upload "index" file on server.
I am able to perform file uploading operation successfully but while performing delete operation, i am getting "java.io.EOFException" exception, when i am trying to retrieve "index" file.
i am writing following code to download "index" file from FTPS server
//download index file
if (service.retrFile("INDEX", "") == service.OK) {
try {
ObjectInputStream objIn = new ObjectInputStream(new FileInputStream("INDEX"));
try {
Map<String, FileData> filesUploaded = (HashMap<String, FileData>) objIn.readObject();
} catch (ClassNotFoundException ex) {
ex.printStackTrace();
}
objIn.close();
} catch (IOException ex) {
ex.printStackTrace();
}
}
Where "service.ok" returns '0' if it is successfully connected to FTPS server
and "FileData" contains information about file(attributes).
Same code i am using while performing uploading operation. there it is working fine with no exception. but while performing delete operation when i am retrieving "index" file i am getting exception on the statement :
Map filesUploaded = (HashMap) objIn.readObject();
Exception is :
SEVERE: null
java.io.EOFException
at java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2298)
at java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:2767)
at java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:798)
at java.io.ObjectInputStream.<init>(ObjectInputStream.java:298)
at com.pixelvault.gui.DeleteServerFilesDialog.startDeleting(DeleteServerFilesDialog.java:447)
I have checked whether FTPS server connections are properly closed after performing corresponding operations.
I am not getting where i am doing wrong.
please give me your valuable suggestions. i thank to all your suggestions which will help me to overcome with this problem.
i am using org.apache.commons.net.ftp and "retrFile" is a method created by me for retrieving files from server.
Here is code for "retrFile"
FTPSClient ftp;
public int retrFile(String filename, String savePath) {
if (!connected) {
return ERR;
}
FileOutputStream fout = null;
InputStream bin = null;
try {
ftp.enterLocalPassiveMode();
fout = new FileOutputStream(savePath + filename);
bin = ftp.retrieveFileStream(filename);
if (bin == null) {
fout.close();
return ERR;
}
byte[] b = new byte[ftp.getBufferSize()];
int bytesRead = 0;
while ((bytesRead = bin.read(b, 0, b.length)) != -1) {
fout.write(b, 0, bytesRead);
}
ftp.completePendingCommand();
fout.close();
} catch (FTPConnectionClosedException ex) {
ex.printStackTrace();
connected = false;
return NOT_CONNECTED;
} catch (IOException ex) {
ex.printStackTrace();
return ERR;
} finally {
try {
fout.close();
} catch (IOException ex) {
ex.printStackTrace();
return ERR;
}
try {
if (bin != null) {
bin.close();
}
} catch (IOException ex) {
ex.printStackTrace();
return ERR;
}
}
return OK;
}

Are you sure the INDEX file is correctly downloaded?
It's present in the filesystem when application is closed?
What FTP lib are you using?. i only know commons.net from Apache and i not recognice the "retrFile" file method. Could it be threaded so that the file is not completely downloaded when the readObject statement is executed?

Related

Spring boot uploading file error when try bigger file

I have this upload method:
try {
Files.createDirectories(filesPath);
} catch (IOException e1) {
e1.printStackTrace();
return null;
}
for (MultipartFile file : Arrays.asList(files)) {
try {
// Get the file and save it somewhere
byte[] bytes = file.getBytes();
Path path = Paths.get(filesPath + File.separator + file.getOriginalFilename());
Files.write(path, bytes);
} catch (IOException e) {
e.printStackTrace();
return null;
}
}
It work well, but when i try upload bigger file around 1,5GB i get this error:
Invalid string length
How can i fix it?
First you need to adjust two properties if you want to allow uploads of such big files.
spring.servlet.multipart.max-file-size=-1
spring.servlet.multipart.max-request-size=-1
Then it is better to use MultipartFile.getInputStream() to read the contents of the file.
You might also use IOUtils.copy() from Apache Commons IO to simplify your job.

How to find directory for File Output Stream in Android app?

I have an app which gets data from an Arduino via bluetooth. The data are written into various ArrayLists and saved in files afterwards. Here is the code:
public boolean SaveValues(ArrayList<String> arrayList1, ArrayList<String> arrayList2, String valueType, String timeStart, String timeStop) {
String filename = "File_" + valueType + "_" + timeStart + " bis " + timeStop;
FileOutputStream outputStream = null;
if(arrayList1.size() == 0)
return false;
try {
outputStream = openFileOutput(filename, Context.MODE_PRIVATE);
for (int i = 0; i < arrayList1.size(); i++) {
outputStream.write(arrayList1.get(i).getBytes());
outputStream.write("\n".getBytes());
outputStream.write(arrayList2.get(i).getBytes());
outputStream.write("\n".getBytes());
}
outputStream.close();
return true;
} catch (FileNotFoundException e) {
e.printStackTrace();
return false;
} catch (IOException e) {
e.printStackTrace();
return false;
} finally {
if (outputStream != null) {
try {
outputStream.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
I also save the valuetype, timeStart and timeStop into a database to be able to find the files. My problem is, that somehow the table where I save these values got deleted, so I am not able to find the files in the app. I would like to find the files maybe on the phone or something. Actually, I just need the filenames to be able to open them in the app, because they contain data I need. So, where can I find these files?
I have tried searching in the phones files. The files, and all the other apps data, should be saved in Android/data/package-name but I can't find the package name in Android/data. I also tried to search for "File_" in the files, because that's what every one of my files start with. But no file gets found.

EOFexception caused by empty file

I created a new file roomChecker which is empty. Now when I read it, it throws me an EOFException which is undesirable. Instead I want it to see that, if file is empty then it would run other two functions that are in if(roomFeed.size() == 0) condition. I could write this statement in EOFException catch clause; but that's not what I want to do because then every time when the file will be read and reaches end of file it will execute those functions. Instead when the file has some data it should do what is specified in else.
File fileChecker = new File("roomChecker.ser");
if(!fileChecker.exists()) {
try {
fileChecker.createNewFile();
} catch (IOException e) {
e.printStackTrace();
System.out.println("Unable to create new File");
}
}
try(FileInputStream fis = new FileInputStream("roomChecker.ser"); ObjectInputStream ois = new ObjectInputStream(fis)) {
roomFeed = (List<roomChecker>) ois.readObject();
System.out.println("End of read");
if(roomFeed.size() == 0) {
System.out.println("your in null if statement");
defaultRoomList();
uploadAvailableRooms();
} else {
for(int i=0; i<roomNumber.size(); i++) {
for(int j=0; j<roomFeed.size(); i++) {
if((roomNumber.get(i)).equals(roomFeed.get(i).getRoomNumSearch())){
System.out.println("Reach Dead End for now");
} else {
defaultRoomList();
uploadAvailableRooms();
}
}
}
}
} catch (IOException ioe) {
ioe.printStackTrace();
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
All this:
if(!fileChecker.exists()) {
try {
fileChecker.createNewFile();
} catch (IOException e) {
e.printStackTrace();
System.out.println("Unable to create new File");
}
}
is a complete waste of time, and it is one of two possible causes for your empty file problem. Creating a file just so you can open it and then get a different problem instead of coping correctly with the original problem of the file not being there isn't a rational strategy. Instead, you should do this:
if (fileChecker.isFile() && fileChecker.length() == 0) {
// file is zero length: bail out
}
and, in the following code, this:
try(FileInputStream fis = new FileInputStream(fileChecker); ObjectInputStream ois = new ObjectInputStream(fis)) {
// ...
}
catch (FileNotFoundException exc) {
// no such file ...
}
// other catch blocks as before.
Of course you can still get EOFException if you read the file to its end, or if the file is incomplete, and you still need to handle that.

Does IOUtils.copy block until writing is finished?

Here's my situation: I'm using IOUtils to copy a file. The next thing I do is send a JSON message to another program to say, "You can download the copy". The problem is about 25% of the time the other program gets an error saying "Received unexpected EOF downloading artifact".
Every time this error occurs, if I try again manually, the error doesn't occur. My theory is that IOUtils.copy doesn't block and the OS is still writing the file to the FS while the other program tries to download it. Is there a way to force IOUtils.copy or other functionally equivalent code to block until the OS has finished writing the file? Or is my theory incorrect? Here's the code I'm using:
private boolean archiveArtifact(String archivePath, String deployId, Artifact artifact) {
InputStream inputStream = null;
FileOutputStream fileOutputStream = null;
boolean successful = true;
try {
File archiveDir = new File(archivePath);
File deployDir = new File(archiveDir, deployId);
if (!deployDir.exists()) {
deployDir.mkdirs();
}
URLConnection connection = new URL(artifact.getJenkinsUrl()).openConnection();
inputStream = connection.getInputStream();
File output = new File(deployDir, artifact.getFileName());
fileOutputStream = new FileOutputStream(output);
IOUtils.copy(inputStream, fileOutputStream);
} catch (IOException e) {
successful = false;
logger.error(e.getMessage(), e);
} finally {
try {
if (fileOutputStream != null) {
fileOutputStream.close();
}
} catch (IOException e) {
successful = false;
logger.error(e.getMessage(), e);
}
try {
if (inputStream != null) {
inputStream.close();
}
} catch (IOException e) {
successful = false;
logger.error(e.getMessage(), e);
}
}
return successful;
}
It might be worth noting that I'm copying this to a NFS. Keep in mind I don't really know anything about NFS. This is CentOS release 5.9 (Final).
Your current code only ensures that the file content is passed to the operating system for writing; it does not guarantee that it is actually written to a the disk.
To be certain that the file is actually written to disk you can call sync() on the FileDescriptor:
fileOutputStream.flush();
fileOutputStream.getFD().sync();

Reading zip archive from database

I have a zip file which I store in the database as a blob field. When I want to download it from it the zip file is corrupted. I can open it only from 7zip. The file is ok when I try to open it before upload it in the DB and when is in the DB. When I retrieve the file from the database as a blob I get this error when try to unzip it on unix
Archive: test.zip
End-of-central-directory signature not found. Either this file is not
a zipfile, or it constitutes one disk of a multi-part archive. In the
latter case the central directory and zipfile comment will be found on
the last disk(s) of this archive.
unzip: cannot find zipfile directory in one of test.zip or
test.zip.zip, and cannot find test.zip.ZIP, period.
Here is the code when I retrieve the zip from the database :
public oracle.sql.BLOB GetBlob(Connection myConn,
CallableStatement cstmt) throws Exception {
String strSql = null;
BLOB tempBlob = null;
try {
strSql = .... // Here is the sql procedure which I called to retrieve the blobl field.
cstmt = myConn.prepareCall(strSql);
cstmt.registerOutParameter(1, OracleTypes.BLOB);
cstmt.setLong(2, request_id);
cstmt.execute();
tempBlob = (oracle.sql.BLOB)cstmt.getObject(1);
int bufsize = tempBlob.getBufferSize();
} catch (Exception e) {
e.printStackTrace();
throw e;
}
return tempBlob;
Here is the reading :
oracle.sql.BLOB tempBlob = null;
Connection myConn = null;
CallableStatement cstmt = null;
try {
myConn = DBHelper.getConnection();
if (null == myConn)
throw new SQLException();
tempBlob = GetBlob(myConn, cstmt);
int bufsize = tempBlob.getBufferSize();
InputStream in = tempBlob.getBinaryStream();
int length = 0;
byte buf[] = new byte[bufsize];
while ((in != null) && ((length = in.read(buf)) != -1)) {
out.write(buf, 0, length);
}
in.close();
// out.flush();
// out.close();
} catch (Exception e) {
e.printStackTrace();
} finally {
if (null != myConn) {
try {
myConn.close();
} catch (Exception e) {
e.printStackTrace();
}
}
if (cstmt != null) {
try {
cstmt.close();
} catch (SQLException e) {
}
}
}
Could somebody help me.
Thanks in advance.
Compare the files before and after. The difference should give you some hint what is going wrong.
Possible culprits are:
Missing bytes at the end
converted bytes
messed up order of bytes
I'd expect looking at the first 10, the last 10 and the total number of bytes should be sufficient to give you a good idea what is going on.

Categories

Resources