Not able to open my uploaded file from server? - java

I am hosting a website on Tomcat server. The application uses Struts 1.1 and Spring for all its operations. I have a page that is used for uploading files to the server.
When user uploads any file it is successfully uploaded but gives a 404 error when tried to retrieve. I checked the file using SSH login, the uploaded file is present in that location. I am scratching my head over this problem from past 4 days but no solution. Its works properly without any problems in my local machine. The problem in there in the deployment.
An Important note: From SSH login, If i try to move that file to some other location and then place it back to its original location, i am able to retrieve the file..!!! I don't know why but I can't do this for every file uploaded by the user. So i modified the my code so that the file is uploaded to a temp location first and then moving it to the correct location. But even this is not working.
FileOutputStream outputStream = null;
FormFile formFile = null;
String tempFilePath = getServlet().getServletContext()
.getRealPath("/")
+ "uploads"
+ System.getProperty("file.separator") + "temp";
try
{
formFile = uploadForm.getFile();
boolean errorflag = false;
if(formFile.getFileSize() > 10660000)
{
request.setAttribute("error",
"File size cannot exceed 10MB!");
errorflag = true;
}
else
{
errorflag = validateFileUpload(request,
formFile, errorflag);
}
if(errorflag)
{
return gotoKnowledgeSharingPage(mapping,
request, actionHelper, session, userid,
instid);
}
File folder = new File(tempFilePath);
if(!folder.exists())
{
folder.mkdir();
}
outputStream = new FileOutputStream(new File(
tempFilePath, formFile.getFileName()));
outputStream.write(formFile.getFileData());
}
finally
{
if(outputStream != null)
{
outputStream.flush();
outputStream.close();
}
}
String finalFilePath = getServlet().getServletContext()
.getRealPath("/")
+ "uploads"
+ System.getProperty("file.separator")
+ session.getAttribute("userid");
//+ System.getProperty("file.separator")
// + formFile.getFileName();
File oldPath = new File(tempFilePath
+ System.getProperty("file.separator")
+ formFile.getFileName());
// Move file to new directory
File newPath = new File(finalFilePath);
if(!newPath.exists())
{
newPath.mkdir();
}
boolean success = oldPath.renameTo(new File(
finalFilePath, formFile.getFileName()));
if(success)
{
actionHelper.insertIntoUploadTable(userid,
knowledgeForm, formFile.getFileName());
}
else
{
if(oldPath.exists())
{
oldPath.delete();
}
}

Related

the action can't be completed because the file is open in openjdk platform binary spring boot

Hey I'm trying to delete a image (file) but I can't :(
That how I upload the image:
try {
List<String> imagesPaths = new ArrayList<>();
for (String image : imagesBytes)
{
String base64Image = image.split(",")[1];
byte[] imageByte = javax.xml.bind.DatatypeConverter.parseBase64Binary(base64Image);
String folder = "C:/images/" + LoggedInUser.UserId();
File newDirectory = new File(folder);
if (!newDirectory.exists())
{
newDirectory.mkdirs();
}
long timeMilli = new Date().getTime();
String imageType = image.substring("data:image/".length(), image.indexOf(";base64"));
String path = timeMilli + "." + imageType;
Files.write(Paths.get(folder, path), imageByte);
String newPath = LoggedInUser.UserId() + "/" + path;
imagesPaths.add(newPath);
}
logger.debug("uploadImages() in ImageService Ended by " + LoggedInUser.UserName());
return new ResponseEntity<>(imagesPaths, HttpStatus.OK);
} catch (Exception e) {
throw new ApiRequestException(e.getMessage());
}
And this how I delete it :
List<ImageJpa> images = imageRepository.findByStatus(Image.UNUSED.status);
System.out.println(images.size() + ": Images are not used");
images.forEach(image -> {
String imagePath = "C:/images/" + image.getPath();
System.out.println(imagePath);
File imagePathFile = new File(imagePath);
if (imagePathFile.exists())
{
boolean isDeleted = imagePathFile.delete();
if (isDeleted)
{
imageRepository.deleteById(image.getId());
System.out.println("Deleted the file: " + imagePathFile.getName());
} else {System.out.println("Failed to delete the file. :" + imagePathFile.getName());}
}else {
System.out.println("Already Deleted");
}
});
Always I got (Failed to delete the file ...)
Note : The image will deleted if I ReRender the the project again or close and open the IDE.
The problem was on other function :\
I was open the files after save it without CLOSE the connection after it !
This one what was messing on the read files
inputStream.close();

Validate Excel file is downloaded in Selenium Web driver Java

I'm able to download excel file by clicking Download button which comes under DOM ,
after that i want verify downloaded file is same one.
AUTO IT is not allowed in project.
I have tried below code for verification on local but if i will push this code to repo.
then user path will get change and code will fail.
`String filepath = "C:User\\Dhananjay\\Downloads";
String fileName = "report.xlsx"
File targetFile = new File(fileName,filePath);
if(! targetFile.exists())'
{
system.out.println("File is verified")`
}else{
system.out.println("file not downloaded")
}'
String userProfile = System.getProperty("user.home"); returns %USERPROFILE% variable.
So you can use String filepath = System.getProperty("user.home") + "\\Downloads";
Works even on Linux.
I have found way to validate on local path and it's generic one
File folder = new File(System.getProperty("user.home") +\\Downloads);
File[] listOfFiles = folder.listFiles();
boolean found = false;
File f = null;
for (File listOfFile : listOfFiles) {
if (listOfFile.isFile()) {
String fileName = listOfFile.getName();
System.out.println("File " + listOfFile.getName());
if (fileName.matches("5MB.zip")) {
f = new File(fileName);
found = true;
}
}
}
Assert.assertTrue("Downloaded document is not found",found );
f.deleteOnExit();

Download entire FTP directory in Java (Apache Net Commons)

I am trying to recursively iterate through the entire root directory that I arrive at after login to the FTP server.
I am able to connect, all I really want to do from there is recurse through the entire structure and and download each file and folder and have it in the same structure as it is on the FTP. What I have so far is a working download method, it goes to the server and gets my entire structure of files, which is brilliant, except it fails on the first attempt, then works the second time around. The error I get is as follows:
java.io.FileNotFoundException: output-directory\test\testFile.png
(The system cannot find the path specified)
I managed to do upload functionality of a directory that I have locally, but can't quite get downloading to work, after numerous attempts I really need some help.
public static void download(String filename, String base)
{
File basedir = new File(base);
basedir.mkdirs();
try
{
FTPFile[] ftpFiles = ftpClient.listFiles();
for (FTPFile file : ftpFiles)
{
if (!file.getName().equals(".") && !file.getName().equals("..")) {
// If Dealing with a directory, change to it and call the function again
if (file.isDirectory())
{
// Change working Directory to this directory.
ftpClient.changeWorkingDirectory(file.getName());
// Recursive call to this method.
download(ftpClient.printWorkingDirectory(), base);
// Create the directory locally - in the right place
File newDir = new File (base + "/" + ftpClient.printWorkingDirectory());
newDir.mkdirs();
// Come back out to the parent level.
ftpClient.changeToParentDirectory();
}
else
{
ftpClient.setFileType(FTPClient.BINARY_FILE_TYPE);
String remoteFile1 = ftpClient.printWorkingDirectory() + "/" + file.getName();
File downloadFile1 = new File(base + "/" + ftpClient.printWorkingDirectory() + "/" + file.getName());
OutputStream outputStream1 = new BufferedOutputStream(new FileOutputStream(downloadFile1));
boolean success = ftpClient.retrieveFile(remoteFile1, outputStream1);
outputStream1.close();
}
}
}
}
catch(IOException ex)
{
System.out.println(ex);
}
}
Your problem (well, your current problem after we got rid of the . and .. and you got past the binary issue) is that you are doing the recursion step before calling newDir.mkdirs().
So suppose you have a tree like
.
..
someDir
.
..
someFile.txt
someOtherDir
.
..
someOtherFile.png
What you do is skip the dot files, see that someDir is a directory, then immediately go inside it, skip its dot files, and see someFile.txt, and process it. You have not created someDir locally as yet, so you get an exception.
Your exception handler does not stop execution, so control goes back to the upper level of the recursion. At this point it creates the directory.
So next time you run your program, the local someDir directory is already created from the previous run, and you see no problem.
Basically, you should change your code to:
if (file.isDirectory())
{
// Change working Directory to this directory.
ftpClient.changeWorkingDirectory(file.getName());
// Create the directory locally - in the right place
File newDir = new File (base + "/" + ftpClient.printWorkingDirectory());
newDir.mkdirs();
// Recursive call to this method.
download(ftpClient.printWorkingDirectory(), base);
// Come back out to the parent level.
ftpClient.changeToParentDirectory();
}
A complete standalone code to download all files recursively from an FTP folder:
private static void downloadFolder(
FTPClient ftpClient, String remotePath, String localPath) throws IOException
{
System.out.println("Downloading folder " + remotePath + " to " + localPath);
FTPFile[] remoteFiles = ftpClient.listFiles(remotePath);
for (FTPFile remoteFile : remoteFiles)
{
if (!remoteFile.getName().equals(".") && !remoteFile.getName().equals(".."))
{
String remoteFilePath = remotePath + "/" + remoteFile.getName();
String localFilePath = localPath + "/" + remoteFile.getName();
if (remoteFile.isDirectory())
{
new File(localFilePath).mkdirs();
downloadFolder(ftpClient, remoteFilePath, localFilePath);
}
else
{
System.out.println("Downloading file " + remoteFilePath + " to " +
localFilePath);
OutputStream outputStream =
new BufferedOutputStream(new FileOutputStream(localFilePath));
if (!ftpClient.retrieveFile(remoteFilePath, outputStream))
{
System.out.println("Failed to download file " + remoteFilePath);
}
outputStream.close();
}
}
}
}

How to preserve date modified when retrieving file using Apache FTPClient?

I am using org.apache.commons.net.ftp.FTPClient for retrieving files from a ftp server. It is crucial that I preserve the last modified timestamp on the file when its saved on my machine. Do anyone have a suggestion for how to solve this?
This is how I solved it:
public boolean retrieveFile(String path, String filename, long lastModified) throws IOException {
File localFile = new File(path + "/" + filename);
OutputStream outputStream = new FileOutputStream(localFile);
boolean success = client.retrieveFile(filename, outputStream);
outputStream.close();
localFile.setLastModified(lastModified);
return success;
}
I wish the Apache-team would implement this feature.
This is how you can use it:
List<FTPFile> ftpFiles = Arrays.asList(client.listFiles());
for(FTPFile file : ftpFiles) {
retrieveFile("/tmp", file.getName(), file.getTimestamp().getTime());
}
You can modify the timestamp after downloading the file.
The timestamp can be retrieved through the LIST command, or the (non standard) MDTM command.
You can see here how to do modify the time stamp: that: http://www.mkyong.com/java/how-to-change-the-file-last-modified-date-in-java/
When download list of files, like all files returned by by FTPClient.mlistDir or FTPClient.listFiles, use the timestamp returned with the listing to update timestemp of local downloaded files:
String remotePath = "/remote/path";
String localPath = "C:\\local\\path";
FTPFile[] remoteFiles = ftpClient.mlistDir(remotePath);
for (FTPFile remoteFile : remoteFiles) {
File localFile = new File(localPath + "\\" + remoteFile.getName());
OutputStream outputStream = new BufferedOutputStream(new FileOutputStream(localFile));
if (ftpClient.retrieveFile(remotePath + "/" + remoteFile.getName(), outputStream))
{
System.out.println("File " + remoteFile.getName() + " downloaded successfully.");
}
outputStream.close();
localFile.setLastModified(remoteFile.getTimestamp().getTimeInMillis());
}
When downloading a single specific file only, use FTPClient.mdtmFile to retrieve the remote file timestamp and update timestamp of the downloaded local file accordingly:
File localFile = new File("C:\\local\\path\\file.zip");
FTPFile remoteFile = ftpClient.mdtmFile("/remote/path/file.zip");
if (remoteFile != null)
{
OutputStream outputStream = new BufferedOutputStream(new FileOutputStream(localFile));
if (ftpClient.retrieveFile(remoteFile.getName(), outputStream))
{
System.out.println("File downloaded successfully.");
}
outputStream.close();
localFile.setLastModified(remoteFile.getTimestamp().getTimeInMillis());
}

downloaded zip file returns zero has 0 bytes as size

I have written a Java web application that allows a user to download files from a server. These files are quite large and so are zipped together before download.
It works like this:
1. The user gets a list of files that match his/her criteria
2. If the user likes a file and wants to download he/she selects it by checking a checkbox
3. The user then clicks "download"
4. The files are then zipped and stored on a servera
5. The user this then presented with a page which contains a link to the downloadable zip filea
6. However on downloading the zip file the file that is downloaded is 0 bytes in sizea
I have checked the remote server and the zip file is being created properly, all that is left is to serve the file the user somehow, can you see where I might be going wrong, or suggest a better way to serve the zip file.
The code that creates the link is:
<%
String zipFileURL = (String) request.getAttribute("zipFileURL"); %>
<p>Zip File Link</p>
The code that creates the zipFileURL variable is:
public static String zipFiles(ArrayList<String> fileList, String contextRootPath) {
//time-stamping
Date date = new Date();
Timestamp timeStamp = new Timestamp(date.getTime());
Iterator fileListIterator = fileList.iterator();
String zipFileURL = "";
try {
String ZIP_LOC = contextRootPath + "WEB-INF" + SEP + "TempZipFiles" + SEP;
BufferedInputStream origin = null;
zipFileURL = ZIP_LOC
+ "FITS." + timeStamp.toString().replaceAll(":", ".").replaceAll(" ", ".") + ".zip";
FileOutputStream dest = new FileOutputStream(ZIP_LOC
+ "FITS." + timeStamp.toString().replaceAll(":", ".").replaceAll(" ", ".") + ".zip");
ZipOutputStream out = new ZipOutputStream(new BufferedOutputStream(
dest));
// out.setMethod(ZipOutputStream.DEFLATED);
byte data[] = new byte[BUFFER];
while(fileListIterator.hasNext()) {
String fileName = (String) fileListIterator.next();
System.out.println("Adding: " + fileName);
FileInputStream fi = new FileInputStream(fileName);
origin = new BufferedInputStream(fi, BUFFER);
ZipEntry entry = new ZipEntry(fileName);
out.putNextEntry(entry);
int count;
while ((count = origin.read(data, 0, BUFFER)) != -1) {
out.write(data, 0, count);
}
origin.close();
}
out.close();
} catch (Exception e) {
e.printStackTrace();
}
return zipFileURL;
}
A URL cannot access any files (directly) under WEB-INF. I'd suggest using a servlet to return the file from whatever location it was saved to
Would also suggest saving the file outside the context of your webapp

Categories

Resources