String dirWay = "C:\\Project";
int daysBack = 7;
File directory = new File(dirWay);
if(directory.exists()){
File[] listFiles = directory.listFiles();
long purgeTime = System.currentTimeMillis() - (daysBack * 24 * 60 * 60 * 1000);
for(File listFile : listFiles) {
if(listFile.lastModified() < purgeTime) {
if(!listFile.delete()) {
System.err.println("Unable to delete file: " + listFile);
} else {
System.out.println(listFile);
}
}
}
}
This is working only for Project folder files. But i have some folders and files in the Project folder, and each folder has some folders and files.
How i can check inside all folder and check files last modified date and delete it if more than 7 days?
For example i have directory: C:/Project/JavaIdea/...
If files in the JavaIdea folder older than 7 days, i need delete all files and JavaIdea folder too.
The easiest way is probably to use Files.walkFileTree. Files.walk gives you a Stream<Path>, but folders come first, not after. The FileVisitor on the other hand uses two events for folders - before visiting, and after.
Something that could work (untested):
Deque<LongAdder> counts = new ArrayDeque<>(); // counting # of remaining files per level
Files.walkFileTree(start, new SimpleFileVisitor<Path>() {
#Override
public FileVisitResult preVisitDirectory(T dir, BasicFileAttributes attrs) {
counts.push(new LongAdder());
return FileVisitResult.CONTINUE;
}
#Override
public FileVisitResult postVisitDirectory(T dir, BasicFileAttributes attrs) throws IOException {
long remaining = counts.pop().sum();
if (remaining == 0) {
// the directory is empty now, delete it
}
return FileVisitResult.CONTINUE;
}
#Override
public FileVisitResult visitFile(T file, BasicFileAttributes attrs) throws IOException {
if (is old enough to delete) {
// delete
} else {
counts.peek().increment();
}
return FileVisitResult.CONTINUE;
}
#Override
public FileVisitResult visitFileFailed(T file, IOException exc) {
counts.peek().increment();
return FileVisitResult.CONTINUE;
}
});
Related
I am trying to move folder 114229494 from one path to another and I want to replace the existing folder at the destination path (D:\SampleP2) but I am getting DirectoryNotEmptyException with the code mentioned below.
Since I don't want to change the name of the folder I mentioned D:\SampleP2\114229494 this as destination path
There are some images inside the folder.
Please help me to figure out what is wrong with this code.
public class MoveFiles {
private String source_path= "D:\\sampleP1\\114229494";
private String destination_path= "D:\\SampleP2\\114229494";
public void movefolder() {
File source = new File(source_path);
File destination = new File(destination_path);
Path path1 = FileSystems.getDefault().getPath(source_path);
Path path2 = FileSystems.getDefault().getPath(destination_path);
System.out.println(source);
System.out.println(destination);
try {
Files.move(path1, path2, StandardCopyOption.REPLACE_EXISTING);
} catch (IOException e) {
e.printStackTrace();
System.out.println("there is file in uploads");
}
}
}
If path1 is a directory and path2 exists already you cannot use Files.move(path1, path2) nor Files.move(path1, path2, StandardCopyOption.REPLACE_EXISTING) as these give rise to either FileAlreadyExistsException or DirectoryNotEmptyException.
A reliable move operation for occasions where the destination directory exists already needs to traverse the source directory tree and move each file to the same location under the destination, cleaning up folders as they are emptied.
The Files.walkFileTree method handles traversals, add a suitable FileVisitor action to move / merge the path1=>path2 files:
public static void move(final Path source, final Path dest) throws IOException {
FileVisitor<Path> visitor = new FileVisitor<>() {
public FileVisitResult preVisitDirectory(Path dir, BasicFileAttributes attrs) throws IOException {
final Path target = dest.resolve(source.relativize(dir));
System.out.println("Files.createDirectories("+target+")");
Files.createDirectories(target);
return FileVisitResult.CONTINUE;
}
public FileVisitResult visitFile(Path p, BasicFileAttributes attrs) throws IOException {
final Path targ = dest.resolve(source.relativize(p));
System.out.println("Files.move("+p+", "+targ+", StandardCopyOption.REPLACE_EXISTING)");
Files.move(p, targ, StandardCopyOption.REPLACE_EXISTING);
return FileVisitResult.CONTINUE;
}
public FileVisitResult postVisitDirectory(Path dir, IOException exc) throws IOException {
// Dir should be empty by now or there is coding error:
System.out.println("Files.deleteIfExists("+dir+")");
if (!Files.deleteIfExists(dir))
throw new IOException("Failed to delete src dir: "+dir);
return FileVisitResult.CONTINUE;
}
public FileVisitResult visitFileFailed(Path file, IOException exc) throws IOException {
throw exc;
}
};
Files.walkFileTree(source, visitor);
}
Note that the above works for files or folders - Files.walkFileTree handles calling the appropriate callbacks.
The logic of the above move could be made significantly quicker if you use File.move(subdir,targsubdir) if detecting that a subdir is not found in the target - replacing createDirectories / deleteIfExists. That would avoid need to move every file underneath that tree. However I will leave that as an exercise for the reader.
I need the list of the directories in the C: And get size for each one.
I am trying with this code:
int[] count = {0};
try {
Files.walkFileTree(Paths.get(dir.getPath()), new HashSet<FileVisitOption>(Arrays.asList(FileVisitOption.FOLLOW_LINKS)),
Integer.MAX_VALUE, new SimpleFileVisitor<Path>() {
#Override
public FileVisitResult visitFile(Path file , BasicFileAttributes attrs) throws IOException {
System.out.printf("Visiting file %s\n", file);
++count[0];
return FileVisitResult.CONTINUE;
}
#Override
public FileVisitResult visitFileFailed(Path file , IOException e) throws IOException {
System.err.printf("Visiting failed for %s\n", file);
return FileVisitResult.SKIP_SUBTREE;
}
#Override
public FileVisitResult preVisitDirectory(Path dir , BasicFileAttributes attrs) throws IOException {
System.out.printf("About to visit directory %s\n", dir);
return FileVisitResult.CONTINUE;
}
});
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
The problem I have is that if I use exactly this, Takes a lot of time because it visits each file! of the disk. I tried with differents options for FileVisitResult but I can't get the desired result. I need only first level with space in the disk.
Thanks in advance for any help.
Folders do not have a (significant) size of their own. Usually when a folder size is discussed, the meaning is the sum of sizes of all files under that folder and its sub-folders. This operation takes a "very long" time for C:\ (try right-clicking your Program Files folder and hitting properties). Remove all logs and try again. Compare the time it took your code, with the time it takes Windows Explorer to calculate that size (again, by right-clicking your Program Files folder and hitting properties).
In any case, Apache's commons-io has a one-liner:
long size = FileUtils.sizeOfDirectory(folder);
And Java-8 has another "pure" one-liner:
long size = Files.walk(Paths.get("C:\\"))
.filter(p -> p.toFile().isFile())
.mapToLong(p -> p.toFile().length())
.sum();
I have folders with multiple files. This is how the organization is.
Folder_1
- Folder_1_File_1
- Folder_1_File_2
- Folder_1_File_3
- Folder_1_File_4
...
Folder_2
- Folder_2_File_1
- Folder_2_File_2
- Folder_2_File_3
- Folder_2_File_4
...
Folder_3
- Folder_3_File_1
- Folder_3_File_2
- Folder_3_File_3
- Folder_3_File_4
...
I have to open each folder, read each file and then perform some computations on all files of each folder and display result, then move to next folder and perform same computations on files of those folders. Below is my code. Is there any other easier method to do it? My code is really slow and sometimes it doesn't even work.
public void listFilesForFolder_Test(final File folder) {
for (final File fileEntry : folder.listFiles()) {
if (fileEntry.isDirectory()) {
listFilesForFolder_Test(fileEntry);
} else {
Test_filenames.add(fileEntry.getAbsolutePath());
Test_filename.add(fileEntry.getName());
// String content = FileUtils.readFileToString(file);
// read file and perform computations on it.
}
}
}
Java 8 has introduced the Visitor-based Files.walkFileTree() api which is a lot easier to master:
File file = new File("/base/folder");
Files.walkFileTree(file.toPath(), new SimpleFileVisitor<>() {
#Override
public FileVisitResult visitFile(Path file, BasicFileAttributes attrs) {
// do your thing here
return FileVisitResult.CONTINUE;
}
});
This code will provide you regular files (not folders) at first the walk the rest of the given path tree.
public class ReadRecurssivelyFromFolder {
public static void main(String[] args) {
try {
Files.walk(Paths.get("/home/rafik/test" ))
.filter(Files::isRegularFile)
.forEach(System.out::println);
} catch (IOException ex) {
Logger.getLogger(ReadRecurssivelyFromFolder.class.getName()).log(Level.SEVERE, null, ex);
}
}
OUTPUT:
Then you can perform your calculation based on the given paths.
I have a service that will be populating a directory that I need to copy to another directory periodically. The source will be populated periodically.
When I copy the directory, it will be quite large, so I only want to add files in the destination, or overwrite files that aren't the same file (e.g. file size mismatch or modification date).
Is there a simple way to do this? I'm aware of FileUtils, but it's unclear to me if it will always ovewrite all the files, and what "merge" means here, specifically if it will not copy files that already match.
Files.walkFileTree and the other methods of Files can do it:
public void copyTree(Path source,
Path destination)
throws IOException {
Files.walkFileTree(source,
new SimpleFileVisitor<Path>() {
#Override
public FileVisitResult preVisitDirectory(Path dir,
BasicFileAttributes attr)
throws IOException {
Path destPath = destination.resolve(source.relativize(dir));
Files.createDirectories(destPath);
return FileVisitResult.CONTINUE;
}
#Override
public FileVisitResult visitFile(Path file,
BasicFileAttributes attr)
throws IOException {
Path destPath = destination.resolve(source.relativize(file));
FileTime sourceTime = Files.getLastModifiedTime(file);
FileTime destinationTime = Files.getLastModifiedTime(destPath);
if (!Files.exists(destPath) ||
sourceTime.compareTo(destinationTime) > 0) {
Files.copy(file, destPath,
StandardCopyOption.COPY_ATTRIBUTES,
StandardCopyOption.REPLACE_EXISTING);
}
return FileVisitResult.CONTINUE;
}
});
}
Hi I would like to process files inside many sub directories using Java. Psuedo code would be
while(mainDir.hasMoreDirectory())
{
getFilesFromCurrentDirectory()
passThoseFilesAsArgumentToProcess()
}
I am currently using the following code
public void list(File file) {
System.out.println(file.getName());
File[] children = file.listFiles();
for (File child : children) {
list(child);
}
}
Above code just lists files. Other thing I can do is I have to store list of files and directories in a list and then process in another loop. But I am not able to come up with what I want as show in pseudo code. I am new to Files Directories please help. Thanks in advance.
If you are using Java 7, you can harness the enhanced functionality of NIO in the form of the Files.walkFileTree method. Traversing the file system has never been easier in Java.
There is a short tutorial on it's usage here.
It implements the visitor pattern so you don't need to worry about the traversal algorithm itself, only specify what you want to do with each entry.
When traveling a directory tree in Java 7 use the Paths and Files functionality. They not only ease reading of directories and files, they're way faster then the "old" File way.
Assume you have two directories: mainDirand otherDirand you want to walk thru all directories of mainDir down to its leaves. With each entry in maiondir (file, sub-directory, symbolic link, ...) you want to compare this entry and its attributes (size, modification time, ...) against the entry at the same position in the otherDir.
Then this would be your code:
public final void test() throws IOException, InterruptedException {
final Path mainDir = Paths.get("absolute path to your main directory to read from");
final Path otherDir = Paths.get("absolute path to your other directory to compare");
// Walk thru mainDir directory
Files.walkFileTree(mainDir, new FileVisitor<Path>() {
#Override
public FileVisitResult preVisitDirectory(Path path,
BasicFileAttributes atts) throws IOException {
return visitFile(path, atts);
}
#Override
public FileVisitResult visitFile(Path path, BasicFileAttributes mainAtts)
throws IOException {
// I've seen two implementations on windows and MacOSX. One has passed the relative path, one the absolute path.
// This works in both cases
Path relativePath = mainDir.relativize(mainDir.resolve(path));
BasicFileAttributes otherAtts = Files.readAttributes(otherDir.resolve(relativePath), BasicFileAttributes.class);
// Do your comparison logic here:
compareEntries(mainDir, otherDir, relativePath, mainAtts, otherAtts);
return FileVisitResult.CONTINUE;
}
#Override
public FileVisitResult postVisitDirectory(Path path,
IOException exc) throws IOException {
// TODO Auto-generated method stub
return FileVisitResult.CONTINUE;
}
#Override
public FileVisitResult visitFileFailed(Path path, IOException exc)
throws IOException {
exc.printStackTrace();
// If the root directory has failed it makes no sense to continue
return (path.equals(mainDir))? FileVisitResult.TERMINATE:FileVisitResult.CONTINUE;
}
});
}
What it not does:
Find entries that do exist in otherDirbut not in maindir
Path and BasicFileAttributes are not Serializable, so there's no easy way to do this walk on two different machines.
A method like this would return you a List of all the files recursively within a directory. You can either operate on the returned List or replace the rtn.add calls with your processing.
Beware that this method doesn't have anything to stop it getting stuck in circular symlinks.
public static List<File> getFilesRecursive(File s)
{
ArrayList<File> rtn = new ArrayList<File>();
File[] contents = s.listFiles();
for(int i = 0; i<contents.length; i++)
{
if(contents[i].isDirectory()){
rtn.addAll(getFilesRecursive(contents[i]));
}else{
rtn.add(contents[i]);
}
}
return rtn;
}
Maybe this piece of code helps you:
public void traverse(String path) {
File root = new File(path);
File[] list = root.listFiles();
if (list == null) return;
for (File file : list) {
if (file.isDirectory()) {
traverse(file.getAbsolutePath());
System.out.println("Directory: " + file.getAbsoluteFile());
} else {
System.out.println("File: " + file.getAbsoluteFile());
}
}
}
Will following do?
public void list(File file) {
File[] children = file.listFiles();
if (children != null) {
process(children);
for (File child : children) {
if (child.isDirectory()) {
list(child);
}
}
} else {
process(new File[]{file});
}
}
private void process(File[] children) {
for (File child : children) {
if (child.isFile()) {
// process normal file
}
}
}
private static List<File> allFiles = new ArrayList<File>();
private static void processFiles(String rootDirectory) {
File rootDir = new File(rootDirectory);
if (rootDir.exists()) {
traverseDirectories(rootDir);
}
}
private static void traverseDirectories(File file) {
// add all files and directories to list.
allFiles.add(file);
if (file.isDirectory()) {
File[] fileList = file.listFiles();
for (File fileHandle : fileList) {
traverseDirectories(fileHandle);
}
} else {
// call to process file
System.out.println("Call to process file " + file.getAbsolutePath());
}
}