i finally got my Files.walk working now my question is if there is any way to identify if the file collected to the list comes from a subfolder or mainfolder because there is a delete function for these files but the files from the sub folder the user should not be able to delete.
private static List<FileInfo> listBackupFilesInLocalDir(String localPath, Predicate<String> fileNamePredicate) {
try (Stream<Path> files = Files.walk(Paths.get(localPath))) {
return files.filter(p -> fileNamePredicate.test(p.getFileName().toString()))
.map(p -> new FileInfo(p.getFileName().toString(), p.toFile().length()))
.sorted()
.collect(toList());
} catch (IOException e) {
log.error("Error listing directories", e);
throw new RuntimeException(e);
}
}
This is the function wich find and collects all the files. Is it some sort of filter i need or is it even possible to do what i want?
deleteLocalFile.addClickListener(event -> {
try {
Files.delete(Paths.get(this.localStorage, String.valueOf(localFilesComboBox.getValue())));
} catch (IOException e) {
UI.getCurrent().access(() -> Notification.show(e.getMessage(), Notification.Type.ERROR_MESSAGE));
}
UI.getCurrent().access(() -> {
localFilesComboBox.removeAllItems();
localFilesComboBox.addItems(listBackupFiles());
});
});
The above is the delete method and what i want to is simply like a if(from folder a) {
deny delete
}
or something similar
Okay, so you want to be able to only delete files, and only files, in the main folder, and not files in the sub folders. Thus you need a list of files in the main folder. You can do this by checking the URL of the FileInfo objects from the result of your listBackupFilesInLocalDir method. This can be done in the following manner:
public ArrayList<FileInfo> filesInMainFolder(string mainPath,
ArrayList<FileInfo> files) {
ArrayList<FileInfo> res = new ArrayList<FileInfo>();
for (FileInfo info : files) {
String url = info.getUrl().toString();
// Get the path of the File for which we have file information
url = url.substring(0, url.lastIndexOf('/'));
// Is file in the main folder
if (url.compareTo(mainPath) == 0 && info.isDirectory() == false) {
res.add(info);
}
}
return res;
}
The method should be fairly easy to follow. And option I have not include here is the getUrl() method on URLs because I am not 100% certain how it works. If it gets you the directory path, use that instead and drop the conversion to string of the url and simply use info.getUrl().getPath()
To know at which depth you are relative to localPath, you could count the number of elements in the path. Something like this:
int mainFolderDepth = Paths.get(localPath).toRealPath().getNameCount();
//in your stream
int folderDepth = p.toRealPath().getNameCount();
if (! Files.isDirectory(p)) folderDepth--; //don't count the file name
if (folderDepth != mainFolderDepth) { /* not in main folder */ }
Alternatively, in your file walk, make sure you don't enter subfolders if you want to ignore them by setting the maxDepth argument to 1.
Related
I have a plugin. Within this plugin I have a view that creates some markers so that I can open the file and navigate automatically to a selected line. However where as this method has worked for me previously. it ceases to now? it only ever returns null as my IFile.
Here is my method of creating the markers NOTE: That this method is not located within the controlling class of the FXML file. it is located in another external file.
public static String openAbsoluteFileInEclipseEditor(String absoluteLocationP, int lineNumberP) {
File absolute = new File(absoluteLocationP);
if(absolute.isFile() && absolute.exists()) {
if(Globals.testing) {
try {
Desktop.getDesktop().open(absolute);
return null;
} catch (IOException e) {
ErrorHandling.reportErrors(e);
return "";
}
}else {
IWorkbenchWindow window = PlatformUI.getWorkbench().getActiveWorkbenchWindow();
IWorkbenchPage page = window.getActivePage();
IWorkspace workspace = ResourcesPlugin.getWorkspace();
try {
if(lineNumberP != 0) {
IPath location = Path.fromOSString(absolute.getAbsolutePath());
System.out.println("location " + location);
IFile iFile = workspace.getRoot().getFileForLocation(location);
System.out.println("iFile " + iFile);
IMarker marker = iFile.createMarker(IMarker.TEXT);
marker.setAttribute(IMarker.LINE_NUMBER, lineNumberP);
IDE.openEditor(page, marker);
marker.delete();
}else {
IFileStore fileStore = EFS.getLocalFileSystem().getStore(absolute.toURI());
IDE.openEditorOnFileStore( page, fileStore );
}
return null;
} catch (PartInitException e) {
ErrorHandling.reportErrors(e);
return "";
} catch (CoreException e) {
ErrorHandling.reportErrors(e);
return "";
}
}
}else {
return "File not found";
}
}
Here are the two prints values that you can see in the middle of the method.
location C:/SoftwareAG_Workspaces/workspace105/HOSPITAL/Natural-Libraries/HOSPITAL/Programs/XX021P01.NSP
iFile null
Can anyone point out to me why it might no longer work and why its only returning nulls? and if possible could you suggest an alternate method that will work? the file does exist within in that location I have made sure of that.
Thanks in advance.
The Javadoc for getFileForLocation says:
This method returns null when the given file system location is not
under the location of any existing project in the workspace.
So is that location in the current workspace, and in a valid project?
The Javadoc also says:
The result will also omit resources that are explicitly excluded from
the workspace according to existing resource filters.
So check any resource filters.
Finally the Javadoc says:
Warning: This method ignores linked resources and their children.
Since linked resources may overlap other resources, a unique mapping
from a file system location to a single resource is not guaranteed.
To find all resources for a given location, including linked
resources, use the method findFilesForLocation.
So check for linked resources
I have two method - one to write, second to rename file:
public void writeToFile(File file, String content, boolean isLastLine) {
Optional<File> optionalFile = Optional.ofNullable(file);
if (!isLastLine)
content += System.lineSeparator();
try {
Files.write(
optionalFile.orElseThrow(() -> new RuntimeException("File couldn't be find")).toPath(),
content.getBytes(),
StandardOpenOption.APPEND, StandardOpenOption.SYNC);
} catch (IOException e) {
throw new RuntimeException(e);
}
}
public void renameFile(File fileToRename, String newFileName) {
Optional<File> optionalFile = Optional.ofNullable(fileToRename);
File finalBikFileName = new File(newFileName);
if (!optionalFile.orElseThrow(() -> new RuntimeException("File couldn't be find or doesn't exist")).renameTo(finalBikFileName)) {
throw new RuntimeException("File couldn't be saved - already exists or some other issues");
}
}
public void renameFile(File fileToRename, String newFileName) {
Optional<File> optionalFile = Optional.ofNullable(fileToRename);
File finalBikFileName = new File(newFileName);
if (!optionalFile.orElseThrow(() -> new RuntimeException("File couldn't be find or doesn't exist")).renameTo(finalBikFileName)) {
throw new RuntimeException("File couldn't be saved - already exists or some other issues");
}
}
This is normal class, in application deployed on wildfly. I tested it in many ways. If I comment the write function then rename function is working proper. But if I first write something to file and then I want to rename then i got "action cannot be completed because the file is open in another program" Also i cant touch this file in windows explorer - i can't rename or delete. What can be a reason? How can I unlock it?
1) Is it different threads (or server requests) that call the writeToFile and the renameFile methods? Or both methods are calling one after other under same thread/request?
2) How much data (content.length I mean) are you writing? Just want to make sure SYNC is done before the RENAME.
Let's have a maven project with resources in following structure:
src/main/resources
dir1
subdir1
files...
subdir2
files...
dir2
Although I found many answers how to list file resources, I am stuck to find a way how to list all direct subdirectories of a directory (or path) in resources, that would work when app is run both from IDE (IntelliJ) and jar.
The goal is to get names of subdirectories of directory "dir1":
subdir1, subdir2
I tried
Thread.currentThread().getContextClassLoader().getResourceAsStream("dir1");
and to use returned InputStream (containing names of subdirectories), but does not work from jar. Works from IntelliJ.
I also tried to use spring framework but
PathMatchingResourcePatternResolver resolver = new PathMatchingResourcePatternResolver();
Resource[] resources = resolver.getResources("dir1/*");
does not work as well from jar - resources array is empty. Works from IntelliJ.
I tried to modify to
Resource[] resources = resolver.getResources("dir1/*/");
and it works from jar, but does not work from IntelliJ.
When I make it work from jar, I break the other way and vice versa. My last idea is to use
Resource[] resources = resolver.getResources("dir1/**");
to get all resources under the directory and then get only ones that are required. Is there a better (preferably not hacky) way?
I finally ended up with this (using spring):
public class ResourceScanner {
private PathMatchingResourcePatternResolver resourcePatternResolver;
public ResourceScanner() {
this.resourcePatternResolver = new PathMatchingResourcePatternResolver();
}
public Resource getResource(String path) {
path = path.replace("\\", "/");
return resourcePatternResolver.getResource(path);
}
public Resource[] getResources(String path) throws IOException {
path = path.replace("\\", "/");
return resourcePatternResolver.getResources(path);
}
public Resource[] getResourcesIn(String path) throws IOException {
// Get root dir URI
Resource root = getResource(path);
String rootUri = root.getURI().toString();
// Search all resources under the root dir
path = (path.endsWith("/")) ? path + "**" : path + "/**";
// Filter only direct children
return Arrays.stream(getResources(path)).filter(resource -> {
try {
String uri = resource.getURI().toString();
boolean isChild = uri.length() > rootUri.length() && !uri.equals(rootUri + "/");
if (isChild) {
boolean isDirInside = uri.indexOf("/", rootUri.length() + 1) == uri.length() - 1;
boolean isFileInside = uri.indexOf("/", rootUri.length() + 1) == -1;
return isDirInside || isFileInside;
}
return false;
} catch (IOException e) {
return false;
}
}).toArray(Resource[]::new);
}
public String[] getResourcesNamesIn(String path) throws IOException {
// Get root dir URI
Resource root = getResource(path);
String rootUri = URLDecoder.decode(root.getURI().toString().endsWith("/") ? root.getURI().toString() : root.getURI().toString() + "/", "UTF-8");
// Get direct children names
return Arrays.stream(getResourcesIn(path)).map(resource -> {
try {
String uri = URLDecoder.decode(resource.getURI().toString(), "UTF-8");
boolean isFile = uri.indexOf("/", rootUri.length()) == -1;
if (isFile) {
return uri.substring(rootUri.length());
} else {
return uri.substring(rootUri.length(), uri.indexOf("/", rootUri.length() + 1));
}
} catch (IOException e) {
return null;
}
}).toArray(String[]::new);
}
}
The problem with a jar is that there are not folder as such, there are entries, and folders are just entries with a trailing slash in it's name. This is probably why dir1/*/ return the folders in jar but not from de IDE where the folder name does not end with /. For a similar reason dir1/* work on the IDE but not in the jar since again, the folder name ends with the slash in the jar.
Spring handles the resources based on files, if you try to get an empty folder as resource it will fail, and this empty folder normally is not added to the jar. Also looking for folders is not that common, since the content is in files, and the only useful information is what files it contains.
The easiest way you can achieve this is to use dir1/** pattern, since this will list all the files, and folders with files under the dir1 directory.
'Detecting' if the program is being executed from a jar file, to choose a different strategy to list the folders can also be done, but that solution is even more 'hacky'.
Here an example using first option, pattern dir1/** (just in case needed):
String folderName = "dir1";
String folderPath = String.format("%s/**", folderName);
Resource[] resources = resolver.getResources(folderPath);
Map<String, Resource> resourceByURI = Arrays.stream(resources)
.collect(Collectors.toMap(resource -> {
try {
return resource.getURI().toString();
} catch (IOException e) {
throw new RuntimeException(e);
}
}, Function.identity()));
The resourceByURI is a map containing the Resource, identified by its URI.
Resource folder = resolver.getResource(folderName);
int folderLength = folder.getURI().toString().length();
Map<String, String> subdirectoryByName = resourceByURI.keySet().stream()
.filter(name -> name.length() > folderLength
&& name.indexOf("/", folderLength + 1) == name.length() - 1)
.collect(Collectors.toMap(Function.identity(), name -> name.substring(folderLength,
name.indexOf("/", folderLength + 1))));
Then you can get the URI of the folder, to calculate the offset of the path (similar to what spring does), then check that the name is longer that the current folder (to discard the same folder in the jar case) and that the name constains a single / in the end. Finally collect to a Map again identified by the URI, containing the name of the folders. The collection of folder names is subdirectoryByName.values().
Seems like the following works:
public String[] getResourcesNames(String path) {
try {
URL url = getClassLoader().getResource(path);
if (url == null) {
return null;
}
URI uri = url.toURI();
if (uri.getScheme().equals("jar")) { // Run from jar
try (FileSystem fileSystem = FileSystems.newFileSystem(uri, Collections.emptyMap())){
Path resourcePath = fileSystem.getPath(path);
// Get all contents of a resource (skip resource itself), if entry is a directory remove trailing /
List<String> resourcesNames =
Files.walk(resourcePath, 1)
.skip(1)
.map(p -> {
String name = p.getFileName().toString();
if (name.endsWith("/")) {
name = name.substring(0, name.length() - 1);
}
return name;
})
.sorted()
.collect(Collectors.toList());
return resourcesNames.toArray(new String[resourcesNames.size()]);
}
} else { // Run from IDE
File resource = new File(uri);
return resource.list();
}
} catch (IOException | URISyntaxException e) {
return null;
}
}
private ClassLoader getClassLoader() {
return Thread.currentThread().getContextClassLoader();
}
Thanks Lubik for your first post, it works very well within Springboot 2.1., in IDE or jar.
I needed to copy and read some resources files from jar (or IDE in dev), returning Resource[] instead, next copy them where you want on the disk thanks to resource.getInputStream():
for (Resource resource: mapResources) {
Path destPath =
someFolder.resolve(resource.getFilename());
Files.copy(resource.getInputStream(), destPath);
}
How could I get all subfolders of some folder? I would use JDK 8 and nio.
picture
for example, for folder "Designs.ipj" method should return {"Workspace", "Library1"}
Thank you in advance!
List<Path> subfolder = Files.walk(folderPath, 1)
.filter(Files::isDirectory)
.collect(Collectors.toList());
it will contains folderPath and all subfolders in depth 1. If you need only subfolders, just add:
subfolders.remove(0);
You have to read all the items in a folder and filter out the directories, repeating this process as many times as needed.
To do this you could use listFiles()
File folder = new File("your/path");
Stack<File> stack = new Stack<File>();
Stack<File> folders = new Stack<File>();
stack.push(folder);
while(!stack.isEmpty())
{
File child = stack.pop();
File[] listFiles = child.listFiles();
folders.push(child);
for(File file : listFiles)
{
if(file.isDirectory())
{
stack.push(file);
}
}
}
see
Getting the filenames of all files in a folder
A simple recursive function would also work, just make sure to be wary of infinite loops.
However I am a bit more partial to DirectoryStream. It allows you to create a filter so that you only add the items that fit your specifications.
DirectoryStream.Filter<Path> visibleFilter = new DirectoryStream.Filter<Path>()
{
#Override
public boolean accept(Path file)
{
try
{
return Files.isDirectory(file));
}
catch(IOException e)
{
e.printStackTrace();
}
return false;
}
try(DirectoryStream<Path> stream = Files.newDirectoryStream(directory.toPath(), visibleFilter))
{
for(Path file : stream)
{
folders.push(child);
}
}
When I start my application I create a temp folder:
public static File createTempDir(String name) throws IOException {
File tempDir = File.createTempFile(name, "");
if (!(tempDir.delete())) {
throw new IOException("could not delete" + tempDir.getAbsolutePath());
}
if (!(tempDir.mkdir())) {
throw new IOException("could not create" + tempDir.getAbsolutePath());
}
tempDir.deleteOnExit();
return tempDir;
}
During a session a user might load a file. As a result the old temp dir is deleted and a new is created based on the ID of the file loaded.
During load where the old temp dir is deleted I sometimes get a:
java.io.IOException: Unable to delete file:
Here is how the old temp folder is deleted:
public void cleanup(String tmpPath) {
File tmpFolder = new File(tmpPath);
if (tmpFolder != null && tmpFolder.isDirectory()) {
try {
FileUtils.deleteDirectory(file);
} catch (IOException e) {
e.printStackTrace();
}
}
}
where FileUtils is: org.apache.commons.io.FileUtils. Typically the content of the temp folder is:
mytempfolder_uuid
|-> mysubfolder
|-> myImage.jpg
And the error is:
java.io.IOException: Unable to delete file: C:\Users\xxx\AppData\Local\Temp\mytempfolder_uuid\mysubfolder\myImage.jpg
I have tried to debug the application and before the delete operation is executed verified that the above image is actually located in the specified folder.
The nasty thing is that it only happens sometimes. I have made sure not to have the folder/files in the temp folder open in any other applications. Any ideas/suggestions?
You cannot delete files which are open and you can't delete a directory which contains a file. You have to ensure all files in the directory are closed.
I'd suggest you use the Guava library. It has a method Files.createTempDir() that does exactly what you seem to need:
Atomically creates a new directory somewhere beneath the system's
temporary directory (as defined by the java.io.tmpdir system
property), and returns its name. Use this method instead of
File.createTempFile(String, String) when you wish to create a
directory, not a regular file. A common pitfall is to call
createTempFile, delete the file and create a directory in its place,
but this leads a race condition which can be exploited to create
security vulnerabilities, especially when executable files are to be
written into the directory. This method assumes that the temporary
volume is writable, has free inodes and free blocks, and that it will
not be called thousands of times per second.
try deleting the files in the temp folder before deleting it. Try somethng like
private boolean deleteFolder(File path) {
if (path.exists()) {
File[] files = path.listFiles();
for (File f : files) {
if (f.isDirectory()) {
deleteFolder(f);
} else {
f.delete();
}
}
}
return path.delete();
}
also using deleteOnExit is not a very good idea...
cheers!
public static boolean deleteDir(String path)
{
java.io.File dir = new java.io.File(path);
if (dir.isDirectory())
{
String[] filesList = dir.list();
for(String s : filesList)
{
boolean success = new java.io.File(dir, s).delete();
if(!success)
{
return false;
}
}
}
return dir.delete();
}
and then you can use it like: deleteDir("C:\\MyFolder\\subFolder\\")