Java Files.readAllLines() is failing to close the file stream - java

I am trying to use Files.readAllLines() to read all the lines of a file in to a Scala List.
For a case, I need to delete the file once its read in to a List. But the problem is that the readAllLines method doesn't appear to be closing the file stream afterwards and because of that I am not able to delete that file after reading it.
This is how I am reading the file:
Files.readAllLines(Paths.get(tempFile)).asScala.toList
And after a few lines, I am doing this to delete it.
Files.delete(Paths.get(tempFile))
Any ideas or suggestions as to how this can be prevented?
Edit:
The exception message as requested.
The process cannot access the file because it is being used by another process

Related

Is it faster to read a Zip file while unzipping it (in a stream) or wait for it complete unzipping and then read it?

I am currently reading a CSV file from a zip file. I have 2 options to read it.
Read the CSV while it is being unzipped line by line by streaming it (ZipInputStream and openCSV)
Unzip the entire CSV file first, and then go back and read the entire thing.
Which one would be faster? I am going to perform some tests but I was wondering if anyone already knew logistically which is more efficient. Thanks!

Java Servelet 3.0 File Upload to input stream - without intermediate folders or files being created

I dont know how to do this, or whether is possible or wise, so any form of answer that points me to a library, example or reasoning will be helpful.
I need to upload and process some Java XML files (actually, XSLT files - XML Excel files).
I dont want to store the file on the server and then invoke processing on it. Instead, I want to stream the file in, and process it as a stream.
I also want to be able to process multipart file uploads, but still process that as an input stream.
I am expressly trying to avoid creating a file on disk for this.

Updating a Jar file with updates in input Excel

I need to create a Jar file which will read an excel and display as output, the existing data and the updated data.
This file needs to keep on running and displaying the Excel data as output. Any update that has been done on the Excel recently needs to be reflected in the output, along with the previous data.
I know how to create a Jar file, i am also able to read an excel file using Apache POI.
I just need an idea regarding how during every run, if the Excel is updated, that updated values can be displayed.
Do we need to implement threading,synchronization? If so, then how?
Synchronization does only work inside of your Java process. Assuming that an external process creates/updates the Excel file therefore synchronization will not help you.
The best chance you have is to listen for file-system changes of the Excel file (see WatchService class) and access the file after it has been changed.
For avoiding (or better minimize) file access conflicts I would open the file, copy the data to memory and then directly close the file.
Alternatively you could copy the file and then operate on the copied file. In both cases conflicts can still occur if the program writing the Excel file tries to perform changes while you are accessing the file.
Potential errors are errors because of blocked file or inconsistent data.

When does BufferedWriter create file?

Using BufferedWriter.write() when is a file created?
I know from the docs that when the buffer is filled it will flush to file, does this mean that:
every-time the buffer is filled an incomplete file will appear on my file system?
or that the file is only created when the BufferedWriter is closed?
My concern is that I am writing files to a directory using a BufferedWriter and another process is polling the directory for new files and reading them. I do not want an incomplete file to be created and be read by the other process.
Using BufferedWriter.write() when is a file created?
Never. BufferedWriter itself just writes to another Writer. Now if you're using a FileOutputStream or a FileWriter (where the first would probably be wrapped in an OutputStreamWriter) the file is created (or opened for write if it already exists) when you construct the object, i.e. before you've actually written any data.
My concern is that I am writing files to a directory using a BufferedWriter and another process is polling the directory for new files and reading them. I do not want an incomplete file to be created and be read by the other process.
One typical way of handling this is to write to a staging area and then rename the file into the correct place, which is usually an atomic operation. Or even write the file into the correct directory, but with a file extension which the polling process won't spot - and then rename the file to the final filename afterwards.
BufferedWriter doesn't create a file as Jon Skeet said. And you cannot guarantee that another process won't read an incomplete file when it is being written to disk. But there are two things you can do:
Lock the file so that the other process cannot read it before writing is complete. There are several questions concerning file locking in Java on this site (search for "[java] lock file").
Create the file with another filename (ie. use an extension that is not being looked for by the other process) and rename it when writing is finished.

Read a file with java while it is saved manually

I have a question concerning java and file input/output.
for an specific task, i have to transfer a file (excel to be precise) while it's opened.
imagine following scenario:
An excel file is opened and used by one user. From time to time the file is saved manually by the user. Now i want to write a java programm which reads the file and transfer it over an socket every 30 sec. No problem so far. My question: what happens if the user saves the document in the exact moment my program wants to read the file. Could this cause any troubles?
Don't know if it matters, but im using an BufferedInputStream to read the file.
My question: what happens if the user saves the document in the exact moment my program wants to read the file. Could this cause any troubles?
Yes.
One or more of the following things could happen depending on your platform, and the way that the Excel file is saved.
If Excel uses locking, then either Excel or the program trying to read the file could get an error saying that the file is in use.
If Excel does a rewrite in place and doesn't lock the file, then the program trying to read the file could see a truncated Excel file.
If Excel writes a new file and renames it, the program trying to read the file could see a state
where the file apparently does not exist.
It could work.
In short, the program doing the reading needs to very defensive ...
Don't know if it matters, but im using an BufferedInputStream to read the file.
That's irrelevant I think.
AFaik, the behaviour will depend on your underlying filesystem / operating system. A unix system typically keep an "un-named" copy of the file being read and starts creating a new file for the "being written" new copy, using inode trickery. An old Windows system would likely reply that the file cannot be written to because it is locked. I don't know about modern Windows systems.
what you can do i think is to alway check the state of the file before you do anything about it. like what have been said in some earlier posts, it all depends on the underlying platform, and you should employ a lot of defensive programming...

Categories

Resources