I have some data stored in file. I have to read some data from file, do something with it and then write it to new file, and then again read - calculate - write, and so on.
My problem is, that I have method for reading bytes from file and every time this method is called I open file, read and then close. Same thing at writing.
I think because of this my app is very slow, because opening and closing files is taking some time for sure.
For reading I'm using RandomAccessFile class, and FileWritter for writting. Is there any way, that both of the files will be opened all the time from first reading and writing and just closed at the end?
It won't be opening and closing the file that causes slowness. However reading and writing files can be slow, especially if writing to the sd card. Just opening it should be pretty quick.
Also, for writing make sure you use a buffered writer in there somewhere. It will greatly increase your speed if you aren't just writing the whole file as a big block. If you're reading in the entire file you should use a buffered reader as well.
Related
I am currently reading a CSV file from a zip file. I have 2 options to read it.
Read the CSV while it is being unzipped line by line by streaming it (ZipInputStream and openCSV)
Unzip the entire CSV file first, and then go back and read the entire thing.
Which one would be faster? I am going to perform some tests but I was wondering if anyone already knew logistically which is more efficient. Thanks!
I am trying to use Files.readAllLines() to read all the lines of a file in to a Scala List.
For a case, I need to delete the file once its read in to a List. But the problem is that the readAllLines method doesn't appear to be closing the file stream afterwards and because of that I am not able to delete that file after reading it.
This is how I am reading the file:
Files.readAllLines(Paths.get(tempFile)).asScala.toList
And after a few lines, I am doing this to delete it.
Files.delete(Paths.get(tempFile))
Any ideas or suggestions as to how this can be prevented?
Edit:
The exception message as requested.
The process cannot access the file because it is being used by another process
I have a file which I need to upload to a service, and parse into relevant data. The parser and the uploader both require an InputStream. Ought I to open the file twice? I could save the file to a String, but having many of these files in memory is concerning.
EDIT: Thought I should make it clear that the parsing and uploading are entirely separate processes.
Since you are parsing it already it would be most efficient to load the file into a string. Parse it into indexes to the string, you will save memory and can just upload the string whenever you want to. This would be the most effective way, with memory but maybe not processing time.
A reply to one of the comments above.
Separate processes does not mean different threads or processes, just they do not need each other to operate.
Using BufferedWriter.write() when is a file created?
I know from the docs that when the buffer is filled it will flush to file, does this mean that:
every-time the buffer is filled an incomplete file will appear on my file system?
or that the file is only created when the BufferedWriter is closed?
My concern is that I am writing files to a directory using a BufferedWriter and another process is polling the directory for new files and reading them. I do not want an incomplete file to be created and be read by the other process.
Using BufferedWriter.write() when is a file created?
Never. BufferedWriter itself just writes to another Writer. Now if you're using a FileOutputStream or a FileWriter (where the first would probably be wrapped in an OutputStreamWriter) the file is created (or opened for write if it already exists) when you construct the object, i.e. before you've actually written any data.
My concern is that I am writing files to a directory using a BufferedWriter and another process is polling the directory for new files and reading them. I do not want an incomplete file to be created and be read by the other process.
One typical way of handling this is to write to a staging area and then rename the file into the correct place, which is usually an atomic operation. Or even write the file into the correct directory, but with a file extension which the polling process won't spot - and then rename the file to the final filename afterwards.
BufferedWriter doesn't create a file as Jon Skeet said. And you cannot guarantee that another process won't read an incomplete file when it is being written to disk. But there are two things you can do:
Lock the file so that the other process cannot read it before writing is complete. There are several questions concerning file locking in Java on this site (search for "[java] lock file").
Create the file with another filename (ie. use an extension that is not being looked for by the other process) and rename it when writing is finished.
I have a question concerning java and file input/output.
for an specific task, i have to transfer a file (excel to be precise) while it's opened.
imagine following scenario:
An excel file is opened and used by one user. From time to time the file is saved manually by the user. Now i want to write a java programm which reads the file and transfer it over an socket every 30 sec. No problem so far. My question: what happens if the user saves the document in the exact moment my program wants to read the file. Could this cause any troubles?
Don't know if it matters, but im using an BufferedInputStream to read the file.
My question: what happens if the user saves the document in the exact moment my program wants to read the file. Could this cause any troubles?
Yes.
One or more of the following things could happen depending on your platform, and the way that the Excel file is saved.
If Excel uses locking, then either Excel or the program trying to read the file could get an error saying that the file is in use.
If Excel does a rewrite in place and doesn't lock the file, then the program trying to read the file could see a truncated Excel file.
If Excel writes a new file and renames it, the program trying to read the file could see a state
where the file apparently does not exist.
It could work.
In short, the program doing the reading needs to very defensive ...
Don't know if it matters, but im using an BufferedInputStream to read the file.
That's irrelevant I think.
AFaik, the behaviour will depend on your underlying filesystem / operating system. A unix system typically keep an "un-named" copy of the file being read and starts creating a new file for the "being written" new copy, using inode trickery. An old Windows system would likely reply that the file cannot be written to because it is locked. I don't know about modern Windows systems.
what you can do i think is to alway check the state of the file before you do anything about it. like what have been said in some earlier posts, it all depends on the underlying platform, and you should employ a lot of defensive programming...