Im tring to transfer from a file filein starting at position 1300 (in uint8 pieces) into fileto, using the RandomAccessFile transferFrom function.
fromfile = java.io.RandomAccessFile(ifile, 'rw');
fromchannel = fromfile.getChannel();
tofile = java.io.RandomAccessFile(ofile, 'rw');
tochannel = tofile.getChannel();
tochannel.transferFrom(fromchannel,n,fromfile.length()-n)
tochannel.close();
fromchannel.close();
fromfile.close();
tofile.close();
My output file is just empty tho.
Anyone know what im doing wrong??
Edit 1:
I've changed
tochannel.transferFrom(fromchannel,n,fromfile.length()-n)
to
fromchannel.transferTo(n,fromfile.length()-n,tochannel)
But now the output is printing to the all the file right except for it puts alot of 00 hexadecimals where the header in the original was???
You want to use transferTo I believe
fromchannel.transferTo(n,fromfile.length()-n,tochannel)
as transferFrom tries to start at position n in the outfile, while transferTo will start at position n in the infile
Related
I have a .txt file that will be accessed by many users, possibly at the same time (or close to that) and because of that I need a way modify that txt file without creating a temporary file and I haven't found answer or solution to this. So far, I only found this approach ->
Take existing file -> modify something -> write it to a new file (temp file) -> delete the old file.
But his approach is not good to me, I need something like: Take existing file -> modify it -> save it.
Is this possible? I'm really sorry if this question already exists, I tried searching Stack-overflow and I read thru Oracle Docs but I haven't found solution that suits my needs.
EDIT:
After modification, file would stay the same size as before. For example imagine list of students, each student can have value 1 or 0 (passed or failed the exam)
So in this case I would just need to update one character per row in a file (that is per, student). Example:
Lee Jackson 0 -> Lee Jackson 0
Bob White 0 -> would become -> Bob White 1
Jessica Woo 1 -> Jessica Woo 1
In the example above we have a file with 3 records one below other and I need to update 2nd record while 1st and 3rd would became the same and all that without creating a new file.
Here's a potential approach using RandomAccessFile. The idea would be to use readline to read it in strings but to remember the position in the file so you can go back there and write a new line. It's still risky in case anything in the text encoding would change byte lenght, because that could overwrite the line break for example.
void modifyFile(String file) throws IOException {
try (RandomAccessFile raf = new RandomAccessFile(file, "rw")) {
long beforeLine = raf.getFilePointer();
String line;
while ((line = raf.readLine()) != null) {
// edit the line while keeping its length identical
if (line.endsWith("0")) {
line = line.substring(0, line.length() - 1) + "1";
}
// go back to the beginning of the line
raf.seek(beforeLine);
// overwrite the bytes of that line
raf.write(line.getBytes());
// advance past the line break
String ignored = raf.readLine();
// and remember that position again
beforeLine = raf.getFilePointer();
}
}
}
Handling correct String encoding is tricky in this case. If the file isn't in the encoding used by readline() and getBytes(), you could workaround that by doing
// file is in "iso-1234" encoding which is made up.
// reinterpret the byte as the correct encoding first
line = new String(line.getBytes("ISO-8859-1"), "iso-1234");
... modify line
// when writing use the expected encoding
raf.write(line.getBytes("iso-1234"));
See How to read UTF8 encoded file using RandomAccessFile?
Try storing the changes you want to make to a file in the RAM (string or linked list of strings). If you read in the file to a linked list of strings (per line of the file) and write a function to merge the string you want to insert into that linked list of lines from the file and then rewrite the file entirely by putting down every line from the linked list it should give you what you want. Heres what I mean in psudocode the order is important here.
By reading in the file and setting after input we minimize interference with other users.
String lineYouWantToWrite = yourInput
LinkedList<String> list = new LinkedList<String>()
while (file has another line)
list.add(file's next line)
add your string to whatever index of list you want
write list to file line by line, file's first line = list[1]...
I have a .CSV file containing 100 000 records. I need to parse through a set of records and then delete it. Then again parse the next set of records till the end. How to do it? A code snippet will be very helpful.
I tried but I am not able to delete the records and reuse the same CSV file left with remaining set of records.
This can not be done efficiently, since CSV is a sequential file format. Say you have
"some text", "adsf"
"more text", "adfgagqwe"
"even more text", "adsfasdf"
...
and you want to remove the second line:
"some text", "adsf"
"even more text", "adsfasdf"
...
you need to move up all subsequent lines (which in your case can be 100 000 ...), which involves reading them at their old location and writing them to the new one. That is, deleting the first of 100 000 lines involves reading and writing 99 999 lines of text, which will take a while ...
It is therefore worthwhile to consider alternatives. For instance, if you are trying to process a file, and want to keep track of how far you got, it is far more efficient store the line number (or offset in bytes) you were at, and leave the input file intact. This will also prevent corrupting the file if your program crashes while deleting the lines. Another approach is to first split the file into many small files (perhaps 1000 lines each), process each file in its entirety and then delete the file.
However, if you truly must delete lines from a CSV file, the most robust way is to read the entire file, write all records you want to keep to a new file, delete the original file, and finally rename the new file to the original file.
You cannot edit or delete the existing data of a file. Ideally you should generate a new file for your output. In your case, once you reach the point to delete the existing data, you can create a new file, copy the remaining lines to the file and use this new file as input
code:
File infile =new File("C:\\MyInputFile.txt");
File outfile =new File("C:\\MyOutputFile.txt");
instream = new FileInputStream(infile);
outstream = new FileOutputStream(outfile);
byte[] buffer = new byte[1024];
int length;
/*copying the contents from input stream to
* output stream using read and write methods
*/
while ((length = instream.read(buffer)) > 0){
outstream.write(buffer, 0, length);
}
//Closing the input/output file streams
instream.close();
outstream.close();
Below code is tested working fine, you can erase any line in existing csv file using below code, so please check and let me know, you will have to put row number in array to delete,
File f=new File(System.getProperty("user.home")+"/Desktop/c.csv");
RandomAccessFile ra=new RandomAccessFile(f,"rw");
ra.seek(0);
long p=ra.getFilePointer();
byte b[]=ra.readLine().getBytes();
char c=' ';//44 for comma 32 for white space
for(int i=0;i<b.length;i++){
if(b[i]!=44){//Replace all except comma
b[i]=32;
}
}
ra.seek(p);//Go to intial pointer of line
ra.write(b);//write blank line with commas as column separators
ra.close();
We have an issue unzipping bz2 files in Java, whereby the input stream thinks it's finished after reading ~3% of the file.
We would welcome any suggestions for how to decompress and read large bz2 files which have to be processed line by line.
Here are the details of what we have done so far:
For example, a bz2 file is 2.09 GB in size and uncompressed it is 24.9 GB
The code below only reads 343,800 lines of the actual ~10 million lines the file contains.
Modifying the code to decompress the bz2 into a text file (FileInputStream straight into the CompressorInputStream) results in a file of ~190 MB - irrespective of the size of the bz2 file.
I have tried setting a buffer value of 2048 bytes, but this has no effect on the outcome.
We have executed the code on Windows 64 bit and Linux/CentOS both with the same outcome.
Could the buffered reader come to an empty, "null" line and cause the code to exit the while-loop?
import org.apache.commons.compress.compressors.*;
import java.io.*;
...
CompressorInputStream is = new CompressorStreamFactory()
.createCompressorInputStream(
new BufferedInputStream(
new FileInputStream(filePath)));
lineNumber = 0;
line = "";
br = new BufferedReader(
new InputStreamReader(is));
while ((line = br.readLine()) != null) {
this.processLine(line, ++lineNumber);
}
Even this code, which forces an exception when the end of the stream is reached, has exactly the same result:
byte[] buffer = new byte[1024];
int len = 1;
while (len == 1) {
out.write(buffer, 0, is.read(buffer));
out.flush();
}
There is nothing obviously wrong with your code; it should work. This means the problem must be elsewhere.
Try to enable logging (i.e. print the lines as you process them). Make sure there are no gaps in the input (maybe write the lines to a new file and do a diff). Use bzip2 --test to make sure the input file isn't buggy. Check whether it always fails for the same line (maybe the input contains odd characters or binary data?)
The issue lies with the bz2 files: they were created using a version of Hadoop which includes bad block headers inside the files.
Current Java solutions stumble over this, while others ignore it or handle it somehow.
Will look for a solution/workaround.
This question already has answers here:
How do I read / convert an InputStream into a String in Java?
(62 answers)
Closed 8 years ago.
I want to directly read a file, put it into a string without storing the file locally. I used to do this with an old project, but I don't have the source code anymore. I used to be able to get the source of my website this way.
However, I don't remember if I did it by "InputStream to String array of lines to String", or if I directly read it into a String.
Was there a function for this, or am I remembering wrong?
(Note: this function would be the PHP equivalent of file_get_contents($path))
You need to use InputStreamReader to convert from a binary input stream to a Reader which is appropriate for reading text.
After that, you need to read to the end of the reader.
Personally I'd do all this with Guava, which has convenience methods for this sort of thing, e.g. CharStreams.toString(Readable).
When you create the InputStreamReader, make sure you supply the appropriate character encoding - if you don't, you'll get junk text out (just like trying to play an MP3 file as if it were a WAV, for example).
Check out apache-commons-io and for your use case FileUtils.readFileToString(File file)
(should not be to hard to get a File form the path).
You can use the library or have a look at the code - as this is open.
There is no direct way to read a File into a String.
But there is a quick alternative - read the File into a Byte array and convert it into a String.
Untested:
File f = new File("/foo/bar");
InputStream fStream = new FileInputStream(f);
ByteArrayOutputStream bStream = new ByteArrayOutputStream();
for(int data = fStream.read(); data > -1; data = fStream.read()) {
b.write(data);
}
String theResult = new String(bStream.toByteArray(), "UTF-8");
This problem seems to happen inconsistently. We are using a java applet to download a file from our site, which we store temporarily on the client's machine.
Here is the code that we are using to save the file:
URL targetUrl = new URL(urlForFile);
InputStream content = (InputStream)targetUrl.getContent();
BufferedInputStream buffered = new BufferedInputStream(content);
File savedFile = File.createTempFile("temp",".dat");
FileOutputStream fos = new FileOutputStream(savedFile);
int letter;
while((letter = buffered.read()) != -1)
fos.write(letter);
fos.close();
Later, I try to access that file by using:
ObjectInputStream keyInStream = new ObjectInputStream(new FileInputStream(savedFile));
Most of the time it works without a problem, but every once in a while we get the error:
java.io.StreamCorruptedException: invalid stream header: 0D0A0D0A
which makes me believe that it isn't saving the file correctly.
I'm guessing that the operations you've done with getContent and BufferedInputStream have treated the file like an ascii file which has converted newlines or carriage returns into carriage return + newline (0x0d0a), which has confused ObjectInputStream (which expects serialized data objects.
If you are using an FTP URL, the transfer may be occurring in ASCII mode.
Try appending ";type=I" to the end of your URL.
Why are you using ObjectInputStream to read it?
As per the javadoc:
An ObjectInputStream deserializes primitive data and objects previously written using an ObjectOutputStream.
Probably the error comes from the fact you didn't write it with ObjectOutputStream.
Try reading it wit FileInputStream only.
Here's a sample for binary ( although not the most efficient way )
Here's another used for text files.
There are 3 big problems in your sample code:
You're not just treating the input as bytes
You're needlessly pulling the entire object into memory at once
You're doing multiple method calls for every single byte read and written -- use the array based read/write!
Here's a redo:
URL targetUrl = new URL(urlForFile);
InputStream is = targetUrl.getInputStream();
File savedFile = File.createTempFile("temp",".dat");
FileOutputStream fos = new FileOutputStream(savedFile);
int count;
byte[] buff = new byte[16 * 1024];
while((count = is.read(buff)) != -1) {
fos.write(buff, 0, count);
}
fos.close();
content.close();
You could also step back from the code and check to see if the file on your client is the same as the file on the server. If you get both files on an XP machine, you should be able to use the FC utility to do a compare (check FC's help if you need to run this as a binary compare as there is a switch for that). If you're on Unix, I don't know the file compare program, but I'm sure there's something.
If the files are identical, then you're looking at a problem with the code that reads the file.
If the files are not identical, focus on the code that writes your file.
Good luck!