I have servlet to read log file my jsp call servlet at some interval.I want to add functionality in my servlet so that it read file from line next to last line from previous read.
I have following code but its not working
File file = new File("D:\\graph\\temp.log");
FileReader fr = new FileReader(file);
LineNumberReader lnr = new LineNumberReader(fr);\
lnr.setLineNumber(count) // count is the variable keeping track of number of line
previously read.
Is it need some extra code ? or is there any other alernative to do the same ?
From the documentation:
Note however, that setLineNumber(int) does not actually change the
current position in the stream; it only changes the value that will be
returned by getLineNumber().
Sounds to me like you'll have to start from the beginning and call readLine in a loop until getLineNumber() is greater than count.
Something like this:
File file = new File("D:\\graph\\temp.log");
FileReader fr = new FileReader(file);
LineNumberReader lineReader = new LineNumberReader(fr);
// skip the lines you don't need
while (lineReader.getLineNumber() < count) {
lineReader.readLine();
}
// begin processing input here
Have you looked at RandomAccessFile. If you keep the count of bytes read. Than followin code will help you.
RandomAccessFile raf = new RandomAccessFile(fileName,"r");
byte[] cBuf = new byte[1024];
long count = offset;
raf.seek(offset);
long lineCOunt=0l;
while(true){
int lineCOunt =raf.read(cBuf);
count +=lineCOunt;
if(lineCOunt<1024){
System.out.println((new String(cBuf)).substring(0,i));
break;
}
System.out.println(new String(cBuf));
Related
I have big csv(12 gb), so I can't read it in memory, and I need only 100 rows of them and save it back(truncate). Has java such api?
The other answers create a new file from the original file. As I understand it, you want to truncate the original file instead. You can do that quite easily using RandomAccessFile:
try (RandomAccessFile file = new RandomAccessFile(FILE, "rw")) {
for (int i = 0; i < N && file.readLine() != null; i++)
; // just keep reading
file.setLength(file.getFilePointer());
}
The caveat is that this will truncate after N lines, which is not necessarily the same thing as N rows, because CSV files can have rows that span multiple lines. For example, here is one CSV record that has a name, address, and phone number, and spans multiple lines:
Joe Bloggs, "1 Acacia Avenue,
Naboo Town,
Naboo", 01-234 56789
If you are sure all your rows only span one line, then the above code will work. But if there is any possibility that your CSV rows may span multiple lines, then you should first parse the file with a suitable CSV reader to find out how many lines you need to retain before you truncate the file. OpenCSV makes this quite easy:
final long numLines;
try (CSVReader csvReader = new CSVReader(new FileReader(FILE))) {
csvReader.skip(N); // Skips N rows, not lines
numLines = csvReader.getLinesRead(); // Gives number of lines, not rows
}
try (RandomAccessFile file = new RandomAccessFile(FILE, "rw")) {
for (int i = 0; i < numLines && file.readLine() != null; i++)
; // just keep reading
file.setLength(file.getFilePointer());
}
You should stream a file : read it line by line
For example :
CSVReader reader = new CSVReader(new FileReader("myfile.csv"));
String [] nextLine;
// the readnext => Reads the next line from the buffer and converts to a string array.
while ((nextLine = reader.readNext()) != null) {
System.out.println(nextLine);
}
If you need just a hundred lines, reading just that small portion of the file into memory would be really quick and cheap. You could use the Standard Library file APIs to achieve this quite easily:
val firstHundredLines = File("test.csv").useLines { lines ->
lines.take(100).joinToString(separator = System.lineSeparator())
}
File("test.csv").writeText(firstHundredLines)
Possible solution
File file = new File(fileName);
// collect first N lines
String newContent = null;
try (BufferedReader reader = new BufferedReader(new FileReader(file))) {
newContent = reader.lines().limit(N).collect(Collectors.joining(System.lineSeparator()));
}
// replace original file with collected content
Files.write(file.toPath(), newContent.getBytes(), StandardOpenOption.TRUNCATE_EXISTING);
I'm trying to delete a line of text from a text file without copying to a temporary file. I am trying to do this by using a Printwriter and a Scanner and having them traverse the file at the same time, the writer writing what the Scanner reads and overwriting each line with the same thing, until it gets to the line that I wish to delete. Then, I advance the Scanner but not the writer, and continue as before. Here is the code:
But first, the parameters: My file names are numbers, so this would read 1.txt or 2.txt, etc, and so f specifies the file name. I convert it to a String in the constructor for a file. Int n is the index of the line that I want to delete.
public void deleteLine(int f, int n){
try{
Scanner reader = new Scanner(new File(f+".txt"));
PrintWriter writer = new PrintWriter(new FileWriter(new File(f+".txt")),false);
for(int w=0; w<n; w++)
writer.write(reader.nextLine());
reader.nextLine();
while(reader.hasNextLine())
writer.write(reader.nextLine());
} catch(Exception e){
System.err.println("Enjoy the stack trace!");
e.printStackTrace();
}
}
It gives me strange errors. It says "NoSuchElementException" and "no line found" in the stack trace. It points to different lines; it seems that any of the nextLine() calls can do this. Is it possible to delete a line this way? If so, what am I doing wrong? If not, why? (BTW, just in case you'd want this, the text file is about 500 lines. I don't know if that counts as large or even matters, though.)
As others have pointed out, you might be better off using a temporary file, if there's a slightest risk that your program crashes mid way:
public static void removeNthLine(String f, int toRemove) throws IOException {
File tmp = File.createTempFile("tmp", "");
BufferedReader br = new BufferedReader(new FileReader(f));
BufferedWriter bw = new BufferedWriter(new FileWriter(tmp));
for (int i = 0; i < toRemove; i++)
bw.write(String.format("%s%n", br.readLine()));
br.readLine();
String l;
while (null != (l = br.readLine()))
bw.write(String.format("%s%n", l));
br.close();
bw.close();
File oldFile = new File(f);
if (oldFile.delete())
tmp.renameTo(oldFile);
}
(Beware of the sloppy treatment of encodings, new-line characters and exception handling.)
However, I don't like answering questions with "I won't tell you how, because you shouldn't do it anyway.". (In some other situation for instance, you may be working with a file that's larger than half your hard drive!) So here goes:
You need to use a RandomAccessFile instead. Using this class you can both read and write to the file using the same object:
public static void removeNthLine(String f, int toRemove) throws IOException {
RandomAccessFile raf = new RandomAccessFile(f, "rw");
// Leave the n first lines unchanged.
for (int i = 0; i < toRemove; i++)
raf.readLine();
// Shift remaining lines upwards.
long writePos = raf.getFilePointer();
raf.readLine();
long readPos = raf.getFilePointer();
byte[] buf = new byte[1024];
int n;
while (-1 != (n = raf.read(buf))) {
raf.seek(writePos);
raf.write(buf, 0, n);
readPos += n;
writePos += n;
raf.seek(readPos);
}
raf.setLength(writePos);
raf.close();
}
You cannot do it this way. FileWriter can only append to a file, rather than write in the middle of it - You need RandomAccessFile if you want to write in the middle. What you do now - you override the file the first time you write to it (and it gets empty - that's why you get the exception). You can create FileWriter with append flag set to true - but this way you would append to a file rather than write in the middle of it.
I'd really recommend to write to a new file and then rename it at the end.
#shelley: you can't do what you are trying to do and what's more, you shouldn't. You should read the file and write to a temporary file for several reasons, for one, it's possible to do it this way (as opposed to what you're trying to do) and for another, if the process gets corrupted, you could bale out without loss of the original file. Now you could update a specific location of a file using a RandomAccessFile, but this is usually done (in my experience) when you are dealing with fixed sized records rather than typical text files.
For some reason my program is overwriting the file and not adding to it.
This is the method that I am using to create the file and name it:
public void filenameMethod() throws IOException{
System.out.println("Input the name of the file");
InputStreamReader isr = new InputStreamReader(System.in);
BufferedReader br = new BufferedReader(isr);
filename = br.readLine();
raf = new RandomAccessFile(filename, "rw");
}
I'm using this method to take in the input from the user, it sets the value to a variable that are then written to a file:
public void inputMethod() throws IOException{
InputStreamReader isr = new InputStreamReader(System.in);
BufferedReader br = new BufferedReader(isr);
System.out.println("Input Carname, ID, Existing Mileage, Gas Cost, Number of Days, Rate, Total Charge, Discount, Tax, Net Charge and Return Milage");
String tokenString;
tokenString = br.readLine();
StringTokenizer st;
st = new StringTokenizer(tokenString);
carName = st.nextToken();
id = Integer.parseInt(st.nextToken());
existingMileage = Integer.parseInt(st.nextToken());
gasCost = Integer.parseInt(st.nextToken());
ndays = Integer.parseInt(st.nextToken());
rate = Integer.parseInt(st.nextToken());
totalCharge = Integer.parseInt(st.nextToken());
discount = Integer.parseInt(st.nextToken());
tax = Integer.parseInt(st.nextToken());
netCharge = Integer.parseInt(st.nextToken());
returnMileage = Integer.parseInt(st.nextToken());
}
I am then using this method to write them to a file:
public void fileWriterMethod() throws IOException{
raf.writeInt(id);
raf.writeInt(existingMileage);
raf.writeInt(gasCost);
raf.writeInt(ndays);
raf.writeInt(rate);
raf.writeInt(totalCharge);
raf.writeInt(discount);
raf.writeInt(tax);
raf.writeInt(netCharge);
raf.writeInt(returnMileage);
raf.writeBytes(carName + "\r\n");
//Closing the stream
raf.close();
}
I don't understand why this is happening, can anyone help me?
seek to the end of your file before you start writing, for example:
File f = new File(filename);
long fileLength = f.length();
RandomAccessFile raf = new RandomAccessFile(filename, "rw");
raf.seek(fileLength);
raf.writeInt(id);
...
Relevant javadoc.
By default RandomAccessFile starts writing at start of file and will overwrite existing data. To write to end of file, you need to use skip to the end as follows:
raf.skipBytes( (int)raf.length() );
When you open RandomAccessFile its pointer points to the beginning of the file. If you want to move to specific position you have to use method seek(). In your case you have to move to the end of file, i.e. seek(fileLength):
File f = new File(filename);
long fileLength = f.length();
RandomAccessFile raf = new RandomAccessFile(f, "rw");
raf.seek(fileLength);
// now write your bytes
If you want to append to the file, you need to seek(long) to the end. Something like,
raf.seek(raf.length());
From the linked Javadoc,
Sets the file-pointer offset, measured from the beginning of this file, at which the next read or write occurs. The offset may be set beyond the end of the file. Setting the offset beyond the end of the file does not change the file length. The file length will change only by writing after the offset has been set beyond the end of the file.
You'll need to set the file pointer to the end of the file before you start writing, it defaults to the beginning.
raf.seek(raf.length());
placed before any of the write operations should do the trick.
I need to do processes on a file ,first count the number of lines and compare with a value.
The next is one to read thru the file line by line and do validations.
if first one passes only i need to do second process.
I read the same file using FTP.
When i try to create a different input stream...ftp is busy reading the current file.
like this :
(is1 = ftp.getFile(feedFileName);)
below is the remaining :
InputStream is = null;
LineNumberReader lin = null;
LineNumberReader lin1 = null;
is = ftp.getFile(feedFileName);
lin = new LineNumberReader(new InputStreamReader(is));
so can i just use like below:
is1=is;
Will both streams be having the file contents from start to finish or the second object will become null as soon as the first stream object is read.
So is the only option left is to create a new ftp object to read a stream seperately ?
It can, but you would need to "rewind" the InputStream. First you need to call mark() method on it, and then reset. Here are docs: http://docs.oracle.com/javase/6/docs/api/java/io/InputStream.html#reset()
After you are done with the LineNumberReader, close the InputStream is. Then re-request the file from FTP, it will not be busy then anymore. You cannot 'just' read from the same InputStream, as that one is probably exhausted by the time the LineNumberReader is done. Furthermore, not all InputStreams support the mark() and reset() methods.
However I'd suggest that doing the second process only when the first one succeeds might not be the right way. As you're streaming the data anyways, why not stream it into a temporary data structure and then count the lines and then operate on the same data structure.
if you file is not big, you can save data to a String.
liek:
StringBuilder sb = new StringBuilder();
byte[] buffer = new byte[1024];
int len;
while((len = is.read(buffer))!=-1)
sb.append(buffer, 0, len);
String data = sb.toString();
then you can do further thing in the String
like:
int lineNumber = data.split("\n").length;
I am creating one text file which will connected to some server.
this text file will receive its contents from the server.
It will receive the some text data continuously.
To limit the file size , I am checking no.of lines in the file and if exceeds the mark I am clearing file content. Server will write from the beginning.
Below is the code I have used to do this :
LineNumberReader myReader = new LineNumberReader( new FileReader(new File("mnt/sdcard/abc.txt")));
while(true) {
while(myReader.readLine() != null) {
counter ++;
}
if(counter > 100 ) {
PrintWriter writer = new PrintWriter("/mnt/sdcard/abc.txt");
writer.print("");
writer.close();
writer = null;
counter = 0;
}
}
But after I clear the contents in a file my "counter" not increasing.
But my file is having some data.
I think after reading done I have set my "myReader" to some intial..?
If its how to set that to initial so that .readLine() should start from begining.
Shouldn't you close myReader before writing to the file??
LineNumberReader myReader = new LineNumberReader( new FileReader(new File("mnt/sdcard/abc.txt")));
while(true)
{
while(myReader.readLine() != null) {
counter++;
}
if(counter > 100 )
{
//CLOSE myReader
myReader.close();
PrintWriter writer = new PrintWriter("/mnt/sdcard/abc.txt");
writer.print("");
writer.close();
writer = null;
counter = 0;
//REOPEN myReader
myReader = new LineNumberReader( new FileReader(new File("mnt/sdcard/abc.txt")));
}
}
Shouldn't you make sure that changes to the file done by the server and changes to the file done by this loop are synchronized??
can you show how and where counter is declared and what other code might be modifying it? it is a matter of guessing without seeing that. meanwhile, maybe you can consider not reading the file content all the time and use the file size to determine if you should clean it.
long limit= .... //add your limit in bytes
long fileSize = new File("mnt/sdcard/abc.txt").length();
if (fileSize > limit){
//clean the file
}
please also note to check what was mentioned in the other answers regarding closing the file or trying to clean it while it is open and the server is writing to it.
Issue a myReader.reset() after clearing the contents.