write multiple string line in a file - java

I want to write a file in multiple line in a file. for example:
first line: a b c d e f g h i
second line: j k l m n o p q r
third line: s t u v w x y z 1
but the code I made cannot do so. it only prints on one line every time I try to write in it. Here's my code:
FileOutputStream write = new FileOutputStream ("file.txt");
PrintStream print = new PrintStream (write);
BufferedReader in = new BufferedReader(new FileReader(data));
String read;
while ((read = in.readLine()) != null) {
String[] splited = read.split("\n");
for (int z = 0; z<splited.length; z++){
print.print(splited[z]+" ");
}
}
print.println();
how can i fix this?

You need to move print.println() inside the while loop.
The correct way is as follows :
while ((read = in.readLine()) != null) {
String[] splited = read.split("\n");
for (int z = 0; z<splited.length; z++){
print.print(splited[z]+" ");
}
print.println(); // correct place for println
}

So first of all, readLine reads one line, up to a newline, so there is no need to call split("\n") because it will always return the string itself.
Second, you never write a newline to the output file, so it ends up having all the lines unrolled in one. You can just modify the line print.print(splited[z] + " "); to print.print(splited[z] + "\n");.

I have altered your code in a few ways to improve readability and performance.
I am now using a BufferedWriter instead of a PrintStream, which has a nice method for writing newlines.
I am using try-with-resources. This is a Java 7+ feature that automatically closes connections, streams, and buffers for you. In your current code I don't see you closing either the reader or the writer. For such a small program this is not a big deal, but it is good practice to close them.
I used a for loop instead of a while loop. This is a nice trick I picked up that saves me from defining read in a higher scope than I need it (scope-creep).
-
try (BufferedWriter writer = new BufferedWriter(new FileWriter("file.txt"));
BufferedReader reader = new BufferedReader(new FileReader(data)))
{
for (String read = reader.readLine(); read != null; read = reader.readLine()) {
writer.append(read);
writer.newLine();
}
}

Related

How to read every second line from a file in java

Can someone tell me how to read every second line from a file in java?
BufferedReader br = new BufferedReader(new FileReader(file));
String line = br.readLine();
while(line != null){
//Do something ..
line = br.readLine()
}
br.close
One simple way would be to just maintain a counter of number of lines read:
int count = 0;
String line;
while ((line = br.readLine()) != null) {
if (count % 2 == 0) {
// do something with this line
}
++count;
}
But this still technically reads every line in the file, only choosing to process every other line. If you really only want to read every second line, then something like RandomAccessFile might be necessary.
You can do it in Java 8 fashion with very few lines :
static final int FIRST_LINE = 1;
Stream<String> lines = Files.lines(path);
String secondLine = lines.limit(2).skip(FIST_LINE).collect(Collectors.joining("\n"));
First you stream your file lines
You keep only the two first lines
Skip the first line
Note : In java 8, when using Files.lines(), you are supposed to close the stream afterwards or use it in a try-with-resource block.
This is similar to #Tim Biegeleisen's approach, but I thought I would show an alternative to get every other line using a boolean instead of a counter:
boolean skipOddLine = true;
String line;
while ((line = br.readLine()) != null) {
if (skipOddLine = !skipOddLine) {
//Use the String line here
}
}
This will toggle the boolean value every loop iteration, skipping every odd line. If you want to skip every even line instead you just need to change the initial condition to boolean skipOddLine = false;.
Note: This approach only works if you do not need to extend functionality to skip every 3rd line for example, where an approach like Tim's would be easier to modify. It also has the downside of being harder to read than the modulo approach.
This will help you to do it very well
You can use try with resource
You can use stream api java 8
You can use stream api supplier to use stream object again and again
I already hane added comment area to understand you
try (BufferedReader reader =
new BufferedReader(
new InputStreamReader(
new ByteArrayInputStream(x.getBytes()),
"UTF-8"))) { //this will help to you for various languages reading files
Supplier<Stream<String>> fileContentStream = reader::lines; // this will help you to use stream object again and again
if (FilenameUtils.getExtension(x.getOriginalFilename()).equals("txt")) { this will help you to various files extension filter
String secondLine = lines.limit(2).skip(FIST_LINE).collect(Collectors.joining("\n"));
String secondLine =
fileContentStream
.get()
.limit(2)
.skip(1)// you can skip any line with this action
.collect(Collectors.joining("\n"));
}
else if (FilenameUtils.getExtension(x.getOriginalFilename()).equals("pdf")) {
} catch (Exception ex) {
}

how to read from a huge file and write to a new file by java

What I am doing is to read one file line by line, format every line, then write to a new file. But the problem is that the file is huge, nearly 178 MB. But always getting error message: IO console updater error, java heap space. Here is my code:
public class fileFormat {
public static void main(String[] args) throws IOException{
String strLine;
FileInputStream fstream = new FileInputStream("train_final.txt");
BufferedReader reader = new BufferedReader(new InputStreamReader(fstream));
BufferedWriter writer = new BufferedWriter(new FileWriter("newOUTPUT.txt"));
while((strLine = reader.readLine()) != null){
List<String> numberBox = new ArrayList<String>();
StringTokenizer st = new StringTokenizer(strLine);
while(st.hasMoreTokens()){
numberBox.add(st.nextToken());
}
for (int i=1; i< numberBox.size(); i++){
String head = numberBox.get(0);
String tail = numberBox.get(i);
String line = head + " "+tail ;
System.out.println(line);
writer.write(line);
writer.newLine();
}
numberBox.clear();
}
reader.close();
writer.close();
}
}
How can I avoid this error message? Moreover, I have set the VM preference: -xms1024m
Remove the line
System.out.println(line);
This is a workaround the fialing console updater, which otherwise runs out of memory.
The program looks okay. I suspect the problem is that you run this inside of Eclipse, and System.out is collected by Eclipse in memory (to be displayed in that Console window).
System.out.println(line);
Try to run it outside of Eclipse, change Eclipse settings to pipe System.out somewhere, or remove the line.
This part of the code:
for (int i=1; i< numberBox.size(); i++){
String head = numberBox.get(0);
String tail = numberBox.get(i);
String line = head + " "+tail ;
System.out.println(line);
writer.write(line);
writer.newLine();
}
Can be translated to:
String head = numberBox.get(0);
for (int i=1; i< numberBox.size(); i++){
String tail = numberBox.get(i);
System.out.print(head);
System.out.print(" ");
System.out.println(tail);
writer.write(head);
writer.write(" ");
writer.write(tail);
writer.newLine();
}
This may add a little code duplication but it avoids creating a lot of objects.
Also there if you merge this for loop with the loop contructing the numberBox, you won't need numberBox structure at all.
If you read whole file the heap memory will occupy so better option in to read the file in chuck. See my below code. It will start reading from the offset given in argument and will return the end offset . You need to pass number of lines to be read.
Please remember: You can use any collection to store these read lines and clear the collection before calling this method to read next chunk.
FileInputStream fis = new FileInputStream(file);
InputStreamReader streamReader = new InputStreamReader(fis, "UTF-8");
LineNumberReader reader = new LineNumberReader(streamReader);
//call this below method recursively until the file does not reaches to the end
public int getParsedLines(LineNumberReader reader, int iLineNumber_Start, int iNumberOfLinesToBeRead) {
int iLineNumber_End = 0;
int iReadUptoLines = iLineNumber_Start + iNumberOfLinesToBeRead;
try {
reader.mark(iLineNumber_Start);
reader.setLineNumber(iLineNumber_Start);
do {
String str = reader.readLine();
if (str == null) {
break;
}
// your code
iLineNumber_End = reader.getLineNumber();
} while (iLineNumber_End != iReadUptoLines);
} catch (Exception ex) {
// exception handling
}
return iLineNumber_End;
}

NullPointerException when trying to read a file line by line (Java)?

I'm trying to read a file line by line, but every time I run my program I get a NullPointerException at the line spaceIndex = line.indexOf(" "); which obviously means that line is null. HOWEVER. I know for a fact that the file I'm using has exactly 7 lines (even if I print the value of numLines, I get the value 7. And yet I still get a nullpointerexception when I try to read a line into my string.
// File file = some File I take in after clicking a JButton
Charset charset = Charset.forName("US-ASCII");
try (BufferedReader reader = Files.newBufferedReader(file.toPath(), charset)) {
String line = "";
int spaceIndex;
int numLines = 0;
while(reader.readLine()!=null) numLines++;
for(int i = 0; i<numLines; i++) {
line = reader.readLine();
spaceIndex = line.indexOf(" ");
System.out.println(spaceIndex);
}
PS: (I'm not actually using this code to print the index of the space, I replaced the code in my loop since there's a lot of it and it would make it longer to read)
If i'm going about reading the lines the wrong way, it would be great if someone could suggest another way, since so far every way I've tried gives me the same exception. Thanks.
By the time you start your for loop, the reader is already at the end of the file
(from the while loop).
Therefore, readLine() will always return null.
You should get rid of the for loop and do all of your work in the while loop as you first read the file.
You have two options.
First, you could read number of lines from a file this way:
LineNumberReader lnr = new LineNumberReader(new FileReader(new File("File1")));
lnr.skip(Long.MAX_VALUE);
System.out.println(lnr.getLineNumber());
Then read the file right after:
while((line = reader.readLine())!=null)
{
spaceIndex = line.indexOf(" ");
System.out.println(spaceIndex);
}
This first option is an alternative (and in my my opinion, cooler) way of doing this.
Second option (and probably the more sensible) is to do it all at once in the while loop:
while((line = reader.readLine())!=null)
{
numLines++;
spaceIndex = line.indexOf(" ");
System.out.println(spaceIndex);
}

Efficiently parsing and writing to a file in Java

I have a file that has no new line characters. I want a new line character every 160 characters.
To do this, I read the file, and then do:
String newLine = "";
int lineSize = 160;
char[] line = new char[lineSize];
while (rEntrada.read(line) > 0) {
newLine = new String(line);
String parsedLine = parseLine(newLine, date);
fw.write(parsedLine);
}
where parseLine takes care of some extra parsing of the line. My main question is if doing a "new String" inside a while loop is inefficient or not recommended or if you guys see anything that could be done better in this code. I'm really trying to get better at coding!
Thanks!
Try this.
First read the single line.
FileReader r = new FileReader(new File("<file-name>"));
// A buffered reader is fast
BufferedReader = reader = new BufferedReader(r);
String line = reader.readLine();
// Also use try-catch blocks!
Now iterate over the string and insert a \n at every 160th position.
StringBuilder sb = new StringBuilder();
int counter = 0;
for (int i=0; i<line.length(); i++){
sb.append(line.charAt(i));
counter++;
if (counter==160){
sb.append("\n");
counter = 0;
}
}
line = sb.toString();
Now you could write this line to the file.
It looks good to me, the only inefficiency I can see is if parseLine could be written better, possibly being passed line instead of newLine. It depends on what parseLine actually does.
Take a look at StringBuffer and see if it isn't usable in this case.
StringBuffer API Documentation
StringBuilder may also be of interest if you're not multi-threaded.

Question about Java File Reader

I'm having some problems with the FileReader class.
How do I specify an offset in the lines it goes through, and how do I tell it when to stop?
Let's say I want it to go through each line in a .txt file, but only lines 100-200 and then stop?
How would I do this? Right now I'm using ReadLine() but I don't think there's a way to specify offset with that.
Any fast help is VERY appreciated. Thanks.
You can't. FileReader reads a character at a time or a line at a time. Obviously you can write your own code extending or wrapping it to skip to the unneeded lines.
An aside: Be CAREFUL using FileReader or FileWriter - they use the default LOCALE character set. If you want to force a character set use OutputStreamWriter or InputStreamReader. Example
Writer w = new FileWriter(file) can be replaced by
Writer w = new OutputStreamWriter(new FileOutputStream(file),"UTF-8"); <=== see how I can set the character set.
An alternative: If you have FIXED-WIDTH text, then look at RandomAccessFile which lets you seek to any position. This doesn't help you much unless you have fixed width text or an index to skip to a line. But it is handy :)
Read all the lines but use another variable to count which line you are on. Call continue if you are on a line that you don't want to process (say, before the 100th line) and break when you will not want to process any more lines (after the 200th line).
There is not a way to tell the reader to only read certain lines, you can just use a counter to do it.
try {
BufferedReader in = new BufferedReader(new FileReader("infilename"));
String str;
int lineNumber = 0;
while ((str = in.readLine()) != null) {
lineNumber++;
if (lineNumber >= 100 && lineNumber <= 200) {
System.out.println("Line " + lineNumber + ": " + str);
}
}
in.close();
} catch (IOException e) { }
BufferedReader in = new BufferedReader(new FileReader("foo.in"));
for(int i=0;i<100;i++,in.readLine()){}
String line101 = in.readLine();

Categories

Resources