This is a quite simple task and I have done this a lot of times. But, at the moment, I am stuck at this trivial line of code.
import java.io.File;
import java.io.FileNotFoundException;
import java.util.Scanner;
public class Test {
private static Scanner scan;
public static void main(String[] args) throws FileNotFoundException {
// TODO Auto-generated method stub
File file = null;
switch (1) {
case 1:
file = new File("W:\\Umesoft Evobus\\From AQUA\\Aqua data_ All\\20090101-20090630_datenabzug_tilde.txt");
break;
case 2:
file = new File("W:\\Umesoft Evobus\\From AQUA\\Aqua data_ All\\20090701-20091231_datenabzug_tilde.txt");
break;
}
scan = new Scanner(file);
String x = scan.nextLine();
System.out.println(x);
}
}
When I try to read the first file, I get a NoSuchElementException. When I try to read the second file, I have no issues. Both the files are from the same source and have the same format. I am sure, there are no issues with regards to the input files. The first line in both the files are identical.
Can someone explain this situation?
The above program is just for testing. Hence, I have used a switch case to select the file.
In the actual program, a set of files are selected by the user. And every time, this file is being skipped. The input files are data files, generated through another program. They are very similar to CSV files, but the delimiter used here is ~ for some reasons. They cannot be empty, because, even in the worst case, they would have headers.
Screenshot of the file contents in notepad++:
Output for file 1:
Exception in thread "main" java.util.NoSuchElementException: No line found
at java.util.Scanner.nextLine(Unknown Source)
at controller.Test.main(Test.java:24)
Output for file 2:
Weltherstellercode~FIN~Fahrzeug_Baumuster~~Motoridentnummer~Getriebe_Identifizierungsnummer~Produktionsdatum~Produktionsnummer_Fzg~Erstzulassungsdatum~Reparaturdatum~Fahrzeug_Laufleistung_in_km~Interne_VEGA_Antragsnummer~TGA~Fehlerort~~Fehlerart~~Reparaturart~~Hauptschadensteil~Reparaturland_(G&K)~~Reparaturbetrieb_(G&K)~~Mitteilungstext~Gutschriftsdatum_(Summe)~Anzahl_Beanstandungen~Gesamtkosten~Lohnkosten~Materialkosten~Summe_DH+NK~Anzahl_Arbeitswerte_(Gutgeschrieben)
String line ="";
BufferedReader br = new BufferedReader(new FileReader("path"));
while ((line = br.readLine()) != null) {
System.out.println(line);
}
i changed previous code to use a buffered reader since
BufferedReader has significantly larger buffer memory than Scanner.
Use BufferedReader if you want to get long strings from a stream, and
use Scanner if you want to parse specific type of token from a stream
The following answer from a different post, worked.
https://stackoverflow.com/a/35173548/6234625
scan = new Scanner(file,"UTF-8");
I had to mention the encoding for the Scanner.
Thanks to everyone who tried to help me. Thanks especially to #Abhisheik and #Priyamal.
Related
I've been trying to code a quiz game in javafx where I store the questions on a text file and then randomize a number then use it to call the line of the same number on the text file and read it into an array.
After looking online I can only seem to find how to read a text file line by line instead of a specific line. I also use the following code to read the text file but am unsure where to go on from there.
File file = new File("/Users/administrator/Desktop/Short Questions.txt");
FileReader fileReader = new FileReader(file);
BufferedReader bufferedReader = new BufferedReader(fileReader);
String line;
This may help you
You need to change file path as per your file location
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
public class Test {
public static void main(String[] args) throws IOException {
BufferedReader bufferedReader = new BufferedReader(new FileReader("C:\\Users\\everestek22\\Desktop\\Invoice.txt"));
String[] strArray =
bufferedReader.lines().map(String::new).toArray(String[]::new);
// String line = bufferedReader.readLine();
// while (line != null) {
// System.out.println(line);
// line = bufferedReader.readLine();
// String[] strArray = bufferedReader.lines().map(String::new).toArray(String[]::new);
// }
bufferedReader.close();
for (String s : strArray) {
System.out.println(s);
}
}
}
Don't bother trying to read specific lines from the file, just read all the lines from the file, then lookup your question by index in the resultant list.
List<String> questions = Files.readAllLines(
Paths.get("<your file path>")
);
Then you could choose a question at random:
Random random = new Random(42);
int randomQuestionIndex = random.nextInt(questions.size());
String randomQuestion = questions.get(randomQuestionIndex);
Using 42 as the seed to the random number generator makes the random sequence repeatable, which is good for testing. To have it truly psuedo-random, then remove the seed (e.g. just new Random());
If the structure of the data you wish to read is complex, then use a helper library such as Jackson to store and retrieve the data as serialized JSON objects. If it is even more complex, then a database can be used.
If you have a really large file and you know the position in the file of each specific thing you wish to read, then you can use a random access file for lookup. For example, if the all the questions in the file are exactly the same length and you know how many questions are stored there, then a random access file might be used fairly easily. But, from your description of what you need to do, this is likely not the case, and the simpler solution of reading everything rather than using a random access file is better.
I'm trying to read from a file and it's not working correctly. I've looked at many examples on here and the method I'm using is borrowed from an answer to someone else's question. I'm aware you can use bufferedreader but I'd like to stick with what I know and am learning in my classes right now. When I run this code I just get a blank line, but the file contains 4 lines of information.
import java.io.*;
import java.util.Scanner;
import java.lang.StringBuilder;
import java.io.File;
import java.io.FileInputStream;
public class fileWriting{
public static void main(String[] args) throws IOException{
//Set everything up to read & write files
//Create new file(s)
File accInfo = new File("accountInfo.txt");
//Create FileWriter
Scanner in = new Scanner(new FileReader("accountInfo.txt"));
String fileString = "";
//read from text file to update current information into program
StringBuilder sb = new StringBuilder();
while(in.hasNext()) {
sb.append(in.next());
}
in.close();
fileString = sb.toString();
System.out.println(fileString);
}
}
My file contains the following text:
name: Howard
chequing: 0
savings: 0
credit: 0
One of the advantages of using something like BufferedReader over using Scanner is that you will get an exception if the read fails for any reason. That’s a good thing—you want to know when and why your program failed, rather than having to guess.
Scanner does not throw an exception. Instead, you have to check manually:
if (in.ioException() != null) {
throw in.ioException();
}
Such a check probably belongs near the end of your program, after the while loop. That won’t make your program work, but it should tell you what went wrong, so you can fix the problem.
Of course, you should also verify that accountInfo.txt actually has some text in it.
I have two different ways to read the file but I am not sure how to proceed to converting the text to a string and then an if then statement like...
if string contains ":"
true string = "string"
false string = ,,"string"
package test;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileReader;
import java.io.IOException;
public class ReadStringFromFileLineByLine {
public static void main(String[] args) {
try {
File file = new File("foo.txt");
FileReader fileReader = new FileReader(file);
BufferedReader bufferedReader = new BufferedReader(fileReader);
StringBuffer stringBuffer = new StringBuffer();
String line;
String trim;
while ((line = bufferedReader.readLine()) != null) {
stringBuffer.append(line);
stringBuffer.append("\n");
}
fileReader.close();
System.out.println("Contents of file:");
System.out.println(stringBuffer.toString());
} catch (IOException e) {
e.printStackTrace();
}
}
}
But I don't believe I am using the trim command appropriately
Your question doesn't really communicate clearly the intent of the program. What exactly are you trying to do? If your file is text-based, there is no "conversion to String" needed. Also "save the file as an output" isn't clear either. Do you want to save a new file, overwrite the existing file, or append the existing file. All of these scenarios are handled differently. Taking this by parts:
First point: Your Scantest class works. Given a file foo.txt in the project folder, the class will print out the contents of the file.
Second point: Your class ReadStringFromFileLineByLine works with my own foo.txt just like the first class. So, there might be something wrong with your test.txt file. This is probably the most important thing when testing (making all conditions equal). If the conditions for testing are not equal, the tests will most likely be inconclusive (which is why I suspect happened in your case).
Third point: None of your classes attempted to make any modifications to the obtained strings or made modifications to the file. If you were to write to a file, you have to consider the following: Append vs. Overwrite. All it takes is the use of a simple boolean value:
FileWriter fw = new FileWriter(file.getAbsoluteFile()); // overwrites contents of file
FileWriter fw = new FileWriter(file.getAbsoluteFile(), true); // appends to file
The FileWriter single argument contructor calls the two-argument constructor passing false to it. Therefore, the FileWriter overwrites instead of appends. This is important because if you handle the file line by line, it is possible that at the end, your file will contain only the last line you "modified." If you choose to append, the new String will be added to the end of the line. So this is not good either. If you want to process a file line by line, made modifications to any given line, AND save the line to the same file, your best option is to use RandomAccessFile. This class allows you to write 'X' number of characters starting on a given offset. In this case, this "offset" is the "address" of the current line; putting it simply: the offset is equal to the number of characters already processed. So, for the first line, the offset is 0, for line 2 is the number of characters in line 1, and so forth.
I can add this as an update if you need it, but I did not see anything in your code that attempted to change the file in any way. I was just going by your title.
I have a file, I know that file will always contain only one word.
So what should be the most efficient way to read this file ?
Do i have to create input stream reader for small files also OR Is there any other options available?
Well something's got to convert bytes to characters.
Personally I'd suggest using Guava which will allow you to write something like this:
String text = Files.toString(new File("..."), Charsets.UTF_8);
Obviously Guava contains much more than just this. It wouldn't be worth it for this single method, but it's positive treasure trove of utility classes. Guava and Joda Time are two libraries I couldn't do without :)
Use Scanner
File file = new File("filename");
Scanner sc = new Scanner(file);
System.out.println(sc.next()); //it will give you the first word
if you have int,float...as first word you can use corresponding function like nextInt(),nextFloat()...etc.
Efficient you mean performance-wise or code simplicity (lazy programmer)?
If it is the second, then nothing I know beats:
String fileContent = org.apache.commons.io.FileUtils.readFileToString("/your/file/name.txt")
- Use InputStream and Scanner for reading the file.
Eg:
public class Pass {
public static void main(String[] args){
File f = new File("E:\\karo.txt");
Scanner scan;
try {
scan = new Scanner(f);
while(scan.hasNextLine()){
System.out.println(scan.nextLine());
}
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
- Guava Library handles this beautifully and efficiently.
Use BufferedReader and FileReader classes. Only two lines of code will suffice to read one word/one line file.
BufferedReader br = new BufferedReader(new FileReader("Demo.txt"));
System.out.println(br.readLine());
Here is a small program to do so. Empty file will cause to print 'null' as output.
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
public class SmallFileReader
public static void main(String[] args) throws IOException {
BufferedReader br = new BufferedReader(new FileReader("Demo.txt"));
System.out.println(br.readLine());
}
}
I have a program to pull the source code of a webpage and save it to a .txt file. It works if done with just one at a time, but when I go through a loop of say 100 pages all of a sudden each page source starts to get cut off between 1/4 and 3/4 of the way through (seems to be arbitrary). Any ideas on why or how I would go about solving this?
Initial thoughts where that the loop is going too fast for the java (I am running this java from a php script) but then thought that it technically shouldn't be going to the next item until the current condition was finished anyway.
Here is the code I'm using:
import java.io.*;
import java.net.URL;
public class selectout {
public static BufferedReader read(String url) throws Exception{
return new BufferedReader(
new InputStreamReader(
new URL(url).openStream()));}
public static void main (String[] args) throws Exception{
BufferedReader reader = read(args[0]);
String line = reader.readLine();
String thenum = args[1];
FileWriter fstream = new FileWriter(thenum+".txt");
BufferedWriter out = new BufferedWriter(fstream);
while (line != null) {
out.write(line);
out.newLine();
//System.out.println(line);
line = reader.readLine(); }}
}
The PHP is a basic mysql_query while(fetch_assoc) grab the url from the database, then run system("java -jar crawl.jar $url $filename");
Then, it fopen and fread the new file, and finally saves the source to database (after escaping_strings and such).
You need to close your output streams after you finish writing each file. After your while loop, call out.close(); and fstream.close();
You must flush the stream and close it.
finally{ //Error handling ignored in my example
fstream.flush();
fstream.close();
}