I have a file which every line contains strings like that :
usual,proper,complete,1,convenient,convenient,nonprob,recommended,recommend
I want to replace every word by such a code like that :
1000, 100, 110, 110, 111, 001, 111, 111, 1000
Here is the code that I used but it still incomplete :
public class Codage {
BufferedReader in;
public Codage() {
try {
in = new BufferedReader(new FileReader("nursery.txt"));
FileOutputStream fos2 = new FileOutputStream("nursery.txt");
DataOutputStream output = new DataOutputStream(fos2);
String str;
while (null != ((str = in.readLine()))) {
String delims = ",";
String[] tokens = str.split(delims);
int tokenCount = tokens.length;
for (int j = 0; j < tokenCount; j++) {
if (tokens[j].equals("usual")) {
tokens[j] = "1000";
output.writeChars(tokens[j]);
}
//continue the other cases
}
System.out.print(str);
}
in.close();
} catch (IOException e) {
System.out.println("There was a problem:" + e);
}
}
public static void main(String[] args) {
Codage c = new Codage();
}
}
My code replace the values incorrectly.
First of all, the code you wrote here is not working, because when you open an outputStream to the exact file you try read from, it will empty the source file and the statement in.readLine() always returns null. So if this is your real code maybe this is the problem.
I assume that you know you should separate the file you are opening to read from and the one you want to write into. That is, when you open the nursery.txt to read from, you should create an outputStream to a temp file called nursery.tmp in the same path, and after the process is finished, you can delete the nursery.txt and rename the nursery.tmp to nursery.txt.
Also if I were you, I wouldn't do the job using if-else structure. It seams that you have unique keys like:
usual, proper, complete, convenient, convenient, nonprob, recommended, recommend
So maybe it is more convenient to use a map structure to lookup the replacing value:
usual, proper, complete, convenient, convenient, nonprob, recommended, recommend, ...
1000, 100, 110, 110, 111, 001, 111, 111, ...
But these are just some guesses and you know how to manage your business logic.
After that part I think this is a better idea to create the output data as String lines and write them line by line to the nursery.tmp:
public class Codage {
private BufferedReader in;
private BufferedWriter out;
private HashMap<String, String> replacingValuesByKeys = new HashMap<String, String>();
public Codage() {
initialize();
}
private void initialize() {
// I assumed that you have rule that a key like "proper" always goes to "100"
// Initialize the map between keys and replacing values:
replacingValuesByKeys.put("usual", "1000");
replacingValuesByKeys.put("proper", "100");
replacingValuesByKeys.put("complete", "110");
replacingValuesByKeys.put("convenient", "110");
replacingValuesByKeys.put("nonprob", "111");
replacingValuesByKeys.put("recommended", "001");
replacingValuesByKeys.put("recommend", "1000");
}
public void doRelpacementInFile(){
try {
in = new BufferedReader(new FileReader("c:/nursery.txt"));
out = new BufferedWriter(new FileWriter("c:/nursery.tmp"));
String str = in.readLine();
while (null != str) {
Iterator<String> it = replacingValuesByKeys.keySet().iterator();
while(it.hasNext())
{
String toBeReplaced = it.next();
String replacementValue = replacingValuesByKeys.get(toBeReplaced);
// \\b is for word boundary, because you have both recommend and recommended
// and we do not want to replacing the [recommend] part of recommended.
str = str.replaceAll("\\b"+toBeReplaced+"\\b", replacementValue);
}
// Write the fully replaced line to the temp file:
out.append(str);
out.newLine();
// Do not forget to read the next line:
str = in.readLine();
}
} catch (IOException e) {
System.out.println("There was a problem:" + e);
} finally{
try {
in.close();
out.close();
} catch (IOException e) {
e.printStackTrace();
}
}
File f = new File("c:/nursery.txt");
f.delete();
File f2 = new File("c:/nursery.tmp");
f2.renameTo(new File("c:/nursery.txt"));
}
public static void main(String[] args) {
Codage c = new Codage();
c.doRelpacementInFile();
}
}
Hope this snippets would be helpful,
Good Luck.
Related
So I have a text file that looks like this...
4234
Bob
6858
Joe
I am trying to read the file with java and insert the data into an array. I want to separate the data by that empty line (space). Here is the code that I have come up with to solve the issue, but I am not quite there.
public class Main {
public static void main(String[] args) {
// This name is used when saving the file
BufferedReader input;
String inputLine;
try {
input = new BufferedReader(new FileReader("test.txt"));
while ((inputLine = input.readLine()) != null) {
System.out.println(Arrays.toString(inputLine.split(" ")));
}
} catch (IOException e) {
System.out.println(e.getMessage());
System.exit(1);
}
}
}
The issue that I am coming across is that the output from the code above looks something like this
[4234]
[Bob]
[]
[6858]
[Joe]
The outcome that I would like to achieve, and for the life of me can't think of how to accomplish, is
[4234, Bob]
[6858, Joe]
I feel like with many things that it is a relatively simple code change; I am just not sure what that is.
You need:
2D array
Logic to keep track of where you are in the array position
If your Line is a Number/String
This sounds like hw :) so I wont be solving it, I will just help a bit.
String[][] myData = define your 2D array;
//You need to create a consumer. This is what will take the String line, figure out where to put it into your 2D array.
Consumer<String> processLine = (line) -> {
if(StringUtils.isNumeric(line)){
//Put into array[counter][1]
}
else{
//its a String
//Put into array[counter][0]
}
};
The below try/catch, Opens a File, Reads its Lines, and goes over each one in order (forEachOrdered), ignoring all empty lines, and send it to your processLine consumer.
try (Stream<String> lines = Files.lines(Paths.get("C:/example.txt"), Charset.defaultCharset())) {
lines.filter(line -> !line.isEmpty()).forEachOrdered(processLine);
}
catch (Exception e){
//Handle Exception
}
Used Apache StringUtils http://commons.apache.org/proper/commons-lang/apidocs/org/apache/commons/lang3/StringUtils.html
IF you dont want to use any external Libs. You can probably do
Integer.parseInt(line) <-- If that throws an exception, its not a number
Your way of reading the file is not most convenient, in this case.. Scanner would have eased all this work; however, if you insist, that you want to use BufferedReader and FileReader, it's going to be a bit verbose, boilerplate and even ugly code, something like this:
public class Main {
public static void main(String[] args) {
// This name is used when saving the file
BufferedReader input;
String inputLine;
String answer = "";
try {
input = new BufferedReader(new FileReader("path\\to\\your\\test.txt"));
while ((inputLine = input.readLine()) != null) {
answer = answer + "[" + inputLine + ", ";
while ((inputLine = input.readLine()) != null && !inputLine.equals("")) {
answer += inputLine;
}
answer += "]";
System.out.println(answer);
answer = "";
}
} catch (IOException e) {
System.out.println(e.getMessage());
System.exit(1);
}
}
}
This code, with test.txt containing:
4234
Bob
6858
Joe
4234
John
5352
Martin
will output:
[4234, Bob]
[6858, Joe]
[4234, John]
[5352, Martin]
I don't know if it's an actual requirement for you to use arrays of strings, but the better way in the long run is to create a class.
class Person {
public String id;
public String name;
public String toString() { return String.format("[%s, %s]", id, name); }
}
(note: It's a bad idea to actually make the fields public, but this makes the code shorter. You should probably use getters and setters).
Now you can create Persons while reading the file.
List<Person> allInFile = new ArrayList<>();
try (BufferedReader reader = new BufferedReader(new FileReader("path\\to\\your\\test.txt"))) {
String line = reader.readLine();
while (line != null) {
line = line.trim();
// ignore empty lines
if (line.length() == 0) {
continue;
}
// this is an id; create a person and assign id
Person person = new Person();
person.id = line;
// read consecutive field, which is the name
person.name = reader.readLine();
// add the person to the list
allInFile.add(person);
}
}
allInFile.forEach(System.out::println);
Lots of improvements to be done on this, but the main point is to put the two data points into a class.
Try with this code:
it work only when file contains number followed by name otherwise pair would be different format
pair : [number, string]
public static void main(String[] args) {
BufferedReader input;
String inputLine;
List<String> pair = new ArrayList<String>();
List<String> list = new ArrayList<String>();
try {
input = new BufferedReader(new FileReader("Test.txt"));
while ((inputLine = input.readLine()) != null) {
if (!inputLine.isEmpty()) {
pair.add(inputLine);
}
if (pair.size() == 2) {
list.add(pair.toString());
pair.clear();
}
}
for (String s : list) {
System.out.println(s);
}
} catch (IOException e) {
System.out.println(e.getMessage());
System.exit(1);
}
}
After looking at the answers posted by my fellow Stack Overflow members I figured out that there was a very simple way of solving this issue and that was by using Scanner rather than using BufferedReader. I am not sure why I didn't think of this before, but hindsight is 2020. Anyway, the code below is what I used to solve my issue.
public static void main(String[] args) {
ArrayList<String> test = new ArrayList<>();
File file = new File("test.txt");
try {
Scanner sc = new Scanner(file);
while (sc.hasNextLine()) {
test.add(sc.next()); // The id
test.add(sc.next()); // The name
}
sc.close();
System.out.println(test.toString());
} catch (FileNotFoundException e) {
e.printStackTrace();
}
}
All this is doing is getting each line with the different data on it and is skipping the blank. From there it is adding it to an ArrayList for later processing. Remember K.I.S.S (Keep It Simple Stupid) no need to overcomplicate anything.
My code is too slow
How can I make my code efficiently? Currently the code needs several minutes until the file was read, which is way too long. Can this be done faster? There is no stacktrace, because it works, but too slow.
Thanks!
The Problem Code:
private void list(){
String strLine2="";
wwwdf2 = new StringBuffer();
InputStream fis2 = this.getResources().openRawResource(R.raw.list);
BufferedReader br2 = new BufferedReader(new InputStreamReader(fis2));
if(fis2 != null) {
try {
LineNumberReader lnr = new LineNumberReader(br2);
String linenumber = String.valueOf(lnr);
int i=0;
while (i!=1) {
strLine2 = br2.readLine();
wwwdf2.append(strLine2 + "\n");
String contains = String.valueOf(wwwdf2);
if(contains.contains("itisdonecomplet")){
i++;
}
}
// Toast.makeText(getApplicationContext(), strLine2, Toast.LENGTH_LONG).show();
Toast.makeText(getApplicationContext(), wwwdf2, Toast.LENGTH_LONG).show();
} catch (IOException e) {
e.printStackTrace();
}
}
}
Use StringBuilder instead of StringBuffer.
StringBuffer is synchronized, and you don't need that.
Don't use String.valueOf, which builds a string, negating the value using a StringBuffer/Builder. You are building a string from the whole buffer, checking it, discarding the string, then constructing nearly the same string again.
Use if (wwwdf2.indexOf("itisdonecomplet") >= 0) instead, which avoids creating the string.
But this will still be reasonably slow, as although you would not be constructing a string and searching through it all, you are still doing the searching.
You can make this a lot faster by only searching the very end of the string. For example, you could use wwwdf2.indexOf("itisdonecomplet", Math.max(0, wwwdf2.length() - strLine2.length() - "itisdonecomplet".length())).
Although, as blackapps points out in a comment, you could simply check if strLine2 contains that string.
Don't use string concatenation inside a call to append: make two separate calls.
wwwdf2.append(strLine2);
wwwdf2.append("\n");
You don't check if you reach the end of the file. Check if strLine2 is null, and break the loop if it is.
My new Created code:(My test device is a Samsung S8)
private void list(){
String strLine2="";
wwwdf2 = new StringBuilder();
InputStream fis2 = this.getResources().openRawResource(R.raw.list);
BufferedReader br2 = new BufferedReader(new InputStreamReader(fis2));
if(fis2 != null) {
try {
LineNumberReader lnr = new LineNumberReader(br2);
String linenumber = String.valueOf(lnr);
int i=0;
while (i!=1) {
strLine2 = br2.readLine();
wwwdf2.append(strLine2);
wwwdf2.append("\n");
if (wwwdf2.indexOf("itisdonecomplet") >= 0){
i++;
}
}
// Toast.makeText(getApplicationContext(), strLine2, Toast.LENGTH_LONG).show();
Toast.makeText(getApplicationContext(), wwwdf2, Toast.LENGTH_LONG).show();
} catch (IOException e) {
e.printStackTrace();
}
}
}
i'm a beginner at java and still learning so please excuse my question if it sounds stupid.
i've been stuck on a straight forward problem i was given:
i'm supposed to read a text file and store the values of the text file in different variables. my text file looks like:
foo.txt
Directory_path=C:\University
school_name=SyracuseUni
i want to store the directory path and school_name in a new variable say
var_one = C:\University
and var_two = SyracuseUni
I was able to split it but in a single string.
public static void main(String[] args) throws IOException {
try {
BufferedReader br = new BufferedReader(new FileReader("C:\\foo.txt"));
String strLine = null;
String var_one = null;
String var_two = null;
while ((strLine = br.readLine()) != null) {
String[] parts = strLine.split("=");
String parameter = parts[1];
System.out.println(parameter);
}
}
catch (IOException e) {
e.printStackTrace();
}
}
this gives me an output like this which isn't how i want it:
C:\University
SyracuseUni
i will appreciate if anyone can guide me towards the right approach. thanks all.
There is already a simple way to deal with such files using java.util.Properties class. This could be an overkill if you are simply trying to learn how to read a file.
public static void main(String[] args) {
String myVar1 = null;
String myVar2 = null;
Properties prop = new Properties();
InputStream input = null;
try (FileInputStream input = new FileInputStream("pathToYourFile")) {
prop.load(input);
myVar1 = prop.getProperty("Directory_path");
myVar2 = prop.getProperty("school_name");
} catch (IOException ex) {
//Handle exception
}
}
Something simple would be using Java Properties. You could also store values in a map. If you really insisted on filling two separate varibles, you could always count how many lines you've went across in your while loop and use switch/case to determine which variable to fill.
public static void main(String[] args) throws IOException {
try {
BufferedReader br = new BufferedReader(new FileReader("C:\\foo.txt"));
String strLine = null;
HashMap<String, String> map = new HashMap<String, String>();
while ((strLine = br.readLine()) != null) {
String[] parts = strLine.split("=");
map.put(parts[0], parts[1]);
}
for (Entry<String, String> entry : map.entrySet()) {
String key = entry.getKey();
String value = entry.getValue();
System.out.println(key + " = " + value);
}
}
catch (IOException e) {
e.printStackTrace();
}
}
I wrote a program that generates random numbers into two text files and random letters into a third according the two constant files. Now I need to read from each text file, line by line, and put them together. The program is that the suggestion found here doesn't really help my situation. When I try that approach it just reads all lines until it's done without allowing me the option to pause it, go to a different file, etc.
Ideally I would like to find some way to read just the next line, and then later go to the line after that. Like maybe some kind of variable to hold my place in reading or something.
public static void mergeProductCodesToFile(String prefixFile,
String inlineFile,
String suffixFile,
String productFile) throws IOException
{
try (BufferedReader br = new BufferedReader(new FileReader(prefixFile)))
{
String line;
while ((line = br.readLine()) != null)
{
try (PrintWriter out = new PrintWriter(new BufferedWriter(new FileWriter(productFile, true))))
{
out.print(line); //This will print the next digit to the right
}
catch (FileNotFoundException e)
{
System.err.println("File error: " + e.getMessage());
}
}
}
}
EDIT: The digits being created according to the following. Basically, constants tell it how many digits to create in each line and how many lines to create. Now I need to combine these together without deleting anything from either text file.
public static void writeRandomCodesToFile(String codeFile,
char fromChar, char toChar,
int numberOfCharactersPerCode,
int numberOfCodesToGenerate) throws IOException
{
for (int i = 1; i <= PRODUCT_COUNT; i++)
{
int I = 0;
if (codeFile == "inline.txt")
{
for (I = 1; I <= CHARACTERS_PER_CODE; I++)
{
int digit = (int)(Math.random() * 10);
try (PrintWriter out = new PrintWriter(new BufferedWriter(new FileWriter(codeFile, true))))
{
out.print(digit); //This will print the next digit to the right
}
catch (FileNotFoundException e)
{
System.err.println("File error: " + e.getMessage());
System.exit(1);
}
}
}
if ((codeFile == "prefix.txt") || (codeFile == "suffix.txt"))
{
for (I = 1; I <= CHARACTERS_PER_CODE; I++)
{
Random r = new Random();
char digit = (char)(r.nextInt(26) + 'a');
digit = Character.toUpperCase(digit);
try (PrintWriter out = new PrintWriter(new BufferedWriter(new FileWriter(codeFile, true))))
{
out.print(digit);
}
catch (FileNotFoundException e)
{
System.err.println("File error: " + e.getMessage());
System.exit(1);
}
}
}
//This will take the text file to the next line
if (I >= CHARACTERS_PER_CODE)
{
{
Random r = new Random();
char digit = (char)(r.nextInt(26) + 'a');
try (PrintWriter out = new PrintWriter(new BufferedWriter(new FileWriter(codeFile, true))))
{
out.println(""); //This will return a new line for the next loop
}
catch (FileNotFoundException e)
{
System.err.println("File error: " + e.getMessage());
System.exit(1);
}
}
}
}
System.out.println(codeFile + " was successfully created.");
}// end writeRandomCodesToFile()
Being respectfull with your code, it will be something like this:
public static void mergeProductCodesToFile(String prefixFile, String inlineFile, String suffixFile, String productFile) throws IOException {
try (BufferedReader prefixReader = new BufferedReader(new FileReader(prefixFile));
BufferedReader inlineReader = new BufferedReader(new FileReader(inlineFile));
BufferedReader suffixReader = new BufferedReader(new FileReader(suffixFile))) {
StringBuilder line = new StringBuilder();
String prefix, inline, suffix;
while ((prefix = prefixReader.readLine()) != null) {
//assuming that nothing fails and the files are equals in # of lines.
inline = inlineReader.readLine();
suffix = suffixReader.readLine();
line.append(prefix).append(inline).append(suffix).append("\r\n");
// write it
...
}
} finally {/*close writers*/}
}
Some exceptions may be thrown.
I hope you don't implement it in one single method.
You can make use of iterators too, or a very simple reader class (method).
I wouldn't use List to load the data at least I guarantee that the files will be low sized and that I can spare the memory usage.
My approach as we discussed by storing the data and interleaving it. Like Sergio said in his answer, make sure memory isn't a problem in terms of the size of the file and how much memory the data structures will use.
//the main method we're working on
public static void mergeProductCodesToFile(String prefixFile,
String inlineFile,
String suffixFile,
String productFile) throws IOException
{
try {
List<String> prefix = read(prefixFile);
List<String> inline = read(inlineFile);
List<String> suffix = read(productFile);
String fileText = interleave(prefix, inline, suffix);
//write the single string to file however you want
} catch (...) {...}//do your error handling...
}
//helper methods and some static variables
private static Scanner reader;//I just prefer scanner. Use whatever you want.
private static StringBuilder sb;
private static List<String> read(String filename) throws IOException
{
List<String> list = new ArrayList<String>;
try (reader = new Scanner(new File(filename)))
{
while(reader.hasNext())
{ list.add(reader.nextLine()); }
} catch (...) {...}//catch errors...
}
//I'm going to build the whole file in one string, but you could also have this method return one line at a time (something like an iterator) and output it to the file to avoid creating the massive string
private static String interleave(List<String> one, List<String> two, List<String> three)
{
sb = new StringBuilder();
for (int i = 0; i < one.size(); i++)//notice no checking on size equality of words or the lists. you might want this
{
sb.append(one.get(i)).append(two.get(i)).append(three.get(i)).append("\n");
}
return sb.toString()
}
Obviously there is still some to be desired in terms of memory and performance; additionally there are ways to make this slightly more extensible to other situations, but it's a good starting point. With c#, I could more easily make use of the iterator to make interleave give you one line at a time, potentially saving memory. Just a different idea!
Sample data in csv file
##Troubleshooting DHCP Configuration
#Module 3: Point-to-Point Protocol (PPP)
##Configuring HDLC Encapsulation
Hardware is HD64570
So i want to get the lines as
#Troubleshooting DHCP Configuratin
Module 3: Point-to-Point Protocol(PPP)
#Configuring HDLC Encapsulation
Hardware is HD64570
I have written sample code
public class ReadCSV {
public static BufferedReader br = null;
public static void main(String[] args) {
ReadCSV obj = new ReadCSV();
obj.run();
}
public void run() {
String sCurrentLine;
try {
br = new BufferedReader(new FileReader("D:\\compare\\Genre_Subgenre.csv"));
try {
while ((sCurrentLine = br.readLine()) != null) {
if(sCurrentLine.charAt(0) == '#'){
System.out.println(sCurrentLine);
}
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
I am getting below error
##Troubleshooting DHCP Configuration
#Module 3: Point-to-Point Protocol (PPP)
##Configuring HDLC Encapsulation
Exception in thread "main" java.lang.StringIndexOutOfBoundsException: String index out of range: 0
at java.lang.String.charAt(Unknown Source)
at example.ReadCSV.main(ReadCSV.java:19)
Please suggest me how to do this?
Steps:
Read the CSV file line by line
Use line.replaceFirst("#", "") to remove the first # from each line
Write the modified lines to an output stream (file or String) which suites you
If the variable s contains the content of the CSV file as String
s = s.replace("##", "#");
will replace all the occurrencies of '##" with '#'
You need something like String line=buffer.readLine()
Check the first character of the line with line.charAt(0)=='#'
Get the new String with String newLine=line.substring(1)
This is a rather trivial question. Rather than do the work for you, I'll outline the steps that you need to take without gifting you the answer.
Read in a file line by line
Take the first line and check if the first character of this line is a # - If it is, create a substring of this line excluding the first character ( or use fileLine.replaceFirst("#", ""); )
Store this line somewhere in an array like data structure or simply replace the current variable with the edited one ( fileLine = fileLine.replaceFirst("#", ""); )
Repeat until no more lines left from file.
If you want to add these changes to the file, simply overwrite the old file with the new lines (e.g. Using a steam reader and setting second parameter to false would overwrite)
Make an attempt and show us what you have tried, people will be more likely to help if they believe you have attempted the problem yourself thoroughly first.
package stackoverflow.q_25054783;
import java.util.Arrays;
public class RemoveHash {
public static void main(String[] args) {
String [] strArray = new String [3];
strArray[0] = "##Troubleshooting DHCP Configuration";
strArray[1] = "#Module 3: Point-to-Point Protocol (PPP)";
strArray[2] = "##Configuring HDLC Encapsulation";
System.out.println("Original array: " + Arrays.toString(strArray));
for (int i = 0; i < strArray.length; i++) {
strArray[i] = strArray[i].replaceFirst("#", "");
}
System.out.println("Updated array: " + Arrays.toString(strArray));
}
}
//Output:
//Original array: [##Troubleshooting DHCP Configuration, #Module 3: Point-to-Point Protocol (PPP), ##Configuring HDLC Encapsulation]
//Updated array: [#Troubleshooting DHCP Configuration, Module 3: Point-to-Point Protocol (PPP), #Configuring HDLC Encapsulation]
OpenCSV reads CSV file line by line and gives you an array of strings, where each string is one comma separated value, right? Thus, you are operating on a string.
You want to remove '#' symbol from the beginning of the string (if it is there). Correct?
Then this should do it:
CSVReader reader = new CSVReader(new FileReader("yourfile.csv"));
String [] nextLine;
while ((nextLine = reader.readNext()) != null) {
if (nextLine[0].charAt(0) == '#') {
nextLine[0] = nextLine[0].substring(1, nextLine[0].length());
}
}
Replacing the first '#' symbol on each of the lines in the CSV file.
private List<String> getFileContentWithoutFirstChar(File f){
try (BufferedReader input = new BufferedReader(new InputStreamReader(new FileInputStream(f), Charset.forName("UTF-8")))){
List<String> lines = new ArrayList<String>();
for(String line = input.readLine(); line != null; line = input.readLine()) {
lines.add(line.substring(1));
}
return lines
} catch(IOException e) {
e.printStackTrace();
System.exit(1);
return null;
}
}
private void writeFile(List<String> lines, File f){
try(BufferedWriter bw = new BufferedWriter(new OutputStreamWriter(new FileOutputStream(f), StandardCharsets.UTF_8))){
for(String line : lines){
bw.write(content);
}
bw.flush();
}catch (Exception e) {
e.printStackTrace();
}
}
main(){
File f = new File("file/path");
List<Stirng> lines = getFileContent(f);
f.delete();
writeFile(lines, f);
}