Writing back to a csv with Java using super csv - java

Ive been working on this code for quite sometime and just want to be given the simple heads up if im routing down a dead end. The point where im at now is to mathch identical cells from diffrent .csv files and copy one row into another csv file. The question really is would it be possible to write at specfic lines say for example if the the 2 cells match at row 50 i wish to write back on to row 50. Im assuming that i would maybe extract everything to a hashmap, write it in there then write back to the .csv file? is there a easier way?
for example i have one Csv that has person details, and the other has property details of where the actual person lives, i wish to copy the property details to the person csv, aswell as match them up with the correct person detail. hope this makes sense
public class Old {
public static void main(String [] args) throws IOException
{
List<String[]> cols;
List<String[]> cols1;
int row =0;
int count= 0;
boolean b;
CsvMapReader Reader = new CsvMapReader(new FileReader("file1.csv"), CsvPreference.EXCEL_PREFERENCE);
CsvMapReader Reader2 = new CsvMapReader(new FileReader("file2.csv"), CsvPreference.EXCEL_PREFERENCE);
try {
cols = readFile("file1.csv");
cols1 = readFile("fiel2.csv");
String [] headers = Reader.getCSVHeader(true);
headers = header(cols1,headers
} catch (IOException e) {
e.printStackTrace();
return;
}
for (int j =1; j<cols.size();j++) //1
{
for (int i=1;i<cols1.size();i++){
if (cols.get(j)[0].equals(cols1.get(i)[0]))
{
}
}
}
}
private static List<String[]> readFile(String fileName) throws IOException
{
List<String[]> values = new ArrayList<String[]>();
Scanner s = new Scanner(new File(fileName));
while (s.hasNextLine()) {
String line = s.nextLine();
values.add(line.split(","));
}
return values;
}
public static void csvWriter (String fileName, String [] nameMapping ) throws FileNotFoundException
{
ICsvListWriter writer = new CsvListWriter(new PrintWriter(fileName),CsvPreference.STANDARD_PREFERENCE);
try {
writer.writeHeader(nameMapping);
} catch (IOException e) {
e.printStackTrace();
}
}
public static String[] header(List<String[]> cols1, String[] headers){
List<String> list = new ArrayList<String>();
String [] add;
int count= 0;
for (int i=0;i<headers.length;i++){
list.add(headers[i]);
}
boolean c;
c= true;
while(c) {
add = cols1.get(0);
list.add(add[count]);
if (cols1.get(0)[count].equals(null))// this line is never read errpr
{
c=false;
break;
} else
count ++;
}
String[] array = new String[list.size()];
list.toArray(array);
return array;
}

Just be careful if you read all of the addresses and person details into memory first (as Thomas has suggested) - if you're only dealing with small CSV files then it's fine, but you may run out of memory if you're dealing with larger files.
As an alternative, I've put together an example that reads the addresses in first, then writes the combined person/address details while it reads in the person details.
Just a few things to note:
I've used CsvMapReader and CsvMapWriter because you were - this meant I've had to use a Map containing a Map for storing the addresses. Using CsvBeanReader/CsvBeanWriter would make this a bit more elegant.
The code from your question doesn't actually use Super CSV to read the CSV (you're using Scanner and String.split()). You'll run into issues if your CSV contains commas in the data (which is quite possible with addresses), so it's a lot safer to use Super CSV, which will handle escaped commas for you.
Example:
package example;
import java.io.StringReader;
import java.io.StringWriter;
import java.util.HashMap;
import java.util.Map;
import org.supercsv.io.CsvMapReader;
import org.supercsv.io.CsvMapWriter;
import org.supercsv.io.ICsvMapReader;
import org.supercsv.io.ICsvMapWriter;
import org.supercsv.prefs.CsvPreference;
public class CombiningPersonAndAddress {
private static final String PERSON_CSV = "id,firstName,lastName\n"
+ "1,philip,fry\n2,amy,wong\n3,hubert,farnsworth";
private static final String ADDRESS_CSV = "personId,address,country\n"
+ "1,address 1,USA\n2,address 2,UK\n3,address 3,AUS";
private static final String[] COMBINED_HEADER = new String[] { "id",
"firstName", "lastName", "address", "country" };
public static void main(String[] args) throws Exception {
ICsvMapReader personReader = null;
ICsvMapReader addressReader = null;
ICsvMapWriter combinedWriter = null;
final StringWriter output = new StringWriter();
try {
// set up the readers/writer
personReader = new CsvMapReader(new StringReader(PERSON_CSV),
CsvPreference.STANDARD_PREFERENCE);
addressReader = new CsvMapReader(new StringReader(ADDRESS_CSV),
CsvPreference.STANDARD_PREFERENCE);
combinedWriter = new CsvMapWriter(output,
CsvPreference.STANDARD_PREFERENCE);
// map of personId -> address (inner map is address details)
final Map<String, Map<String, String>> addresses =
new HashMap<String, Map<String, String>>();
// read in all of the addresses
Map<String, String> address;
final String[] addressHeader = addressReader.getCSVHeader(true);
while ((address = addressReader.read(addressHeader)) != null) {
final String personId = address.get("personId");
addresses.put(personId, address);
}
// write the header
combinedWriter.writeHeader(COMBINED_HEADER);
// read each person
Map<String, String> person;
final String[] personHeader = personReader.getCSVHeader(true);
while ((person = personReader.read(personHeader)) != null) {
// copy address details to person if they exist
final String personId = person.get("id");
final Map<String, String> personAddress = addresses.get(personId);
if (personAddress != null) {
person.putAll(personAddress);
}
// write the combined details
combinedWriter.write(person, COMBINED_HEADER);
}
} finally {
personReader.close();
addressReader.close();
combinedWriter.close();
}
// print the output
System.out.println(output);
}
}
Output:
id,firstName,lastName,address,country
1,philip,fry,address 1,USA
2,amy,wong,address 2,UK
3,hubert,farnsworth,address 3,AUS

From your comment, it seems like you have the following situation:
File 1 contains persons
File 2 contains addresses
You then want to match persons and addresses by some key ( one or more fields) and write the combination back to a CSV file.
Thus the simplest approach might be something like this:
//use a LinkedHashMap to preserve the order of the persons as found in file 1
Map<PersonKey, String[]> persons = new LinkedHashMap<>();
//fill in the persons from file 1 here
Map<PersonKey, String[]> addresses = new HashMap<>();
//fill in the addresses from file 2 here
List<String[]> outputLines = new ArrayList<>(persons.size());
for( Map.Entry<PersonKey, String[]> personEntry: persons.entrySet() ) {
String[] person = personEntry.getValue();
String[] address = addresses.get( personEntry.getKey() );
//merge the two arrays and put them into outputLines
}
//write outputLines to a file
Note that PersonKey might just be a String or a wrapper object ( Integer etc.) if you can match persons and addresses by one field. If you have more fields you might need a custom PersonKey object with equals() and hashCode() properly overridden.

Related

Method put of hashMap overwrites values of already stored data

I use a hashMap to store data (certificate details) which is read from a file.
The key and value is stored in the hashMap but after calling the put method, ALL values have the value of the last added entry.
I guess it is also related to
hashmap.get() returning wrong values even though they are all correct in the map
but I don't see my error:
HashMap<String, String[]> certDataMap = new HashMap<String, String[]>();
String line="";
String bankName = "", validTill = "", fingerPrint = "";
File certDat = new File(certDataFile);
int cntEntries=0;
String[] data = {"dummy", "dummy"};
if (certDat.exists()) {
try {
Scanner scanner = new Scanner(certDat);
while (scanner.hasNextLine()) {
line=scanner.nextLine();
bankName=line.split("\\|")[0];
validTill=line.split("\\|")[1];
fingerPrint=line.split("\\|")[2];
logger.debug("line: {} bankName: {} validTill: {} fingerPrint: {}",line, bankName, validTill, fingerPrint);
data[0]=validTill;
data[1]=fingerPrint;
certDataMap.put(bankName, data);
debugCertMap();
cntEntries++;
}
scanner.close();
logger.debug("{} read from {}", cntEntries, certDataFile);
} catch (IOException e) {
logger.error(certDataFile,e);
}
} else
logger.error(certDataFile+" not found! New file will be created if certificates were downloaded");
The problem was the declaration of string array data outside the loop as mentioned by Jonathan:
while (scanner.hasNextLine()) {
line=scanner.nextLine();
bankName=line.split("\\|")[0];
validTill=line.split("\\|")[1];
fingerPrint=line.split("\\|")[2];
logger.debug("line: {} bankName: {} validTill: {} fingerPrint: {}",line, bankName, validTill, fingerPrint);
String[] data = {validTill, fingerPrint};
certDataMap.put(bankName, data);
debugCertMap();
cntEntries++;
An object is actually reference and you are using the same object data for each line. Use a new object.
Yes, you use same object String[] data = {"dummy", "dummy"};, where data is the reference to the array.
But look at your code. All these could be done very simply and avoid these problems.
Create data holder class, that represents single line from the file:
public static final class Data {
private final String bankName;
private final String validTill;
private final String fingerPrint;
public Data(String[] line) {
bankName = line[0];
validTill = line[1];
fingerPrint = line[2];
}
}
And provide a method that accept Path and retrieve file content with required format:
public static Map<String, Data> read(Path path) throws IOException {
return Files.lines(path)
.map(line -> new Data(line.split("\\|")))
.collect(Collectors.toMap(Data::getBankName, Function.identity()));
}
That's all!

Prinitng matching information from 2 files in Java

I am trying to write a program that checks two files and prints the common contents from both the files.
Example of the file 1 content would be:
James 1
Cody 2
John 3
Example of the file 2 content would be:
1 Computer Science
2 Chemistry
3 Physics
So the final output printed on the console would be:
James Computer Science
Cody Chemistry
John Physics
Here is what I have so far in my code:
public class Filereader {
public static void main(String[] args) throws Exception {
File file = new File("file.txt");
File file2 = new File("file2.txt");
BufferedReader reader = new BufferedReader(new FileReader(file));
BufferedReader reader2 = new BufferedReader(new FileReader(file2));
String st, st2;
while ((st = reader.readLine()) != null) {
System.out.println(st);
}
while ((st2 = reader2.readLine()) != null) {
System.out.println(st2);
}
reader.close();
reader2.close();
}
}
I am having trouble in figuring out how to match the file contents, and print only the student name and their major by matching the student id in each of the file. Thanks for all the help.
You can use the other answers and make an object to every file, like tables in databases.
public class Person{
Long id;
String name;
//getters and setters
}
public class Course{
Long id;
String name;
//getters and setters
}
Them you have more control with your columns and it is simple to use.
Further you will use an ArrayList<Person> and an ArrayList<Course> and your relation can be a variable inside your objects like courseId in Person class or something else.
if(person.getcourseId() == course.getId()){
...
}
Them if the match is the first number of the files use person.getId() == course.getId().
Ps: Do not use split(" ") in your case, because you can have other objects with two values i.e 1 Computer Science.
What you want is to organize your text file data into map, then merge their data. This will work even if your data are mixed, not in order.
public class Filereader {
public static void main(String[] args) throws Exception {
File file = new File("file.txt");
File file2 = new File("file2.txt");
BufferedReader reader = new BufferedReader(new FileReader(file));
BufferedReader reader2 = new BufferedReader(new FileReader(file2));
String st, st2;
Map<Integer, String> nameMap = new LinkedHashMap<>();
Map<Integer, String> majorMap = new LinkedHashMap<>();
while ((st = reader.readLine()) != null) {
System.out.println(st);
String[] parts = st.split(" "); // Here you got ["James", "1"]
String name = parts[0];
Integer id = Integer.parseInt(parts[1]);
nameMap.put(id, name);
}
while ((st2 = reader2.readLine()) != null) {
System.out.println(st2);
String[] parts = st2.split(" ");
String name = parts[1];
Integer id = Integer.parseInt(parts[0]);
majorMap.put(id, name);
}
reader.close();
reader2.close();
// Combine and print
nameMap.keySet().stream().forEach(id -> {
System.out.println(nameMap.get(id) + " " + majorMap.get(id));
})
}
}
You should read these files at the same time in sequence. This is easy to accomplish with a single while statement.
while ((st = reader.readLine()) != null && (st2 = reader2.readLine()) != null) {
// print both st and st2
}
The way your code is written now, it reads one file at a time, printing data to the console from each individual file. If you want to meld the results together, you have to combine the output of the files in a single loop.
Given that the intention may also be that you have an odd-sized file in one batch but you do have numbers to correlate across, or the numbers may come in a nonsequential order, you may want to store these results into a data structure instead, like a List, since you know the specific index of each of these values and know where they should fit in.
Combining the NIO Files and Stream API, it's a little simpler:
public static void main(String[] args) throws Exception {
Map<String, List<String[]>> f1 = Files
.lines(Paths.get("file1"))
.map(line -> line.split(" "))
.collect(Collectors.groupingBy(arr -> arr[1]));
Map<String, List<String[]>> f2 = Files
.lines(Paths.get("file2"))
.map(line -> line.split(" "))
.collect(Collectors.groupingBy(arr -> arr[0]));
Stream.concat(f1.keySet().stream(), f2.keySet().stream())
.distinct()
.map(key -> f1.get(key).get(0)[0] + " " + f2.get(key).get(0)[1])
.forEach(System.out::println);
}
As can easily be noticed in the code, there are assumptions of valid data an of consistency between the two files. If this doesn't hold, you may need to first run a filter to exclude entries missing in either file:
Stream.concat(f1.keySet().stream(), f2.keySet().stream())
.filter(key -> f1.containsKey(key) && f2.containsKey(key))
.distinct()
...
If you change the order such that the number comes first in both files, you can read both files into a HashMap then create a Set of common keys. Then loop through the set of common keys and grab the associated value from each Hashmap to print:
My solution is verbose but I wrote it that way so that you can see exactly what's happening.
import java.util.Set;
import java.util.HashSet;
import java.util.Map;
import java.util.HashMap;
import java.io.File;
import java.util.Scanner;
class J {
public static Map<String, String> fileToMap(File file) throws Exception {
// TODO - Make sure the file exists before opening it
// Scans the input file
Scanner scanner = new Scanner(file);
// Create the map
Map<String, String> map = new HashMap<>();
String line;
String name;
String code;
String[] parts = new String[2];
// Scan line by line
while (scanner.hasNextLine()) {
// Get next line
line = scanner.nextLine();
// TODO - Make sure the string has at least 1 space
// Split line by index of first space found
parts = line.split(" ", line.indexOf(' ') - 1);
// Get the class code and string val
code = parts[0];
name = parts[1];
// Insert into map
map.put(code, name);
}
// Close input stream
scanner.close();
// Give the map back
return map;
}
public static Set<String> commonKeys(Map<String, String> nameMap,
Map<String, String> classMap) {
Set<String> commonSet = new HashSet<>();
// Get a set of keys for both maps
Set<String> nameSet = nameMap.keySet();
Set<String> classSet = classMap.keySet();
// Loop through one set
for (String key : nameSet) {
// Make sure the other set has it
if (classSet.contains(key)) {
commonSet.add(key);
}
}
return commonSet;
}
public static Map<String, String> joinByKey(Map<String, String> namesMap,
Map<String, String> classMap,
Set<String> commonKeys) {
Map<String, String> map = new HashMap<String, String>();
// Loop through common keys
for (String key : commonKeys) {
// TODO - check for nulls if get() returns nothing
// Fetch the associated value from each map
map.put(namesMap.get(key), classMap.get(key));
}
return map;
}
public static void main(String[] args) throws Exception {
// Surround in try catch
File names = new File("names.txt");
File classes = new File("classes.txt");
Map<String, String> nameMap = fileToMap(names);
Map<String, String> classMap = fileToMap(classes);
Set<String> commonKeys = commonKeys(nameMap, classMap);
Map<String, String> nameToClass = joinByKey(nameMap, classMap, commonKeys);
System.out.println(nameToClass);
}
}
names.txt
1 James
2 Cody
3 John
5 Max
classes.txt
1 Computer Science
2 Chemistry
3 Physics
4 Biology
Output:
{Cody=Chemistry, James=Computer, John=Physics}
Notes:
I added keys in classes.txt and names.txt that purposely did not match so you see that it does not come up in the output. That is because the key never makes it into the commonKeys set. So, they never get inserted into the joined map.
You can loop through the HashMap if you want my calling map.entrySet()

How can I unscramble a list of words using a HashMap?

I will be given two files which I need to read into my program. One file will be a list of real words, while the other will be a list of those same words out of order. I need to output the scrambled words in alphabetical order with the real words printed next to them, and I need to do this using a Hashmap. My issue is that I can print out the scrambled word and 1 real word next to it, but in some cases there may be more than one real word for each jumbled word.
for example, my program can do this:
cta cat
stpo post
but I need it to be able to do this:
cta cat
stpo post stop
What changes do I need to make to my code to be able to have more than one dictionary word for each scrambled word? Thank you for your help. My code is below:
import java.io.*;
import java.util.*;
public class Project5
{
public static void main (String[] args) throws Exception
{
BufferedReader dictionaryList = new BufferedReader( new FileReader( args[0] ) );
BufferedReader scrambleList = new BufferedReader( new FileReader( args[1] ) );
HashMap<String, String> dWordMap = new HashMap<String, String>();
while (dictionaryList.ready())
{
String word = dictionaryList.readLine();
dWordMap.put(createKey(word), word);
}
dictionaryList.close();
ArrayList<String> scrambledList = new ArrayList<String>();
while (scrambleList.ready())
{
String scrambledWord = scrambleList.readLine();
scrambledList.add(scrambledWord);
}
scrambleList.close();
Collections.sort(scrambledList);
for (String words : scrambledList)
{
String dictionaryWord = dWordMap.get(createKey(words));
System.out.println(words + " " + dictionaryWord);
}
}
private static String createKey(String word)
{
char[] characterWord = word.toCharArray();
Arrays.sort(characterWord);
return new String(characterWord);
}
}
You need to do several changes. The biggest one is that dWordMap can't hold just one String - it needs to hold the list of words that are found in the scrambled words file.
The next change is being able to manipulate that list. I've added a sample solution which is untested but should give you a good place to start from.
import java.io.BufferedReader;
import java.io.FileReader;
import java.util.*;
public class Projects {
public static void main (String[] args) throws Exception
{
BufferedReader dictionaryList = new BufferedReader( new FileReader( args[0] ) );
BufferedReader scrambleList = new BufferedReader( new FileReader( args[1] ) );
Map<String, List<String>> dWordMap = new HashMap<>();
while (dictionaryList.ready()) {
String word = dictionaryList.readLine();
dWordMap.put(createKey(word), new ArrayList<>());
}
dictionaryList.close();
while (scrambleList.ready()) {
String scrambledWord = scrambleList.readLine();
String key = createKey(scrambledWord);
List<String> list = dWordMap.get(key);
list.add(scrambledWord);
}
scrambleList.close();
for (Map.Entry<String, List<String>> entry : dWordMap.entrySet()) {
String word = entry.getKey();
List<String> words = entry.getValue();
Collections.sort(words);
System.out.println(concatList(words, " ") + " " + word );
}
}
private static String createKey(String word) {
char[] characterWord = word.toCharArray();
Arrays.sort(characterWord);
return new String(characterWord);
}
private static String concatList(List<String> list, String delimiter) {
StringJoiner joiner = new StringJoiner(delimiter);
list.forEach(joiner::add);
return joiner.toString();
}
}
There a few other changes I would have made - the first is to put the calls to dictionaryList.close(); and scrambleList.close(); in a finally part of a try...catch clause to make sure that the resources are freed in the end no matter what happens. You can also consider using Java 8's Streams to make the code more up to date. I'll be happy to give some more tips if this doesn't fit your needs or you have any more questions. Good luck!
If you want to record the list of dictionary words that are anagrams of each scrambled word then you will need to have a map to a list:
Map<String, List<String>> anagrams = new HashMap<>();
Then, for each scrambled word, you add a list of dictionary words to the map:
anagrams.put(scrambled, allAnagrams(scrambled));
Where allAnagrams would look like:
private List<String> allAnagrams(String scrambled) {
List<String> anagrams = new ArrayList<>();
for (String word: dictionary) {
if (isAnagram(word, scrambled))
anagrams.add(word);
}
Collections.sort(anagrams);
return anagrams;
}
Not that if you have Java 8 and are familiar with streams then this could be:
private List<String> allAnagrams(String scrambled) {
return dictionary.stream()
.filter(word -> isAnagram(scrambled, word))
.sorted()
.collect(Collectors.toList());
}
To improve upon #sprinter's Map<String, List<String>> example:
private final Map<String, List<String>> lookup = new HashMap<>();
public List<String> getList(String word) {
//can also make #computeIfAbsent use an "initializer" for the key
return lookup.computeIfAbsent(word, k -> new ArrayList<>());
}
Then it's simple to interact with:
List<String> words = getList("tspo"); //spot, post, stop, etc...
You can do the unscrambling from there, and could go even further if you wanted to save space and find a way to index the key as a specific list of characters (so that sotp and tpos would do only one lookup).

Reading a csv file into a HashMap<String, ArrayList<Integer>>

I've been trying to make a java program in which a tab delimited csv file is read line by line and the first column (which is a string) is added as a key to a hash map and the second column (integer) is it's value.
In the input file, there are duplicate keys but with different values so I was going to add the value to the existing key to form an ArrayList of values.
I can't figure out the best way of doing this and was wondering if anyone could help?
Thanks
EDIT: sorry guys, heres where i've got to with the code so far:
I should add the first column is the value and the second column is the key.
public class WordNet {
private final HashMap<String, ArrayList<Integer>> words;
private final static String LEXICAL_UNITS_FILE = "wordnet_data/wn_s.csv";
public WordNet() throws FileNotFoundException, IOException {
words = new HashMap<>();
readLexicalUnitsFile();
}
private void readLexicalUnitsFile() throws FileNotFoundException, IOException{
BufferedReader in = new BufferedReader(new FileReader(LEXICAL_UNITS_FILE));
String line;
while ((line = in.readLine()) != null) {
String columns[] = line.split("\t");
if (!words.containsKey(columns[1])) {
words.put(columns[1], new ArrayList<>());
}
}
in.close();
}
You are close
String columns[] = line.split("\t");
if (!words.containsKey(columns[1])) {
words.put(columns[1], new ArrayList<>());
}
should be
String columns[] = line.split("\t");
String key = columns[0]; // enhance readability of code below
List<Integer> list = words.get(key); // try to fetch the list
if (list == null) // check if the key is defined
{ // if not
list = new ArrayList<>(); // create a new list
words.put(key,list); // and add it to the map
}
list.add(new Integer(columns[1])); // in either case, add the value to the list
In response to the OP's comment/question
... the final line just adds the integer to the list but not to the hashmap, does something need to be added after that?
After the statement
List<Integer> list = words.get(key);
there are two possibilities. If list is non-null, then it is a reference to (not a copy of) the list that is already in the map.
If list is null, then we know the map does not contain the given key. In that case we create a new empty list, set the variable list as a reference to the newly created list, and then add the list to the map for the key.
In either case, when we reach
list.add(new Integer(columns[1]));
the variable list contains a reference to an ArrayList that is already in the map, either the one that was there before, or one we just creatd and added. We just add the value to it.
I should add the first column is the value and the second column is the key.
You could remplace the ArrayList declaration by a List declaration. But it is not very problematic.
Anyway, not tested but the logic should be such as :
while ((line = in.readLine()) != null) {
String columns[] = line.split("\t");
ArrayList<Integer> valueForCurrentLine = words.get(columns[1]);
// you instantiate and put the arrayList once
if (valueForCurrentLine==null){
valueForCurrentLine = new ArrayList<Integer>();
words.put(columns[1],valueForCurrentLine);
}
valueForCurrentLine.add(columns[0]);
Upvote to Jim Garrison's answer above. Here's a little more... (Yes, you should check/mark his answer as the one that solved it)
import java.io.BufferedReader;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.io.IOException;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
public class WordNet {
private final Map<String, List<Integer>> words;
private final static String LEXICAL_UNITS_FILE = "src/net/bwillard/practice/code/wn_s.csv";
/**
*
* #throws FileNotFoundException
* #throws IOException
*/
public WordNet() throws FileNotFoundException, IOException {
words = new HashMap<>();
readLexicalUnitsFile();
}
/**
*
* #throws FileNotFoundException
* #throws IOException
*/
private void readLexicalUnitsFile() throws FileNotFoundException, IOException {
BufferedReader in = new BufferedReader(new FileReader(LEXICAL_UNITS_FILE));
String line;
while ((line = in.readLine()) != null) {
String columns[] = line.split("\t");
String key = columns[0];
int valueInt;
List<Integer> valueList;
try {
valueInt = Integer.parseInt(columns[1]);
} catch (NumberFormatException e) {
System.out.println(e);
continue;
}
if (words.containsKey(key)) {
valueList = words.get(key);
} else {
valueList = new ArrayList<>();
words.put(key, valueList);
}
valueList.add(valueInt);
}
in.close();
}
//You can test this file by running it as a standalone app....
public static void main(String[] args) {
try {
WordNet wn = new WordNet();
for (String k : wn.words.keySet()) {
System.out.println(k + " " + wn.words.get(k));
}
} catch (IOException e) {
e.printStackTrace();
}
}
}

Reading from file and splitting the data in Java

I'm trying to read data from a .txt file. The format looks like this:
ABC, John, 123
DEF, Mark, 456
GHI, Mary, 789
I am trying to get rid of the commas and put the data into an array or structure (structure most likely).
This is the code I used to to extract each item:
package prerequisiteChecker;
import java.util.*;
import java.io.*;
public class TestUnit {
public static void main(String[]args){
try {
FileInputStream fstream = new FileInputStream("courses.txt");
DataInputStream in = new DataInputStream(fstream);
BufferedReader br = new BufferedReader(new InputStreamReader(in));
String strLine;
while ((strLine = br.readLine()) != null) {
String[] splitOut = strLine.split(", ");
for (String token : splitOut)
System.out.println(token);
}
in.close();
} catch (Exception e){
System.err.println("Error: " + e.getMessage());
}
}
}
At one point I had a print line in the "while" loop to see if the items would be split. They were. Now I'm just at a loss on what to do next. I'm trying to place each grouping into one structure. For example: ID - ABC. First Name - John. Room - 123.
I have a few books on Java at home and tried looking around the web. There is so much out there, and none of it seemed to lead me in the right direction.
Thanks.
Michael
create a class that looks something like this:
class structure {
public String data1;
public String data2;
public String data3;
}
This will form your basic data structure that you can use to hold the kind of data you have mentioned in your question. Now, you might want to follow proper object oriented methods like declaring all your fields as private, and writting getters and setters. you can find more on there here ... http://java.dzone.com/articles/getter-setter-use-or-not-use-0
Now, just outside your while loop, create an ArrayList like this: ArrayList<structure> list = new ArrayList<structure>(); This will be used to hold all the different rows of data that you will parse.
Now, in your while loop do something like this:
structure item = new structure();//create a new instance for each row in the text file.
item.data1 = splitOut[0];
item.data2 = splitOut[1];
item.data3 = splitOut[2];
list.add(item);
this will basically take the data that you parse in each row, put in the data structure that you declared by creating a new instance of it for each new row that is parsed. this finally followed by inserting that data item in the ArrayList using the list.add(item) in the code as shown above.
I would create a nice structure to store your information. I'm not sure if how you want to access the data, but here's a nice example. I'll go off of what you previously put. Please note that I only made the variables public because they're final. They cannot change once you make the Course. If you want the course mutable, create getters and setters and change the instance variables to private. After, you can use the list to retrieve any course you'd like.
package prerequisiteChecker;
import java.util.*;
import java.io.*;
public class TestUnit {
public static void main(String[] args) {
try {
FileInputStream fstream = new FileInputStream("courses.txt");
// use DataInputStream to read binary NOT text
// DataInputStream in = new DataInputStream(fstream);
BufferedReader br = new BufferedReader(new InputStreamReader(fstream));
String strLine;
List<Course> courses = new LinkedList<Course>();
while ((strLine = br.readLine()) != null) {
String[] splitOut = strLine.split(", ");
if (splitOut.length == 3) {
courses.add(new Course(splitOut[0], splitOut[1],
splitOut[2]));
} else {
System.out.println("Invalid class: " + strLine);
}
}
in.close();
} catch (Exception e) {
System.err.println("Error: " + e.getMessage());
}
}
public static class Course {
public final String _id;
public final String _name;
public final String _room;
public Course(String id, String name, String room) {
_id = id;
_name = name;
_room = room;
}
}
}
public class File_ReaderWriter {
private static class Structure{
public String data;
}
public static void main(String[] args) throws IOException{
String allDataString;
FileInputStream fileReader = new FileInputStream ("read_data_file.txt");
DataInputStream in = new DataInputStream(fileReader);
BufferedReader bufferReader = new BufferedReader(new InputStreamReader(in));
String[] arrayString = {"ID - ", " NAME - ", " ROOM - "};
int recordNumber = 0;
Structure[] structure = new Structure[10];
for (int i = 0; i < 10; i++)
structure[i] = new Structure();
while((allDataString = bufferReader.readLine()) != null){
String[] splitOut = allDataString.split(", ");
structure[recordNumber].data = "";
for (int i = 0; i < arrayString.length; i++){
structure[recordNumber].data += arrayString[i] + splitOut[i];
}
recordNumber++;
}
bufferReader.close();
for (int i = 0; i < recordNumber; i++){
System.out.println(structure[i].data);
}
}
}
I modify your given code. It works. Try it and if any query then ask.

Categories

Resources