This is some code that I found to help with reading in a 2D Array, but the problem I am having is this will only work when reading a list of number structured like:
73
56
30
75
80
ect..
What I want is to be able to read multiple lines that are structured like this:
1,0,1,1,0,1,0,1,0,1
1,0,0,1,0,0,0,1,0,1
1,1,0,1,0,1,0,1,1,1
I just want to essentially import each line as an array, while structuring them like an array in the text file.
Everything I have read says to use scan.usedelimiter(","); but everywhere I try to use it the program throws straight to the catch that replies "Error converting number". If anyone can help I would greatly appreciate it. I also saw some information about using split for the buffered reader, but I don't know which would be better to use/why/how.
String filename = "res/test.txt"; // Finds the file you want to test.
try{
FileReader ConnectionToFile = new FileReader(filename);
BufferedReader read = new BufferedReader(ConnectionToFile);
Scanner scan = new Scanner(read);
int[][] Spaces = new int[10][10];
int counter = 0;
try{
while(scan.hasNext() && counter < 10)
{
for(int i = 0; i < 10; i++)
{
counter = counter + 1;
for(int m = 0; m < 10; m++)
{
Spaces[i][m] = scan.nextInt();
}
}
}
for(int i = 0; i < 10; i++)
{
//Prints out Arrays to the Console, (not needed in final)
System.out.println("Array" + (i + 1) + " is: " + Spaces[i][0] + ", " + Spaces[i][1] + ", " + Spaces[i][2] + ", " + Spaces[i][3] + ", " + Spaces[i][4] + ", " + Spaces[i][5] + ", " + Spaces[i][6]+ ", " + Spaces[i][7]+ ", " + Spaces[i][8]+ ", " + Spaces[i][9]);
}
}
catch(InputMismatchException e)
{
System.out.println("Error converting number");
}
scan.close();
read.close();
}
catch (IOException e)
{
System.out.println("IO-Error open/close of file" + filename);
}
}
I provide my code here.
public static int[][] readArray(String path) throws IOException {
//1,0,1,1,0,1,0,1,0,1
int[][] result = new int[3][10];
BufferedReader reader = new BufferedReader(new FileReader(path));
String line = null;
Scanner scanner = null;
line = reader.readLine();
if(line == null) {
return result;
}
String pattern = createPattern(line);
int lineNumber = 0;
MatchResult temp = null;
while(line != null) {
scanner = new Scanner(line);
scanner.findInLine(pattern);
temp = scanner.match();
int count = temp.groupCount();
for(int i=1;i<=count;i++) {
result[lineNumber][i-1] = Integer.parseInt(temp.group(i));
}
lineNumber++;
scanner.close();
line = reader.readLine();
}
return result;
}
public static String createPattern(String line) {
char[] chars = line.toCharArray();
StringBuilder pattern = new StringBuilder();;
for(char c : chars) {
if(',' == c) {
pattern.append(',');
} else {
pattern.append("(\\d+)");
}
}
return pattern.toString();
}
The following piece of code snippet might be helpful. The basic idea is to read each line and parse out CSV. Please be advised that CSV parsing is generally hard and mostly requires specialized library (such as CSVReader). However, the issue in hand is relatively straightforward.
try {
String line = "";
int rowNumber = 0;
while(scan.hasNextLine()) {
line = scan.nextLine();
String[] elements = line.split(',');
int elementCount = 0;
for(String element : elements) {
int elementValue = Integer.parseInt(element);
spaces[rowNumber][elementCount] = elementValue;
elementCount++;
}
rowNumber++;
}
} // you know what goes afterwards
Since it is a file which is read line by line, read each line using a delimiter ",".
So Here you just create a new scanner object passing each line using delimter ","
Code looks like this, in first for loop
for(int i = 0; i < 10; i++)
{
Scanner newScan=new Scanner(scan.nextLine()).useDelimiter(",");
counter = counter + 1;
for(int m = 0; m < 10; m++)
{
Spaces[i][m] = newScan.nextInt();
}
}
Use the useDelimiter method in Scanner to set the delimiter to "," instead of the default space character.
As per the sample input given, if the next row in a 2D array begins in a new line, instead of using a ",", multiple delimiters have to be specified.
Example:
scan.useDelimiter(",|\\r\\n");
This sets the delimiter to both "," and carriage return + new line characters.
Why use a scanner for a file? You already have a BufferedReader:
FileReader fileReader = new FileReader(filename);
BufferedReader reader = new BufferedReader(fileReader);
Now you can read the file line by line. The tricky bit is you want an array of int
int[][] spaces = new int[10][10];
String line = null;
int row = 0;
while ((line = reader.readLine()) != null)
{
String[] array = line.split(",");
for (int i = 0; i < array.length; i++)
{
spaces[row][i] = Integer.parseInt(array[i]);
}
row++;
}
The other approach is using a Scanner for the individual lines:
while ((line = reader.readLine()) != null)
{
Scanner s = new Scanner(line).useDelimiter(',');
int col = 0;
while (s.hasNextInt())
{
spaces[row][col] = s.nextInt();
col++;
}
row++;
}
The other thing worth noting is that you're using an int[10][10]; this requires you to know the length of the file in advance. A List<int[]> would remove this requirement.
I am trying to find top k words in a "data" text file. But I cannot remove stopwords including in "stop.txt" should I do it manually adding stopwords one by one or there is a method to read stop.txt file and remove these words in data.txt file?
try {
System.out.println("Enter value of 'k' words:: ");
Scanner in = new Scanner(System.in);
int n = in.nextInt();
w = new String[n];
r = new int[n];
Set<String> stopWords = new LinkedHashSet<String>();
BufferedReader SW = new BufferedReader(new FileReader("stop.txt"));
for(String line; (line = SW.readLine()) != null;)
stopWords.add(line.trim());
SW.close();
FileReader fr = new FileReader("data.txt");
BufferedReader br = new BufferedReader(fr);
String text = "";
String sz = null;
while((sz=br.readLine())!=null){
text = text.concat(sz);
}
String[] words = text.split(" ");
String[] uniqueLabels;
int count = 0;
uniqueLabels = getUniqLabels(words);
for(int j=0; j<n; j++){
r[j] = 0;
}
for(String l: uniqueLabels)
{
if("".equals(l) || null == l)
{
break;
}
for(String s : words)
{
if(l.equals(s))
{
count++;
}
}
for(int i=0; i<n; i++){
if(count>r[i]){
r[i] = count;
w[i] = l;
break;
}
}
count=0;
}
display(n);
} catch (Exception e) {
System.err.println("ERR "+e.getMessage());
}
Read file contents by:
List<String> stopwords = Files.readAllLines(Paths.get("english_stopwords.txt"));
Then use this for removing stop words:
ArrayList<String> allWords =
Stream.of(original.toLowerCase().split(" "))
.collect(Collectors.toCollection(ArrayList<String>::new));
allWords.removeAll(stopwords);
String result = allWords.stream().collect(Collectors.joining(" "));
Removing Stopwords from a String in Java
I have a text file list of thousands of String (3272) and I want to put them each into a slot of an Array so that I can use them to be sorted out. I have the sorting part done I just need help putting each line of word into an array. This is what I have tried but it only prints the last item from the text file.
public static void main(String[] args) throws IOException
{
FileReader fileText = new FileReader("test.txt");
BufferedReader scan = new BufferedReader (fileText);
String line;
String[] word = new String[3272];
Comparator<String> com = new ComImpl();
while((line = scan.readLine()) != null)
{
for(int i = 0; i < word.length; i++)
{
word[i] = line;
}
}
Arrays.parallelSort(word, com);
for(String i: word)
{
System.out.println(i);
}
}
Each time you read a line, you assign it to all of the elements of word. This is why word only ends up with the last line of the file.
Replace the while loop with the following code.
int next = 0;
while ((line = scan.readLine()) != null) word[next++] = line;
Try this.
Files.readAllLines(Paths.get("test.txt"))
.parallelStream()
.sorted(new ComImpl())
.forEach(System.out::println);
For the given text file (text.txt) compute how many times each word appears in the file. The output of the program should be another text file containing on each line a word and then the number of times it appears in the original file. After you finish change the program so that the words in the output file are sorted alphabetically. Do not use maps, use only basic arrays. The thing is displaying me only one word that I enter from keyboard in that text file, but how can I display for all words, not only for one? Thanks
package worddata;
import java.io.IOException;
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.*;
import java.util.ArrayList;
import java.util.List;
import java.util.Scanner;
class WordData {
public FileReader fr = null;
public BufferedReader br =null;
public String [] stringArray;
public int counLine = 0;
public int arrayLength ;
public String s="";
public String stringLine="";
public String filename ="";
public String wordname ="";
public WordData(){
try{
Scanner scan = new Scanner(System.in);
System.out.println("Please enter the filename: ");
filename = scan.nextLine();
Scanner scan2 = new Scanner(System.in);
System.out.println("Please enter a word: ");
wordname = scan.nextLine();
fr = new FileReader(filename);
br = new BufferedReader(fr);
while((s = br.readLine()) != null){
stringLine = stringLine + s;
//System.out.println(s);
stringLine = stringLine + " ";
counLine ++;
}
stringArray = stringLine.split(" ");
arrayLength = stringArray.length;
for (int i = 0; i < arrayLength; i++) {
int c = 1 ;
for (int j = i+1; j < arrayLength; j++) {
if(stringArray[i].equalsIgnoreCase(stringArray[j])){
c++;
for (int j2 = j; j2 < arrayLength; j2++) {
stringArray[j2] = stringArray[j2+1];
arrayLength = arrayLength - 1;
}
if (stringArray[i].equalsIgnoreCase(wordname)){
System.out.println("The word "+wordname+" is present "+c+" times in the specified file.");
}
}
}
}
System.out.println("Total number of lines: "+counLine);
fr.close();
br.close();
}catch (Exception e) {
e.printStackTrace();
}
}
public static void main(String[] args) throws IOException {
Scanner scan = new Scanner(System.in);
OutputStream out = new FileOutputStream("output.txt");
System.out.println("Please enter the filename: ");
String filename = scan.nextLine();
System.out.println("Please enter a word: ");
String wordname = scan.nextLine();
int count = 0;
try (LineNumberReader r = new LineNumberReader(new FileReader(filename))) {
String line;
while ((line = r.readLine()) != null) {
for (String element : line.split(" ")) {
if (element.equalsIgnoreCase(wordname)) {
count++;
System.out.println("Word found at line " + r.getLineNumber());
}
}
}
}
FileReader fileReader = new FileReader(filename);
BufferedReader bufferedReader = new BufferedReader(fileReader);
StringBuffer stringBuffer = new StringBuffer();
String line;
while ((line = bufferedReader.readLine()) != null) {
stringBuffer.append(line);
stringBuffer.append("\n");
}
fileReader.close();
System.out.println("The word " + stringBuffer.toString() + " appears " + count + " times.");
int i;
List<String> ls = new ArrayList<String>();
for (i = 1; i <= 1000; i++) {
String str = null;
str = +i + ":- The word "+wordname+" was found " + count +" times";
ls.add(str);
}
String listString = "";
for (String s : ls) {
listString += s + "\n";
}
FileWriter writer = null;
try {
writer = new FileWriter("final.txt");
writer.write(listString);
writer.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
The code below does something like you want I think.
it does the following:
read the contents from the input.txt file
Remove punctuation marks from the text
make it one string of words by removing line breaks
Split the text up in words by using space as delimiter
The lambda maps all the words to lowercase then removes whitespace and all empty entries then it...
loops over all words and computes there word count in het HashMap
then we sort the Map based on the count value in reverse order to get the highest counted words first
then write them to a StringBuilder to format it like this "word : count\n" and then write it to a text file
final String content = new String(Files.readAllBytes(Paths.get("<PATH TO YOUR PLACE>/input.txt")));
final List<String> words = Arrays.asList(content.replaceAll("[\\p{InCombiningDiacriticalMarks}]", "").replace("\n", " ").split(" "));
final Map<String, Integer> wordlist = new HashMap<>();
words.stream()
.map(String::toLowerCase)
.map(String::trim)
.filter(s -> !s.isEmpty())
.forEach(s -> {
wordlist.computeIfPresent(s, (s1, integer) -> ++integer);
wordlist.putIfAbsent(s, 1);
});
final StringBuilder sb = new StringBuilder();
wordlist.entrySet()
.stream()
.sorted(Map.Entry.comparingByValue(Collections.reverseOrder()))
.collect(Collectors.toMap(
Map.Entry::getKey,
Map.Entry::getValue,
(e1, e2) -> e1,
LinkedHashMap::new
)).forEach((s, integer) -> sb.append(s).append(" : ").append(integer).append("\n"));
Files.write(Paths.get("<PATH TO YOUR PLACE>/output.txt"), sb.toString().getBytes());
Hope it helps :-)
Note: the <PATH TO YOUR PLACE> needs to be replaced by the fully qualified path to your text file with words.
again. im gonna ask again about counting words and how to store it in array. So far, all i got is this.
Scanner sc = new Scanner(System.in);
int count;
void readFile() {
System.out.println("Gi navnet til filen: ");
String filNavn = sc.next();
try{
File k = new File(filNavn);
Scanner sc2 = new Scanner(k);
count = 0;
while(sc2.hasNext()) {
count++;
sc2.next();
}
Scanner sc3 = new Scanner(k);
String a[] = new String[count];
for(int i = 0;i<count;i++) {
a[i] =sc3.next();
if ( i == count -1 ) {
System.out.print(a[i] + "\n");
}else{
System.out.print(a[i] + " ");
}
}
System.out.println("Number of words: " + count);
}catch(FileNotFoundException e) {
my code works. but my question is, is there a more simple way to this? And the other question is how do i count the unique words out of the total words in a given file without using hashmap and arraylist.
Heres a simpler way to go about it:
public static void main(String[] args){
File f= new File(filename);
BufferedReader br = new BufferedReader(new InputStreamReader(new FileInputStream(f)));
String line = null;
String[] res;
while((line = br.readLine())!= null ){
String[] tokens = line.split("\\s+");
String[] both = ArrayUtils.addAll(res, tokens);
}
}