Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
Following code in Java reads multiple files one after another serially and it works well till here. (The files are JSON and at this step they are stored in a String/Buffer without parsing.)
for (int fileIndex = 0; fileIndex < numberOfFiles; fileIndex++) {
BufferedReader br = new BufferedReader(new FileReader("Files/file" + fileIndex + ".json"));
try {
StringBuilder sb = new StringBuilder();
String line = br.readLine();
while (line != null) {
sb.append(line);
sb.append(System.lineSeparator());
line = br.readLine();
}
String contentJSON = sb.toString();
} finally {
br.close();
}
}
How to read those files in parallel by using Threads ?
I could not match Multithreading to above code and every time got errors.
I've not tested this code directly (as I don't have a bunch of files to read), but the basic idea would be to do something like...
import java.io.BufferedReader;
import java.io.FileReader;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.Callable;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.Future;
import java.util.logging.Level;
import java.util.logging.Logger;
public class Test {
public static void main(String[] args) {
new Test();
}
public Test() {
try {
int numberOfFiles = 10;
ExecutorService service = Executors.newFixedThreadPool(10);
List<ReadWorker> workers = new ArrayList<>(numberOfFiles);
for (int fileIndex = 0; fileIndex < numberOfFiles; fileIndex++) {
workers.add(new ReadWorker(fileIndex));
}
List<Future<String>> results = service.invokeAll(workers);
for (Future<String> result : results) {
try {
String value = result.get();
} catch (ExecutionException ex) {
Logger.getLogger(Test.class.getName()).log(Level.SEVERE, null, ex);
}
}
} catch (InterruptedException ex) {
Logger.getLogger(Test.class.getName()).log(Level.SEVERE, null, ex);
}
}
public class ReadWorker implements Callable<String> {
private int fileIndex;
public ReadWorker(int fileIndex) {
this.fileIndex = fileIndex;
}
#Override
public String call() throws Exception {
try (BufferedReader br = new BufferedReader(new FileReader("Files/file" + fileIndex + ".json"))) {
StringBuilder sb = new StringBuilder();
String line = br.readLine();
while (line != null) {
sb.append(line);
sb.append(System.lineSeparator());
line = br.readLine();
}
return sb.toString();
}
}
}
}
This will basically execute a series of Callables and wait for them all to complete, at which time, you can then read the results (or errors)
See the Executors trail for more details
Tested and verified version...
So, I dumped a series of files into a the Files folder at the root of my working directory, modified the above example to list all the files in that directory and read them....
import java.io.BufferedReader;
import java.io.File;
import java.io.FileFilter;
import java.io.FileReader;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.Callable;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.Future;
import java.util.logging.Level;
import java.util.logging.Logger;
public class Test {
public static void main(String[] args) {
new Test();
}
public Test() {
File files[] = new File("Files").listFiles(new FileFilter() {
#Override
public boolean accept(File pathname) {
return pathname.getName().toLowerCase().endsWith(".svg");
}
});
try {
int numberOfFiles = files.length;
ExecutorService service = Executors.newFixedThreadPool(20);
List<ReadWorker> workers = new ArrayList<>(numberOfFiles);
for (File file : files) {
workers.add(new ReadWorker(file));
}
System.out.println("Execute...");
List<Future<String>> results = service.invokeAll(workers);
System.out.println("Results...");
for (Future<String> result : results) {
try {
String value = result.get();
System.out.println(value);
} catch (ExecutionException ex) {
Logger.getLogger(Test.class.getName()).log(Level.SEVERE, null, ex);
}
}
service.shutdownNow();
} catch (InterruptedException ex) {
Logger.getLogger(Test.class.getName()).log(Level.SEVERE, null, ex);
}
}
public class ReadWorker implements Callable<String> {
private File file;
public ReadWorker(File file) {
this.file = file;
}
#Override
public String call() throws Exception {
System.out.println("Reading " + file);
try (BufferedReader br = new BufferedReader(new FileReader(file))) {
StringBuilder sb = new StringBuilder();
String line = br.readLine();
while (line != null) {
sb.append(line);
sb.append(System.lineSeparator());
line = br.readLine();
}
return sb.toString();
}
}
}
}
And this works just fine and I have no issue.
java.io.FileNotFoundException: Files\file0.json is a localised issue you are going to have to solve. Does file0.json actually exist? Does it exist in the Files directory? Is the Files directory in the root of the working directory when the program is executed?
None of these issues can be solved by us, as we don't have access to your environment
Test #3
I then renamed all the files in my Files directory to file{x}.json using...
File files[] = new File("Files").listFiles(new FileFilter() {
#Override
public boolean accept(File pathname) {
return pathname.getName().toLowerCase().endsWith(".svg");
}
});
for (int index = 0; index < files.length; index++) {
File source = files[index];
File target = new File(source.getParent(), "file" + index + ".json");
source.renameTo(target);
}
And the modified the example slightly to include a File#exists report...
import java.io.BufferedReader;
import java.io.File;
import java.io.FileReader;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.Callable;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.Future;
import java.util.logging.Level;
import java.util.logging.Logger;
public class Test {
public static void main(String[] args) {
new Test();
}
public Test() {
try {
int numberOfFiles = 10;
ExecutorService service = Executors.newFixedThreadPool(20);
List<ReadWorker> workers = new ArrayList<>(numberOfFiles);
for (int index = 0; index < numberOfFiles; index++) {
workers.add(new ReadWorker(index));
}
System.out.println("Execute...");
List<Future<String>> results = service.invokeAll(workers);
System.out.println("Results...");
for (Future<String> result : results) {
try {
String value = result.get();
System.out.println(value);
} catch (ExecutionException ex) {
Logger.getLogger(Test.class.getName()).log(Level.SEVERE, null, ex);
}
}
service.shutdownNow();
} catch (InterruptedException ex) {
Logger.getLogger(Test.class.getName()).log(Level.SEVERE, null, ex);
}
}
public class ReadWorker implements Callable<String> {
private int fileIndex;
public ReadWorker(int fileIndex) {
this.fileIndex = fileIndex;
}
#Override
public String call() throws Exception {
System.out.println("Reading " + fileIndex);
File file = new File("Files/file" + fileIndex + ".json");
System.out.println("File " + fileIndex + " exists = " + file.exists());
try (BufferedReader br = new BufferedReader(new FileReader(file))) {
StringBuilder sb = new StringBuilder();
String line = br.readLine();
while (line != null) {
sb.append(line);
sb.append(System.lineSeparator());
line = br.readLine();
}
return sb.toString();
} finally {
System.out.println("All done here");
}
}
}
}
Which prints
Execute...
Reading 8
Reading 1
Reading 2
Reading 4
Reading 6
Reading 9
Reading 3
Reading 7
Reading 0
Reading 5
File 8 exists = true
File 1 exists = true
File 5 exists = true
File 4 exists = true
File 9 exists = true
File 2 exists = true
File 0 exists = true
File 3 exists = true
File 7 exists = true
File 6 exists = true
All done here
All done here
All done here
All done here
All done here
All done here
All done here
All done here
All done here
All done here
Results...
// I won't bore you with the results, as it's a lot of pointless text
which all worked without issues
Related
Here is a code snippet from my main Java function:
try (MultiFileReader multiReader = new MultiFileReader(inputs)) {
PriorityQueue<WordEntry> words = new PriorityQueue<>();
for (BufferedReader reader : multiReader.getReaders()) {
String word = reader.readLine();
if (word != null) {
words.add(new WordEntry(word, reader));
}
}
}
Here is how I get my BufferedReader readers from another Java file:
public List<BufferedReader> getReaders() {
return Collections.unmodifiableList(readers);
}
But for some reason, when I compile my code here is what I get:
The error happens exactly at the line where I wrote String word = reader.readLine(); and what's weird is that reader.readLine() is not null, in fact multiReader.getReaders() returns a list of 100 objects (they are files read from a directory). I would like some help solving that issue.
I posted where the issue is, now let me provide a broader view of my code. To run it, it suffices to compile it under the src/ directory doing javac *.java and java MergeShards shards/ sorted.txt provided that shards/ is present under src/ and contains .txt files in my scenario.
This is MergeShards.java where I have my main function:
import java.io.BufferedReader;
import java.io.Writer;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.List;
import java.util.Objects;
import java.util.PriorityQueue;
import java.util.stream.Collectors;
public final class MergeShards {
public static void main(String[] args) throws Exception {
if (args.length != 2) {
System.out.println("Usage: MergeShards [input folder] [output file]");
return;
}
List<Path> inputs = Files.walk(Path.of(args[0]), 1).skip(1).collect(Collectors.toList());
Path outputPath = Path.of(args[1]);
try (MultiFileReader multiReader = new MultiFileReader(inputs)) {
PriorityQueue<WordEntry> words = new PriorityQueue<>();
for (BufferedReader reader : multiReader.getReaders()) {
String word = reader.readLine();
if (word != null) {
words.add(new WordEntry(word, reader));
}
}
try (Writer writer = Files.newBufferedWriter(outputPath)) {
while (!words.isEmpty()) {
WordEntry entry = words.poll();
writer.write(entry.word);
writer.write(System.lineSeparator());
String word = entry.reader.readLine();
if (word != null) {
words.add(new WordEntry(word, entry.reader));
}
}
}
}
}
private static final class WordEntry implements Comparable<WordEntry> {
private final String word;
private final BufferedReader reader;
private WordEntry(String word, BufferedReader reader) {
this.word = Objects.requireNonNull(word);
this.reader = Objects.requireNonNull(reader);
}
#Override
public int compareTo(WordEntry other) {
return word.compareTo(other.word);
}
}
}
This is my MultiFileReader.java file:
import java.io.BufferedReader;
import java.io.Closeable;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
public final class MultiFileReader implements Closeable {
private final List<BufferedReader> readers;
public MultiFileReader(List<Path> paths) {
readers = new ArrayList<>(paths.size());
try {
for (Path path : paths) {
readers.add(Files.newBufferedReader(path));
}
} catch (IOException e) {
e.printStackTrace();
} finally {
close();
}
}
public List<BufferedReader> getReaders() {
return Collections.unmodifiableList(readers);
}
#Override
public void close() {
for (BufferedReader reader : readers) {
try {
reader.close();
} catch (Exception ignored) {
}
}
}
}
The finally block in your constructor closes all of your readers. Remove that.
public MultiFileReader(List<Path> paths) {
readers = new ArrayList<>(paths.size());
try {
for (Path path : paths) {
readers.add(Files.newBufferedReader(path));
}
} catch (IOException e) {
e.printStackTrace();
} /* Not this. finally {
close();
} */
}
Every time the user logins.I'm reading till the SECOND LAST LINE OF THE FILE .I want to know what changes i need to make to the code so that i can read only till the second last line of the file.
public static boolean User(String usid) {
try {
String acc = usid;
File file = new File("C:\\Temp\\logs\\bank.log");
Scanner myReader = new Scanner(file);
while (myReader.hasNextLine()) {
String data = myReader.nextLine();
String[] substrings = data.split("[:]");
if (substrings[5].contains(acc) && substrings[4].contains("Login Successful for user")) {
a = true;
} else {
a = false;
}
}
} catch (Exception e) {
e.printStackTrace();
}
Could anyone please guide me what changes i need to make to the above code to read till the second last line of the file.[NOTE:-The contents of this file keeps adding once the user logins or logout.]
Try this, you can modify as per your requirement
import java.io.BufferedReader;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.ArrayList;
import java.util.List;
import java.util.stream.Collectors;
public class Test{
public static boolean User(String usid) {
boolean a=false;
String fileName = "c://lines.txt";
List<String> list = new ArrayList<>();
String acc = usid;
try (BufferedReader br = Files.newBufferedReader(Paths.get(fileName))) {
//br returns as stream and convert it into a List
list = br.lines().collect(Collectors.toList());
for(int i=0; i<list.size()-1; i++){
String data = list.get(i);
String[] substrings = data.split("[:]");
if (substrings[5].contains(acc) && substrings[4].contains("Login Successful for user")) {
a = true;
} else {
a = false;
}
}
} catch (IOException e) {
e.printStackTrace();
}
return a;
}
}
This is already answered at Link
Here is my code
package sequentialFilePractice;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.io.IOException;
import java.util.ArrayList;
import java.util.logging.Level;
import java.util.logging.Logger;
public class ReadFile{
static String line = "";
ReadFile() throws FileNotFoundException{
readTheFile();
CSVtoArrayList();
}
public String readTheFile() throws FileNotFoundException{
String csvFile = "H:\\S6\\AH Computing\\Java Practice\\test.csv";
BufferedReader br = null;
String cvsSplitBy = ",";
try {
br = new BufferedReader(new FileReader(csvFile));
while ((line = br.readLine()) != null) {
}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
if (br != null) {
try {
br.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
return line;
}
public static ArrayList<String> CSVtoArrayList() {
ArrayList<String> splitCSV = new ArrayList<>();
if (line != null) {
String[] splitData = line.split("\\s*,\\s*");
for (int i = 0; i < splitData.length; i++) {
if (!(splitData[i] == null) || !(splitData[i].length() == 0)) {
splitCSV.add(splitData[i].trim());
}
}
}
for(int j = 0;j < splitCSV.size();j++){
System.out.println(splitCSV.get(j));
}
return splitCSV;
}
public static void main(String[]args) throws IOException{
ReadFile f = new ReadFile();
}
}
The code compiles and the file exists. I can print line and it prints the contents of the file however when I print the arrayList, nothing is output so it has not been copied. This is my first use of sequential files in java.
Do you HAVE to read the file manually? If not, you should check out http://opencsv.sourceforge.net/, it allows you to read a CSV directly into a List<String[]> instead of having to deal with the admin of looping, splitting the line and creating a list.
In essence reducing your code to:
CSVReader reader = new CSVReader(new FileReader("yourfile.csv"));
List myEntries = reader.readAll();
I need to create a textfile that combines any numbers of textfiles from the same folder. They need to be accessed via the arguments in my main-method, so that it look for the filenames I write. The last file name should be the destination file.
So far my code is creating a new file that has the last string I enter as a name, but it is an empty file. I suspect that my BufferedReader class is not doing what it should, but I'm at a loss. Here is my code. First a driver class and the the actual program. Thanks so much for any help you're able to provide!
public class Driver {
public static void main(String[] args)
{
CatFiles cat = new CatFiles(args);
cat.bookCombiner();
}
}
This is where it goes wrong.
import java.io.BufferedReader;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.FileReader;
import java.io.FileWriter;
import java.io.IOException;
import java.nio.channels.FileChannel;
import java.util.ArrayList;
public class CatFiles {
private String[] files;
public CatFiles(String[] files) {
this.files = files;
}
public String getDest() {
String destination = null;
for (int i = 0; i < files.length; i++) {
destination = files[i];
}
return destination;
}
public void bookCombiner() {
BufferedReader reader = null;
try {
FileWriter writer = new FileWriter(getDest());
for (int i = 0; i < files.length - 1; i++) {
File file = new File(files[i]);
String line = null;
reader = new BufferedReader(new FileReader(file));
if ((line = reader.readLine()) != null) {
writer.write(files.length - 1);
}
}
writer.close();
} catch (Exception e) {
System.out.println(e);
} finally {
try{
reader.close();
} catch(IOException e){
e.printStackTrace();
}
}
}
}
you never use writer to write line. Change:
if ((line = reader.readLine()) != null) {
writer.write(files.length - 1);
}
to
while ((line = reader.readLine()) != null) {
writer.write(line);
}
Couple of issues:
You are using if ((line = reader.readLine()) != null) instead of while which will just write the file length - 1 once into target file.
You are using writer.write(files.length - 1); to write to last file, which should be writer.write(line);
You have probably an error in your getDest() method. Now it just returns last element from files[] array. It is equivalent to:
return files[files.length - 1];
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileWriter;
import java.io.IOException;
import java.nio.MappedByteBuffer;
import java.nio.channels.FileChannel;
import java.nio.charset.Charset;
import java.util.ArrayList;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
public class Test6 implements Runnable {
private File file;
private int totalNumberOfFiles = 0;
private static int nextFile = -1;
private static ArrayList<String> allFilesArrayList = new ArrayList<String>();
private static ExecutorService executorService = null;
public Test6(File file) {
this.file = file;
}
private String readFileToString(String fileAddress) {
FileInputStream stream = null;
MappedByteBuffer bb = null;
String stringFromFile = "";
try {
stream = new FileInputStream(new File(fileAddress));
FileChannel fc = stream.getChannel();
bb = fc.map(FileChannel.MapMode.READ_ONLY, 0, fc.size());
/* Instead of using default, pass in a decoder. */
stringFromFile = Charset.defaultCharset().decode(bb).toString();
} catch (IOException e) {
System.out.println("readFileToString IOException");
e.printStackTrace();
} finally {
try {
stream.close();
} catch (IOException e) {
System.out.println("readFileToString IOException");
e.printStackTrace();
}
}
return stringFromFile;
}
private void toFile(String message, String fileName) {
try {
FileWriter fstream = new FileWriter("C:/Users/Nomi/Desktop/Workspace2/Test6/TestWritten/" + fileName);
System.out.println("printing to file: ".concat(fileName));
BufferedWriter out = new BufferedWriter(fstream);
out.write(message);
out.close();
} catch (Exception e) {
System.out.println("toFile() Exception");
System.err.println("Error: " + e.getMessage());
}
}
// private void listFilesForFolder(final File fileOrFolder) {
// String temp = "";
// if (fileOrFolder.isDirectory()) {
// for (final File fileEntry : fileOrFolder.listFiles()) {
// if (fileEntry.isFile()) {
// temp = fileEntry.getName();
// toFile(readFileToString(temp), "Copy".concat(temp));
// }
// }
// }
// if (fileOrFolder.isFile()) {
// temp = fileOrFolder.getName();
// toFile(readFileToString(temp), "Copy".concat(temp));
// }
// }
public void getAllFilesInArrayList(final File fileOrFolder) {
String temp = "";
System.out.println("getAllFilesInArrayList fileOrFolder.getAbsolutePath()" + fileOrFolder.getAbsolutePath());
if (fileOrFolder.isDirectory()) {
for (final File fileEntry : fileOrFolder.listFiles()) {
if (fileEntry.isFile()) {
temp = fileEntry.getAbsolutePath();
allFilesArrayList.add(temp);
}
}
}
if (fileOrFolder.isFile()) {
temp = fileOrFolder.getAbsolutePath();
allFilesArrayList.add(temp);
}
totalNumberOfFiles = allFilesArrayList.size();
for (int i = 0; i < allFilesArrayList.size(); i++) {
System.out.println("getAllFilesInArrayList path: " + allFilesArrayList.get(i));
}
}
public synchronized String getNextFile() {
nextFile++;
if (nextFile < allFilesArrayList.size()) {
// File tempFile = new File(allFilesArrayList.get(nextFile));
return allFilesArrayList.get(nextFile);
} else {
return null;
}
}
#Override
public void run() {
getAllFilesInArrayList(file);
executorService = Executors.newFixedThreadPool(allFilesArrayList.size());
while(nextFile < totalNumberOfFiles)
{
String tempGetFile = getNextFile();
File tempFile = new File(allFilesArrayList.get(nextFile));
toFile(readFileToString(tempFile.getAbsolutePath()), "Copy".concat(tempFile.getName()));
}
}
public static void main(String[] args) {
Test6 test6 = new Test6(new File("C:/Users/Nomi/Desktop/Workspace2/Test6/Test Files/"));
Thread thread = new Thread(test6);
thread.start();
// executorService.execute(test6);
// test6.listFilesForFolder(new File("C:/Users/Nomi/Desktop/Workspace2/Test6/"));
}
}
The programs' doing what's expected. It goes into the folder, grabs a file, reads it into a string and then writes the contents to a new file.
I would like to do this multi threaded. If the folder has N number of files, I need N number of threads. Also I would like to use executor framework if possible. I'm thinking that there can be a method along this line:
public synchronized void getAllFilesInArrayList() {
return nextFile;
}
So each new thread could pick the next file.
Thank you for your help.
Error:
Exception in thread "Thread-0" java.lang.IllegalArgumentException
at java.util.concurrent.ThreadPoolExecutor.<init>(ThreadPoolExecutor.java:589)
at java.util.concurrent.ThreadPoolExecutor.<init>(ThreadPoolExecutor.java:480)
at java.util.concurrent.Executors.newFixedThreadPool(Executors.java:59)
at Test6.run(Test6.java:112)
at java.lang.Thread.run(Thread.java:662)
Firstly, your approach to the problem will result in more synchronization and race condition worries than seems necessary. A simple strategy to keep your threads from racing would be this:
1) Have a dispatcher thread read all the file names in your directory.
2) For each file, have the dispatcher thread spawn a worker thread and hand off the file reference
3) Have the worker thread process the file
4) Make sure you have some sane naming convention for your output file names so that you don't get threads overwriting each other.
As for using an executor, a ThreadPoolExecutor would probably work well. Go take a look at the javadoc: http://docs.oracle.com/javase/1.5.0/docs/api/java/util/concurrent/ThreadPoolExecutor.html