I am trying to read three files using thread and then pass on the content to the writer class to write it to another file. The thread associated with the first file(which has line break in it) is returning back after every line break. Can anyone please tell me why is this happening. I will be pasting my code of the reader class.
package filehandling;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileReader;
import java.io.IOException;
public class FileReading extends Thread{
BufferedReader fis;
int count=0;
FileWriting fw;
String str1, str2;
String ssr;
public FileReading(String str) throws IOException
{
//getting filename
File f= new File(str);
String strin;
strin= f.getName();
System.out.println(".." + strin);
//splitting filename to get the initial name
String stra[]= new String[2];
stra= strin.split("\\.");
str1= stra[0];
str2= stra[1];
System.out.println("extension name :" + str2);
System.out.println("filename :" + str1);
//associating file to input stream
fis= new BufferedReader(new FileReader(f));
}
public void run()
{
try
{
while((ssr=fis.readLine())!=null)
{
//file contents
System.out.println(ssr);
//writer thread
fw= new FileWriting(str1,ssr);
fw.start();
//assigning thread time to read,else next thread comes in
join(1000);
}
}
catch(Exception e)
{
System.out.println("exception : " + e.getMessage() + e.getStackTrace());
}
}
}
There is no sense in starting multiple threads within the process of reading a single file. Reading from a file data stream has no opportunity for parallel execution. But you can read three independent files in parallel by using three threads using an ordinary loop within each thread.
There is another misconception; you seem to think that you have to assign time to the threads. That’s wrong, you don’t need to think about that and you can’t do it the way you tried. When you start three threads, each of them reading and writing a file, all of them will go to sleep when no data are available and proceed on new data. The operating system will assign them CPU time appropriately.
Since you don’t have given a write part I can’t give you a code example for your task, but here is a simple example of reading three files in parallel and getting their contents back as a String:
import java.io.BufferedReader;
import java.io.FileReader;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.*;
public class ReadFile implements Callable<String>
{
public static void main(String[] args) throws InterruptedException
{
// enter the three file names here
String[] file={
"",
"",
""};
ExecutorService executorService = Executors.newFixedThreadPool(file.length);
List<Callable<String>> jobs=new ArrayList<>(file.length);
for(String f:file) jobs.add(new ReadFile(f));
List<Future<String>> all = executorService.invokeAll(jobs);
System.out.println("all files have been read in");
int i=0; for(Future<String> f:all) { i++;
System.out.println("file "+i+": ");
try
{
String contents = f.get();
System.out.println(contents);
} catch(ExecutionException ex)
{
ex.getCause().printStackTrace();
}
}
}
final String fileName;
public ReadFile(String file) {
fileName=file;
}
public String call() throws Exception {
String newLine=System.getProperty("line.separator");
StringBuilder sb=new StringBuilder();
try(BufferedReader r=new BufferedReader(new FileReader(fileName))) {
for(;;) {
String line=r.readLine();
if(line==null) break;
sb.append(line).append(newLine);
}
}
return sb.toString();
}
}
The line executorService.invokeAll will invoke the call methods of all provided ReadFile instances, each of them in another Thread. These threads read their files in a loop, becoming blocked whenever the I/O system has no new data for them, giving the other threads a chance to proceed. However, don’t expect three threads to run significantly faster the one thread processing the files one after another. The limiting factor is the I/O speed of your harddrive/SSD/etc. You have the most chances of getting more speed by multiple threads when all files lie on different devices.
Things will look different if the threads are not just reading or writing but also performing some computations.
Related
I have a simple project where I created a Store with customers, products and employees. Each is represented by a Class of course and I also have a CSV file for each one of them to be able to load data from and save data to it.
I'm facing issues where the file reading/writing is working, but not really. For example, I have the ability to save each file individually so if for instance I want to create a new customer, I'd save it to the list and then to the file. Issue is, once I do it for another Class (i.e if I create a new employee) and then save it again, the customer file object I saw in the CSV earlier is deleted. BUT, once I add a new object again, that same object reappears again. Hope you can somehow understand, but here is a more detailed view:
customer.csv is empty:
Me creating a new customer:
Created and saved to CSV:
Now, if I go to the other menu, and click on "Save all data" that jon snow customer object will be gone. Then if I create a new customer, then it will be added to the CSV file, along with the jon snow I added earlier. So why is it gone in the first place?
So here is the whole file reader/writer code I'm using:
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileWriter;
import java.util.ArrayList;
import java.io.IOException;
import java.util.List;
import java.util.Scanner;
class CSV {
static void CreateFile(String filename) { //Create new file
try {
File fileToCreate = new File(filename);
if (fileToCreate.createNewFile()) {
System.out.println("File created sucessfully: " + fileToCreate.getName());
}
} catch (IOException e) {
System.out.println("Cannot create file!");
}
}
static void ReadFile(String path_and_filename){
try {
File fileToRead = new File(path_and_filename);
Scanner myReader = new Scanner(fileToRead);
System.out.println("Reading file "+path_and_filename+" :");
while (myReader.hasNextLine()) {
String data = myReader.nextLine();
System.out.println(data);
}
myReader.close();
System.out.println();
} catch (FileNotFoundException e) {
System.out.println("There is no such file "+"\"path_and_filename\""+".\n");
}
}
// The StringBuilder in Java represents a mutable sequence of characters.
// Java's built in String class is not mutable.
static void saveArrayListToFile(List<Output> listToSave, String fileName, String sep) throws Exception {
StringBuilder ans = new StringBuilder();
for (Output record : listToSave) {
ans.append(record.createOutput());
ans.append(sep);
}
saveStringToFile(ans.toString(), fileName);
System.out.println("\nData saved to "+ fileName);
}
static void saveArrayListToFile1(ArrayList<String> listToSave, String fileName, String sep){
StringBuilder ans = new StringBuilder();
for (Object record : listToSave) {
ans.append(record.toString());
ans.append(sep);
}
saveStringToFile(ans.toString(), fileName);
System.out.println("\nList was saved to file "+fileName+"\n");
}
static void saveStringToFile(String data, String fileName){
BufferedWriter bufferedWriter=null;
try {
bufferedWriter = new BufferedWriter(
new FileWriter(fileName,false));
bufferedWriter.write(data);
} catch (IOException e) {
System.out.println("Cannot write to file");
} finally {
try {
bufferedWriter.close();
} catch (IOException e) {
System.out.println("Cannot write to file");
}
}
}
}
When I'm creating a new customer, I call it from a menu and it looks like this:
switch (selection) {
case 1:
try {
System.out.println("You're registering as a new customer");
String custID = ObjectIDs.generateID();
System.out.println("Enter first name:");
String firstName = sc.next();
System.out.println("Enter last name:");
String lastName = sc.next();
st.newCustomer(custID, firstName, lastName);
st.saveCustomersList();
} catch (Exception e) {
e.printStackTrace();
}
break;
the saveCustomerList() function is this:
#SuppressWarnings("unchecked")
void saveCustomersList() throws Exception {
CSV.saveArrayListToFile((List<Output>)(List<?>) customers, CUSTOMERS_FILE_PATH,"\n");
}
And then the functions calls saveArrayListToFile() to save it.
The behavior is the same with Product and Employee projects, so I randomly chose to show how it acts when creating a new Product.
I hope I added enough information. If needed, I can paste more code in but I already feel it's very cluttered. Hopefully it's ok.
Thank you very much :)
At the moment it's hard to say, as one can only hypothesise as to what happens when you click on "Save all data". There are some weird things (what is saveArrayListToFile and saveArrayListToFile11? Why does one declare an exception? When are these called?).
Having said that, look at the actual file writing method saveStringToFile, it says:
bufferedWriter = new BufferedWriter(new FileWriter(fileName,false));
This false there means 'do not append to file, rewrite it from scratch'. So each time you call it, file contents are discarded and replaced from what you provide to the method call. So my somewhat educated guess would be:
You save customer one to file (gets cleared, customer 1 written) and
append the customer to a list of customers (that's my guess)
You
save customer two to file (file gets cleared, so only customer 2 is
saved), you add to list to customers (do you?)
Then you choose 'save all' which gets list of customers, and save them in one go, a single call to the method. The file is cleared, all customers are saved.
But it's all guessing. Try creating a minimal, reproducible example
In addition to pafau k. I would like to add some things at least I would do differently...
First of all:
Things that can cause errors or unexpected behaviour:
Everything below is in saveStringToFile
Like already pointed out the Initialisation of the BufferedWriter: It should be initialized like this:
bufferedWriter = new BufferedWriter(new FileWriter(filename, true));
This puts the File into appending mode (if you want to append to a file you can also get rid of the boolean (second argument) entirely because appending is standard: new FileWriter(filename))
If for some case the Creation of the BufferedWriter failed you will still have a null-pointing object as bufferedWriter. This however means that you will be surprised with a NullPointerException in your finally block. To prevent this first of all do a check in your finally block:
if (bufferedWriter != null) {
// Close your bufferedWriter in here
}
Also, if you run into an error you will likely be presented with the same error message twice.
Now cosmetics:
Things that I would write differently for aesthetic reasons:
Java methods (and static "methods") are always starting with a small letter :)
This means it should be public static void createFile() for example or static void readFile()
variables and parameters of methods do not contain seperators like _ but instead if you want to make it more readable you start with a small letter and for each seperation you use a capital letter for that: e.g. String thisIsAVeryLongVariableWithALotOfSeperations = "Foo";
The generic types in saveArrayListToFile1() work like a placeholder. So you declare ArrayList<String> listToSave so you don't need a cast in the following for-loop: You can simply write:
for (String record : listToSave) {
ans.append(record);
ans.append(sep);
}
I hope this fixes all errors or complications. :)
PrintWriter works (it writes to the external file) until I add the line that says Thread.sleep(100);. Then the code still compiles just fine, and it continues writing to the console, but it won't print to the external file. But I can't figure out why?
import java.io.*;
import java.io.PrintWriter;
import java.io.File;
import javax.swing.*;
public class RecordMouse {
public static void main(String[] args) throws InterruptedException{
String line = "";
// string for filename
String filename = System.currentTimeMillis() + "out.txt";
// create file
File file = new File(filename);
// create writer
PrintWriter printWriter = null;
try
{
printWriter = new PrintWriter(file);
while(true){
//Thread.sleep(100);
System.out.println(System.currentTimeMillis() + " hi \n");
printWriter.println(System.currentTimeMillis() + " hi");
}
}
catch (FileNotFoundException e)
{
e.printStackTrace();
}
finally
{
if ( printWriter != null )
{
printWriter.close();
}
}
}
}
The difference sleep makes in your case is that it slows down frequency of writes and and it will take a while until the writes get flushed into the file. By removing the sleep you are causing the write flush to happen much more earlier. Change the sleep time into something smaller (like 5 instead of 100) or wait a little longer and see that the file gets written over.
I'm writing a wrapper program in java that's just supposed to pass arguments to other processes by writing to their standard in streams, and reading the response from their standard out streams. However, when the String I try to pass in is too large, PrintWriter.print simply blocks. No error, just freezes. Is there a good workaround for this?
Relevant code
public class Wrapper {
PrintWriter writer;
public Wrapper(String command){
start(command);
}
public void call(String args){
writer.println(args); // Blocks here
writer.flush();
//Other code
}
public void start(String command) {
try {
ProcessBuilder pb = new ProcessBuilder(command.split(" "));
pb.redirectErrorStream(true);
process = pb.start();
// STDIN of the process.
writer = new PrintWriter(new OutputStreamWriter(process.getOutputStream(), "UTF-8"));
} catch (Exception e) {
e.printStackTrace();
System.out.println("Process ended catastrophically.");
}
}
}
If I try using
writer.print(args);
writer.print("\n");
it can handle a larger string before freezing, but still ultimately locks up.
Is there maybe a buffered stream way to fix this? Does print block on the processes stream having enough space or something?
Update
In response to some answers and comments, I've included more information.
Operating System is Windows 7
BufferedWriter slows the run time, but didn't stop it from blocking eventually.
Strings could get very long, as large as 100,000 characters
The Process input is consumed, but by line i.e Scanner.nextLine();
Test code
import java.io.IOException;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.TimeoutException;
import ProcessRunner.Wrapper;
public class test {
public static void main(String[] args){
System.out.println("Building...");
Wrapper w = new Wrapper("java echo");
System.out.println("Calling...");
String market = "aaaaaa";
for(int i = 0; i < 1000; i++){
try {
System.out.println(w.call(market, 1000));
} catch (InterruptedException | ExecutionException
| TimeoutException e) {
System.out.println("Timed out");
}
market = market + market;
System.out.println("Size = " + market.length());
}
System.out.println("Stopping...");
try {
w.stop();
} catch (IOException e) {
e.printStackTrace();
System.out.println("Stop failed :(");
}
}
}
Test Process:
You have to first compile this file, and make sure the .class is in the same folder as the test .class file
import java.util.Scanner;
public class echo {
public static void main(String[] args){
while(true){
Scanner stdIn = new Scanner(System.in);
System.out.println(stdIn.nextLine());
}
}
}
I suspect that what is happening here is that the external process is writing to its standard output. Since your Java code doesn't read it, it eventually fills the external process's standard out (or err) pipe. That blocks the external process, which means that it can read from its input pipe .... and your Java process freezes.
If this is the problem, then using a buffered writer won't fix it. You either need to read the external processes output or redirect it to a file (e.g. "/dev/null" on Linux)
Writing to any pipe or socket by any means in java.io blocks if the peer is slower reading than you are writing.
Nothing you can do about it.
Recently, I wrote a simple client server program for file transfer over standard TCP sockets. The average throughput was around 2.2Mbps over WiFi channel. My question is:
Is it possible to transfer a large file (say 5 GB) over multiple data IO streams so that each stream could transfer several parts of the same file in a parallel manner (different threads could be used for this purpose)? These file parts could be re-assembled at the receiving end.
I tried to split a small file and transfered it over a dataoutputstream. The first segment works fine, but I don't know how to read a file input stream in selective manner (I also tried mark() and reset() methods for selective reading but no use)
Here is my code (for testing purpose, I have redirected the output to fileoutputstream):
public static void main(String[] args) {
// TODO Auto-generated method stub
final File myFile=new File("/home/evinish/Documents/Android/testPicture.jpg");
long N=myFile.length();
try {
FileInputStream in=new FileInputStream(myFile);
FileOutputStream f0=new FileOutputStream("/home/evinish/Documents/Android/File1.jpg");
FileOutputStream f1=new FileOutputStream("/home/evinish/Documents/Android/File2.jpg");
FileOutputStream f2=new FileOutputStream("/home/evinish/Documents/Android/File3.jpg");
byte[] buffer=new byte[4096];
int i=1, noofbytes;
long acc=0;
while(acc<=(N/3)) {
noofbytes=in.read(buffer, 0, 4096);
f0.write(buffer, 0, noofbytes);
acc=i*noofbytes;
i++;
}
f0.close();
I got the first segment of my file (this can be copied to a DataOutputStream in one thread). Can any one suggest, how to read remaining part of the file (after N/3 byte) in a segment of N/3 so that three streams could be used in three threads for concurrent operation?
Here is the code to merge file segments at receiver end:
package com.mergefilespackage;
import java.io.BufferedInputStream;
import java.io.BufferedOutputStream;
import java.io.Closeable;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
public class MergeFiles {
/**
* #param args
*/
public static void main(String[] args) throws Exception{
// TODO Auto-generated method stub
IOCopier.joinFiles(new File("/home/evinish/Documents/Android/File1.jpg"), new File[] {
new File("/home/evinish/Documents/Android/File2.jpg"), new File("/home/evinish/Documents/Android/File3.jpg")});
}
}
class IOCopier {
public static void joinFiles(File destination, File[] sources)
throws IOException {
OutputStream output = null;
try {
output = createAppendableStream(destination);
for (File source : sources) {
appendFile(output, source);
}
} finally {
IOUtils.closeQuietly(output);
}
}
private static BufferedOutputStream createAppendableStream(File destination)
throws FileNotFoundException {
return new BufferedOutputStream(new FileOutputStream(destination, true));
}
private static void appendFile(OutputStream output, File source)
throws IOException {
InputStream input = null;
try {
input = new BufferedInputStream(new FileInputStream(source));
IOUtils.copy(input, output);
} finally {
IOUtils.closeQuietly(input);
}
}
}
class IOUtils {
private static final int BUFFER_SIZE = 1024 * 4;
public static long copy(InputStream input, OutputStream output)
throws IOException {
byte[] buffer = new byte[BUFFER_SIZE];
long count = 0;
int n = 0;
while (-1 != (n = input.read(buffer))) {
output.write(buffer, 0, n);
count += n;
}
return count;
}
public static void closeQuietly(Closeable output) {
try {
if (output != null) {
output.close();
}
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
}
Any help would be highly appreciated! Thanks in advance!
You can't get any more speed over the same link with more sockets. Each socket sends a certain number of packets, each of a certain size. As we double the number of sockets, the number of packets/sec*socket is halved, and then decreased even more due to collisions, overhead, and contention. Packets start to bump, jumble, and otherwise panic. The OS cannot handle the pandemonium of lost ACKs, and the WiFi card struggles to transmit at such a rate. It is losing its low-level acks as well. As packets get lost, a desperate TCP stack dials down the transmit rate. If this were to be able to come up due to signal improvement, it's now stuck at the lower speed due to silly window syndrome or another form of TCP deadlock.
Any attempt of WiFi to get any higher speeds out of wider carrier bands, MiMo, or multiple paths, has already been realized as gains, even with one socket. You can't take it any farther.
Now, wait. We're quite below WiFi speed, aren't we? Of course, we need to use buffering!
Make sure you create BufferedWriter and BufferedReader objects from your socket's getInputStream or getOutputStream methods. Then write to/read from those buffers. Your speed may increase somewhat.
You could get the byte array of the FileInputStream and split it every 10 KB (every 10.000 bytes).
Then send these parts through the streams in order.
On the server you can put the arrays together again and read the file from this giant byte array.
I have an executable jar that runs a Java Swing application with an internal SqlLite db.
Users (by mistake) do more than a click on the jar, causing the db lock.
I'd like to prevent this behavior.
What can I do?
thank you very much
You need some kind of synchronization mechanism.
Either you need to code it yourself, or you can create a Java WebStart configuration for your application, where Java WebStart can handle the "only one invocation" through the Single Instance Service (which you must call explicitly in your code).
See http://docs.oracle.com/javase/6/docs/technotes/guides/javaws/developersguide/examples.html#SingleInstanceService for an example.
The first instances accessing the db should acquire a lock of some sort on the db and all further instances should first check if there is already such a lock. If there is one -> "I am not the first, show warning/error, quit.", if there is none "I am the first, get a lock, proceed."
You can use JPS or JNI (need to implement on different platform). The attached is the JPS code to check the Java application instance. You can modify it to more OO.
Using File, Socket or Registry as a lock is not perfect, since there are a lot of chance that a mis-operation can make your application can not start any more (for example, another program occupe the same port)
import java.io.*;
public class TestRun {
public TestRun() {}
public static void main(String args[]) {
String jpsApp = "jps -mlvV";
int count = 0;
try {
Process p = Runtime.getRuntime().exec(jpsApp);
//parser the result to check if TestAPP is running
InputStream is = p.getInputStream();
BufferedReader reader = new BufferedReader(new InputStreamReader(is));
String line = null;
while ((line = reader.readLine()) != null) {
System.out.println();
System.out.println(line);
String[] pair = line.split(" ");
if (pair.length >= 2) {
System.out.println("name is " + pair[1]);
if (pair[1].trim().indexOf("TestRun") > -1) {
count++;
System.out.println("count is " + count);
}
}
}
//it is running, just exit the second instance
if(count>1){
System.out.println("Has run a application!");
return;
}
} catch (Exception ex) {
ex.printStackTrace();
}
}
}