JavaFX concurrency freezing ui - java

i am looking into JavaFX now, specifically into concurrency. As my homework i decided to do an app, of which one of the functions is to read text file. For reference i used core java bonus chapter which is not in the book but available on website of horstmann (v1ch13fx.uitask.TaskDemo) and after finishing i tried it out and encountered an issue - the whole UI freezes together with reading the file, can be between few seconds and few minutes. Unless what i learned was not correct and just nonsense, shouldn't doing the reading on separate thread from UI prevent this from happening?
Can anyone who unedrstands this better have a look and tell me whether there is some problem with the code itself (tried same task with the app written by horstmann and same issue appears though) and advise me of any ways i could deal with this issue?
No matter what i do issue is the same, second day im at it now looking for sources online. Is it an issue with the code itself, or the concept of the task? And in the end this runs on damn powerful machine, which makes the headache even bigger.
public static void read(Stage stage, TextArea textArea, Label status, MenuItem open, MenuItem clear) {
if(task != null) return;
FileChooser chooser = new FileChooser();
chooser.setInitialDirectory(new File(".."));
chooser.getExtensionFilters().addAll(new FileChooser.ExtensionFilter("Text files", "*.txt"));
File file = chooser.showOpenDialog(stage);
if(file == null) return;
textArea.clear();
task = new Task<>() {
public Integer call() {
int lines = 0;
try(BufferedReader br = new BufferedReader(new FileReader(file))) {
while(br.readLine() != null) {
String line = br.readLine();
Platform.runLater(() -> textArea.appendText(line + " \n"));
lines++;
updateMessage(lines + " lines read.");
}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
return lines;
}
};
execute.execute(task);`enter code here`

Joop Eggen is correct. Your runLater calls are happening so fast, they are clogging up the JavaFX event queue, so JavaFX has no chance to do its normal painting and input handling.
A simple workaround is putting Thread.sleep(20); in your loop.
A better workaround, which won’t slow down your reading of the file, is to make your own buffer, and limit how often you update the TextArea:
task = new Task<>() {
public Integer call() {
int lines = 0;
Collection<String> linesBuffer = new ArrayList<>(100_000);
long timeOfLastAppend = System.nanoTime();
try (BufferedReader br = new BufferedReader(new FileReader(file))) {
String line;
while ((line = br.readLine()) != null) {
linesBuffer.add(line);
// Do not update the TextArea more than 10 times per second
// (that is, every 100 million nanoseconds).
long time = System.nanoTime();
if (time - timeOfLastAppend >= 100_000_000L) {
timeOfLastAppend = time;
String text = String.join(" \n", linesBuffer) + " \n";
Platform.runLater(() -> textArea.appendText(text));
linesBuffer.clear();
}
lines++;
updateMessage(lines + " lines read.");
}
if (!linesBuffer.isEmpty()) {
String text = String.join(" \n", linesBuffer) + " \n";
Platform.runLater(() -> textArea.appendText(text));
}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
return lines;
}
};

Related

Reading Large file A, Search Records matching records from file B and write file C in java

I have two files assume its already sorted.
This is just example data, in real ill have around 30-40 Millions of records each file Size 7-10 GB file as row length is big, and fixed.
It's a simple text file, once searched record is found. ill do some update and write to file.
File A may contain 0 or more records of matching ID from File B
Motive is to complete this processing in least amount of time possible.
I am able to do but its time taking process...
Suggestions are welcome.
File A
1000000001,A
1000000002,B
1000000002,C
1000000002,D
1000000002,D
1000000003,E
1000000004,E
1000000004,E
1000000004,E
1000000004,E
1000000005,E
1000000006,A
1000000007,A
1000000008,B
1000000009,B
1000000010,C
1000000011,C
1000000012,C
File B
1000000002
1000000004
1000000006
1000000008
1000000010
1000000012
1000000014
1000000016
1000000018\
// Not working as of now. due to logic is wrong.
private static void readAndWriteFile() {
System.out.println("Read Write File Started.");
long time = System.currentTimeMillis();
try(
BufferedReader in = new BufferedReader(new FileReader(Commons.ROOT_PATH+"input.txt"));
BufferedReader search = new BufferedReader(new FileReader(Commons.ROOT_PATH+"search.txt"));
FileWriter myWriter = new FileWriter(Commons.ROOT_PATH+"output.txt");
) {
String inLine = in.readLine();
String searchLine = search.readLine();
boolean isLoopEnd = true;
while(isLoopEnd) {
if(searchLine == null || inLine == null) {
isLoopEnd = false;
break;
}
if(searchLine.substring(0, 10).equalsIgnoreCase(inLine.substring(0,10))) {
System.out.println("Record Found - " + inLine.substring(0, 10) + " | " + searchLine.substring(0, 10) );
myWriter.write(inLine + System.lineSeparator());
inLine = in.readLine();
}else {
inLine = in.readLine();
}
}
in.close();
myWriter.close();
search.close();
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
System.out.println("Read and Write to File done in - " + (System.currentTimeMillis() - time));
}
My suggestion would be to use a database. As said in this answer. Using txt files has a big disadvantage over DBs. Mostly because of the lack of indexes and the other points mentioned in the answer.
So what I would do, is create a Database (there are lots of good ones out there such as MySQL, PostgreSQL, etc). Create the tables that are needed, and read the file afterward. Insert each line of the file into the DB and use the db to search and update them.
Maybe this would not be an answer to your concrete question on
Motive is to complete this processing in the least amount of time possible.
But this would be a worthy suggestion. Good luck.
With this approach I am able to process 50M Records in 150 Second on i-3, 4GB Ram and SSD Hardrive.
private static void readAndWriteFile() {
System.out.println("Read Write File Started.");
long time = System.currentTimeMillis();
try(
BufferedReader in = new BufferedReader(new FileReader(Commons.ROOT_PATH+"input.txt"));
BufferedReader search = new BufferedReader(new FileReader(Commons.ROOT_PATH+"search.txt"));
FileWriter myWriter = new FileWriter(Commons.ROOT_PATH+"output.txt");
) {
String inLine = in.readLine();
String searchLine = search.readLine();
boolean isLoopEnd = true;
while(isLoopEnd) {
if(searchLine == null || inLine == null) {
isLoopEnd = false;
break;
}
// Since file is already sorted, i was looking for the //ans i found here..
long seachInt = Long.parseLong(searchLineSubString);
long inInt = Long.parseLong(inputLineSubString);
if(searchLine.substring(0, 10).equalsIgnoreCase(inLine.substring(0,10))) {
System.out.println("Record Found - " + inLine.substring(0, 10) + " | " + searchLine.substring(0, 10) );
myWriter.write(inLine + System.lineSeparator());
}
// Which pointer to move..
if(seachInt < inInt) {
searchLine = search.readLine();
}else {
inLine = in.readLine();
}
}
in.close();
myWriter.close();
search.close();
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
System.out.println("Read and Write to File done in - " + (System.currentTimeMillis() - time));
}

RandomAccessFile.seek() not working on Linux

I am using some sort of tail -f implementation to tail a file for changes (pretty much like this ). For this I am using a RandomAccessFile, periodically check if the file length has increased and if so, seek and read the new lines (everything happening in a separate thread of the FileTailer).
Now, everything is working as expected on Windows, but I tested my program on Linux and it does not work as expected. Here is the run()-method of the FileTailer class. Specifically where it fails on linux is the part where file.seek(filePointer) gets called and then file.readLine(), of which the latter surprisingly returns NULL (although the filePointer gets incremented correctly if I append content to the file getting tailed at runtime).
public void run() {
// The file pointer keeps track of where we are in the file
long filePointer = 0;
// Determine start point
if(startAtBeginning){
filePointer = 0;
}
else {
filePointer = logfile.length();
}
try {
// Start tailing
tailing = true;
RandomAccessFile file = new RandomAccessFile(logfile, "r");
while(tailing) {
// Compare the length of the file to the file pointer
long fileLength = logfile.length();
System.out.println("filePointer = " + filePointer + " | fileLength = " + fileLength);
if(fileLength < filePointer) {
// Log file must have been rotated or deleted;
// reopen the file and reset the file pointer
file = new RandomAccessFile(logfile, "r");
filePointer = 0;
}
if(fileLength > filePointer) {
// There is data to read
file.seek(filePointer);
String line = file.readLine();
System.out.println("new line = " + line);
while(line != null){
if(!line.isEmpty())
try {
fireNewFileLine(line);
} catch (ParseException e) {
e.printStackTrace();
}
line = file.readLine();
}
filePointer = file.getFilePointer();
}
// Sleep for the specified interval
sleep(sampleInterval);
}
// Close the file that we are tailing
file.close();
}
catch(InterruptedException | IOException e){
e.printStackTrace();
}
}
Like I said, everything is working as it should on Windows, but on Linux the String variable "line" is NULL after it should have been filled with the newly appended line, so fireNewLine gets called on NULL and everything goes to crap.
Does anyone have an idea why this happens on Linux Systems?
You don't need all this, or RandomAccessFile. You are always at the end of the file. All you need is this:
public void run() {
try {
// Start tailing
tailing = true;
BufferedReader reader = new BufferedReader(new FileReader(logfile));
String line;
while (tailing) {
while ((line = reader.readLine() != null) {
System.out.println("new line = " + line);
if(!line.isEmpty()) {
try {
fireNewFileLine(line);
} catch (ParseException e) {
e.printStackTrace();
}
}
}
// Sleep for the specified interval
sleep(sampleInterval);
}
// Close the file that we are tailing
reader.close();
} catch(InterruptedException | IOException e) {
e.printStackTrace();
}
}
with maybe some provision for reopening the file.
E&OE

what logic should be used to create a lock screen like in angry birds

i am developing a game like angry birds in which i am using a lock screen in which first round is by default open to play but another 9 rounds are locked
now i want to know to create an activity to unlock these rounds when completing rounds
i thaught to write score in a file then make 2nd round to read that file if text file contains score 100 then next round should b open but i am not going to use this technique because when i run activity for the first time it gives me error that file not found because without playing file wont create...
is there any solution for this
public final static String STORETEXT = "round2.txt";
if (mScore == 100) {
int a =1;
try {
OutputStreamWriter out = new OutputStreamWriter(
openFileOutput(STORETEXT, MODE_WORLD_WRITEABLE));
out.write(new Integer(a).toString());
out.close();
} catch (Throwable t) {
}
on another side in lockscreen
ImageButton i1, i2;
try {
fis = openFileInput("round2.txt");
BufferedReader d = new BufferedReader(new InputStreamReader(fis));
strLine = null;
if ((strLine = d.readLine()) != null) {
d.close();
fis.close();
}
} catch (Throwable t) {
// Toast.makeText(this, "Exception: " + t.toString(),
// Toast.LENGTH_LONG).show();
}
int B = Integer.parseInt(strLine);
if(B==1){
i2.setImageDrawable(getResources().getDrawable(R.drawable.lockopen));
i2.setClickable(true);
}
else{
i2.setClickable(false);
i2.setImageDrawable(getResources().getDrawable(R.drawable.lockclose));
}
As per my view you don't have to use text file for any of the purpose in game development. Either you have to use database or shared preferences.
This thing well discussed in following websites
http://www.matim-dev.com/data-storage.html
http://developer.android.com/guide/topics/data/data-storage.html

How to use java program to run command prompt commands?

this is my first time posting here, so I'm not really sure what to say/ask.
Anyways, I am trying to make a simple java program that runs command prompt commands from the java program, mainly used for ping flood (ping flooding myself).
Here is my current code
public class Core extends JFrame {
JTextField ipTextField;
int packets = 0;
boolean running = false;
public Core() {
super("Fatique");
Container container = getContentPane();
JButton bAttack = new JButton("Start Attack");
JButton bStop = new JButton("Stop Attack");
JPanel jPanel = new JPanel();
container.setLayout(new FlowLayout());
ipTextField = new JTextField("IP Address", 30);
container.add(ipTextField);
bAttack.addActionListener(new ActionListener()
{
public void actionPerformed(ActionEvent e)
{
String input = ipTextField.getText();
String[] value = input.split(":");
int amountOfPackets = Integer.parseInt(value[1]);
exec("cmd /c" + input + " -t -n " + amountOfPackets);
running = true;
}
});
bStop.addActionListener(new ActionListener()
{
public void actionPerformed(ActionEvent e)
{
stop();
}
});
if(!running) {
jPanel.add(bAttack);
} else {
jPanel.add(bStop);
}
add(jPanel);
}
public void exec(String cmd) {
try {
Process p = Runtime.getRuntime().exec(cmd);
System.out.println(getOutput(p) + " - " + getPacketsSent());
} catch (IOException e) {
e.printStackTrace();
}
}
public String getOutput(Process p) {
String output = null;
try {
BufferedReader in = new BufferedReader(new InputStreamReader(p.getInputStream()));
String line = null;
while ((line = in.readLine()) != null) {
output = line;
packets++;
}
return output;
} catch (IOException e) {
System.err.println(e.getStackTrace());
}
return null;
}
public int getPacketsSent() {
return packets;
}
public void stop() {
exec("cmd /c break");
running = false;
}
public static void main(String[] args) {
Core c = new Core();
c.setSize(500, 300);
c.setVisible(true);
c.setResizable(false);
c.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
c.setLocationRelativeTo(null);
}
I'm quite new at java, so that might not do what I want it to do.
What I want it to do is I enter an ip address in the textfield, and split it with ":", and after that the amount of packets, for instance
127.0.0.1:100
Though now when I try to use that ip and packet amount, it returns "null - 0" (from exec method), and I'm not even sure if it did anything related to ping.
What I am trying to accomplish is as I already said, ping flood myself, and then output whatever I get as response, though I have no idea if this code does anything even related to that, I mostly use logic when coding java.
public String getOutput(Process p) {
String output = null;
try {
BufferedReader in = new BufferedReader(new InputStreamReader(p.getInputStream()));
String line = null;
while ((line = in.readLine()) != null) {
output = line;
packets++;
}
return output;
} catch (IOException e) {
System.err.println(e.getStackTrace());
}
return null;
}
Could someone explain me why my code code is not working how I want it to work? Please don't judge, as I already said, I'm quite new to java programming.
EDIT: Here is a quick "informative" explanation of what I am trying to accomplish.
I type in an ip address and how many packets I want to send. In this explanation, I am using localhost ip, and 5 packets.
I start the attack. At this part, I want the program to run cmd prompt command
ping 127.0.0.1 -t -n 5
127.0.0.1 being the ip that I put in the textfield in my program, and 5 is the amount of packets I put in the textfield.
I started the attack, so this is what should happen in the command prompt:
The language is Finnish, but still the same thing.
This is the basic explanation of what I am trying to accomplish, hopefully someone understood and can help/tell why my code is not working, or is working but not printing the proper lines in eclipse console.
There is a problem with your getOutput method. It looks like you intend to collect every line of output. But in fact, since you are assigning line to output, you will only return the last line before the end of stream.
To fix this, change
output = line;
to
output += line + "\n";
Or to be more correct:
output += line + LINE_SEPARATOR;
where you previously declared the latter as:
final String LINE_SEPARATOR = System.getProperty("line.separator");
That doesn't directly explain why you are getting null, but that might be because the command you are running is writing output to the 'error' stream rather than the 'output' stream.
Try something like this:
try {
Runtime rt = Runtime.getRuntime();
Process p = rt.exec("ping 192.168.16.67");
InputStream in = p.getInputStream();
OutputStream out = p.getOutputStream ();
InputStream err = p.getErrorStream();
p.destroy();
} catch(Exception exc) {}
Then, you'll have to read the out variable to parse the ping command output continuously.
bAttack.addActionListener(new ActionListener()
{
public void actionPerformed(ActionEvent e)
{
String input = ipTextField.getText();
String[] value = input.split(":");
int amountOfPackets = Integer.parseInt(value[1]);
try {
p=Runtime.getRuntime().exec("ping -n "+amountOfPackets+" "+value[0]);
} catch (IOException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
running = true;
}
Just a small modification of your code. get output is as:
public String getOutput(Process p) {
String output = null;
try {
BufferedReader in = new BufferedReader(new InputStreamReader(p.getInputStream()));
String line = null;
while ((line = in.readLine()) != null) {
output =output+ line+"\n";
packets++;
}
return output;
} catch (IOException e) {
System.err.println(e.getStackTrace());
}
return null;
}
Here output is JTextArea I have taken to display the output of PING process. I cannot show you the output because I lack reputation.
I don't know why first line is null. Anyway, it works.
Hope this help you. Have good time coding.

Processing log files, distribute work among worker threads, to find a simple sum

I want to distribute work among threads. Load parts of a log file and then distribute the work to process parts of the file.
In my simple example, I wrote 800,000 lines of data and had a number in each line. And then I sum the number.
When I run this example, I get totals that are slightly off. Do you see in this threading code where threads might not complete properly and hence won't total the numbers?
public void process() {
final String d = FILE;
FileInputStream stream = null;
try {
stream = new FileInputStream(d);
final BufferedReader reader = new BufferedReader(new InputStreamReader(stream));
String data = "";
do {
final Stack<List<String>> allWork = new Stack<List<String>>();
final Stack<ParserWorkerAtLineThread> threadPool = new Stack<ParserWorkerAtLineThread>();
do {
if (data != null) {
final List<String> currentWorkToDo = new ArrayList<String>();
do {
data = reader.readLine();
if (data != null) {
currentWorkToDo.add(data);
} // End of the if //
} while(data != null && (currentWorkToDo.size() < thresholdLinesToAdd));
// Hand out future work
allWork.push(currentWorkToDo);
} // End of the if //
} while(data != null && (allWork.size() < numberOfThreadsAllowedInPool));
// Process the lines from the work to do //
// Hand out the work
for (final List<String> theCurrentTaskWork : allWork) {
final ParserWorkerAtLineThread t = new ParserWorkerAtLineThread();
t.data = theCurrentTaskWork;
threadPool.push(t);
}
for (final Thread workerAboutToDoWork : threadPool) {
workerAboutToDoWork.start();
System.out.println(" -> Starting my work... My name is : " + workerAboutToDoWork.getName());
} // End of the for //
// Waiting on threads to finish //
System.out.println("Waiting for all work to complete ... ");
for (final Thread waiting : threadPool) {
waiting.join();
} // End of the for //
System.out.println("Done waiting ... ");
} while(data != null); // End of outer parse file loop //
} catch(Exception e) {
e.printStackTrace();
} finally {
if (stream != null) {
try {
stream.close();
} catch (final IOException e) {
e.printStackTrace();
}
} // End of the stream //
} // End of the try - catch finally //
}
While you're at it, why not use a bounded BlockingQueue (ArrayBlockingQueue) of size thresholdLinesToAdd. This would be your producer code where you read the lines and use the method put on that queue to block until space is available.
As Chris mentionned before, use the Executors.newFixedThreadPool() to submit your work items on it. Your consumers would call take() to block until an element is available.
This is not a map/reduce. If you wanted a map/reduce, you would need another queue in the mix where you would publish keys to it. As an example, if you were to count the number of INFO and DEBUG occurances in your logs, your mapper would queue the extracted words every time it encounters it. The reducer would dequeue the mapper's output and increment the counter of each words. The result of your reducer would the word count for DEBUG and INFO.

Categories

Resources