Read/write to file with java for personal program - java

so I'm assuming I'm not the only one who's asked this but I have a specific matter I want resolved. Before I start let me just say this is NOT a homework assignment, this is for my convenience and strictly personal use. It isn't so much the coding I'm having an issue with, it's mainly I don't know how to get file access.
I'm developing a program that runs off the command prompt and it is storing all my passwords in the program itself. What I want to know is how can i have the bat file that is running the java file, access another file (like passwords.whatever extension it can read) and store the information there so i don't have to store it in the program. I want to store the data in an array but the primary issue I'm having is how do I get java to access a file, write to it, then able to read it whenever i want to access certain pieces of it.
If something didn't make sense I'll glady elaborate, I just want to make a program for my self being because I have too many passwords to remember in my head.

Learning the entire Java API just to reinvent the functionality available in many high quality , freely available programs seems like overkill.

for writing a File in Java
try{
FileWriter fw=new FileWriter("C://myfile.txt",true);
BufferedWriter bw = new BufferedWriter(fw);
bw.write(mystringanddata);
bw.close();
}
catch(Exception e){
System.out.print("ERROR"+e);
}
for reading a File in Java
try{
FileReader fr=new FileReader("C://myfile.txt",true);
BufferedReader br = new BufferedReader(fr);
String mystring;
while((mystring=br.readLine())!=null){
System.out.print("DATA IS"+mystring);
}
br.close();
}
catch(Exception e){
System.out.print("ERROR"+e);
}

Related

Reading internal storage file is taking too much time in Android

I have a large JSON data around 20 MB (I create this data using JSON.stringify
from JavaScript code). I 'm writing this JSON data to an internal storage file on Android Device and reading it later. So When I read the file it's taking too much time, I don't know whether its reading or not. One more thing I need to Read in the Main thread only.
The below code works fine if I pass data value "Hello World" in WriteFile method, But it fails with the large JSON
public String ReadFile()
{
StringBuffer text = new StringBuffer();
String FILE_NAME = "file.txt";
try {
BufferedReader bReader = new BufferedReader(new InputStreamReader(openFileInput(FILE_NAME)));
String line;
int count = 0;
while ((line = bReader.readLine()) != null) {
text.append(line + "\n");
alert("Reading File: " + ++count);
}
}
catch (Exception e) {
alert(e.toString());
}
return text.toString();
}
public String WriteFile(String data)
{
String FILE_NAME = "file.txt";
String result = "";
try {
FileOutputStream fos = openFileOutput(FILE_NAME, Context.MODE_PRIVATE);
fos.write(data.toString().getBytes());
result = "Success";
fos.close();
}
catch (Exception e) {
e.printStackTrace();
result="Error";
}
return result;
}
I have added one alert in while loop also, but I cannot see any alert message. I have not seen even the Exception message also.
so there can be two problems.
There is something wrong in writing to file (But I don't know how to verify this? because I don't think there is any way to view internal storage file).
Something wrong in my reading code.
Update1:
If let's say I cannot read so large file in Java native code, then Is there any way to read an internal storage Android file from WebView JavaScript code?
============================================================================
Application Requirement
I have an Android application, In which I have a WebView. I have copied the full javascript code (js and HTML files) to assets folder of the app. I'm writing to file from java native code and reading from java native code. I am getting all data from the server on app launch. My client has a very slow internet connection and its disconnected many times. So they want this app to be run in offline mode. Means app will get all the data at launch and We will store it somewhere and then read it throughout the app. If a user launches the app again it will get the old existing data. Actually, this data is very big so I'm storing it to the internal storage file.
First of all, the only way to be really sure why your code is taking a long time is to profile it. We can't do that for you.
But here are some performance tips relevant to your code:
Don't read the entire 20MB JSON file into the Java heap / RAM memory unless you really need to do it. (I am finding it difficult to understand why you are doing this. For example, a typical JSON parser will happily1 read input directly from a file. Or if you are reading this so that you can send this to a client on the other end of an HTTP connection, you should be able to stream the data.)
Reading a file a line at a time and then stitching the lines back together is unnecessary. It generates unnecessary garbage. Extra garbage means more work for the GC, which slows you down. If the lines are long, you have the added performance "hit" of using a internal StringBuilder to build each line.
Reading to a recycled char[], then appending the char[] content to the StringBuilder will be faster than appending lines.
Your StringBuilder will repeatedly "grow" its backing character array to accommodate the characters as you append them. This generates garbage and leads to unnecessary copying. (Implementations typically "grow" the array exponentially to avoid O(N^2) behavior. However the expansions still affect performance, and can result in up to 3 times the peak memory usage than is actually required.)
One way to avoid this is to get an initial estimate of the number of characters you are going to add and set the StringBuilder "capacity" accordingly. You may be able to estimate the number of characters from the file size. (It depends on the encoding.)
Look for a way to do it using existing standard Java libraries; e.g. Files.copy and ByteArrayOutputStream, or Files.readAllBytes
Look for an existing 3rd-party library method; e.g. Apache Commons IO has an IOUtils.toString(Reader) method. The chances are that they will have spent a lot of time figuring out how to do this efficiently. Reusing a well engineered, well maintained library is likely to saves you time.
Don't put a trace print (I assume that is what alert is ...) in the middle of a loop that could be called millions of times. (Duh!)
1 - Parser are cheerful once you get to know them :-)

How to create and scan a .txt file and to and from a Java ArrayList

I am a beginning Java programmer that is new around this site. I am working on a Java project that involves the saving and reloading of .txt files. I realize that there are many similar questions out there, but none of them are exactly what I am looking for. If I have mistaken, I am very sorry.
I am trying to design a Java program for a classroom when the teacher wants to assign new seats. First, I am going to explain my vision for the program. The program will allow you to type in the students' names, which it will separate into two separate ArrayLists, one for the boys in the class and another for the girls. I want these ArrayLists to be saved somehow into a .txt file so they can be read later. I have done this before with simple strings:
PrintWriter out = new PrintWriter("list.txt");
out.println("Bobby Joe");
out.println("John Doe");
out.close();
Is there a different way to do this with ArrayLists? The PrintWriter worked perfectly when I used simple strings. Another part of the program will read those .txt files and import them into ArrayLists again. There, it can use the ArrayLists to assign seats for the children in the class. I am completely unaware how to load .txt files.
I am pretty sure these are the only two parts of the program I will need help with, I can use the randomizer and other simple methods to program the rest.
I would really appreciate if you could help me create my Java program. Thank you!
I assume you want the names to be written line-by-line.
Edit your writing method to write the names from a list:
PrintWriter out = new PrintWriter("list.txt");
list.forEach(out::println);
out.close();
And then you can read them with something like this:
List<String> list = Files.lines(Paths.get("list.txt")).collect(Collectors.toList());
I propose to use for this serialization:
Example
Just write/read full ArrayList object to/from file.

Managing file bytes with DataOutput/InputStream

I have a program that will go through and create multiple different class instances. I want to write the details of each instance to a file using DataOutputStream (it's a necessary exercise, I'll look at other ways of doing this later), but the problem is I noticed that DataOutputStream overwrites the file each time a new instance is created and written. My first idea was each time a new instance is written, first using DataInputStream to get what's in the file, save it, and then rewrite it with the new instance. This seems like it could get confusing very fast. What would be best practice for something like this? Thanks in advance.
EDIT: I will try and be a bit more specific about what I'm trying to do here.
When I take the class that I want to write to the file, first I'll use an dataInputStream.readFully to get everything in the file. My understanding is that takes all the bytes in the file and stores them in an array. I would like to compare this with the class instance and if the instance matches something already in the file, don't output this particular instance (because it's already there) to the file. Otherwise, append to the file.
Use the FileOutputStream(File file, boolean append) constructor when you open the file for writing. For example:
File f = new File("C:\data.txt");
FileOutputStream fos = new FileOutputStream(f, true); // open file for appending
DataOutputStream dos = new DataOutputStream(fos);
// anything written to dos after this point will be appended
If you just need to serialize your objects, I'd highly recommend using JAXB or another serialization/marshaling API instead of reinventing the wheel. You'll potentially save a ton of time.

Reading Random Access File with Buffered Reader

i am trying to read a huge file ( > 1GB) , i am thinking that reading it as a random access file with a buffered reader would be efficient.
i need to read the file line by line and parse it
However being new to JAVA IO Api , i'm not sure how can i do this..
i appreciate your help.
You can use Java's BufferedReader for this:
BufferedReader reader = new BufferedReader(new FileReader(fileName));
String line;
while ((line = reader.readLine()) != null) {
// Do some stuff with the line
}
fileName is the path to the file you want to read.
Do you need to read all of it and from the beginning? You can use a RandomAccessFile to jump to different parts of the file if you know what byte you can start at. I think it is the seek function that does this.
While it is perfectly doable in java, I wanted to suggest based on my experience:
If you're on Unix platform, you may use external shell script for searching through the GBs of log. sed is very optimum for this purpose. Specific usage here: http://www.grymoire.com/Unix/Sed.html
Call shell script through java file whenever you need to read/grep through the log file.
How?
1) In your java code, use ProcessBuilder class. It can take shell script as arg to constructor
ProcessBuilder obj = new ProcessBuilder("FastLogRead.sh");
2) Create object for Process
Process process = obj.start();
3) You can read the output of this shell, directly in your BufferedRead through this
BufferedReader br=new BufferedReader(new InputStreamReader(process.getInputStream()));
Pros:
Speeds up execution by avg. 10 times (I searched through around 4GB log file)
Cons:
Some developers don't like bringing in light-weight shell script in realms of java, hence want to go for java's RandomAccessFile. This is justified.
For your case, you may choose between standardization and performance.

How do I write/read to the beginning of a text file?

EDIT
This is my file reader, can I make this read it from bottom to up seeing how difficult it is to make it write from bottom to up.
BufferedReader mainChat = new BufferedReader(new FileReader("./messages/messages.txt"));
String str;
while ((str = mainChat.readLine()) != null)
{
System.out.println(str);
}
mainChat.close();
OR (old question)
How can I make it put the next String at the beginning of the file and then insert an new line(to shift the other lines down)?
FileWriter chatBuffer = new FileWriter("./messages/messages.txt",true);
BufferedWriter mainChat = new BufferedWriter(chatBuffer);
mainChat.write(message);
mainChat.newLine();
mainChat.flush();
mainChat.close();
Someone could correct me, but I'm pretty sure in most operating systems, there is no option but to read the whole file in, then write it back again.
I suppose the main reason is that, in most modern OSs, all files on the disc start at the beginning of a boundary. The problem is, you cannot tell the file allocation table that your file starts earlier than that point.
Therefore, all the later bytes in the file have to be rewritten. I don't know of any OS routines that do this in one step.
So, I would use a BufferedReader to store whole file into a Vector or StringBuffer, then write it all back with the prepended string first.
--
Edit
A way that would save memory for larger files, reading #Saury's randomaccessfile suggestion, would be:
file has N bytes to start with
we want to add on "hello world"
open the file for append
append 11 spaces
i=N
loop {
go back to byte i
read a byte
move to byte i+11
write that byte back
i--
} until i==0
then move to byte 0
write "hello world"
voila
Use FileUtils from Apache Common IO to simplify this if you can. However, it still needs to read the whole file in so it will be slow for large files.
List<String> newList = Arrays.asList("3");
File file = new File("./messages/messages.txt");
newList.addAll(FileUtils.readLines(file));
FileUtils.writeLines(file, newList);
FileUtils also have read/write methods that take care of encoding.
Use RandomAccessFile to read/write the file in reverse order. See following links for more details.
http://www.java2s.com/Code/Java/File-Input-Output/UseRandomAccessFiletoreverseafile.htm
http://download.oracle.com/javase/1.5.0/docs/api/java/io/RandomAccessFile.html
As was suggested here pre-pending to a file is rather difficult and is indeed linked to how files are stored on the hard drive. The operation is not naturally available from the OS so you will have to make it yourself and most obvious answers to this involve reading the whole file and writing it again. this may be fine for you but will incur important costs and could be a bottleneck for your application performance.
Appending would be the natural choice but this would, as far as I understand, make reading the file unnatural.
There are many ways you could tackle this depending on the specificities of your situation.
If writing this file is not time critical in your application and the file does not grow too big you could bite the bullet and read the whole file, prepend the information and write it again. apache's common-io's FileUtils will be of help here simpifying the operation where you can read the file as a list of strings, prepend the new lines to the list and write the list again.
If writing is time critical but have control over the reading or the file. That is, if the file is to be read by another of your programs. you could load the file in a list of lines and reverse the list. Again FileUtils from the common-io library and helper functions in the Collections class in the standard JDK should do the trick nicely.
If writing is time critical but the file is intended to be read through a normal text editor you could create a small class or program that would read the file and write it in another file with the preferred order.

Categories

Resources