I have 2 java classes. Let them be class A and class B.
Class A gets String input from user and stores the input as byte into the FILE, then Class B should read the file and display the Byte as String.
CLASS A:
File file = new File("C:\\FILE.txt");
file.createNewFile();
FileOutputStream fos = new FileOutputStream(file);
String fwrite = user_input_1+"\n"+user_input_2;
fos.write(fwrite.getBytes());
fos.flush();
fos.close();
In CLASS B, I wrote the code to read the file, but I don't know how to read the file content as bytes.
CLASS B:
fr = new FileReader(file);
br = new BufferedReader(fr);
arr = new ArrayList<String>();
int i = 0;
while((getF = br.readLine()) != null){
arr.add(getF);
}
String[] sarr = (String[]) arr.toArray(new String[0]);
The FILE.txt has the following lines
[B#3ce76a1
[B#36245605
I want both these lines to be converted into their respective string values and then display it. How to do it?
Are you forced to save using a String byte[] representation to save data? Take a look at object serialization (Object Serialization Tutorial), you don't have to worry about any low level line by line read or write methods.
Since you are writing a byte array through the FileOutputStream, the opposite operation would be to read the file using the FileInputStream, and construct the String from the byte array:
File file = new File("C:\\FILE.txt");
Long fileLength = file.length();
byte[] bytes = new byte[fileLength.intValue()]
try (FileInputStream fis = new FileInputStream(file)) {
fis.read(bytes);
}
String result = new String(bytes);
However, there are better ways of writing the String to a file.
You could write it using the FileWriter, and read using FileReader (possibly wrapping them by the corresponding BufferedReader/Writer), this will avoid creating intermediate byte array. Or better yet, use Apache Commons' IOUtils or Google's Guava libraries.
Related
Currently I have a source file which has base64 encoded data (20 mb in size approx). I want to read from this file, decode the data and write to a .TIF output file. However I don't want to decode all 20MB data at once. I want to read a specific number of characters/bytes from the source file, decode it and write to destination file. I understand that the size of the data I read from the source file has to be in multiples of 4 or else it can't be decoded?
Below is my current code where I decode it all at once
public write Output(File file){
BufferedReader br = new BufferedReader (new Filereader(file));
String builder sb = new StringBuilder ();
String line=BR.readLine();
While(line!=null){
....
//Read line by line and append to sb
}
byte[] decoded = Base64.getMimeDecoder().decode(SB.toString());
File outputFile = new File ("output.tif")
OutputStream out = new BufferedOutputStream(new FileOutputStream(outputFile));
out.write(decoded);
out.flush();
}
How can I read specific number of characters from source file and decode and then write to output file so I don't have to load everything in memory?
Here is a simple method to demonstrate doing this, by wrapping the Base64 Decoder around an input stream and reading into an appropriately sized byte array.
public static void readBase64File(File inputFile, File outputFile, int chunkSize) throws IOException {
FileInputStream fin = new FileInputStream(inputFile);
FileOutputStream fout = new FileOutputStream(outputFile);
InputStream base64Stream = Base64.getMimeDecoder().wrap(fin);
byte[] chunk = new byte[chunkSize];
int read;
while ((read = base64Stream.read(chunk)) != -1) {
fout.write(chunk, 0, read);
}
fin.close();
fout.close();
}
so I have a fairly big (4mb) txt with data for a monolingual dictionary. Because the explanations of the words are split into multiple lines i can't read it line by line. On the other hand I have "###" separators which I can use.
My question is: what is the most efficient way to load this text into a map in java/android?
Load file to single String and use split("###") method on it. It gives you array of strings split by your separator. 4 Mb is OK to load it in memory at once.
byte[] encoded = Files.readAllBytes(Paths.get(filePath));
String fileContents = new String(encoded, encoding);
String[] lines = fileContents.split("###");
Update: not sure you can use that code to read file on android - it's for Java SE 7. On android can use code like this:
FileInputStream fis;
fis = openFileInput(filePath);
StringBuffer fileContent = new StringBuffer("");
byte[] buffer = new byte[1024];
while ((n = fis.read(buffer)) != -1)
{
fileContent.append(new String(buffer, 0, n));
}
String[] lines = fileContent.toString().split("###");
I have few files in my local folder. I want to store the file-names as the key and the content of the corresponding file as value.
HashMap<String,String> hm = new HashMap<String,String>();
hm.put(filename,filecontent);
Can someone tell me is this the right way to do?
When storing file contents as a String, you have to make sure the encoding is respected, I would recommend to use byte array instead:
Map<String, byte[]> hm = new HashMap<String, byte[]>();
Also: depending on how many files you are manipulating, you may want to consider using file streams to avoid keeping everything in memory.
There are a couple of steps to what you would like.
I am going to assume you have the filename already as a String.
HashMap<String, byte[]> hm = new HashMap<String, byte[]>(); //Initialize our hashmap with String as key, and byte array as data (binary data)
FileInputStream fileStream = new FileInputStream(filename); //Get a stream of the file
byte[] buff = new byte[512]; //A buffer for our read loop
ByteArrayOutputStream byteStream = new ByteArrayOutputStream(); //Where to write buff content to, we can convert this into the output byte array with toByteArray()
while(fileStream.read(buff) > 0) { //read 512 bytes of file at a time, until end of file
byteStream.write(buff); //write buff content to byte stream
}
fileStream.close(); //Close our file handle, if we don't do this we may not be able to see changes!
hm.put(filename, byteStream.toByteArray()); //insert filename and filecontent to hashmap
As others have suggested, however, this is less than ideal. You are holding multiple files in memory for an arbitrary length of time. You can eat a lot of ram and not realize it doing this, and quickly run into an out of memory exception.
You would be better off reading the file content only when needed, so there isn't a whole file sitting in your ram for god knows how long. The only plausible reason I could see to store file contents would be if you were reading it a lot, and you could afford the ram to cache the file in memory.
Update for Binary Data
HashMap<String,String> hm = new HashMap<String, byte[]>();
final File folder = new File("/home/you/Desktop");
listFilesForFolder(folder);
public void listFilesForFolder(final File folder) {
for (final File fileEntry : folder.listFiles()) {
if (fileEntry.isDirectory()) {
listFilesForFolder(fileEntry);
} else {
String name = fileEntry.getName();
byte[] fileData = new byte[(int) fileEntry.length()];
DataInputStream dis = new DataInputStream(new FileInputStream(fileEntry));
dis.readFully(fileData);
dis.close();
hm.put(name,fileData);
}
}
}
Tested for Zip file for OP:
public static void main(String[] args) throws FileNotFoundException, IOException {
File file = new File("D:\\try.zip");
System.out.println(file.length());
byte[] fileData = new byte[(int) file.length()];
DataInputStream dis = new DataInputStream(new FileInputStream(file));
dis.readFully(fileData);
dis.close();
}
I am trying to use a FileInputStream to essentially read in a text file, and then output it in a different text file. However, I always get very strange characters when I do this. I'm sure it's some simple mistake I'm making, thanks for any help or pointing me in the right direction. Here's what I've got so far.
File sendFile = new File(fileName);
FileInputStream fileIn = new FileInputStream(sendFile);
byte buf[] = new byte[1024];
while(fileIn.read(buf) > 0) {
System.out.println(buf);
}
The file it is reading from is just a big text file of regular ASCII characters. Whenever I do the system.out.println, however, I get the output [B#a422ede. Any ideas on how to make this work? Thanks
This happens because you are printing a byte array object itself, rather than printing its content. You should construct a String from the buffer and a length, and print that String instead. The constructor to use for this is
String s = new String(buf, 0, len, charsetName);
Above, len should be the value returned by the call of the read() method. The charsetName should represent the encoding used by the underlying file.
If you're reading from a file to another file, you shouldn't convert the bytes to a string at all, just write the bytes read into the other file.
If your intention is to convert a text file from an encoding to another, read from a new InputStreamReader(in, sourceEncoding), and write to a new OutputStreamWriter(out, targetEncoding).
That's because printing buf will print the reference to the byte array, not the bytes themselves as String as you would expect. You need to do new String(buf) to construct the byte array into string
Also consider using BufferedReader rather than creating your own buffer. With it you can just do
String line = new BufferedReader(new FileReader("filename.txt")).readLine();
Your loop should look like this:
int len;
while((len = fileIn.read(buf)) > 0) {
System.out.write(buf, 0, len);
}
You are (a) using the wrong method and (b) ignoring the length returned by read(), other than checking it for < 0. So you are printing junk at the end of each buffer.
the object 's defualt toString method is return object's id in the memory.
byte buf[] is an object.
you can print using this.
File sendFile = new File(fileName);
FileInputStream fileIn = new FileInputStream(sendFile);
byte buf[] = new byte[1024];
while(fileIn.read(buf) > 0) {
System.out.println(Arrays.toString(buf));
}
or
File sendFile = new File(fileName);
FileInputStream fileIn = new FileInputStream(sendFile);
byte buf[] = new byte[1024];
int len=0;
while((len=fileIn.read(buf)) > 0) {
for(int i=0;i<len;i++){
System.out.print(buf[i]);
}
System.out.println();
}
I have text file with string which code page is 1250. I want to save text into RandomAccessFile. When I read bytes from RandomAccessFile I get string with different character. Some solution...
If you're using writeUTF() then you should read its JavaDoc to learn that it always writes modified UTF-8.
If you want to use another encoding, then you'll have to "manually" do the encoding and somehow store the length of the byte[] as well.
For example:
RandomAccessFile raf = ...;
String writeThis = ...;
byte[] cp1250Data = writeThis.getBytes("cp1250");
raf.writeInt(cp1250Data.length);
raf.write(cp1250Data);
Reading would work similarly:
RandomAccessFile raf = ...;
int length = raf.readInt();
byte[] cp1250Data = new byte[length];
raf.readFully(cp1250Data);
String string = new String(cp1250Data, "cp1250");
This code will write and read a string using the 1250 code page. Of course, you will need to clean it, check exceptions and close streams properly before putting in prod :)
public static void main(String[] args) throws Exception {
File file = new File("/toto.txt");
String myString="This is a test";
OutputStreamWriter w = new OutputStreamWriter(new FileOutputStream(file), Charset.forName("windows-1250"));
w.write(myString);
w.flush();
CharBuffer b = CharBuffer.allocate((int)file.length());
new InputStreamReader(new FileInputStream(file), Charset.forName("windows-1250")).read(b);
System.out.println(b.toString());
}