what i did so far :
I read a file1 with text, XORed the bytes with a key and wrote it back to another file2.
My problem: I read for example 'H' from file1 , the byte value is 72;
72 XOR -32 = -88
Now i wrote -88 in to the file2.
when i read file2 i should get -88 as first byte, but i get -3.
public byte[] readInput(String File) throws IOException {
Path path = Paths.get(File);
byte[] data = Files.readAllBytes(path);
byte[]x=new byte[data.length ];
FileInputStream fis = new FileInputStream(File);
InputStreamReader isr = new InputStreamReader(fis);//utf8
Reader in = new BufferedReader(isr);
int ch;
int s = 0;
while ((ch = in.read()) > -1) {// read till EOF
x[s] = (byte) (ch);
}
in.close();
return x;
}
public void writeOutput(byte encrypted [],String file) {
try {
FileOutputStream fos = new FileOutputStream(file);
Writer out = new OutputStreamWriter(fos,"UTF-8");//utf8
String s = new String(encrypted, "UTF-8");
out.write(s);
out.close();
}
catch (IOException e) {
e.printStackTrace();
}
}
public byte[]DNcryption(byte[]key,byte[] mssg){
if(mssg.length==key.length)
{
byte[] encryptedBytes= new byte[key.length];
for(int i=0;i<key.length;i++)
{
encryptedBytes[i]=Byte.valueOf((byte)(mssg[i]^key[i]));//XOR
}
return encryptedBytes;
}
else
{
return null;
}
}
You're not reading the file as bytes - you're reading it as characters. The encrypted data isn't valid UTF-8-encoded text, so you shouldn't try to read it as such.
Likewise, you shouldn't be writing arbitrary byte arrays as if they're UTF-8-encoded text.
Basically, your methods have signatures accepting or returning arbitrary binary data - don't use Writer or Reader classes at all. Just write the data straight to the stream. (And don't swallow the exception, either - do you really want to continue if you've failed to write important data?)
I would actually remove both your readInput and writeOutput methods entirely. Instead, use Files.readAllBytes and Files.write.
In writeOutput method you convert encrypted byte array into UTF-8 String which changes the actual bytes you are writing later to the file. Try this code snippet to see what is happening when you try to convert byte array with negative values to UTF-8 String:
final String s = new String(new byte[]{-1}, "UTF-8");
System.out.println(Arrays.toString(s.getBytes("UTF-8")));
It will print something like [-17, -65, -67]. Try using OutputStream to write bytes to the file.
new FileOutputStream(file).write(encrypted);
Related
Currently I have a source file which has base64 encoded data (20 mb in size approx). I want to read from this file, decode the data and write to a .TIF output file. However I don't want to decode all 20MB data at once. I want to read a specific number of characters/bytes from the source file, decode it and write to destination file. I understand that the size of the data I read from the source file has to be in multiples of 4 or else it can't be decoded?
Below is my current code where I decode it all at once
public write Output(File file){
BufferedReader br = new BufferedReader (new Filereader(file));
String builder sb = new StringBuilder ();
String line=BR.readLine();
While(line!=null){
....
//Read line by line and append to sb
}
byte[] decoded = Base64.getMimeDecoder().decode(SB.toString());
File outputFile = new File ("output.tif")
OutputStream out = new BufferedOutputStream(new FileOutputStream(outputFile));
out.write(decoded);
out.flush();
}
How can I read specific number of characters from source file and decode and then write to output file so I don't have to load everything in memory?
Here is a simple method to demonstrate doing this, by wrapping the Base64 Decoder around an input stream and reading into an appropriately sized byte array.
public static void readBase64File(File inputFile, File outputFile, int chunkSize) throws IOException {
FileInputStream fin = new FileInputStream(inputFile);
FileOutputStream fout = new FileOutputStream(outputFile);
InputStream base64Stream = Base64.getMimeDecoder().wrap(fin);
byte[] chunk = new byte[chunkSize];
int read;
while ((read = base64Stream.read(chunk)) != -1) {
fout.write(chunk, 0, read);
}
fin.close();
fout.close();
}
I have a byte array file with me which I am trying to convert into human readable. I tried below ways :
public static void main(String args[]) throws IOException
{
//System.out.println("Platform Encoding : " + System.getProperty("file.encoding"));
FileInputStream fis = new FileInputStream("<Path>");
// Using Apache Commons IOUtils to read file into byte array
byte[] filedata = IOUtils.toByteArray(fis);
String str = new String(filedata, "UTF-8");
System.out.println(str);
}
Another approach :
public static void main(String[] args) {
File file = new File("<Path>");
readContentIntoByteArray(file);
}
private static byte[] readContentIntoByteArray(File file) {
FileInputStream fileInputStream = null;
byte[] bFile = new byte[(int) file.length()];
try {
FileInputStream(file);
fileInputStream.read(bFile);
fileInputStream.close();
for (int i = 0; i < bFile.length; i++) {
System.out.print((char) bFile[i]);
}
} catch (Exception e) {
e.printStackTrace();
}
return bFile;
}
These codes are compiling but its not yielding output file in a human readable fashion. Excuse me if this is a repeated or basic question.
Could someone please correct me where I am going wrong here?
Your code (from the first snippet) for decoding a byte file into a UTF-8 text file looks correct to me (assuming FileInputStream fis = new FileInputStream("Path") is yielding the correct fileInputStream) .
If you're expecting a text file format but are not sure which encoding the file format is in (perhaps it's not UTF-8) , you can use a library like the below to find out.
https://code.google.com/archive/p/juniversalchardet/
or just explore some of the different Charsets in the Charset library and see what they produce in your String initialization line and what you produce:
new String(byteArray, Charset.defaultCharset()) // try other Charsets here.
The second method you show has associated catches with byte to char conversion , depending on the characters, as discussed here (Byte and char conversion in Java).
Chances are, if you cannot find a valid encoding for this file, it is not human readable to begin with, before byte conversion, or the byte array file being passed to you lost something that makes it decodeable along the way.
I have a string of bits with length 128 and I want to convert it to an byte array, then write it to a binary file and later read from the binary file and convert byte array to bit string. This is my code ( I use an input of length 16 for simplicity):
String stest = "0001010010000010";
//convert to byte array
short a = Short.parseShort(stest, 2);
ByteBuffer bytes = ByteBuffer.allocate(2).putShort(a);
byte[] wbytes = bytes.array();
System.out.println("Byte length: "+ wbytes.length);
System.out.println("Writing to binary file");
try {
FileOutputStream fos = new FileOutputStream("test.ai");
fos.write(wbytes);
fos.flush();
fos.close();
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
System.out.println("Reading from binary file");
File inputFile = new File("test.ai");
byte[] rdata = new byte[(int) inputFile.length()];
//byte[] rdata = new byte[2];
FileInputStream fis;
String readstr = "";
try {
fis = new FileInputStream(inputFile);
fis.read(rdata, 0, rdata.length);
fis.close();
for(int i=0; i<rdata.length; i++){
Byte cb = new Byte(rdata[i]);
readstr += Integer.toBinaryString(cb.intValue());
}
System.out.println("Read data from file: " + readstr);
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
However the string that I read from the file is not equal to the original string. This is the output:
String: 0001010010000010
Byte length: 2
Writing to binary file
Reading from binary file
Read data from file: 1010011111111111111111111111110000010
I would start to choose data type I use for both scenario. Lets think about, you choose byte. So, writing it is pretty easy.
byte data[] = ... //your data
FileOutputStream fo = new FileOutputStream("test.ai");
fo.write(data);
fo.close();
now, lets read stringized data from the file. As you know, 1 byte is 8 bit. So, you need to just read 8 chars from the file, and convert it to a byte. So,
FileInputStream fi = new FileInputStream("test2.ai"); // I assume this is different file
StringBuffer buf = new StringBuffer();
int b = fi.read();
int counter = 0;
ByteArrayOutputStream dataBuf = new ByteArrayOutputStream();
while (b != -1){
buf.append((char)b);
b = fi.read();
counter++;
if (counter%8 == 0){
int i = Integer.parseInt(buf.toString(),2);
byte b = (byte)(i & 0xff);
dataBuf.write(b);
buf.clear(0,buf.length());
}
}
byte data[] = dataBuf.toByteArray();
I think the problem in your code is converting string to byte. Your starting point is already wrong. You are converting your data to short that keeps only 2 byte. But, you say your file can keep 128 bit, that is 16 byte. So, you should not try to convert whole file to a data type, like short, integer or long. you have to convert every 8 bit to byte.
I'm looking for a way that I can read the binary data of a file into a string.
I've found one that reads the bytes directly and converts the bytes to binary, the only problem is that it takes up a significant amount of RAM.
Here's the code I'm currently using
try {
byte[] fileData = new byte[(int) sellect.length()];
FileInputStream in = new FileInputStream(sellect);
in.read(fileData);
in.close();
getBinary(fileData[0]);
getBinary(fileData[1]);
getBinary(fileData[2]);
} catch (IOException e) {
e.printStackTrace();
}
And the getBinary() method
public String getBinary(byte bite) {
String output = String.format("%8s", Integer.toBinaryString(bite & 0xFF)).replace(' ', '0');
System.out.println(output); // 10000001
return output;
}
Can you do something like this:
int buffersize = 1000;
int offset = 0;
byte[] fileData = new byte[buffersize];
int numBytesRead;
String string;
while((numBytesRead = in.read(fileData,offset,buffersize)) != -1)
{
string = getBinary(fileData);//Adjust this so it can work with a whole array of bytes at once
out.write(string);
offset += numBytesRead;
}
This way, you never store more information in the ram than the byte and string structures. The file is read 1000 bytes at a time, translated to a string 1 byte at a time, and then put into a new file as a string. Using read() returns the value of how many bytes it reads.
This link can help you :
File to byte[] in Java
public static byte[] toByteArray(InputStream input) throws IOException
Gets the contents of an InputStream as a byte[]. This method buffers
the input internally, so there is no need to use a
BufferedInputStream.
Parameters: input - the InputStream to read from Returns: the
requested byte array Throws: NullPointerException - if the input is
null IOException - if an I/O error occurs
I need to be able to read the bytes from a file in android.
Everywhere I look, it seems that FileInputStream should be used to read bytes from a file but that is not what I want to do.
I want to be able to read a text file that contains (edit) a textual representation of byte-wide numeric values (/edit) that I want to save to an array.
An example of the text file I want to have converted to a byte array follows:
0x04 0xF2 0x33 0x21 0xAA
The final file will be much longer. Using FileInputStream takes the values of each character where I want to save an array of length five to have the values listed above.
I want the array to be processed like:
ExampleArray[0] = (byte) 0x04;
ExampleArray[1] = (byte) 0xF2;
ExampleArray[2] = (byte) 0x33;
ExampleArray[3] = (byte) 0x21;
ExampleArray[4] = (byte) 0xAA;
Using FileInputStream on a text file returns the ASCII values of the characters and not the values I need written to the array.
The simplest solution is to use FileInputStream.read(byte[] a) method which will transfer the bytes from file into byte array.
Edit: It seems I've misread the requirements. So the file contains the text representation of bytes.
Scanner scanner = new Scanner(new FileInputStream(FILENAME));
String input;
while (scanner.hasNext()) {
input = scanner.next();
long number = Long.decode(input);
// do something with the value
}
Old answer (obviously wrong for this case, but I'll leave it for posterity):
Use a FileInputStream's read(byte[]) method.
FileInputStream in = new FileInoutStream(filename);
byte[] buffer = new byte[BUFFER_SIZE];
int bytesRead = in.read(buffer, 0, buffer.length);
You just don't store bytes as text. Never!
Because 0x00 can be written as one byte in a file, or as a string, in this case (hex) taking up 4 times more space.
If you're required to do this, discuss how awful this decision would be!
I will edit my answer if you can provide a sensible reason though.
You would only save stuff as actual text, if:
It is easier (not the case)
It adds value (if an increase in filesize by over 4 (spaces count) adds value, then yes)
If users should be able to edit the file (then you would omit the "0x"...)
You can write bytes like this:
public static void writeBytes(byte[] in, File file, boolean append) throws IOException {
FileOutputStream fos = null;
try {
fos = new FileOutputStream(file, append);
fos.write(in);
} finally {
if (fos != null)
fos.close();
}
}
and read like this:
public static byte[] readBytes(File file) throws IOException {
return readBytes(file, (int) file.length());
}
public static byte[] readBytes(File file, int length) throws IOException {
byte[] content = new byte[length];
FileInputStream fis = null;
try {
fis = new FileInputStream(file);
while (length > 0)
length -= fis.read(content);
} finally {
if (fis != null)
fis.close();
}
return content;
}
and therefore have:
public static void writeString(String in, File file, String charset, boolean append)
throws IOException {
writeBytes(in.getBytes(charset), file, append);
}
public static String readString(File file, String charset) throws IOException {
return new String(readBytes(file), charset);
}
to write and read strings.
Note that I don't use the try-with-resource construct because Android's current Java source level is too low for that. :(