I've got this function that encrypts a file, the encryption bit seems to be working, but i can't get it to overwrite the current file.
FileInputStream inputStream = new FileInputStream(input); // Selects file to encrypt
cipher.init(Cipher.ENCRYPT_MODE, secret, ivSpec); // Sets up the encryption
// Creates an the output stream, the encryption is performed here
CipherOutputStream cos = new CipherOutputStream(new FileOutputStream(input + ".secure"), cipher);
byte[] block = new byte[8];
int i;
while ((i = inputStream.read(block)) != -1) // Reads the file
{
cos.write(block, 0, i); // Writes the new file
}
cos.close();
This is working fine, i end up with an encrypted file with original_file_name.txt.secure, but i want it to overwrite the original file. If i remove the .secure bit it doesn't write the file properly.
How can I overwrite the file original file with the encrypted text?
If you remove the .secure part, you'll be trying to read from the file at the same time that you're writing to it. This is not a very good idea...
The best approach would be to do as you've done, and then if all has gone well, you can delete the original file and rename the old one to match its name, using Files.move().
In fact, if you pass the right options to Files.move(), you can get it to overwrite the existing file, meaning that you won't need to delete the original first.
This solves the simultaneous read/write problem you're having, but it's also a whole lot safer for an application like this. If your application crashes or there's a power cut in the middle of encrypting, and you're encrypting in place, then you're completely screwed. If you do it this way, then power failure in the middle still leaves you with your old file intact. You'll always have the complete old file around until the complete new file is ready.
By the way, you should make use of a BufferedInputStream too, rather than just using a raw FileInputStream. And I can't see an inputStream.close() anywhere.
Related
I'm working on a project to Encrypt/Decrypt files. as this is my first time, I'm wondering if I'm doing it right or not. till now, my idea about encrypting is this :
Select a file -> Read all its bytes and add it to byte array -> Encrypt the byte array -> write encrypted bytes to same file.
note that in this project output file is same file as input. So I decided to clear file before writing encrypted bytes to it.
This might be stupid (and thats why I'm asking for help), so here is my way
public class Encryptor {
File file;
SecretKeySpec secretKeySpec;
public void setFile(String filePath) throws Exception {
this.file = new File(filePath);
if(!file.isFile()){
throw new Exception("The file you choosed is not valid");
}
}
public void setKey(String keyword){
try {
MessageDigest sha = MessageDigest.getInstance("SHA-256");
sha.update(keyword.getBytes("UTF-8"));
byte[] key = sha.digest();
secretKeySpec = new SecretKeySpec(key, "AES");
} catch (UnsupportedEncodingException | NoSuchAlgorithmException e) {
e.printStackTrace();
}
}
public void encrypt(){
byte[] bFile = new byte[(int) file.length()];
try {
//adding portocol bytes to the file bytes
//String portcol = "encryptor portocol";
//byte[] decPortocol = portcol.getBytes();
//convert file into array of bytes
BufferedInputStream bufferedInputStream = new BufferedInputStream(new FileInputStream(file));
bufferedInputStream.read(bFile);
bufferedInputStream.close();
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
//outputStream.write(decPortocol);
outputStream.write(bFile);
byte[] cryptedFileBytes = outputStream.toByteArray();
//Cipher and encrypting
Cipher cipher = Cipher.getInstance("AES/ECB/PKCS5Padding");
cipher.init(Cipher.ENCRYPT_MODE, secretKeySpec);
byte[] encryptedBytes = cipher.doFinal(cryptedFileBytes);
//Write Encrypted File
BufferedOutputStream bufferedOutputStream = new BufferedOutputStream(new FileOutputStream(file,false));
bufferedOutputStream.write(encryptedBytes);
bufferedOutputStream.flush();
bufferedOutputStream.close();
}catch (Exception e){
e.printStackTrace();
}
}
}
main question
are there other ways to read-encrypt-write on same file together at same time? like reading bytes part by part and at same time encrypting that part and overwrite it with encrypted bytes.
Can You Help me more ?
And also Any information about how to make my encrypted files more safe can also be helpful.
and does my program kill RAM ?!
(NOTE) I'm writing encrypted data on same file for some reasons. I'm not much familiar with how hard drive works. One of my reasons is to prevent file from being recovered later. is there anything I have to know about that ? does what I'm doing prevent unEncrypted file to be recovered later ?
EDIT
#erickson has pointed out something important in his answer. I got to know that this way of encrypting a file is not safe. What I was considering to prevent too, was preventing file from being recovered later. I mean there is no point to encrypt a file and keep it in your hard drive if you once had it unEncrypted there ! in my experience, everytime I recovered a file, I reached last edits of it and I could never get history of changes. I thought this must be the same if I was not wrong in first place. How can I help preventing data recovery then ?!
Writing to a file while reading can work, but it would be easy to introduce a bug that would corrupt the file. For safety's sake, it might be better to write to a temporary file, then delete the original file and replace it with the temporary file. That way, all of the file content is always safely in at least one file.
One caveat about this is that if you encrypt an existing file, there's no guarantee that the original file isn't still recorded on disk. Even if you write to the same file as you read, whether the same storage is overwritten with encrypted data will depend on the underlying file system.
It would be better if the original file was written in its encrypted form. Even if the writing application doesn't support encryption, most operating systems support the creation of an encrypted file system so that any application can keep files secret.
You need to close your reader after you have finished reading the file. You are currently doing it in this line:
bufferedInputStream.close();
So it's ok.
Then, instead of clearing file, you can just simply overwrite it using:
BufferedOutputStream bufferedOutputStream = new BufferedOutputStream(new FileOutputStream(filename, false);
Hope that helps :)
I have file which contains the beginning contains unencrypted header data and then the rest of the contains encrypted header data.
I want to be able to read the header data using a BufferedInputStream/ FileInputSteam.
Next I want to start reading the rest of the encrypted data using CipherInputStream that uses the same BuffereredInputStream as input
Is this allowable??? Is it Ok to change the way you are using a stream.
Yes, you can do this.
The only tricky part depends on the structure of the header. If you have to read past the end of the header in order to determine where it ends, you'll need to be able to back up a bit in the stream so that the CipherInputStream can read all of the encrypted data. On the other hand, if your header has a fixed length, or it's length-encoded, or it has some sort of end marker, it should be quite simple:
try (InputStream is = new BufferedInputStream(Files.newInputStream(...))) {
Header header = Header.read(is);
CipherInputStream cis = new CipherInputstream(is, header.getCipherInstance());
cis.read(...);
}
I have an encryption algorithm (AES) that accepts a file converted to array byte and encrypt it.
Since I am going to process a very large files, the JVM may go out of memory.
I am planing to read the files in multiple byte arrays, each containing some part of the file. Then I iteratively feed the algorithm. Finally, I merge them to produce an encrypted file.
So my question is: Is there any way to read a file part by part to multiple byte arrays?
I thought I could use the following to read the file to a byte array:
IOUtils.toByteArray(InputStream input).
And then split the array into multiple bytes using:
Arrays.copyOfRange()
But I am afraid that the code that reads a file to ByteArray will make the JVM to go out of memory.
Look up cipher streams in Java. You can use them to encrypt/decrypt streams on the fly so you don't have to store the whole thing in memory. All you have to do is copy the regular FileInputStream for your source file to the CipherOutputStream that's wrapping your FileOutputStream for the encrypted sink file. IOUtils even conveniently contains a copy(InputStream, OutputStream) method to do this copy for you.
For example:
public static void main(String[] args) {
encryptFile("exampleInput.txt", "exampleOutput.txt");
}
public static void encryptFile(String source, String sink) {
FileInputStream fis = null;
try {
fis = new FileInputStream(source);
CipherOutputStream cos = null;
try {
cos = new CipherOutputStream(new FileOutputStream(sink), getEncryptionCipher());
IOUtils.copy(fis, cos);
} finally {
if (cos != null)
cos.close();
}
} finally {
if (fis != null)
fis.close();
}
}
private static Cipher getEncryptionCipher() {
// Create AES cipher with whatever padding and other properties you want
Cipher cipher = ... ;
// Create AES secret key
Key key = ... ;
cipher.init(Cipher.ENCRYPT_MODE, key);
}
If you need to know the number of bytes that were copied, you can use IOUtils.copyLarge instead of IOUtils.copy if the file sizes exceed Integer.MAX_VALUE bytes (2 GB).
To decrypt the file, do the same thing, but use CipherInputStream instead ofCipherOutputStream and initialize your Cipher using Cipher.DECRYPT_MODE.
Take a look here for more info on cipher streams in Java.
This will save you space because you won't need to store byte arrays of your own anymore. The only stored byte[] in this system is the internal byte[] of the Cipher, which will get cleared each time enough input is entered and an encrypted block is returned by Cipher.update, or on Cipher.doFinal when the CipherOutputStream is closed. However, you don't have to worry about any of this since it's all internal and everything is managed for you.
Edit: note that this can result in certain encryption exceptions being ignored, particularly BadPaddingException and IllegalBlockSizeException. This behavior can be found in the CipherOutputStream source code. (Granted, this source is from the OpenJDK, but it probably does the same thing in the Sun JDK.) Also, from the CipherOutputStream javadocs:
This class adheres strictly to the semantics, especially the failure semantics, of its ancestor classes java.io.OutputStream and java.io.FilterOutputStream. This class has exactly those methods specified in its ancestor classes, and overrides them all. Moreover, this class catches all exceptions that are not thrown by its ancestor classes.
The bolded line here implies that the cryptographic exceptions are ignored, which they are. This may cause some unexpected behavior while trying to read an encrypted file, especially for block and/or padding encryption algorithms like AES. Make a mental note of this that you will get zero or partial output for the encrypted (or decrypted for CipherInputStream) file.
If you're using IOUtils, perhaps you should consider IOUtils.copyLarge()
public static long copyLarge(InputStream input,
OutputStream output,
long inputOffset,
long length)
and specify a ByteArrayOutputStream as the output. You can then iterate through and load sections of your file using offset/length.
From the doc:
Copy some or all bytes from a large (over 2GB) InputStream to an
OutputStream, optionally skipping input bytes.
This is a newbie question, I know. Can you guys help?
I'm talking about big files, of course, above 100MB. I'm imagining some kind of loop, but I don't know what to use. Chunked stream?
One thins is for certain: I don't want something like this (pseudocode):
File file = new File(existing_file_path);
byte[] theWholeFile = new byte[file.length()]; //this allocates the whole thing into memory
File out = new File(new_file_path);
out.write(theWholeFile);
To be more specific, I have to re-write a applet that downloads a base64 encoded file and decodes it to the "normal" file. Because it's made with byte arrays, it holds twice the file size in memory: one base64 encoded and the other one decoded. My question is not about base64. It's about saving memory.
Can you point me in the right direction?
Thanks!
From the question, it appears that you are reading the base64 encoded contents of a file into an array, decoding it into another array before finally saving it.
This is a bit of an overhead when considering memory. Especially given the fact that Base64 encoding is in use. It can be made a bit more efficient by:
Reading the contents of the file using a FileInputStream, preferably decorated with a BufferedInputStream.
Decoding on the fly. Base64 encoded characters can be read in groups of 4 characters, to be decoded on the fly.
Writing the output to the file, using a FileOutputStream, again preferably decorated with a BufferedOutputStream. This write operation can also be done after every single decode operation.
The buffering of read and write operations is done to prevent frequent IO access. You could use a buffer size that is appropriate to your application's load; usually the buffer size is chosen to be some power of two, because such a number does not have an "impedance mismatch" with the physical disk buffer.
Perhaps a FileInputStream on the file, reading off fixed length chunks, doing your transformation and writing them to a FileOutputStream?
Perhaps a BufferedReader? Javadoc: http://download-llnw.oracle.com/javase/1.4.2/docs/api/java/io/BufferedReader.html
Use this base64 encoder/decoder, which will wrap your file input stream and handle the decoding on the fly:
InputStream input = new Base64.InputStream(new FileInputStream("in.txt"));
OutputStream output = new FileOutputStream("out.txt");
try {
byte[] buffer = new byte[1024];
int readOffset = 0;
while(input.available() > 0) {
int bytesRead = input.read(buffer, readOffset, buffer.length);
readOffset += bytesRead;
output.write(buffer, 0, bytesRead);
}
} finally {
input.close();
output.close();
}
You can use org.apache.commons.io.FileUtils. This util class provides other options too beside what you are looking for. For example:
FileUtils.copyFile(final File srcFile, final File destFile)
FileUtils.copyFile(final File input, final OutputStream output)
FileUtils.copyFileToDirectory(final File srcFile, final File destDir)
And so on.. Also you can follow this tut.
This problem seems to happen inconsistently. We are using a java applet to download a file from our site, which we store temporarily on the client's machine.
Here is the code that we are using to save the file:
URL targetUrl = new URL(urlForFile);
InputStream content = (InputStream)targetUrl.getContent();
BufferedInputStream buffered = new BufferedInputStream(content);
File savedFile = File.createTempFile("temp",".dat");
FileOutputStream fos = new FileOutputStream(savedFile);
int letter;
while((letter = buffered.read()) != -1)
fos.write(letter);
fos.close();
Later, I try to access that file by using:
ObjectInputStream keyInStream = new ObjectInputStream(new FileInputStream(savedFile));
Most of the time it works without a problem, but every once in a while we get the error:
java.io.StreamCorruptedException: invalid stream header: 0D0A0D0A
which makes me believe that it isn't saving the file correctly.
I'm guessing that the operations you've done with getContent and BufferedInputStream have treated the file like an ascii file which has converted newlines or carriage returns into carriage return + newline (0x0d0a), which has confused ObjectInputStream (which expects serialized data objects.
If you are using an FTP URL, the transfer may be occurring in ASCII mode.
Try appending ";type=I" to the end of your URL.
Why are you using ObjectInputStream to read it?
As per the javadoc:
An ObjectInputStream deserializes primitive data and objects previously written using an ObjectOutputStream.
Probably the error comes from the fact you didn't write it with ObjectOutputStream.
Try reading it wit FileInputStream only.
Here's a sample for binary ( although not the most efficient way )
Here's another used for text files.
There are 3 big problems in your sample code:
You're not just treating the input as bytes
You're needlessly pulling the entire object into memory at once
You're doing multiple method calls for every single byte read and written -- use the array based read/write!
Here's a redo:
URL targetUrl = new URL(urlForFile);
InputStream is = targetUrl.getInputStream();
File savedFile = File.createTempFile("temp",".dat");
FileOutputStream fos = new FileOutputStream(savedFile);
int count;
byte[] buff = new byte[16 * 1024];
while((count = is.read(buff)) != -1) {
fos.write(buff, 0, count);
}
fos.close();
content.close();
You could also step back from the code and check to see if the file on your client is the same as the file on the server. If you get both files on an XP machine, you should be able to use the FC utility to do a compare (check FC's help if you need to run this as a binary compare as there is a switch for that). If you're on Unix, I don't know the file compare program, but I'm sure there's something.
If the files are identical, then you're looking at a problem with the code that reads the file.
If the files are not identical, focus on the code that writes your file.
Good luck!