I am creating a program that encrypts data. De and encryption works perfectly fine. But when decrypting an image I get an error. I am very greatful for any help!
When trying to decrypt my image using the cat_encrypt method like this:
String encrypted = encrypt.cat_encrypt(getFileContent(jf.getSelectedFile()), pass);
Files.writeString(Path.of(jf.getSelectedFile().getPath()), encrypted);
I get the following error:
java.lang.IllegalArgumentException: Input byte array has incorrect ending byte at 54808
Here is my encryption method:
public static String cat_encrypt(String text, String pass) {
try {
MessageDigest messageDigest = MessageDigest.getInstance("MD5");
Key key = new SecretKeySpec(messageDigest.digest(pass.getBytes(UTF_8)), "AES");
Cipher cipher = Cipher.getInstance("AES");
cipher.init(ENCRYPT_MODE, key);
byte[] encrypted = cipher.doFinal(text.getBytes(UTF_8));
byte[] encoded = Base64.getEncoder().encode(encrypted);
return new String(encoded, UTF_8);
} catch (NoSuchAlgorithmException | NoSuchPaddingException | InvalidKeyException | IllegalBlockSizeException |
BadPaddingException e) {
throw new RuntimeException("Cannot encrypt", e);
}
}
And this is my decryption method:
public static String cat_decrypt(String text, String pass) {
try {
MessageDigest messageDigest = MessageDigest.getInstance("MD5");
Key key = new SecretKeySpec(messageDigest.digest(pass.getBytes(UTF_8)), "AES");
Cipher cipher = Cipher.getInstance("AES");
cipher.init(DECRYPT_MODE, key);
byte[] decoded = Base64.getDecoder().decode(text.getBytes(UTF_8));
byte[] decrypted = cipher.doFinal(decoded);
return new String(decrypted, UTF_8);
} catch (NoSuchAlgorithmException | NoSuchPaddingException | InvalidKeyException | IllegalBlockSizeException |
BadPaddingException e) {
try {
sendError((e.getClass().getSimpleName()));
}catch (Exception sE){e.printStackTrace();}
throw new RuntimeException("Cannot decrypt", e);
}
}
I call the decrypt method like this:
String decrypted = encrypt.cat_decrypt(getFileContent(jf.getSelectedFile()), pass);
The full error code:
Exception in thread "AWT-EventQueue-0" java.lang.IllegalArgumentException: Input byte array has incorrect ending byte at 54808
at java.base/java.util.Base64$Decoder.decode0(Base64.java:876)
at java.base/java.util.Base64$Decoder.decode(Base64.java:566)
I have a helper class AES encrypting/decrypting a String which is mostly a clone of the code provided in Baeldung AES encryption example
The code looks as follows:
import java.security.InvalidAlgorithmParameterException;
import java.security.InvalidKeyException;
import java.security.NoSuchAlgorithmException;
import java.security.SecureRandom;
import java.security.Security;
import java.security.spec.InvalidKeySpecException;
import java.security.spec.KeySpec;
import java.util.Arrays;
import java.util.Base64;
import javax.crypto.BadPaddingException;
import javax.crypto.Cipher;
import javax.crypto.IllegalBlockSizeException;
import javax.crypto.NoSuchPaddingException;
import javax.crypto.SecretKey;
import javax.crypto.SecretKeyFactory;
import javax.crypto.spec.IvParameterSpec;
import javax.crypto.spec.PBEKeySpec;
import javax.crypto.spec.SecretKeySpec;
public class CryptoHelper {
private static final String CIPHER_ALGORITM_NAME = "AES/CBC/PKCS5Padding";
private static final String HASHING_ALGO_NAME = "PBKDF2WithHmacSHA1";
private static final int KEY_TARGET_LENGTH = 256;
private static final int HASHING_ITERATIONS = 65536;
public static SecretKey getSecretKey(String string, byte[] salt)
throws NoSuchAlgorithmException, InvalidKeySpecException {
KeySpec spec = new PBEKeySpec(string.toCharArray(), salt, HASHING_ITERATIONS, KEY_TARGET_LENGTH);
try {
SecretKeyFactory keyFactory = SecretKeyFactory.getInstance(HASHING_ALGO_NAME);
SecretKey encryptedPassword = new SecretKeySpec(keyFactory.generateSecret(spec).getEncoded(), "AES");
return encryptedPassword;
} catch (NoSuchAlgorithmException | InvalidKeySpecException e) {
throw e;
}
}
public static IvParameterSpec getInitializationVector() {
byte[] iv = new byte[16];
new SecureRandom().nextBytes(iv);
return new IvParameterSpec(iv);
}
public static String encrypt(String input, String password, byte[] salt)
throws NoSuchPaddingException, NoSuchAlgorithmException, InvalidAlgorithmParameterException,
InvalidKeyException, BadPaddingException, IllegalBlockSizeException {
Cipher cipher = Cipher.getInstance(CIPHER_ALGORITM_NAME);
SecretKey secretKey = null;
try {
secretKey = getSecretKey(password, salt);
} catch (NoSuchAlgorithmException | InvalidKeySpecException e) {
// code
}
cipher.init(Cipher.ENCRYPT_MODE, secretKey, getInitializationVector());
byte[] cipherText = cipher.doFinal(input.getBytes());
return Base64.getEncoder().encodeToString(cipherText);
}
public static String decrypt(String encryptedText, String password, byte[] salt)
throws NoSuchPaddingException, NoSuchAlgorithmException, InvalidAlgorithmParameterException,
InvalidKeyException, BadPaddingException, IllegalBlockSizeException {
Cipher cipher = Cipher.getInstance(CIPHER_ALGORITM_NAME);
SecretKey secretKey = null;
try {
secretKey = getSecretKey(password, salt);
} catch (NoSuchAlgorithmException | InvalidKeySpecException e) {
// code
}
cipher.init(Cipher.DECRYPT_MODE, secretKey, getInitializationVector());
byte[] plainText = cipher.doFinal(Base64.getDecoder().decode(encryptedText));
return new String(plainText);
}
}
Now I test this with a unit test, but this test fails with
javax.crypto.BadPaddingException: Given final block not properly
padded. Such issues can arise if a bad key is used during decryption.
#Test
public void testDecrypt() {
String encryptedString = "";
String password = "password";
try {
encryptedString = CryptoHelper.encrypt("some string", password, password.getBytes());
} catch (InvalidKeyException | NoSuchPaddingException | NoSuchAlgorithmException
| InvalidAlgorithmParameterException | BadPaddingException | IllegalBlockSizeException e) {
e.printStackTrace();
assertNull(e);
}
try {
CryptoHelper.decrypt(encryptedString, password, password.getBytes());
} catch (InvalidKeyException | NoSuchPaddingException | NoSuchAlgorithmException
| InvalidAlgorithmParameterException | BadPaddingException | IllegalBlockSizeException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
What is wrong here?
Here is my attempt at solving this problem and also removing other security problems I've noticed in your code. It is obvious you just copied the code without understanding the underlying concepts behind it.
First things first, you need to differentiate between encryption and hashing.
In cryptography domain, one of the usage of hashing is to store user passwords in a form which isn't a plaintext. This is possible because the hash function are a one way functions, meaning that you cannot derive a password from a hash. Usually when hashing, a password's salt is used. Salt should be random bits of data that are unique for every hash you create. Also, there are security guidelines when it comes to salt sizes. I think the current recommended length of a salt should be at least 32 bits according to NIST.
hashing_function(input, salt) -> hashed_input
When using symmetric encryption, you will need to choose an algorithm, mode and a padding schema alongside a unique IV value. In your code you have chosen AES algorithm with CBC mode and PKCS5 padding. If you reuse an IV, you open yourself up to attacks which are easier to execute than those when you use a unique IV. Every time you encrypt a new sequence, you should use a new IV value. For example, you will use one, unique, IV value when encrypting one file or a message, but as soon as you want to encrypt another one you should use a new IV value. The main difference between hashing and encrypting is that encryption is not a one way function, meaning that with a proper key and iv you can decrypt your encrypted data.
k - encryption key
iv - initialization vector
p - plaintext
c - ciphertext
E - encryption function
D - decryption function
=> E(p, k, iv) = c
=> D(c, k, iv) = p
The reason why you are hashing your password is because you want to transform your text password into an encryption key. This is known as key stretching. You are using PBKDF2 algorithm with HMAC-SHA1 with 65,536 iterations. I think the current NIST recommendation is to use over 70,000 iterations with HMAC-SHA256. Keep in mind this is a recommendation, the combination of algorithms, iterations and other parameters depend on your desired level of security.
In this example, the reason why your unit test fails is that you are using different IV values for encryption and decryption. Once you fix this, your test will pass. However, in your code, your salt value, which you use for password hashing which is then used as a key for your symmetric encryption isn't correct. The salt should be unique and in the code snipped you provided you are hashing your password by using your own password as a salt. Essentially, you have this: hashing_function(input, input) -> hashed_input, where input in this case is your password.
I would solve this problem by adding another method called getSalt() and I would update the encryption and decryption methods, by adding an IV to methods parameters.
public static byte[] getSalt() {
byte[] salt = new byte[32];
new SecureRandom().nextBytes(salt);
return salt;
}
public static String encrypt(String input, String password, byte[] salt, IvParameterSpec iv)
throws NoSuchPaddingException, NoSuchAlgorithmException, InvalidAlgorithmParameterException,
InvalidKeyException, BadPaddingException, IllegalBlockSizeException {
Cipher cipher = Cipher.getInstance(CIPHER_ALGORITM_NAME);
SecretKey secretKey = null;
try {
secretKey = getSecretKey(password, salt);
} catch (NoSuchAlgorithmException | InvalidKeySpecException e) {
// code
}
cipher.init(Cipher.ENCRYPT_MODE, secretKey, iv);
byte[] cipherText = cipher.doFinal(input.getBytes());
return Base64.getEncoder().encodeToString(cipherText);
}
public static String decrypt(String encryptedText, String password, byte[] salt, IvParameterSpec iv)
throws NoSuchPaddingException, NoSuchAlgorithmException, InvalidAlgorithmParameterException,
InvalidKeyException, BadPaddingException, IllegalBlockSizeException {
Cipher cipher = Cipher.getInstance(CIPHER_ALGORITM_NAME);
SecretKey secretKey = null;
try {
secretKey = getSecretKey(password, salt);
} catch (NoSuchAlgorithmException | InvalidKeySpecException e) {
// code
}
cipher.init(Cipher.DECRYPT_MODE, secretKey, iv);
byte[] plainText = cipher.doFinal(Base64.getDecoder().decode(encryptedText));
return new String(plainText);
}
Then I would use/test you code like this:
#Test
public void testDecrypt() {
String encryptedString;
String decryptedString;
String plaintext = "some string";
String password = "password";
byte[] salt = CryptoHelper.getSalt();
IvParameterSpec iv = CryptoHelper.getInitializationVector();
encryptedString = CryptoHelper.encrypt(plaintext, password, salt, iv);
decryptedString = CryptoHelper.decrypt(encryptedString, password, salt, iv);
assertEquals(plaintext,decryptedString)
}
Finally, I would consider removing the try-catch from your test code as it doesn't, in its current form, provide any value. If an exception is thrown from the code being tested then the test will fail and that is normally what you want to happen. One exception would be when your test wants to provoke an exception.
Solved the problem above with the help of Topaco's comment. Instead of picking a new random number each time I run through method:
public static IvParameterSpec getInitializationVector() {
byte[] iv = new byte[16];
new SecureRandom().nextBytes(iv);
return new IvParameterSpec(iv);
}
I am using a hard coded byte array now:
public static IvParameterSpec getInitializationVector() {
byte[] iv = {1, 2, 3, ... 16};
return new IvParameterSpec(iv);
}
Please note that this solution is not exactly safe, was the IV is hard coded and can be extracted from source code. However, I choose not to pass it to the outside as my application is not very critical. I a proper solution, the random number needs to be returned together with the IvParameterSpec.
I've following aes encryption code in Java which I want to write it in C#, but it is not giving same output.
Java Code
public String doEncryptString(String salt, String password,String token) throws CryptoException {
try {
Cipher cipher = Cipher.getInstance("AES");
SecretKeySpec secretKeySpec = generateKeySpec(salt,password);
cipher.init(Cipher.ENCRYPT_MODE, secretKeySpec);
byte[] inputBytes = token.getBytes();
byte[] outputBytes = cipher.doFinal(inputBytes);
return Base64Utils.encodeToString(outputBytes);
} catch (NoSuchPaddingException | NoSuchAlgorithmException | InvalidKeyException | BadPaddingException
| IllegalBlockSizeException ex) {
throw new CryptoException("Error encrypting password", ex);
}
}
private SecretKeySpec generateKeySpec(String salt,String password) throws CryptoException{
try {
String generatedkey=salt+password;
byte[] key = generatedkey.getBytes("UTF-8");
MessageDigest sha = MessageDigest.getInstance("SHA-1");
key = sha.digest(key);
key = Arrays.copyOf(key, 16); // use only first 128 bit
SecretKeySpec secretKeySpec = new SecretKeySpec(key, "AES");
return secretKeySpec;
} catch (NoSuchAlgorithmException | IOException ex) {
throw new CryptoException("Error encrypting password", ex);
}
}
This is what I've tried in C#
public static string DoEncrypt(string salt, string password, string token)
{
var tdes = new AesManaged();
tdes.Key = GenerateKey(salt, password);
tdes.Mode = CipherMode.ECB;
tdes.Padding = PaddingMode.PKCS7;
ICryptoTransform crypt = tdes.CreateEncryptor();
byte[] plain = Encoding.UTF8.GetBytes(token);
byte[] cipher = crypt.TransformFinalBlock(plain, 0, plain.Length);
return Convert.ToBase64String(cipher);
}
private static byte[] GenerateKey(string salt, string password)
{
string generatedkey = $"{salt}{password}";
var key = Encoding.UTF8.GetBytes(generatedkey);
var sha1 = SHA1Managed.Create();
key = sha1.ComputeHash(key);
return key.Take(16).ToArray(); // use only first 128 bit
}
string/token to encrypt : ZHKRIWB310XVVWG315PI7UZZWU1V0YYL5WE9JL
Java output: eUjNH8kcgWtlEmuCFHMPwnCFWjy5Pye/gF+itrPs1g8AjtAEZQqlzW/v7kEt2haG
My C# code output: O8sKdJWH+XCOIbexZPEwN5NxWqpWRHC5b3ZsihT8cfBqpI1eVr3PEr9Eq39a5pMn
I don't know what I am doing wrong here. Any help would be appreciated. Thanks
Update
My apologies everyone. The code translated in C# in working fine. By mistake, I was passing different salt value. Thanks everyone.
What's in TRANSFORMATION from the Java code?
You need also to use the same mode and padding to get the same results, meaning ECB and PKCS7 in your case.
Java seems to offer only PKCS5 padding? But it seems to be compatible with PKCS7? I'm not a Java dev and can't provide details, but there is a discussion here: https://crypto.stackexchange.com/questions/9043/what-is-the-difference-between-pkcs5-padding-and-pkcs7-padding where they say:
Some cryptographic libraries such as the SUN provider in Java indicate
PKCS#5 where PKCS#7 should be used - "PKCS5Padding" should have been
"PKCS7Padding". This is - with high probability - a legacy from the
time that only 8 byte block ciphers such as (triple) DES symmetric
cipher were available.
And by the way: for production never use ECB mode as it's unsafe.
I've followed a few examples and when trying to implement AES/GCM/NoPadding referenced here: https://www.strongauth.com/samplecode/GCM.java I am unable to encrypt any text that contains special characters (i.e. ø).
Ultimately it fails inside of doFinal with
javax.crypto.ShortBufferException: Output buffer must be (at least) 30 bytes long
but it seems like I must be doing something wrong. What am I missing?
Simple POC:
public class Example {
private static final String CIPHER_TRANSFORM = "AES/GCM/NoPadding";
public static void main(String[] args) {
String key = generateKey("AES", 256, "seed");
encryptText("text containing a ø character", key, "TOKENTOKENTOKENTOKEN", "AES");
}
private static String generateKey(String alg, int size, String seed) {
try {
SecureRandom securerandom = SecureRandom.getInstance("SHA1PRNG");
securerandom.setSeed(seed.getBytes("UTF-8"));
KeyGenerator kg = KeyGenerator.getInstance(alg);
kg.init(size, securerandom);
SecretKey sk = kg.generateKey();
return new String(Base64.getEncoder().encode(sk.getEncoded()), "UTF-8");
}
catch (UnsupportedEncodingException | NoSuchAlgorithmException ex) {
System.err.println(ex);
}
return null;
}
private static String encryptText(String PLAINTEXT, String PLAINTEXTKEY, String TOKEN, String alg) {
try {
// Create SecretKey & Cipher
SecretKeySpec sks = new SecretKeySpec(Base64.getDecoder().decode(PLAINTEXTKEY), alg);
Cipher cipher = Cipher.getInstance(CIPHER_TRANSFORM);
// Setup byte arrays
byte[] input = PLAINTEXT.getBytes("UTF-8");
byte[] tkb = TOKEN.getBytes("UTF-8");
byte[] iv = new byte[12];
System.arraycopy(tkb, 4, iv, 0, 12);
cipher.init(Cipher.ENCRYPT_MODE, sks, new GCMParameterSpec(128, iv));
cipher.updateAAD(tkb);
byte[] opbytes = new byte[cipher.getOutputSize(PLAINTEXT.length())];
// Perform crypto
int ctlen = cipher.update(input, 0, input.length, opbytes);
ctlen += cipher.doFinal(opbytes, ctlen);
byte[] output = new byte[ctlen];
System.arraycopy(opbytes, 0, output, 0, ctlen);
return new String(Base64.getEncoder().encode(output), "UTF-8");
}
catch (InvalidAlgorithmParameterException | UnsupportedEncodingException |
IllegalBlockSizeException | BadPaddingException | InvalidKeyException |
NoSuchAlgorithmException | NoSuchPaddingException | ShortBufferException ex) {
System.err.println(ex);
}
return null;
}
}
Your issue is this line:
byte[] opbytes = new byte[cipher.getOutputSize(PLAINTEXT.length())];
The length of a string in UTF-8 runes is not always the same as the length of the underlying byte array. You should be using the length of input here, not PLAINTEXT.
I have to interface with a system written in java that encrypts data using the following java method:
public final void rsaEncrypt(String data, String filePath) {
try {
Base64.Encoder encoder = Base64.getEncoder();
PublicKey pubKey = readKeyFromFile("/" + Constants.PUBLIC_KEY_FILE_NAME, filePath);
Cipher cipher = Cipher.getInstance(Constants.RSA_INSTANCE);
cipher.init(Cipher.ENCRYPT_MODE, pubKey);
byte[] cipherData = cipher.doFinal(data.getBytes("UTF-8"));
writeToFile(Constants.ENCRYPTED_STRING_FILE_NAME, filePath, encoder.encodeToString(cipherData));
} catch (BadPaddingException | InvalidKeyException | NoSuchPaddingException | NoSuchAlgorithmException
| IllegalBlockSizeException | UnsupportedEncodingException e) {
if (LOG.isErrorEnabled())
LOG.error("Error encrypting String.", e);
throw new EncryptionException("Error encrypting data", e);
}
}
My code is written in c++ using openssl:
std::string prv =
"-----BEGIN RSA PRIVATE KEY-----\n"
// cut key data
"-----END RSA PRIVATE KEY-----\n";
BIO *bio = BIO_new_mem_buf((void*)prv.c_str(), -1);
RSA* rsaPrivKey = PEM_read_bio_RSAPrivateKey(bio, NULL, NULL, NULL);
if (!rsaPrivKey)
printf("ERROR: Could not load PRIVATE KEY! PEM_read_bio_RSAPrivateKey FAILED: %s\n", ERR_error_string(ERR_get_error(), NULL));
BIO_free(bio);
// where enc[] holds the Base64 decoded data, but len becomes -1 and no data is decoded.
int len = RSA_private_decrypt(64,(unsigned char *)&enc, (unsigned char *)&dec, private_key_, RSA_PKCS1_PADDING);
I can decode data I encrypt myself with the public key, but don't seem to match the java options for this.
Any suggestions?
These is no answer here as like my comment said, it turns out the Java code was incorrect in creating their keys, which were not matching the .pem files I was given.