I'm not well aware with Java KeyStore. What I want to do is to have an encrypted structure to store my keys.
I've multiple clusters and there exists a key associated with every cluster & now I want to store those keys securely such that they are all encrypted using single main key (for an instance, 'loginid')
I wandered alot in search of this issue and somewhere on stackoverflow itself someone suggested about Java keyStore to store SecretKey (Symmetric Encryption). I read its documentation & found it perfect as per my requirements but couldn't understand its implementation properly.
Here is a code snippet I'm working on -
public class Prac {
public static void main(String[] args) throws KeyStoreException, FileNotFoundException, IOException, NoSuchAlgorithmException, CertificateException, UnrecoverableKeyException, UnrecoverableEntryException {
KeyStore ks = KeyStore.getInstance("JCEKS");
char[] ksPwd = "yashkaranje98".toCharArray();
ks.load(null, ksPwd);
KeyStore.ProtectionParameter protParam = new KeyStore.PasswordProtection(ksPwd);
javax.crypto.SecretKey mySecretKey = new SecretKeySpec("_anky!#ubn#$0e41".getBytes(),"AES");
KeyStore.SecretKeyEntry skEntry = new KeyStore.SecretKeyEntry(mySecretKey);
ks.setEntry("cluster1", skEntry, protParam);
java.io.FileOutputStream fos = null;
try {
fos = new java.io.FileOutputStream("keystore.ks");
ks.store(fos, ksPwd);
} finally {
if (fos != null) {
fos.close();
}
}
java.io.FileInputStream fis = null;
try {
ks.load(new FileInputStream("keystore.ks"), ksPwd);
} finally {
if (fis != null) {
fis.close();
}
}
SecretKey key = (SecretKey)ks.getKey("cluster1", ksPwd);
String encodedKey = Base64.getEncoder().encodeToString(key.getEncoded());
System.out.println(encodedKey);
}
}
Alias: "cluster1"
Key to store: _anky!#ubn#$0e41
Protection Parameter: yashkaranje98
It prints: X2Fua3khQHVibiMkMGU0MQ==
What I expect is key itsef: _anky!#ubn#$0e41
Kindly please let me know what I'm missing...but before please tell me what I'm expecting is it even legit? or does it make sense?
(I am still learning about this KeyStore concept so there might be some silly mistakes.)
A secret AES key consists of random bytes. Such a key should not be printed directly, because the bytes may not represent valid characters or they may present control characters that don't print on screen. If you'd copy them then you might miss data. If you print them in the wrong terminal you may send terminal control codes.
Because of this you need to print out key values as hexadecimals or base 64. Normally for symmetric keys hex is preferred as it is easy to see the contents and size from the hex (the size in bytes is half that of the hex size, the size in bits is 4 times the hex size as each hex digit represents a 4 bit nibble). However, as Java still lacks a good Hex encoder, base 64 is also a good option.
Of course, in that case, to compare, you should also decode it from base 64 before you insert it into the key store.
Also beware that you don't specify the character encoding when you call getBytes on the string. If you would use higher valued characters then you may get different results on various systems, as getBytes without argument assumes the platform encoding. Specifying StandardCharsets.UTF_8 usually makes more sense.
Of course, as keys should contain random bytes, the getBytes method needs to go entirely, but you should keep this in mind anyway.
When I look at the code it seems you've missed the last 10 years of Java progress. No var, no null avoidance, missing imports, and no try-with-resources. That's a shame, because those would make your code a lot more readable. It's valid, mind you, but yeah...
Related
I'm trying to create a AES key with this code
public static SecretKey generateSecretKey() {
KeyGenerator generator;
try {
generator = KeyGenerator.getInstance(StaticHandler.AES_KEY_MODE); // Is "AES"
generator.init(StaticHandler.AES_KEY_SIZE); // The AES key size in number of bits // Is "128"
return generator.generateKey();
} catch (NoSuchAlgorithmException e) {
e.printStackTrace();
}
return null;
}
however using this code for encrypting/decrypting
public static String encrypt(String data, SecretKey secret, Charset charset) {
try {
Cipher cipher = Cipher.getInstance("AES/CBC/PKCS5Padding");
cipher.init(Cipher.ENCRYPT_MODE, secret);
return new String(cipher.doFinal(data.getBytes()), charset);
} catch (Exception e) {
e.printStackTrace();
}
return null;
}
public static String decrypt(String data, #NonNull SecretKey secret, Charset charset) {
try {
Cipher cipher = Cipher.getInstance("AES/CBC/PKCS5Padding");
cipher.init(Cipher.DECRYPT_MODE, secret);
return new String(cipher.doFinal(data.getBytes()), charset);
} catch (Exception e) {
e.printStackTrace();
}
return null;
}
gets the error
java.security.InvalidKeyException: Parameters missing
I'm guessing I need to add some salt, though I don't know how to do that with a generated key. I would like to stray away from generating a password but if it's a securely generated password I wouldn't mind.
Edit: Just an after thought, should I use GCM or CBC encryption if I'm sending packets through the network? Remember I'm using randomly generated keys and I am not going to keep them for sessions, randomly generated per client and server session.
No, you don't need salt and your key is actuallly fine. CBC mode requires an IV (Initialization Vector), see wikipedia, and IV should be different for each piece of data encrypted, but each decryption must use the same value as the corresponding encryption did. (added) For CBC, though not some other modes, it is also vital for security that IVs not be predictable by an adversary; the simplest and most common way to achieve both uniqueness and unpredictability is to use a secure Random Number (aka Bit) Generator such as Java's SecureRandom. If you want to know about other methods, that is not really a programming issue and is better suited on crypto.SX or security.SX, where there are already several Qs.
You can either generate the IV explicitly and specify it to both encrypt and decrypt, or allow the encrypt operation to generate the IV itself, fetch it from the encrypt Cipher, and specify it to the decrypt Cipher. In either case the encryptor must provide the value the decryptor will use; a common approach is to simply concatenate the IV with the ciphertext (making it very easy to keep them matched up properly) but again there are other approaches discussed on crypto and security. See https://docs.oracle.com/en/java/javase/11/security/java-cryptography-architecture-jca-reference-guide.html in the sections named "Initializing a Cipher Object" (the two paragraphs just after the boxed block of method declarations) and "Managing Algorithm Parameters".
Also don't store ciphertext in a String. Java String is designed to handle valid characters not arbitrary bytes. 'Decoding' ciphertext to a String and 'encoding' it back to binary will almost always lose or alter some of the data, especially if you allow the Charset to differ at the two ends, and with modern cryptography any change at all to the ciphertext will destroy all or much of your data. Since ciphertext is bytes, it is best to handle it as byte[]; if that is not possible because you want to put it in something that is characters like a URL, use one of the many schemes designed to encode arbitrary bytes to text so that they can be recovered correctly: base64 (3 or 4 major variants, plus many minor ones), base32, hexadecimal/base16, URL 'percent' encoding, MIME quoted-printable, yencode, Kermit, PPP, etc. j8+ java.util.Base64 provides the newer base64 variants (i.e. not uuencode).
Conversely, although 'plaintext' in modern crypto can really be any form of data, if yours truly is text and belongs in a String you should encode it using a suitable Charset before encrypting, and decode using the same Charset after decrypting, i.e.
byte[] ctext = encCipher.doFinal (input.getBytes(charset));
...
String output = new String (decCipher.doFinal (ctext), charset);
While the 'best' Charset may vary depending on your data, if you don't know what the data will be or don't want to bother analyzing it, UTF-8 is reasonably good for most text data and very popular and standard.
I'm trying to create a glassfish custom JDBCRealm and during some test on it, I got a MalformatedInputException when I using the com.sun.enterprise.util.Utility.convertByteArrayToCharArray function.
So I decided to externalize the part of my function which throw this error to test it and understand where it comming.
Resumed function:
public void justATestFunction()
throws Exception
{
final char[] password = "myP4ssW0rd42".toCharArray();
MessageDigest md = MessageDigest.getInstance("SHA-256");
// according to the Utility doc, if the Charset parameter is null or empty,
// it will call the Charset.defaultCharset() function to define the charset to use
byte[] hashedPassword= Utility.convertCharArrayToByteArray(password, null);
hashedPassword = md.digest(hashedPassword);
Utility.convertByteArrayToCharArray(hashedPassword, null); // throw a MalformatedInputException
}
Thank you in advance for your answer.
Let's look at each step:
byte[] hashedPassword= Utility.convertCharArrayToByteArray(password, null);
The above converts the Unicode-16 characters to a byte-array using the default endcoding, probably WIN-1252 or UTF-8. Since the password contains nothing outside standard 7-bit ASCII, the result is the same for either encoding.
hashedPassword = md.digest(hashedPassword);
hashedPassword now refers to a completely different byte array containing the BINARY digest of the original password. This is a BINARY string and no longer represents anything in any character encoding. It is pure binary data.
Utility.convertByteArrayToCharArray(hashedPassword, null);
Now you attempt to "decode" the binary string as if it were encoded with the default character set, which will undoubtedly throw an exception.
I suspect you really wanted to display either the hexadecimal representation of the digest, or maybe the base-64 version. In either case, what you have done will never work.
Since you haven't explained what you want to accomplish, this is the best anyone can do.
Something in the back of my head is telling me I'm missing something obvious here.
I'm integrating an existing java project with a third-party api that uses an md5 hash of an api key for authentication. It's not working for me, and during debugging I realized that the hashes I'm generating don't match the examples that they've supplied. I've found some websites that create MD5 hashes from strings to check their examples, and as far as I can tell I'm wrong and they're right.
for example, according to this website, the string "hello" generates a hash of "5d41402abc4b2a76b9719d911017c592". (FWIW I don't know anything about this website except that it seems to correctly hash the examples that I have). When I run it through my code I get:
XUFAKrxLKna5cZ2REBfFkg==
Here is the simple method I'm using to generate the md5 hash/string.:
private String md5(String md5Me) throws Exception {
MessageDigest md = MessageDigest.getInstance("MD5");
md.reset();
md.update(md5Me.getBytes("UTF-8"));
return Base64.encodeBase64String(md.digest());
}
I used a very similar method to successfully authenticate a different API using the SHA1 algorithm last week. I'm wondering if the problem is related to the org.apache.commons.net.util.Base64.encodeBase64String... Any help is greatly appreciated, if only some tests to see if the byteArray is correct but the converted string is wrong.
for example, according to this website, the string "hello" generates a hash of "5d41402abc4b2a76b9719d911017c592". (FWIW I don't know anything about this website except that it seems to correctly hash the examples that I have). When I run it through my code I get:
XUFAKrxLKna5cZ2REBfFkg==
Both are correct ways of representing the same sixteen-byte hash. 5d41402abc4b2a76b9719d911017c592 represents each byte of the hash as two hexadecimal digits, whereas XUFAKrxLKna5cZ2REBfFkg== uses Base-64 to represent every three bytes of the hash as four characters.
To generate the hexadecimal-version that this third-party API is expecting, you can change this:
Base64.encodeBase64String(md.digest());
to this:
String.format("%032x", new BigInteger(1, md.digest()));
(mostly taken from this StackOverflow answer).
However, you might want to consider using an external library for this. Perception, in a comment above, mentions Apache Commons DigestUtils. If you use that, you'll want the md5hex method.
The md5 Hash algorithm is part of the core java API so there is no need for any external libraries. Here is the method I used to encrypt a password with MD5.
import java.security.MessageDigest;
/**
* Use to encrypt passwords using MD5 algorithm
* #param password should be a plain text password.
* #return a hex String that results from encrypting the given password.
*/
public static String encryptPassword(String password) {
try {
MessageDigest md = MessageDigest.getInstance("MD5");
md.update(password.getBytes());
byte byteData[] = md.digest();
StringBuffer hexString = new StringBuffer();
for (int i=0;i<byteData.length;i++) {
String hex=Integer.toHexString(0xff & byteData[i]);
if(hex.length()==1) hexString.append('0');
hexString.append(hex);
}
return hexString.toString();
}
catch(java.security.NoSuchAlgorithmException missing) {
return "Error.";
}
}
I have an encryption algorithm (AES) that accepts a file converted to array byte and encrypt it.
Since I am going to process a very large files, the JVM may go out of memory.
I am planing to read the files in multiple byte arrays, each containing some part of the file. Then I iteratively feed the algorithm. Finally, I merge them to produce an encrypted file.
So my question is: Is there any way to read a file part by part to multiple byte arrays?
I thought I could use the following to read the file to a byte array:
IOUtils.toByteArray(InputStream input).
And then split the array into multiple bytes using:
Arrays.copyOfRange()
But I am afraid that the code that reads a file to ByteArray will make the JVM to go out of memory.
Look up cipher streams in Java. You can use them to encrypt/decrypt streams on the fly so you don't have to store the whole thing in memory. All you have to do is copy the regular FileInputStream for your source file to the CipherOutputStream that's wrapping your FileOutputStream for the encrypted sink file. IOUtils even conveniently contains a copy(InputStream, OutputStream) method to do this copy for you.
For example:
public static void main(String[] args) {
encryptFile("exampleInput.txt", "exampleOutput.txt");
}
public static void encryptFile(String source, String sink) {
FileInputStream fis = null;
try {
fis = new FileInputStream(source);
CipherOutputStream cos = null;
try {
cos = new CipherOutputStream(new FileOutputStream(sink), getEncryptionCipher());
IOUtils.copy(fis, cos);
} finally {
if (cos != null)
cos.close();
}
} finally {
if (fis != null)
fis.close();
}
}
private static Cipher getEncryptionCipher() {
// Create AES cipher with whatever padding and other properties you want
Cipher cipher = ... ;
// Create AES secret key
Key key = ... ;
cipher.init(Cipher.ENCRYPT_MODE, key);
}
If you need to know the number of bytes that were copied, you can use IOUtils.copyLarge instead of IOUtils.copy if the file sizes exceed Integer.MAX_VALUE bytes (2 GB).
To decrypt the file, do the same thing, but use CipherInputStream instead ofCipherOutputStream and initialize your Cipher using Cipher.DECRYPT_MODE.
Take a look here for more info on cipher streams in Java.
This will save you space because you won't need to store byte arrays of your own anymore. The only stored byte[] in this system is the internal byte[] of the Cipher, which will get cleared each time enough input is entered and an encrypted block is returned by Cipher.update, or on Cipher.doFinal when the CipherOutputStream is closed. However, you don't have to worry about any of this since it's all internal and everything is managed for you.
Edit: note that this can result in certain encryption exceptions being ignored, particularly BadPaddingException and IllegalBlockSizeException. This behavior can be found in the CipherOutputStream source code. (Granted, this source is from the OpenJDK, but it probably does the same thing in the Sun JDK.) Also, from the CipherOutputStream javadocs:
This class adheres strictly to the semantics, especially the failure semantics, of its ancestor classes java.io.OutputStream and java.io.FilterOutputStream. This class has exactly those methods specified in its ancestor classes, and overrides them all. Moreover, this class catches all exceptions that are not thrown by its ancestor classes.
The bolded line here implies that the cryptographic exceptions are ignored, which they are. This may cause some unexpected behavior while trying to read an encrypted file, especially for block and/or padding encryption algorithms like AES. Make a mental note of this that you will get zero or partial output for the encrypted (or decrypted for CipherInputStream) file.
If you're using IOUtils, perhaps you should consider IOUtils.copyLarge()
public static long copyLarge(InputStream input,
OutputStream output,
long inputOffset,
long length)
and specify a ByteArrayOutputStream as the output. You can then iterate through and load sections of your file using offset/length.
From the doc:
Copy some or all bytes from a large (over 2GB) InputStream to an
OutputStream, optionally skipping input bytes.
I'm having a problem with MessageDigest returning different hash values on different computers.
One computer is running 32-bit Java on Windows Vista and the other is running 64-bit Java on Mac OS. I'm not sure if it is because MessageDigest is machine dependent, or I need to explicitly specify a character encoding somewhere, or perhaps something else. Here's the
code:
public static boolean authenticate(String salt, String encryptedPassword,
char[] plainTextPassword ) throws NoSuchAlgorithmException {
// do I need to explcitly specify character encoding here? -->
String saltPlusPlainTextPassword = salt + new String(plainTextPassword);
MessageDigest sha = MessageDigest.getInstance("SHA-512");
// is this machine dependent? -->
sha.update(saltPlusPlainTextPassword.getBytes());
byte[] hashedByteArray = sha.digest();
// or... perhaps theres a translation problem here? -->
String hashed = new String(hashedByteArray);
return hashed.equals(encryptedPassword);
}
Should this code execute differently on these two different machines?
If it is machine dependent the way I've written it, is there another way hash these passwords that is more portable? Thanks!
Edit:::::
This is the code I'm using to generate the salts:
public static String getSalt() {
int size = 16;
byte[] bytes = new byte[size];
new Random().nextBytes(bytes);
return org.apache.commons.codec.binary.Base64.encodeBase64URLSafeString(bytes);
}
Solution:::
Thanks to the accepted solution, I was able to fix my code:
public static boolean authenticate_(String salt, String encryptedPassword,
char[] plainTextPassword ) throws NoSuchAlgorithmException, UnsupportedEncodingException {
// This was ok
String saltPlusPlainTextPassword = salt + new String(plainTextPassword);
MessageDigest sha = MessageDigest.getInstance("SHA-512");
// must specify "UTF-8" encoding
sha.update(saltPlusPlainTextPassword.getBytes("UTF-8"));
byte[] hashedByteArray = sha.digest();
// Use Base64 encoding here -->
String hashed = org.apache.commons.codec.binary.Base64.encodeBase64URLSafeString(hashedByteArray);
return hashed.equals(encryptedPassword);
}
Encodings are causing you problems. First here:
saltPlusPlainTextPassword.getBytes()
That will use the default encoding for the machine. Bad idea. Specify "UTF-8" as a simple solution. (It's guaranteed to be present.)
Next this causes issues:
String hashed = new String(hashedByteArray);
hashedByteArray is arbitrary binary data. To safely convert it to text, either use a base-64 encoding or just hex. Again, you're currently using the default encoding, which will vary from machine to machine. There are loads of 3rd party libraries for base64 encoding in Java.
Likely Jon Skeet's solution above is the cause, and his recommendations should definitely be taken into account, but another possible cause is a misunderstanding of salt.
Salt is a semi-secret random value that is applied to a String prior to hashing. This makes it harder to perform a brute force attack when trying to guess what an originating String was because the salt is presumably unknown to the attacker.
Salt values generally differ installation to installation. Its possible that the actual cause is just that you have the salt values set differently on the different machines.