HMAC value calculated from Java is not matching with Ruby code - java

I have to write client provided Ruby code in Java. The code uses secret key and Base64 encoding to form hmac value. I tried to write similar code in Java but the resulted hmac value is not matching with the Ruby script result. Please find the below block of code for Java & Ruby along with resulted output.
Java Code:
public static void main(String[] args)
throws NoSuchAlgorithmException, InvalidKeyException
{
// get an hmac_sha1 key from the raw key bytes
String secretKey =
"Ye2oSnu1NjzJar1z2aaL68Zj+64FsRM1kj7I0mK3WJc2HsRRcGviXZ6B4W+/V2wFcu78r8ZkT8=";
byte[] secretkeyByte = Base64.decodeBase64(secretKey.getBytes());
SecretKeySpec signingKey = new SecretKeySpec(secretkeyByte, "HmacSHA1");
// get an hmac_sha1 Mac instance and initialize with the signing key.
String movingFact = "0";
byte[] text = movingFact.getBytes();
Mac mac = Mac.getInstance("HmacSHA1");
mac.init(signingKey);
// compute the hmac on input data bytes
byte[] rawHmac = mac.doFinal(text);
byte[] hash = Base64.encodeBase64(rawHmac);
System.out.println("hash :" + hash);
}
Java Output: hash :[B#72a32604
Ruby Code:
def get_signature()
key = Base64.decode64("Ye2oSnu1NjzJar1z2aaL68Zj+64FsRM1kj7I0mK3WJc2HsRRcGviXZ6B4W+/V2wFcu78r8ZkT8=")
digest = OpenSSL::Digest::Digest.new('sha1')
string_to_sign = "0"
hash = Base64.encode64(OpenSSL::HMAC.digest(digest, key, string_to_sign))
puts "hash: " + hash
end
Ruby Output: hash: Nxe7tOBsbxLpsrqUJjncrPFI50E=

As mentionned in the comments, you're printing the description of your byte array, not the contents:
Replace:
System.out.println("hash :" + hash);
With:
System.out.println("hash: " + new String(hash, StandardCharsets.UTF_8));

Related

Convert HMAC function from Java to JavaScript

I am tying to implement an HMAC function in NodeJS using this Java function as a reference:
private static String printMacAsBase64(byte[] macKey, String counter) throws Exception {
// import AES 128 MAC_KEY
SecretKeySpec signingKey = new SecretKeySpec(macKey, "AES");
// create new HMAC object with SHA-256 as the hashing algorithm
Mac mac = Mac.getInstance("HmacSHA256");
mac.init(signingKey);
// integer -> string -> bytes -> encrypted bytes
byte[] counterMac = mac.doFinal(counter.getBytes("UTF-8"));
// base 64 encoded string
return DatatypeConverter.printBase64Binary(counterMac);
}
From this I get HMAC of Qze5cHfTOjNqwmSSEOd9nEISOobheV833AncGJLin9Y=
I am getting a different value for the HMAC when passing the same counter and key through the HMAC algorithm in node. Here is my code to generate the hmac.
var decryptedMacKey = 'VJ/V173QE+4CrVvMQ2JqFg==';
var counter = 1;
var hash = crypto
.createHmac('SHA256',decryptedMacKey)
.update(new Buffer(counter.toString(),'utf8'),'utf8')
.digest('base64');
When I run this I get a MAC of nW5MKXhnGmgpYwV0qmQtkNBDrCbqQWQSkk02fiQBsGU=
I was unable to find any equivalent to the SecretKeySpec class in javascript so that may be the missing link.
I was also able to generate the same value as my program using this
https://quickhash.com/ by selecting the algorithm Sha-256 and entering the decrypted mac key and counter.
You forgot to decode the decryptedMacKey from a Base 64 representation:
var hash = crypto.createHmac('SHA256', new Buffer(decryptedMacKey, 'base64'))
.update(new Buffer(counter.toString(),'utf8'),'utf8')
.digest('base64');
gives:
'Qze5cHfTOjNqwmSSEOd9nEISOobheV833AncGJLin9Y='

Difference in HMAC signature between python and java

I am trying to take some working python code and convert it to java for my usage. The python code below produces the correct signature. The java code using the same key, salt, produces something different and I am at a loss for why. In the Java code I am using the key generated in python (_key) to create the signature.
What I don't understand is, if I print the value of _key in python I get "34ee7983-5ee6-4147-aa86-443ea062abf774493d6a-2a15-43fe-aace-e78566927585". Now if I take that and place it directly into the hmac(new) call I get a different result than if I just leave the _key variable. I assume this has something to do with encoding of some type but I am at a loss.
_s1 = base64.b64decode('VzeC4H4h+T2f0VI180nVX8x+Mb5HiTtGnKgH52Otj8ZCGDz9jRW'
'yHb6QXK0JskSiOgzQfwTY5xgLLSdUSreaLVMsVVWfxfa8Rw==')
_s2 = base64.b64decode('ZAPnhUkYwQ6y5DdQxWThbvhJHN8msQ1rqJw0ggKdufQjelrKuiG'
'GJI30aswkgCWTDyHkTGK9ynlqTkJ5L4CiGGUabGeo8M6JTQ==')
# bitwise and of _s1 and _s2 ascii, converted to string
_key = ''.join([chr(ord(c1) ^ ord(c2)) for (c1, c2) in zip(_s1, _s2)])
#classmethod
def get_signature(cls, song_id, salt=None):
"""Return a (sig, salt) pair for url signing."""
if salt is None:
salt = str(int(time.time() * 1000))
mac = hmac.new(cls._key, song_id, sha1)
mac.update(salt)
sig = base64.urlsafe_b64encode(mac.digest())[:-1]
return sig, salt
This is my Java code. I think ultimately my issue is how I am handling or encoding the AA_KEY but I cannot figure it out.
private static final String AA_KEY = "34ee7983-5ee6-4147-aa86-443ea062abf774493d6a-2a15-43fe-aace-e78566927585";
public void someFunc(String songId) {
salt = "1431875768596"
String sig = hmacSha1(songId + salt, AA_KEY);
sig = StringUtils.replaceChars(sig, "+/=", "-_.");
}
static String hmacSha1(String value, String key) {
try {
// Get an hmac_sha1 key from the raw key bytes
byte[] keyBytes = key.getBytes();
SecretKeySpec signingKey = new SecretKeySpec(keyBytes, "HmacSHA1");
// Get an hmac_sha1 Mac instance and initialize with the signing key
Mac mac = Mac.getInstance("HmacSHA1");
mac.init(signingKey);
// Compute the hmac on input data bytes
byte[] rawHmac = mac.doFinal(value.getBytes());
return Base64.encodeBytes(rawHmac);
} catch (Exception e) {
throw new RuntimeException(e);
}
}
I found a couple of similar questions but they didn't help me figure it out sadly. Thanks!
Python HMAC-SHA256 signature differs from PHP signature
Python HMAC-SHA1 vs Java HMAC-SHA1 different results

Java AES-256 Decryption - translating code from ActionScript 3

I have a working decryption in ActionScript 3, now I want to get the same result when decrypting in Java. (I know that the OFB-mode and NullPadding is probably not preferred, but that's what I used back then and that is what I need to decrypt now...)
(very old) Adobe ActionScript 3 code:
static public function decryptTest(): Boolean {
var iv: String = "0df1eff724d50157ab048d9ff214b73c";
var cryptext: String = "2743be20314cdc768065b794904a0724e64e339ea6b4f13c510e2d2e8c95dd7409aa0aefd20daae80956dd2978c98d6e914d1d7b5b5be47b491d91e7e4f16f7f30d991ba80a81bafd8f0d7d83755ba0ca66d6b208424529c7111bc9cd6d11786f3f604a0715f";
var kkey: String = "375f22c03371803ca6d36ec42ae1f97541961f7359cf5611bbed399b42c7c0be";
var kdata: ByteArray = Hex.toArray(kkey);
var data: ByteArray = Hex.toArray(cryptext);
var name: String = 'aes-256-ofb';
var pad:IPad = new NullPad();
var mode: ICipher = Crypto.getCipher(name, kdata, pad);
pad.setBlockSize(mode.getBlockSize());
trace("mode block size: " + mode.getBlockSize());
if (mode is IVMode) {
var ivmode:IVMode = mode as IVMode;
ivmode.IV = Hex.toArray(iv);
}
mode.decrypt(data);
var res: String = data.toString();
trace("result: " + res);
return res == "01020506080b10131c22292d313536393b464c535466696d6e7d7f808a8e9899a2adb1b8babcbebfc1c6c7c8cecfd8e0e4e8ef";
}
trace("decryption test: " + netplay.decryptTest());
Flash output is:
mode block size: 16
result: 01020506080b10131c22292d313536393b464c535466696d6e7d7f808a8e9899a2adb1b8babcbebfc1c6c7c8cecfd8e0e4e8ef
decryption test: true
What have I tried?
I have tried two different approaches in Java, one using the built-in Cipher class, and one using this code/class. However, the first approach gives me an IllegalKeyException and the other is giving me garbage. Also, the second approach doesn't clearly specify how to enter the IV-data for the decryption, nor does it let me specify the OFB-mode or the padding.
java.security.InvalidKeyException: Illegal key size
at javax.crypto.Cipher.checkCryptoPerm(Cipher.java:1023)
at javax.crypto.Cipher.implInit(Cipher.java:789)
at javax.crypto.Cipher.chooseProvider(Cipher.java:848)
at javax.crypto.Cipher.init(Cipher.java:1347)
at javax.crypto.Cipher.init(Cipher.java:1281)
at test.net.zomis.ZomisTest.decryptCipher(ZomisTest.java:112)
#Test
public void decryptCipher() throws UnsupportedEncodingException, NoSuchAlgorithmException, NoSuchPaddingException, InvalidKeyException, InvalidAlgorithmParameterException, IllegalBlockSizeException, BadPaddingException {
String iv = "0df1eff724d50157ab048d9ff214b73c";
String cryptext = "2743be20314cdc768065b794904a0724e64e339ea6b4f13c510e2d2e8c95dd7409aa0aefd20daae80956dd2978c98d6e914d1d7b5b5be47b491d91e7e4f16f7f30d991ba80a81bafd8f0d7d83755ba0ca66d6b208424529c7111bc9cd6d11786f3f604a0715f";
String key = "375f22c03371803ca6d36ec42ae1f97541961f7359cf5611bbed399b42c7c0be"; // Hexadecimal String, will be converted to non-hexadecimal String
String expectedResult = "01020506080b10131c22292d313536393b464c535466696d6e7d7f808a8e9899a2adb1b8babcbebfc1c6c7c8cecfd8e0e4e8ef";
byte[] kdata = Util.hex2byte(key);
Assert.assertEquals(32, kdata.length); // 32 bytes = 256-bit key
String result;
Cipher cipher;
cipher = Cipher.getInstance("AES/OFB/NoPadding");
// Below line is 112, which is causing exception
cipher.init(Cipher.DECRYPT_MODE, new SecretKeySpec(kdata, "AES"), new IvParameterSpec(iv.getBytes("UTF-8")));
byte[] cryptData = Util.hex2byte(cryptext);
byte[] ciphertext = cipher.doFinal(cryptData);
result = new String(ciphertext);
Assert.assertEquals(expectedResult, result);
}
#Test
public void decryptAES() {
String iv = "0df1eff724d50157ab048d9ff214b73c";
// Problem: Where should I specify the IV ???? Currently it is an unused variable...
String cryptext = "2743be20314cdc768065b794904a0724e64e339ea6b4f13c510e2d2e8c95dd7409aa0aefd20daae80956dd2978c98d6e914d1d7b5b5be47b491d91e7e4f16f7f30d991ba80a81bafd8f0d7d83755ba0ca66d6b208424529c7111bc9cd6d11786f3f604a0715f";
String key = "375f22c03371803ca6d36ec42ae1f97541961f7359cf5611bbed399b42c7c0be"; // Hexadecimal String, will be converted to non-hexadecimal String
String expectedResult = "01020506080b10131c22292d313536393b464c535466696d6e7d7f808a8e9899a2adb1b8babcbebfc1c6c7c8cecfd8e0e4e8ef";
Assert.assertEquals(64, key.length());
AES aes = new AES();
aes.setKey(Util.hex2byte(key));
byte[] byteCryptedData = Util.hex2byte(cryptext);
String byteCryptedString = new String(byteCryptedData);
while (byteCryptedString.length() % 16 != 0) byteCryptedString += " ";
String result = aes.Decrypt(byteCryptedString);
Assert.assertEquals(expectedResult, result); // Assertion Failed
}
The question:
How can I make Java decrypt in the same way that ActionScript 3 does? Of course, I'd like to get the same result on both.
The first approach is giving you an Illegal key size error message because you don't have the unrestricted policy files installed. Java will refuse to work with "strong" key lengths (e.g. 256-bit AES) without these in place.
If it is legal to do so in your jurisdiction, Google for "Unlimited Strength Jurisdiction Policy Files" and download the version applicable to your Java installation. You will end up with two files to dump into lib/security in your JRE.
Yes there are such libraries, have a look at http://www.bouncycastle.org/ . This is a bit more specific Java Bouncy Castle Cryptography - Encrypt with AES

HMC SHA1 hash - Java producing different hash output than C#

This is a follow up to this question, but I'm trying to port C# code to Java instead of Ruby code to C#, as was the case in the related question. I am trying to verify the encrypted signature returned from the Recurly.js api is valid. Unfortunately, Recurly does not have a Java library to assist with the validation, so I must implement the signature validation myself.
Per the related question above (this), the following C# code can produce the hash needed to validate the signature returned from Recurly:
var privateKey = Configuration.RecurlySection.Current.PrivateKey;
var hashedKey = SHA1.Create().ComputeHash(Encoding.UTF8.GetBytes(privateKey));
var hmac = new HMACSHA1(hashedKey);
var hash = hmac.ComputeHash(Encoding.ASCII.GetBytes(dataToProtect));
return BitConverter.ToString(hash).Replace("-", "").ToLower();
Recurly provides the following example data on their signature documentation page:
unencrypted verification message:
[1312701386,transactioncreate,[account_code:ABC,amount_in_cents:5000,currency:USD]]
private key:
0123456789ABCDEF0123456789ABCDEF
resulting signature:
0f5630424b32402ec03800e977cd7a8b13dbd153-1312701386
Here is my Java implementation:
String unencryptedMessage = "[1312701386,transactioncreate,[account_code:ABC,amount_in_cents:5000,currency:USD]]";
String privateKey = "0123456789ABCDEF0123456789ABCDEF";
String encryptedMessage = getHMACSHA1(unencryptedMessage, getSHA1(privateKey));
private static byte[] getSHA1(String source) throws NoSuchAlgorithmException, UnsupportedEncodingException{
MessageDigest md = MessageDigest.getInstance("SHA-1");
byte[] bytes = md.digest(source.getBytes("UTF-8"));
return bytes;
}
private static String getHMACSHA1(String baseString, byte[] keyBytes) throws GeneralSecurityException, UnsupportedEncodingException {
SecretKey secretKey = new SecretKeySpec(keyBytes, "HmacSHA1");
Mac mac = Mac.getInstance("HmacSHA1");
mac.init(secretKey);
byte[] bytes = baseString.getBytes("ASCII");
return Hex.encodeHexString(mac.doFinal(bytes));
}
However, when I print out the encryptedMessage variable, it does not match the message portion of the example signature. Specifically, I get a value of "c8a9188dcf85d1378976729e50f1de5093fabb78" instead of "0f5630424b32402ec03800e977cd7a8b13dbd153".
Update
Per #M.Babcock, I reran the C# code with the example data, and it returned the same output as the Java code. So it appears my hashing approach is correct, but I am passing in the wrong data (unencryptedMessage). Sigh. I will update this post if/when I can determine what the correct data to encrypt is- as the "unencrypted verification message" provided in the Recurly documentation appears to be missing something.
Update 2
The error turned out to be the "unencrypted verification message" data/format. The message in the example data does not actually encrypt to the example signature provided- so perhaps outdated documentation? At any rate, I have confirmed the Java implementation will work for real-world data. Thanks to all.
I think the problem is in your .NET code. Does Configuration.RecurlySection.Current.PrivateKey return a string? Is that value the key you expect?
Using the following code, .NET and Java return identical results.
.NET Code
string message = "[1312701386,transactioncreate,[account_code:ABC,amount_in_cents:5000,currency:USD]]";
string privateKey = "0123456789ABCDEF0123456789ABCDEF";
var hashedKey = SHA1.Create().ComputeHash(Encoding.UTF8.GetBytes(privateKey));
var hmac = new HMACSHA1(hashedKey);
var hash = hmac.ComputeHash(Encoding.ASCII.GetBytes(message));
Console.WriteLine(" Message: {0}", message);
Console.WriteLine(" Key: {0}\n", privateKey);
Console.WriteLine("Key bytes: {0}", BitConverter.ToString(hashedKey).Replace("-", "").ToLower());
Console.WriteLine(" Result: {0}", BitConverter.ToString(hash).Replace("-", "").ToLower());
Result:
Message: [1312701386,transactioncreate,[account_code:ABC,amount_in_cents:5000,currency:USD]]
Key: 0123456789ABCDEF0123456789ABCDEF
Key bytes: 4d857d2408b00c3dd17f0c4ffcf15b97f1049867
Result: c8a9188dcf85d1378976729e50f1de5093fabb78
Java
String message = "[1312701386,transactioncreate,[account_code:ABC,amount_in_cents:5000,currency:USD]]";
String privateKey = "0123456789ABCDEF0123456789ABCDEF";
MessageDigest md = MessageDigest.getInstance("SHA-1");
byte[] keyBytes = md.digest(privateKey.getBytes("UTF-8"));
SecretKey sk = new SecretKeySpec(keyBytes, "HmacSHA1");
Mac mac = Mac.getInstance("HmacSHA1");
mac.init(sk);
byte[] result = mac.doFinal(message.getBytes("ASCII"));
System.out.println(" Message: " + message);
System.out.println(" Key: " + privateKey + "\n");
System.out.println("Key Bytes: " + toHex(keyBytes));
System.out.println(" Results: " + toHex(result));
Result:
Message: [1312701386,transactioncreate,[account_code:ABC,amount_in_cents:5000,currency:USD]]
Key: 0123456789ABCDEF0123456789ABCDEF
Key Bytes: 4d857d2408b00c3dd17f0c4ffcf15b97f1049867
Results: c8a9188dcf85d1378976729e50f1de5093fabb78
I suspect the default encoding of the values you're working on may be different. As they do not have it specified, they will use the default encoding value of the string based on the platform you're working on.
I did a quick search to verify if this was true and it was still inconclusive, but it made me think that strings in .NET default to UTF-16 encoding, while Java defaults to UTF-8. (Can someone confirm this?)
If such's the case, then your GetBytes method with UTF-8 encoding is already producing a different output for each case.
Based on this sample code, it looks like Java expects you to have not already SHA1'd your key before creating a SecretKeySpec. Have you tried that?

Java equivalent of Fantom HMAC using SHA1

I'm having trouble doing the following in Java. Below is the Fantom code from the documentation for the the tool I am using.
// compute salted hmac
hmac := Buf().print("$username:$userSalt").hmac("SHA-1", password.toBuf).toBase64
// now compute login digest using nonce
digest := "${hmac}:${nonce}".toBuf.toDigest("SHA-1").toBase64
// our example variables
username: "jack"
password: "pass"
userSalt: "6s6Q5Rn0xZP0LPf89bNdv+65EmMUrTsey2fIhim/wKU="
nonce: "3da210bdb1163d0d41d3c516314cbd6e"
hmac: "IjJOApgvDoVDk9J6NiyWdktItl0="
digest: "t/nzXF3n0zzH4JhXtihT8FC1N3s="
I've been searching various examples through Google but none of them produce the results the documentation claims should be returned.
Can someone with Fantom knowledge verify if the example in the documentation is correct?
As for the Java side, here is my most recent attempt
public static String hmacSha1(String value, String key) {
try {
// Get an hmac_sha1 key from the raw key bytes
byte[] keyBytes = key.getBytes("UTF-8");
SecretKeySpec signingKey = new SecretKeySpec(keyBytes, "HmacSHA1");
// Get an hmac_sha1 Mac instance and initialize with the signing key
Mac mac = Mac.getInstance("HmacSHA1");
mac.init(signingKey);
// Compute the hmac on input data bytes
byte[] rawHmac = mac.doFinal(value.getBytes("UTF-8"));
// Convert raw bytes to Hex
byte[] hexBytes = new Hex().encode(rawHmac);
// Covert array of Hex bytes to a String
return new String(hexBytes, "UTF-8");
} catch (Exception e) {
throw new RuntimeException(e);
}
}
However, when I call the method with the following parameters
jack:6s6Q5Rn0xZP0LPf89bNdv+65EmMUrTsey2fIhim/wKU=
pass
I get
22324e02982f0e854393d27a362c96764b48b65d
Not sure where the docs came from - but they could be out-of-date - or wrong. I would actually run the Fantom code to use as your reference to make sure you're testing the right stuff ;)
You can take a look at the Java source for sys::Buf.hmac: MemBuf.java
I would also recommend separating out the 3 transformations. Make sure your raw byte array matches in both Fantom and Java, then verify the digest matches, and finally the Base64 encoding. Be alot easier to verify each stage in your code.
Turns out it was just my own lack of knowledge and with enough trial and error I was able to figure it out by doing the following:
//username: "jack"
//password: "pass"
//userSalt: "6s6Q5Rn0xZP0LPf89bNdv+65EmMUrTsey2fIhim/wKU="
//nonce: "3da210bdb1163d0d41d3c516314cbd6e"
//hmac: "IjJOApgvDoVDk9J6NiyWdktItl0="
//digest: "t/nzXF3n0zzH4JhXtihT8FC1N3s="
...
// initialize a Mac instance using a signing key from the password
SecretKeySpec signingKey = new SecretKeySpec(password.getBytes(), "HmacSHA1");
Mac mac = Mac.getInstance("HmacSHA1");
mac.init(signingKey);
// compute salted hmac
byte[] hmacByteArray = mac.doFinal((username + ':' + userSalt).getBytes());
String hmacString = new String(Base64.encodeBase64(hmacByteArray));
// hmacString == hmac
// now compute login digest using nonce
MessageDigest md = MessageDigest.getInstance("SHA-1");
md.update((hmacString + ':' + nonce).getBytes());
byte[] digestByteArray = md.digest();
String digestString = new String(Base64.encodeBase64(digestByteArray));
// digestString == digest
Used org.apache.commons.codec.binary.Base64 to encode the byte arrays.

Categories

Resources