I'm trying to hash a message to a server side (which I can't change his code) written in php and encoded by HMAC_SHA1 algorithm. I'm writing the code in Java.
the php code is as follows:
$utf8Str = mb_convert_encoding($strToSign, "UTF-8");
$hmac_sha1_str = base64_encode(hash_hmac("sha1", $utf8Str, KEY));
$signature = urlencode($hmac_sha1_str);
my java code is:
private static String HashStringSign(String toHash){
try {
String afterUTF = new String(toHash.getBytes(), "UTF-8");
String res = hmac_sha1(afterUTF, SecretAccessKey);
String signature = new String(Base64.encode(res.getBytes()));
String result = URLEncoder.encode(signature);
return result;
} catch (UnsupportedEncodingException e) {
throw new RuntimeException(e);
}
}
private static String hmac_sha1(String value, String key) {
try {
// Get an hmac_sha1 key from the raw key bytes
byte[] keyBytes = key.getBytes();
SecretKeySpec signingKey = new SecretKeySpec(keyBytes, "HmacSHA1");
// Get an hmac_sha1 Mac instance and initialize with the signing key
Mac mac = Mac.getInstance("HmacSHA1");
mac.init(signingKey);
// Compute the hmac on input data bytes
byte[] rawHmac = mac.doFinal(value.getBytes());
// Convert raw bytes to Hex
byte[] hexBytes = new Hex().encode(rawHmac);
return new String(hexBytes, "UTF-8");
} catch (Exception e) {
throw new RuntimeException(e);
}
}
I followed each of the hashing methods used in the php code and in the same order as you can see. maybe there's a function(s) in java that works different in php?
I'm using - com.sun.org.apache.xml.internal.security.utils.Base64
java.net.URLEncoder, javax.crypto and org.apache.commons.codec.binary
Thanks!
In your hash_hmac function you need to set the 4th parameter to true.
PHP Code not responsible, Java only:
So now you say that you can't change the PHP side, you can do the following to your Java code.
In your last step of the Java code, you converted the raw byte array to hexademical. However, PHP generates a base64-encoded hexademical instead of just hexademical.
So that the end of your Java step, simply base64 encode your hexademical and you will get the same values. https://stackoverflow.com/questions/9845767/base64-encoder-java#
Related
I am trying to take some working python code and convert it to java for my usage. The python code below produces the correct signature. The java code using the same key, salt, produces something different and I am at a loss for why. In the Java code I am using the key generated in python (_key) to create the signature.
What I don't understand is, if I print the value of _key in python I get "34ee7983-5ee6-4147-aa86-443ea062abf774493d6a-2a15-43fe-aace-e78566927585". Now if I take that and place it directly into the hmac(new) call I get a different result than if I just leave the _key variable. I assume this has something to do with encoding of some type but I am at a loss.
_s1 = base64.b64decode('VzeC4H4h+T2f0VI180nVX8x+Mb5HiTtGnKgH52Otj8ZCGDz9jRW'
'yHb6QXK0JskSiOgzQfwTY5xgLLSdUSreaLVMsVVWfxfa8Rw==')
_s2 = base64.b64decode('ZAPnhUkYwQ6y5DdQxWThbvhJHN8msQ1rqJw0ggKdufQjelrKuiG'
'GJI30aswkgCWTDyHkTGK9ynlqTkJ5L4CiGGUabGeo8M6JTQ==')
# bitwise and of _s1 and _s2 ascii, converted to string
_key = ''.join([chr(ord(c1) ^ ord(c2)) for (c1, c2) in zip(_s1, _s2)])
#classmethod
def get_signature(cls, song_id, salt=None):
"""Return a (sig, salt) pair for url signing."""
if salt is None:
salt = str(int(time.time() * 1000))
mac = hmac.new(cls._key, song_id, sha1)
mac.update(salt)
sig = base64.urlsafe_b64encode(mac.digest())[:-1]
return sig, salt
This is my Java code. I think ultimately my issue is how I am handling or encoding the AA_KEY but I cannot figure it out.
private static final String AA_KEY = "34ee7983-5ee6-4147-aa86-443ea062abf774493d6a-2a15-43fe-aace-e78566927585";
public void someFunc(String songId) {
salt = "1431875768596"
String sig = hmacSha1(songId + salt, AA_KEY);
sig = StringUtils.replaceChars(sig, "+/=", "-_.");
}
static String hmacSha1(String value, String key) {
try {
// Get an hmac_sha1 key from the raw key bytes
byte[] keyBytes = key.getBytes();
SecretKeySpec signingKey = new SecretKeySpec(keyBytes, "HmacSHA1");
// Get an hmac_sha1 Mac instance and initialize with the signing key
Mac mac = Mac.getInstance("HmacSHA1");
mac.init(signingKey);
// Compute the hmac on input data bytes
byte[] rawHmac = mac.doFinal(value.getBytes());
return Base64.encodeBytes(rawHmac);
} catch (Exception e) {
throw new RuntimeException(e);
}
}
I found a couple of similar questions but they didn't help me figure it out sadly. Thanks!
Python HMAC-SHA256 signature differs from PHP signature
Python HMAC-SHA1 vs Java HMAC-SHA1 different results
I am porting part of an iOS app to Android, and I'm having trouble porting the following signature generating code in iOS to Android. The iOS code is:
+ (NSString *)hashedBase64ValueOfData:(NSString *) data WithSecretKey:(NSString*)secret {
// ascii convirsion
const char *cKey = [secret cStringUsingEncoding:NSASCIIStringEncoding];
const char *cData = [data cStringUsingEncoding:NSASCIIStringEncoding];
// HMAC Data structure initializtion
unsigned char cHMAC[CC_SHA256_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA256, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
// Gerating hased value
NSData *da = [[NSData alloc] initWithBytes:cHMAC length:sizeof(cHMAC)];
return [da base64EncodedString];// conversion to base64 string & returns
}
The Android Java code I have written and tried is:
private static String hashedBase64ValueOfDataWithSecretKey(String data, String secret) {
try {
SecretKeySpec signingKey = new SecretKeySpec(secret.getBytes(), HMAC_SHA1_ALGORITHM);
Mac mac = Mac.getInstance(HMAC_SHA1_ALGORITHM);
mac.init(signingKey);
byte[] rawHmac = mac.doFinal(data.getBytes());
return Base64.encodeToString(rawHmac, 0);
} catch (Exception e) {
e.printStackTrace();
}
return null;
}
Upon testing, the Android function is not outputting the same thing as the iOS function (given the same input), and I'm not sure why.
Not an expert at this, but NSASCIIStringEncoding seems to imply that you want data and secret interpreted as ASCII, whereas String.getBytes() uses the default character set by default (i.e. UTF-8).
You probably need to use a different charset:
data.getBytes(StandardCharsets.US_ASCII);
secret.getBytes(StandardCharsets.US_ASCII);
For Java pre-1.7, you'll need to use this and catch the UnsupportedEncodingException:
data.getBytes("US-ASCII");
secret.getBytes("US-ASCII");
You might use extras org.apache.commons.codec.binary.Base64. Google it and find it, then you can fellow the codes below. I think the hashed value will be generated by "private key" and appended behind a "public key" being sent to server with a "http-head" together. If no, you can just remove them. Anyway the codes might give you some suggestions. :)
private String getAppendedHeader(String str) {
try {
String hash = getHash(str);
String signature = new String(Base64.encodeBase64(hash.getBytes()));
StringBuilder sb = new StringBuilder();
sb.append(PUBLIC_KEY).append(' ').append(signature);
return sb.toString();
} catch (NoSuchAlgorithmException _e) {
LL.e("Get mac error: " + _e.getMessage());
return null;
} catch (InvalidKeyException _e) {
LL.e("Init mac error: " + _e.getMessage());
return null;
}
}
private String getHash(String str) throws NoSuchAlgorithmException, InvalidKeyException {
Mac mac = Mac.getInstance("HmacSHA256");
SecretKeySpec secret = new SecretKeySpec(PRIVATE_KEY.getBytes(), "HmacSHA256");
mac.init(secret);
byte[] digest = mac.doFinal(str.getBytes());
BigInteger hash = new BigInteger(1, digest);
String hmac = hash.toString(16);
if (hmac.length() % 2 != 0) {
hmac = "0" + hmac;
}
return hmac;
}
I have an application developed on BlackBerry JDE 5.0.0 that encrypts a String using DES algorithm with ECB mode. After the encryption, the result is encoded by base64 encoding. But whenever I compare the result that i get from my encryption method with the result that i get on the online encryptor engine, it always give different result on the several last character. I tried to decrypt the result that i get form my encryption method with the online encriptor engine and it looks like the result is not the valid one. So how can I fix that different result on the several last character?
Here my encryption method code:
public String encryptDESECB(String text) throws MessageTooLongException
{
byte[] input = text.getBytes();
byte[] output = new byte[8];
byte[] uid = null;
uid = "431654625bd37673e3b00359676154074a04666a".getBytes();
DESKey key = new DESKey(uid);
try {
DESEncryptorEngine engine = new DESEncryptorEngine(key);
engine.encrypt(input, 0, output, 0);
String x= BasicAuth.encode(new String(output));
System.out.println("AFTER ENCODE"+x);
return new String(x);
} catch (CryptoTokenException e) {
return "NULL";
} catch (CryptoUnsupportedOperationException e) {
return "NULL";
}
}
The String that i want to encrypt is "00123456"
The Result that i get from my encryption method is:YnF2BWFV/8w=
The Result that i get from online encryptor engine (http://www.tools4noobs.com/online_tools/encrypt/) : YnF2BWFV9sw=
The Result that i get from android (With the same encryption algorithm & Method) : YnF2BWFV9sw=
Here's the code on Android:
public static String encryptDesECB(String data) {
try {
DESKeySpec keySpec = newDESKeySpec("431654625bd37673e3b00359676154074a04666a".getBytes("UTF8"));
SecretKeyFactory keyFactory = SecretKeyFactory.getInstance("DES");
SecretKey key = keyFactory.generateSecret(keySpec);
// ENCODE plainTextPassword String
byte[] cleartext = data.getBytes("UTF8");
Cipher cipher = Cipher.getInstance("DES/ECB/NoPadding");
cipher.init(Cipher.ENCRYPT_MODE, key);
Logger.log(Log.INFO, new String(cipher.doFinal(cleartext)));
String encrypedPwd = Base64.encodeToString(cipher.doFinal(cleartext), Base64.DEFAULT);
Logger.log(Log.INFO, encrypedPwd);
return encrypedPwd;
} catch (Exception e) {
Logger.log(e);
return null;
}
}
Can anyone help me with this?
This is most likely caused by padding, as DES works with 8 byte blocks.
For more information check out this link:
http://www.tero.co.uk/des/explain.php#Padding
As long as you can properly decrypt the content you'll be fine.
I found my mistake. It turn out my BasicAuth Class isn't the correct one for encoding the encrypted string. Now I'm using the correct one Base64 Class for the encoding, and it turn out fine.
I am trying to interface with a TransUnion web service and I need to provide a HMAC-SHA1 signature to access it.
This example is in the TransUnion documentation:
Input of SampleIntegrationOwner2008‐11‐18T19:14:40.293Z with security
key xBy/2CLudnBJOxOtDhDRnsDYq9HTuDVr2uCs3FMzoxXEA/Od9tOuwSC70+mIfpjeG68ZGm/PrxFf/s/CzwxF4Q==
creates output of /UhwvT/kY9HxiXaOjpIc/BarBkc=.
Given that data and key, I cannot get this same result in Java. I have tried several online calculators, and none of them return this result either. Is the example in their documentation incorrect, or am I just not handling these strings correctly?
Here is the code I am currently working with:
public static String calcShaHash (String data, String key) {
String HMAC_SHA1_ALGORITHM = "HmacSHA1";
String result = null;
try {
Key signingKey = new SecretKeySpec(key.getBytes(), HMAC_SHA1_ALGORITHM);
Mac mac = Mac.getInstance(HMAC_SHA1_ALGORITHM);
mac.init(signingKey);
byte[] rawHmac = mac.doFinal(data.getBytes());
result = Base64.encodeBase64String(rawHmac);
}
catch (Exception e) {
e.printStackTrace();
}
return result;
}
Here is my unit test code:
#Test
public void testCalcShaHash() {
String data = "SampleIntegrationOwner2008-11-18T19:14:40.293Z";
String key = "xBy/2CLudnBJOxOtDhDRnsDYq9HTuDVr2uCs3FMzoxXEA/Od9tOuwSC70+mIfpjeG68ZGm/PrxFf/s/CzwxF4Q==";
String result = Utils.calcShaHash(data, key);
assertEquals(result, "/UhwvT/kY9HxiXaOjpIc/BarBkc=");
}
That looks like a Base64 encoded key. So I think you're going to need to do a base64 decode on it, then pass it to the HMAC. Something like this (just for illustration I haven't tested it, any errors are an exercise for the reader):
public String getHmacMD5(String privateKey, String input) throws Exception{
String algorithm = "HmacSHA1";
byte[] keyBytes = Base64.decode(privateKey);
Key key = new SecretKeySpec(keyBytes, 0, keyBytes.length, algorithm);
Mac mac = Mac.getInstance(algorithm);
mac.init(key);
return Base64.encode(mac.doFinal(input.getBytes()));
}
One thing I noticed is that the hyphens are not normal hyphens. If you copy and paste them, they are not in the ASCII character set. All I can say for sure is that the hash length appears correct. The funny thing is, I couldn't get your code to produce the correct answer, even after putting the correct hyphens in. But no matter. It solved the problem. Huzzah!
I have a sample application which generates a SHA1 hash in PHP as follows.
base64_encode(pack('H*', sha1($pass)));
I tried to achieve the same in Java, but so far, the output is different. The approach I used is as follows (Base64 and Hex classes come from commons-codec library).
byte[] rawSHA = null;
byte[] base64HexSHA = null;
String hex = null;
MessageDigest md= null;
// Get Message Digest Instance.
try {
md = MessageDigest.getInstance(SHA1_ALGORITHM);
} catch (NoSuchAlgorithmException e) {
LOG.error("Unable to load SHA-1 Message Digest : " + e.getMessage(), e);
throw new IllegalStateException("SHA-1 Message Digest Instance Not Found");
}
// Build SHA1 Hash
rawSHA = md.digest(rawText.getBytes("UTF-8"));
// Convert to HEX
hex = new String(Hex.encodeHex(rawSHA));
// Encode to Base 64
base64HexSHA = Base64.encodeBase64(hex.getBytes("UTF-8"));
// Return String
return new String(base64HexSHA);
My question is, would the approach I have taken yield the same output as PHP's pack() function? My guess is that PHP pack() function returns the raw bytes where as the Hex.encodeHex returns hex string form (ref : http://www.w3schools.com/php/func_misc_pack.asp).
How can I achieve the same output as PHP's pack() function in Java (or the full output of the above PHP code) ?
Convertion to HEX is not required, just use this:
base64HexSHA = Base64.encodeBase64(rawSHA);