I am working on translating an API from Java into Javascript (nodeJs). The problem is that the signatures generated by the Java code are much shorter than the one in javascript. The results from the getSignature function have different length and as such whenever I generate a signature in javascript the server won't recognize it but it will when it is generated in Java.
I have verified that the values in getSignatureKey are the same in both functions and the getSignature function uses the output from getSignatureKey to encrypt "SOME MESSAGE TO ENCRYPT" which will be the request body in plain text (verified both have the same content and format).
Is there any reason why the output differs in length? Perhaps some encoding problem or something else I'm not seeing.
Using the native crypto library in nodeJs as follows:
var getSignatureKey = function(key, api_key, dateStamp){
let kUser = HMACSHA256("CAS"+api_key, key);
let kDate = HMACSHA256(dateStamp, kUser);
let kService = HMACSHA256(SERVICE_NAME, kDate);
let kSigning = HMACSHA256("cas_request", kService);
return kSigning;
}
var getSignature = function(signature_key){
let signature_bytes = HMACSHA256("SOME MESSAGE TO ENCRYPT", signature_key);
let signature = Buffer.from(signature_bytes).toString('base64');
return signature;
}
var HMACSHA256 = function(message, secret){
let key_bytes = encoder.encode(secret);
let message_bytes = encoder.encode(message);
let hash = crypto.createHmac('sha256', key_bytes).update(message_bytes).digest();
return Uint8Array.from(hash);
}
While in java I have the following code:
public static byte[] getSignatureKey(String key, String apiKey, String dateStamp, String serviceName)
throws Exception {
byte[] kSecret = key.getBytes("UTF8");
byte[] kUser = HmacSHA256("CAS" + apiKey, kSecret);
byte[] kDate = HmacSHA256(dateStamp, kUser);
byte[] kService = HmacSHA256(serviceName, kDate);
byte[] kSigning = HmacSHA256("cas_request", kService);
return kSigning;
}
public static String getSignature(byte[] signature_key) throws Exception {
return Base64.encodeBase64String(HmacSHA256("SOME MESSAGE TO ENCRYPT", signature_key));
}
public static byte[] HmacSHA256(String data, byte[] key) throws Exception {
String algorithm = "HmacSHA256";
Mac mac = Mac.getInstance(algorithm);
mac.init(new SecretKeySpec(key, algorithm));
return mac.doFinal(data.getBytes("UTF8"));
}
Related
I was given following java code:
private static String key = "0123456789ABCDEF0123456789ABCDEF"; // Not real key
public static String Encrypt(String text)
{
byte[] encrypted, bytekey = hexStringToByteArray(key);
SecretKeySpec sks = new SecretKeySpec(bytekey, "AES");
try
{
Cipher cipher = Cipher.getInstance("AES");
cipher.init(1, sks, cipher.getParameters());
encrypted = cipher.doFinal(text.getBytes());
}
catch (Exception e)
{
System.out.println("Error using AES encryption with this Java instance");
e.printStackTrace();
System.exit(1);
return null;
}
String encryptedText = byteArrayToHexString(encrypted);
return encryptedText;
}
Passing Password123 into this returns 6836A38816248A0C7DD89400A997251A. I'm not looking for comments on the security of this. I'm aware. I didn't write it, I just need to duplicate it.
I have to create C# code that has the same functionality. I have tried many code snippets from all over SO and other web sites. None of them produce the same output when given a specific input.
I added some debug statements to the java code to get the following information about the algorithm:
sks.getAlgorithm(): AES (duh)
sks.getFormat(): RAW
cipher.getAlgorithm(): AES (again, duh)
cipher.getBlockSize(): 16
cipher.getParameters(): null
cipher.getIV(): null (I think this might be my primary issue)
Here is one of the C# methods I found that looked promising:
private const string key = "0123456789ABCDEF0123456789ABCDEF"; // Not real key
private static byte[] encryptionKey= new byte[16];
static void SetupKey()
{
var secretKeyBytes = Encoding.UTF8.GetBytes(key);
Array.Copy(secretKeyBytes, encryptionKey, Math.Min(encryptionKey.Length, secretKeyBytes.Length));
}
public static String Encrypt3(String secret)
{
SetupKey();
byte[] inputBytes = UTF8Encoding.UTF8.GetBytes(secret);
using (MemoryStream ms = new MemoryStream())
{
using (AesManaged cryptor = new AesManaged())
{
cryptor.Mode = CipherMode.CBC;
cryptor.Padding = PaddingMode.PKCS7;
cryptor.KeySize = 128;
cryptor.BlockSize = 128;
using (CryptoStream cs = new CryptoStream(ms, cryptor.CreateEncryptor(encryptionKey, null), CryptoStreamMode.Write))
{
cs.Write(inputBytes, 0, inputBytes.Length);
}
byte[] encryptedContent = ms.ToArray();
byte[] result = new byte[encryptedContent.Length];
System.Buffer.BlockCopy(encryptedContent, 0, result, 0, encryptedContent.Length);
return ByteArrayToHexString(result);
}
}
}
Every time I run this code, I get a different result, even though I'm passing null into the Initialization Vector(IV) parameter of cryptor.CreateEncryptor(). Is the AesManaged object using an internal IV even though I told it to use null? If I try to set the IV to null, I get an error.
What do I need to do to get the C# code to consistently return the same result as the java code?
NOTE: Both methods use HexStringToByteArray and ByteArrayToHexString. The original author of the java code, for some reason, wrote his own byte/hex converters. I recreated them in C#, but they work just like the build in functions.
Good afternoon. Interested in the question someone tried to decrypt data in Java that was encrypted in Ruby.
I'm trying to encrypt a word in Ruby (using the Marshal module) and decrypt in Java. If the Marshal module is used, can it be transferred to other languages or not?
This is my test in Ruby:
let(:key) { "12345678901234567890123456789012" }
let(:str) { "Serhii" }
it "encrypt_content" do
crypt = ActiveSupport::MessageEncryptor.new(key, cipher: 'aes-256-cbc')
encrypted_content = crypt.encrypt_and_sign(str)
encrypted_content
end
The code methods are:
def encrypt_and_sign(value, expires_at: nil, expires_in: nil, purpose: nil)
verifier.generate(_encrypt(value, expires_at: expires_at, expires_in: expires_in, purpose: purpose))
end
def _encrypt(value, **metadata_options)
cipher = new_cipher
cipher.encrypt
cipher.key = #secret
iv = cipher.random_iv
cipher.auth_data = "" if aead_mode?
encrypted_data = cipher.update(Messages::Metadata.wrap(#serializer.dump(value), metadata_options))
encrypted_data << cipher.final
blob = "#{::Base64.strict_encode64 encrypted_data}--#{::Base64.strict_encode64 iv}"`enter code here`
blob = "#{blob}--#{::Base64.strict_encode64 cipher.auth_tag}" if aead_mode?
blob
end
The decrypt Java is:
private static final String key = "12345678901234567890123456789012";
#SneakyThrows
public static String decrypt(String encrypted) {
byte[] firstByte = Base64.getDecoder().decode(encrypted.replaceAll("\n", "").getBytes(StandardCharsets.UTF_8));
String first = new String(firstByte);
String[] parts = first.split("--");
byte[] secondByte = Base64.getDecoder().decode(parts[0].getBytes(StandardCharsets.UTF_8));
String second = new String(secondByte);
String[] parts2 = second.split("--");
byte[] encryptedData = Base64.getDecoder().decode(parts2[0].getBytes(StandardCharsets.UTF_8));
SecretKeySpec aesKey = new SecretKeySpec(key.getBytes(StandardCharsets.UTF_8), "AES");
Cipher cipher = Cipher.getInstance("AES/CBC/PKCS5Padding");
cipher.init(Cipher.DECRYPT_MODE, aesKey, new IvParameterSpec(new byte[16]));
byte[] result = cipher.doFinal(encryptedData);
return new String(result);
}
public static void main(String[] args) throws Exception {
String encrypted = "S3l0cVEybDRUM2sxU1hFMk5YVlhOa3A2VXpRNFEyZFFibTVwZVdRMVdEUlpN\n" +
"bkkxUzBaUGNsbFJaejB0TFRWWlVtVkNVWEJXZWxselJuWkVhbFJyWlU5VmNr\n" +
"RTlQUT09LS0yZDA5M2FhZTg0OTJjZmIyZjdiNDA0ZWVkNGU2ZmQ4NDQ1ZTM4\n" +
"ZjIx";
System.out.println("Decrypted: " + decrypt(encrypted));
}
}
Result �'��m�Qի���
What could be the reason?
The exact code that is produced by Ruby is not specified (which I would consider a bug), you can find the format by reading the source code, especially this part:
blob = "#{::Base64.strict_encode64 encrypted_data}--#{::Base64.strict_encode64 iv}"
blob = "#{blob}--#{::Base64.strict_encode64 cipher.auth_tag}" if aead_mode?
Where the IV is a random IV, generated using Cipher::new of the openssl module.
I am using java to do some post with AWS signature header. The signing process is below
private String calculateSignature(String stringToSign) {
try {
byte[] signatureKey = getSignatureKey(secretAccessKey, currentDate, regionName, serviceName);
byte[] signature = HmacSHA256(stringToSign, signatureKey);
String strHexSignature = bytesToHex(signature);
return strHexSignature;
} catch (Exception ex) {
ex.printStackTrace();
}
return null;
}
private byte[] HmacSHA256(String data, byte[] key) throws Exception {
String algorithm = "HmacSHA256";
Mac mac = Mac.getInstance(algorithm);
mac.init(new SecretKeySpec(key, algorithm));
return mac.doFinal(data.getBytes("UTF8"));
}
private byte[] getSignatureKey(String key, String date, String regionName, String serviceName) throws Exception {
byte[] kSecret = ("AWS4" + key).getBytes("UTF8");
byte[] kDate = HmacSHA256(date, kSecret);
byte[] kRegion = HmacSHA256(regionName, kDate);
byte[] kService = HmacSHA256(serviceName, kRegion);
byte[] kSigning = HmacSHA256(aws4Request, kService);
return kSigning;
}
private String bytesToHex(byte[] bytes) {
final StringBuilder hexString = new StringBuilder();
for(byte b : bytes)
hexString.append(String.format("%02x", b));
return hexString.toString().toLowerCase().trim();
}
However, when I send use this signature, it does not match the signature calculated by the server. The server is written in PHP, and it us Crypto-JS to calculate the signature. I have compared two strings need to sign and Canonical Request between Java and PHP. They are matching.
I have checked space, command and other characteries. but cannot figure out why. Are Java HmacSHA256 and crypto-js HmacSHA256 different? Or something wrong with bytesToHex method?
The Java output of Canonical Request
POST
/
content-length:667
content-type:application/json
host:host.name.com
x-amz-date:20171205T012629Z
x-amz-target:_20141201.XXXXXXX
content-length;content-type;host;x-amz-date;x-amz-target
c5b31b699700e6debe4548836a723f89b73ffcef6570e1bed4c534c0f247dc26
The PHP returns:
POST
/
content-length:667
content-type:application/json
host:host.name.com
x-amz-date:20171205T012629Z
x-amz-target:_20141201.XXXXXXX
content-length;content-type;host;x-amz-date;x-amz-target
c5b31b699700e6debe4548836a723f89b73ffcef6570e1bed4c534c0f247dc26
Any help or any idea will be appreciated.
Found the issue. The reason was the algorithm of the secret key was changed, but i was not aware of that change
I'm attempting with little success to port over Google's code to generate a secure token for their captcha (https://github.com/google/recaptcha-java/blob/master/appengine/src/main/java/com/google/recaptcha/STokenUtils.java):
The original utility has the following:
private static final String CIPHER_INSTANCE_NAME = "AES/ECB/PKCS5Padding";
private static String encryptAes(String input, String siteSecret) {
try {
SecretKeySpec secretKey = getKey(siteSecret);
Cipher cipher = Cipher.getInstance(CIPHER_INSTANCE_NAME);
cipher.init(Cipher.ENCRYPT_MODE, secretKey);
return BaseEncoding.base64Url().omitPadding().encode(cipher.doFinal(input.getBytes("UTF-8")));
} catch (Exception e) {
e.printStackTrace();
}
return null;
}
private static SecretKeySpec getKey(String siteSecret){
try {
byte[] key = siteSecret.getBytes("UTF-8");
key = Arrays.copyOf(MessageDigest.getInstance("SHA").digest(key), 16);
return new SecretKeySpec(key, "AES");
} catch (NoSuchAlgorithmException | UnsupportedEncodingException e) {
e.printStackTrace();
}
return null;
}
public static void main(String [] args) throws Exception {
//Hard coded the following to get a repeatable result
String siteSecret = "12345678";
String jsonToken = "{'session_id':'abf52ca5-9d87-4061-b109-334abb7e637a','ts_ms':1445705791480}";
System.out.println(" json token: " + jsonToken);
System.out.println(" siteSecret: " + siteSecret);
System.out.println(" Encrypted stoken: " + encryptAes(jsonToken, siteSecret));
Given the values I hardcoded, I get Irez-rWkCEqnsiRLWfol0IXQu1JPs3qL_G_9HfUViMG9u4XhffHqAyju6SRvMhFS86czHX9s1tbzd6B15r1vmY6s5S8odXT-ZE9A-y1lHns" back as my encrypted token.
My Java and crypto skills are more than a little rusty, and there aren't always direct analogs in C#. I attempted to merge encrypeAes() and getKey() with the following, which isn't correct:
public static string EncryptText(string PlainText, string siteSecret)
{
using (RijndaelManaged aes = new RijndaelManaged())
{
aes.Mode = CipherMode.ECB;
aes.Padding = PaddingMode.PKCS7;
var bytes = Encoding.UTF8.GetBytes(siteSecret);
SHA1 sha1 = SHA1.Create();
var shaKey = sha1.ComputeHash(bytes);
byte[] targetArray = new byte[16];
Array.Copy(shaKey, targetArray, 16);
aes.Key = targetArray;
ICryptoTransform encrypto = aes.CreateEncryptor();
byte[] plainTextByte = ASCIIEncoding.UTF8.GetBytes(PlainText);
byte[] CipherText = encrypto.TransformFinalBlock(plainTextByte, 0, plainTextByte.Length);
return HttpServerUtility.UrlTokenEncode(CipherText); //Equivalent to java's BaseEncoding.base64Url()?
}
}
The C# version produces the incorrect value of: Ye+fySvneVUZJXth67+Si/e8fBUV4Sxs7wEXVDEOJjBMHl1encvt65gGIj8CiFzBGp5uUgKYJZCuQ4rc964vZigjlrJ/430LgYcathLLd9U=
Your code almost works as expected. It's just that you somehow mixed up the outputs of the Java version (and possibly the C# version).
If I execute your Java code (JDK 7 & 8 with Guava 18.0), I get
Ye-fySvneVUZJXth67-Si_e8fBUV4Sxs7wEXVDEOJjBMHl1encvt65gGIj8CiFzBGp5uUgKYJZCuQ4rc964vZigjlrJ_430LgYcathLLd9U
and if I execute your C# code (DEMO), I get
Ye-fySvneVUZJXth67-Si_e8fBUV4Sxs7wEXVDEOJjBMHl1encvt65gGIj8CiFzBGp5uUgKYJZCuQ4rc964vZigjlrJ_430LgYcathLLd9U1
So, the C# version has an additional "1" at the end. It should be a padding character, but isn't. This means that HttpServerUtility.UrlTokenEncode() doesn't provide a standards conform URL-safe Base64 encoding and you shouldn't use it. See also this Q&A.
The URL-safe Base64 encoding can be easily derived from the normal Base64 encoding (compare tables 1 and 2 in RFC4648) as seen in this answer by Marc Gravell:
string returnValue = System.Convert.ToBase64String(toEncodeAsBytes)
.TrimEnd(padding).Replace('+', '-').Replace('/', '_');
with:
static readonly char[] padding = { '=' };
That's not all. If we take your Java output of
Ye+fySvneVUZJXth67+Si/e8fBUV4Sxs7wEXVDEOJjBMHl1encvt65gGIj8CiFzBGp5uUgKYJZCuQ4rc964vZigjlrJ/430LgYcathLLd9U=
and decrypt it, then we get the following token:
{"session_id":"4182e173-3a24-4c10-b76c-b85a36be1173","ts_ms":1445786965574}
which is different from the token that you have in your code:
{'session_id':'abf52ca5-9d87-4061-b109-334abb7e637a','ts_ms':1445705791480}
The main remaining problem is that you're using invalid JSON. Strings and keys in JSON need to be wrapped in " and not '.
Which means that the encrypted token actually should have been (using a valid version of the token from your code):
D9rOP07fYgBfza5vbGsvdPe8fBUV4Sxs7wEXVDEOJjBMHl1encvt65gGIj8CiFzBsAWBDgtdSozv4jS_auBU-CgjlrJ_430LgYcathLLd9U
Here's a C# implementation that reproduces the same result as your Java code:
class Program
{
public static byte[] GetKey(string siteSecret)
{
byte[] key = Encoding.UTF8.GetBytes(siteSecret);
return SHA1.Create().ComputeHash(key).Take(16).ToArray();
}
public static string EncryptAes(string input, string siteSecret)
{
var key = GetKey(siteSecret);
using (var aes = AesManaged.Create())
{
if (aes == null) return null;
aes.Mode = CipherMode.ECB;
aes.Padding = PaddingMode.PKCS7;
aes.Key = key;
byte[] inputBytes = Encoding.UTF8.GetBytes(input);
var enc = aes.CreateEncryptor(key, new byte[16]);
return UrlSafeBase64(enc.TransformFinalBlock(inputBytes,0,input.Length));
}
}
// http://stackoverflow.com/a/26354677/162671
public static string UrlSafeBase64(byte[] bytes)
{
return Convert.ToBase64String(bytes).TrimEnd('=')
.Replace('+', '-')
.Replace('/', '_');
}
static void Main(string[] args)
{
string siteSecret = "12345678";
string jsonToken = "{'session_id':'abf52ca5-9d87-4061-b109-334abb7e637a','ts_ms':1445705791480}";
Console.WriteLine(" json token: " + jsonToken);
Console.WriteLine(" siteSecret: " + siteSecret);
Console.WriteLine(EncryptAes(jsonToken, siteSecret));
Console.ReadLine();
}
}
I don't know why you said you're getting Irez-rWkCEqnsiRLWfol0IXQu1JPs3qL_G_9HfUViMG9u4XhffHqAyju6SRvMhFS86czHX9s1tbzd6B15r1vmY6s5S8odXT-ZE9A-y1lHns from the Java program because I'm not getting that output. The output I'm getting from both the C# version and the Java version is this:
Ye-fySvneVUZJXth67-Si_e8fBUV4Sxs7wEXVDEOJjBMHl1encvt65gGIj8CiFzBGp5uUgKYJZCuQ4rc964vZigjlrJ_430LgYcathLLd9U
As you can see here:
The code for both versions is available here
Live demo of the C# version.
The Java version was copy/pasted from your code and is using guava-18.0 and compiled with JDK8 x64 (I'm not a java expert so I'm just adding these in case it makes any difference).
I have encrypted string using algorithm RSA/ECB/PKCS1Padding through Java code now the same need to be encrypted using node.js. I don't know how to encrypt through node.js using algorithm RSA/ECB/PKCS1Padding .
Any suggestions?
the Java code is:
public static String encrypt(String source, String publicKey)
throws Exception {
Key key = getPublicKey(publicKey);
Cipher cipher = Cipher.getInstance("RSA/ECB/PKCS1Padding");
cipher.init(Cipher.ENCRYPT_MODE, key);
byte[] b = source.getBytes();
byte[] b1 = cipher.doFinal(b);
return new String(Base64.encodeBase64(b1), "UTF-8");
}
node js code using the cryto library:
const crypto = require('crypto')
const encryptWithPublicKey = function(toEncrypt) {
var publicKey = '-----BEGIN PUBLIC KEY-----****' //your public key
var buffer = Buffer.from(toEncrypt, 'utf8');
var encrypted = crypto.publicEncrypt({key:publicKey, padding : crypto.constants.RSA_PKCS1_PADDING}, buffer)
return encrypted.toString("base64");
}