Encrypt in java, decrypt in node.js - java

I need to encrypt in java and decrypt with node.js. The decryption result is corrupted.
Here is the java code:
public String encrypt(SecretKey key, String message){
Cipher cipher = Cipher.getInstance("AES/CBC/PKCS5Padding");
cipher.init(Cipher.ENCRYPT_MODE, key);
byte[] stringBytes = message.getBytes("UTF8");
byte[] raw = cipher.doFinal(stringBytes);
// converts to base64 for easier display.
BASE64Encoder encoder = new BASE64Encoder();
String base64 = encoder.encode(raw);
return base64;
}
Here is the node.js code:
AEse3SCrypt.decrypt = function(cryptkey, encryptdata) {
encryptdata = new Buffer(encryptdata, 'base64').toString('binary');
var decipher = crypto.createDecipher('aes-128-cbc', cryptkey);
decipher.setAutoPadding(false);
var decoded = decipher.update(encryptdata);
decoded += decipher.final();
return decoded;
}
As a key I use: "[B#4ec6948c"
The jave encrypted result is: "dfGiiHZi8wYBnDetNhneBw=="<br>
The node.js result is garbich....
In java I use "PKCS5Padding". What should be done in node.js regarding padding? I made setAutoPadding(false). If I don't do it I get error decipher fail. (only from node.js version 0.8).
I tried to remove utf8 encoding from the java in order to be complementary with the node.js but it didn't work.
Any idea what is wrong?

As a key I use: "[B#4ec6948c"
That sounds very much like you're just calling toString() on a byte array. That's not giving you the data within the byte array - it's just the default implementation of Object.toString(), called on a byte array.
Try using Arrays.toString(key) to print out the key.
If you were using that broken key value in the node.js code, it's no wonder it's giving you garbage back.
I tried to remove utf8 encoding from the java in order to be complementary with the node.js
That's absolutely the wrong approach. Instead, you should work out how to make the node.js code interpret the plaintext data as UTF-8 encoded text. Fundamentally, strings are character data and encryption acts on binary data - you need a way to bridge the gap, and UTF-8 is an entirely reasonable way of doing so. The initial result of the decryption in node.js should be binary data, which you then "decode" to text via UTF-8.
I'm afraid I don't know enough about the padding side to comment on that.

Related

how decode the encodeBase64URLSafeString hashkey

How to decode the encoded string using encodeBase64URLSafeString.
code:
KeyGenerator generator = KeyGenerator.getInstance("HmacSHA1");
SecretKey key = generator.generateKey();
encodedKey = Base64.encodeBase64URLSafeString(key.getEncoded());
output:
E6S9fTfMfAgkNVodHLDmxz-2M_g90bpFztW6GB7-7VcGfyOkugESxjNx1CGf5taJDXz895uWAF5ubPnaqhe4nw
can anyone please help me on how to decode the output string which is in encrypted format?
Tried using decryption with jasypt library but it is throwing bad input error.
You have to decode it with the same library you have used for encoding org.apache.commons.codec.binary.Base64;
This is working for me
System.out.println(Base64.decodeBase64("E6S9fTfMfAgkNVodHLDmxz-2M_g90bpFztW6GB7-7VcGfyOkugESxjNx1CGf5taJDXz895uWAF5ubPnaqhe4nw"));
Keep in mind Decode/Decrypt are 2 different things. Here you have encoded it, in your last process, so you have to decode it.

Trying to use PyCryptodome AES ECB decryption with offsets

As a bit of context to this I am converting a java file to python and am on the last operation. I'm at about 200 LOC so it makes it that much more edge of the seat...
Anyways, in java the operation is:
Cipher cipher = Cipher.getInstance("AES/ecb/nopadding");
cipher.init(Cipher.DECRYPT_MODE, new SecretKeySpec(keys[i], "AES"));
//doFinal(byte[] input, int inputOffset, int inputLen, byte[] output, int outputOffset)
cipher.doFinal(save, saveOffset, 16, save, saveOffset);
In python I have this:
from Crypto.Cipher import AES
cipher = AES.new(bytes(keys[i]), AES.MODE_ECB)
cipher.decrypt(?????)
This is taken from the package under decrypt():
:Parameters:
ciphertext : bytes/bytearray/memoryview
The piece of data to decrypt.
The length must be multiple of the cipher block length.
:Keywords:
output : bytearray/memoryview
The location where the plaintext must be written to.
If ``None``, the plaintext is returned.
:Return:
If ``output`` is ``None``, the plaintext is returned as ``bytes``.
Otherwise, ``None``.
As you can see .decrypt() doesn't really have an input for offsets, but I was wondering if there was some way around this?
This is why I decided to post on SO, would I be able to send:
temp_bytes = save[offset]
temp_decrypt = cipher.decrypt(temp_bytes)
save[offset] = temp_decrypt
Or when decrypting does it use the whole file as context and i will get the wrong output? I would love to just do it and test it but the output is just gibberish that i will have to write another program to parse (another java to python project).
What ended up working was:
save[saveOffset:saveOffset+16] = cipher.decrypt(save[saveOffset:saveOffset+16])
Did not expect it to work so easily

Charset bug in AES Decryption on Android system

Good evening!
In my android app the smartphones load a AES encrypted String from my server and store it in a variable. After that process the variable and a key are pass to a method which decrypt the string. My mistake is that german umlauts (ä, ü, ö) aren't correct decoded. All umlauts displayed as question marks with black background...
My Code:
public static String decrypt(String input, String key) {
byte[] output = null;
String newString = "";
try {
SecretKeySpec skey = new SecretKeySpec(key.getBytes(), "AES");
Cipher cipher = Cipher.getInstance("AES/ECB/PKCS5Padding");
cipher.init(Cipher.DECRYPT_MODE, skey);
output = cipher.doFinal(Base64.decode(input, Base64.DEFAULT));
newString = new String(output);
} catch(Exception e) {}
return newString;
}
The code works perfectly - only umlauts displayed not correctly, an example is that (should be "ö-ä-ü"):
How can I set the encoding of the decrypted String? In my iOS app I use ASCII to encoding the decoded downloaded String. That works perfectly! Android and iOS get the String from the same Server on the same way - so I think the problem is the local Code above.
I hope you can help me with my problem... Thanks!
There is no text but encoded text.
It seems like you are guessing at the character set and encoding—That's no way to communicate.
To recover the text, you need to reverse the original process applied to it with the parameters associated with each step.
For explanation, assume that the server is taking text from a Java String and sending it to you securely.
String uses the Unicode character set (specifically, Unicode's UTF-16 encoding).
Get the bytes for the String, using some specific encoding, say ISO8859-1. (UTF-8 could be better because it is also an encoding for the Unicode character set, whereas ISO8859-1 has a lot fewer characters.) As #Andy points out, exceptions are your friends here.
Encrypt the bytes with a specific key. The key is a sequence of bytes, so, if you are generating this from a string, you have to use a specific encoding.
Encode the encrypted bytes with Base64, producing a Java String (again, UTF-16) with a subset of characters so reduced that it can be re-encoded in just about any character encoding and placed in just about any context such as SMTP, XML, or HTML without being misinterpreted or making it invalid.
Transmit the string using a specific encoding. An HTTP header and/or HTML charset value is usually used to communicate which encoding.
To receive the text, you have to get:
the bytes,
the encoding from step 5,
the key from step 3,
the encoding from step 3 and
the encoding from step 2.
Then you can reverse all of the steps. Per your comments, you discovered you weren't using the encoding from step 2. You also need to use the encoding from step 3.

Why can I still decrypt parts of my data after flipping a bit in the ciphertext?

I am using in my application AES algorithm to encrypt my data. My key is of 256 bit. The encrypted token formed is of this sort:
pRplOI4vTs41FICGeQ5mlWUoq5F3bcviHcTZ2hN
Now if I change one bit of the token alphabet from upper case to lower case say some thing like this:
prplOI4vTs41FICGeQ5mlWUoq5F3bcviHcTZ2hN
Some part of the token is getting decrypted along with junk value. My concern is why even some part of the data is getting visible when as such one bit is changed.My code for encryption is as follows:
cipher = Cipher.getInstance("AES/CBC/PKCS5Padding");
Key secretKeySpecification = secretKeyData.getKey();
cipher.init(
Cipher.ENCRYPT_MODE,
secretKeySpecification,
new IvParameterSpec(secretKeyData.getIV().getBytes("UTF-8")));
byte[] bytesdata = cipher.doFinal(data.getBytes());
String encodedData = new BASE64Encoder().encode(bytesdata)
My code for decryption is:
Key secretKeySpecification = decryptionKeyDetails.getKey();
cipher.init(Cipher.DECRYPT_MODE, secretKeySpecification,
new IvParameterSpec(decryptionKeyDetails.getIV()
.getBytes("UTF-8")));
byte[] bytesdata;
byte[] tempStr = new BASE64Decoder()
.decodeBuffer(splitedData[0]);
bytesdata = cipher.doFinal(tempStr);
return new String(bytesdata);
Ciphertext modes of operation have specific forms of error propagation. There is such as thing as Bi-IGE (Bi-directional Infinite Garble Extension, that does change the whole plaintext if any error is introduced. However, it requires more than one pass, and it still won't protect you from getting random data if a bit was changed.
In the end, listen to Oleg and Codes (and Wikipedia and even me) and add an authentication tag to your ciphertext. Validate the authentication tag (e.g. HMAC) before decryption. Don't forget to include other data in your protocol such as the IV, or you may have a plaintext for which the first block has been changed.

AES difference between iPhone (Objective-c) and Java

I have been tearing my hair out all day trying to solve this...
I have an objective-c client running on the iPhone, connecting to a Java server. The iPhone is encrypting data using AES but I cannot decrypt it on the server. I am using a known passphrase and message (single string) and am generating the byte array on the iPhone, generating a comparison byte array on the Java server using the same key and message but the byte arrays are completely different (and hence can't be decoded on the Java side).
The client is using the CommonCrypto library with the following settings...
Data is an NSData holding the word "message" using dataUsingEncoding:NSASCIIStringEncoding
Key is an NSData holding the phrase "1234567891123456" again using the encoding as above.
Algorithm is kCCAlgorithmAES128
Options is kCCOptionsPKCS7Padding (which I believe equates to ECB on the server?!)
The server is using the following code...
byte[] key = "1234567891123456".getBytes();
Cipher c = Cipher.getInstance("AES/ECB/PKCS5Padding");
SecretKeySpec k = new SecretKeySpec(key, "AES");
c.init(Cipher.ENCRYPT_MODE, k);
byte[] encryptedData = c.doFinal("message".getBytes());
BUT the data in encryptedData does not match that which is being generated in the objective-c code, the byte arrays are completely different.
Can anyone see anything obvious I am doing wrong? I think the settings are all the same... :(
UPDATE - As requested....
Ok so here goes....
iPhone client is encrypting the following string "message"
It uses the key "1234567891123456"
It uses an initialisation vector of "1010101010101010"
It is using AES128, with CBC mode (as far as I can tell) and options of kCCOptionsPKCS7Padding.
The result of the encryption (with base64 encoding) is UHIYllDFAXl81ZM7OZPAuA==
The server is encrypting the same string, with the same key and initialisation vector.
It is using the following Cipher.getInstance("AES/CBC/PKCS5Padding");
The result of the encryption (with base64 encoding) is ALBnFIHysLbvAxjvtNo9vQ==
Thanks.
UPDATE 2 - As requested...
Here is the iPhone code....
NSData *toencrypt = [#"message" dataUsingEncoding:NSASCIIStringEncoding];
NSData *pass = [#"1234567891123456" dataUsingEncoding:NSASCIIStringEncoding];
NSData *iv = [#"1010101010101010" dataUsingEncoding:NSASCIIStringEncoding];
CCCryptorStatus status = kCCSuccess;
NSData *encrypted = [toencrypt dataEncryptedUsingAlgorithm:kCCAlgorithmAES128 key:pass initializationVector:iv options:kCCOptionPKCS7Padding error:&status];
NSString *text = [NSString base64StringFromData:encrypted length:[encrypted length]];
The NSData category for encrypting comes from here...
http://github.com/AlanQuatermain/aqtoolkit/tree/master/CommonCrypto/
Incidentally, I have checked the byte arrays that are in toencrypt, pass and iv and they match those that are on the server.
This is not my area but it looks like on the client you have PKCS7 but on the server you have PKCS5.
I just ran across the exact same problem. I am using CommonCrypto on the iOS client, using settings:
NSData * encrypted = [data dataEncryptedUsingAlgorithm:kCCAlgorithmAES128 key:pass initializationVector:iv options:kCCOptionPKCS7Padding error:&status];
The server uses Cipher.getInstance("AES/CBC/PKCS5Padding"); with the same key and initialization vector as the client.
After banging my head against the wall for the last few hours, I finally followed Jason's advice and checked the dataEncryptedUsingAlgorithm routine and printed out keyData right after FixKeyLengths. It turns out my 128bit key was extended to 192bit with 0s added to the end. After fixing this, everything works properly. :)
Update: My answer was posted almost 2 years ago, and this issue seems to be fixed in the latest NSData+CommonCrypto code. Specifically, this was the part that caused the issue:
static void FixKeyLengths( CCAlgorithm algorithm, NSMutableData * keyData, NSMutableData * ivData )
{
NSUInteger keyLength = [keyData length];
switch ( algorithm )
{
case kCCAlgorithmAES128:
{
if ( keyLength <= 16 )
{
[keyData setLength: 16];
}
else if ( keyLength <= 24 )
{
[keyData setLength: 24];
}
else
{
[keyData setLength: 32];
}
break;
}
The first check keyLength <= 16 wasn't there before.
If you are still experiencing problems now, it's probably something else.
What mode is the iPhone using with AES? You don't list anything, so perhaps that means it's using no chaining (ECB).
However, on the Java side, you are using CBC, but not specifying an initialization vector. That is definitely wrong. If you really are using CBC, you have to have the IV that was used during encryption. The IV is not secret; it can be sent along with the ciphertext.
If you are really using ECB, there is no IV, but your Java specifies the wrong mode.
Based on your samples, the server is doing it right, and the client is not.
Looking at the data, I would guess that the key is wrong. Please show us the iPhone code, especially the code to go from "1234567891123456" to your key.
I recently ran across this in another project. The problem was that the key was one byte too long to fit into the char buffer inside of the dataEncryptedUsingAlgorithm method.
The problem was that the getBytes method on the NSString was soft-failing. It would copy most of the string into the buffer, but then since the key was one byte too long, it would "mark" the operation as failed by setting the first char to NUL (char 0).
Step into that method in Xcode and see what your key char[16] buffer looks like. It may have this same problem, and have the contents { 0, '2', '3', '4', ... }.

Categories

Resources