RSA- BIgInteger Issue - java

I want to use RSA algorithm, for encrypt and decrypt messages. Now, as RSA can encrypt and decrypt Big-integer (or Integer) value, I need the message as Big-integer value. Now, message can contain strings like "ABC 123". What can I do ? Any help or suggestion ?

If your message is initially ascii, you can use something like:
BigInteger i = new BigInteger();
While(j < msg.length() ) {
i += ((byte)msg.charAt(j) << (j*7));
}
For working code consult the actual JavaDocs. But basically you just want to turn your bytes or chars into a number, so the idea is you just concatenate the bits together.

It can be done, by using
byte[] b = message.getBytes()
BigInteger = new BigInteger (b)

Related

Ed25519 in JDK 15, Parse public key from byte array and verify

Since Ed25519 has not been around for long (in JDK), there are very few resources on how to use it.
While their example is very neat and useful, I have some trouble understanding what am I doing wrong regarding key parsing.
They Public Key is being read from a packet sent by an iDevice.
(Let's just say, it's an array of bytes)
From the searching and trying my best to understand how the keys are encoded, I stumbled upon this message.
4. The public key A is the encoding of the point [s]B. First,
encode the y-coordinate (in the range 0 <= y < p) as a little-
endian string of 32 octets. The most significant bit of the
final octet is always zero. To form the encoding of the point
[s]B, copy the least significant bit of the x coordinate to the
most significant bit of the final octet. The result is the
public key.
That means that if I want to get y and isXOdd I have to do some work.
(If I understood correctly)
Below is the code for it, yet the verifying still fails.
I think, I did it correctly by reversing the array to get it back into Big Endian for BigInteger to use.
My questions are:
Is this the correct way to parse the public key from byte arrays
If it is, what could possibly be the reason for it to fail the verifying process?
// devicePublicKey: ByteArray
val lastIndex = devicePublicKey.lastIndex
val lastByte = devicePublicKey[lastIndex]
val lastByteAsInt = lastByte.toInt()
val isXOdd = lastByteAsInt.and(255).shr(7) == 1
devicePublicKey[lastIndex] = (lastByteAsInt and 127).toByte()
val y = devicePublicKey.reversedArray().asBigInteger
val keyFactory = KeyFactory.getInstance("Ed25519")
val nameSpec = NamedParameterSpec.ED25519
val point = EdECPoint(isXOdd, y)
val keySpec = EdECPublicKeySpec(nameSpec, point)
val key = keyFactory.generatePublic(keySpec)
Signature.getInstance("Ed25519").apply {
initVerify(key)
update(deviceInfo)
println(verify(deviceSignature))
}
And the data (before manipulation) (all in HEX):
Device identifier: 34444432393531392d463432322d343237442d414436302d444644393737354244443533
Device public key: e0a611c84db0ae91abfe2e6db91b6a457a4b41f9d8e09afdc7207ce3e4942e94
Device signature: a0383afb3bcbd43d08b04274a9214036f16195dc890c07a81aa06e964668955b29c5026d73d8ddefb12160529eeb66f843be4a925b804b575e6a259871259907
Device info: a86a71d42874b36e81a0acc65df0f2a84551b263b80b61d2f70929cd737176a434444432393531392d463432322d343237442d414436302d444644393737354244443533e0a611c84db0ae91abfe2e6db91b6a457a4b41f9d8e09afdc7207ce3e4942e94
// Device info is simply concatenated [hkdf, identifier, public key]
And the public key after the manipulation:
e0a611c84db0ae91abfe2e6db91b6a457a4b41f9d8e09afdc7207ce3e4942e14
Thank you very much, and every bit of help is greatly appreciated.
This will help many more who will stumble upon this problem at a later point, when the Ed25519 implementation will not be so fresh.
Helped me a lot. Would never have figured it out without your example.
I did it in java.
public static PublicKey getPublicKey(byte[] pk)
throws NoSuchAlgorithmException, InvalidKeySpecException, InvalidParameterSpecException {
// key is already converted from hex string to a byte array.
KeyFactory kf = KeyFactory.getInstance("Ed25519");
// determine if x was odd.
boolean xisodd = false;
int lastbyteInt = pk[pk.length - 1];
if ((lastbyteInt & 255) >> 7 == 1) {
xisodd = true;
}
// make sure most significant bit will be 0 - after reversing.
pk[pk.length - 1] &= 127;
// apparently we must reverse the byte array...
pk = ReverseBytes(pk);
BigInteger y = new BigInteger(1, pk);
NamedParameterSpec paramSpec = new NamedParameterSpec("Ed25519");
EdECPoint ep = new EdECPoint(xisodd, y);
EdECPublicKeySpec pubSpec = new EdECPublicKeySpec(paramSpec, ep);
PublicKey pub = kf.generatePublic(pubSpec);
return pub;
Actually, the whole encoding and decoding is correct.
The one thing in the end, that was the problem was that I (by mistake) reversed the array I read one too many times.
Reversing arrays since certain keys are encoded in little endian, while in order to represent it as a BigInteger in JVM, you have to reverse the little endian so it becomes big endian.
Hopefully this helps everyone in the future who will get stuck on any similar problems.
If there will be any questions, simply comment here or send me a message here.
I'll do my best to help you out.
I the code here is way way way more than you probably need. I investigated this and came up with what I think is equivalent, only much simpler. Anyhow, here is the blog piece: https://www.tbray.org/ongoing/When/202x/2021/04/19/PKI-Detective. and here is the Java code: https://github.com/timbray/blueskidjava
You can check how it is done in the OpenJDK implementation:
https://github.com/openjdk/jdk15/blob/master/src/jdk.crypto.ec/share/classes/sun/security/ec/ed/EdDSAPublicKeyImpl.java#L65
Basically encodedPoint is your byte array (just the plain bytes, without ASN.1 encoding).

Java Mac HMAC vs C++ OpenSSL hmac

This is going to be a long question but I have a really weird bug. I use OpenSSL in C++ to compute a HMAC and compare them to a simular implementation using javax.crypto.Mac. For some keys the HMAC calculation is correct and for others there is a difference in HMAC. I believe the problem occurs when the keys get to big. Here are the details.
Here is the most important code for C++:
void computeHMAC(std::string message, std::string key){
unsigned int digestLength = 20;
HMAC_CTX hmac_ctx_;
BIGNUM* key_ = BN_new();;
BN_hex2bn(&key_, key);
unsigned char convertedKey[BN_num_bytes(key_)];
BIGNUM* bn = BN_new();
HMAC_CTX_init(&hmac_ctx_);
BN_bn2bin(bn, convertedKey);
int length = BN_bn2bin(key_, convertedKey);
HMAC_Init_ex(&hmac_ctx_, convertedKey, length, EVP_sha1(), NULL);
/*Calc HMAC */
std::transform( message.begin(), message.end(), message.begin(), ::tolower);
unsigned char digest[digestLength];
HMAC_Update(&hmac_ctx_, reinterpret_cast<const unsigned char*>(message.c_str()),
message.length());
HMAC_Final(&hmac_ctx_, digest, &digestLength);
char mdString[40];
for(unsigned int i = 0; i < 20; ++i){
sprintf(&mdString[i*2], "%02x", (unsigned int)digest[i]);
}
std::cout << "\n\nMSG:\n" << message << "\nKEY:\n" + std::string(BN_bn2hex(key_)) + "\nHMAC\n" + std::string(mdString) + "\n\n";
}
The java test looks like this:
public String calculateKey(String msg, String key) throws Exception{
HMAC = Mac.getInstance("HmacSHA1");
BigInteger k = new BigInteger(key, 16);
HMAC.init(new SecretKeySpec(k.toByteArray(), "HmacSHA1"));
msg = msg.toLowerCase();
HMAC.update(msg.getBytes());
byte[] digest = HMAC.doFinal();
System.out.println("Key:\n" + k.toString(16) + "\n");
System.out.println("HMAC:\n" + DatatypeConverter.printHexBinary(digest).toLowerCase() + "\n");
return DatatypeConverter.printHexBinary(digest).toLowerCase();
}
Some test runs with different keys (all strings are interpreted as hex):
Key1:
736A66B29072C49AB6DC93BB2BA41A53E169D14621872B0345F01EBBF117FCE48EEEA2409CFC1BD92B0428BA0A34092E3117BEB4A8A14F03391C661994863DAC1A75ED437C1394DA0741B16740D018CA243A800DA25311FDFB9CA4361743E8511E220B79C2A3483FCC29C7A54F1EB804481B2DC87E54A3A7D8A94253A60AC77FA4584A525EDC42BF82AE2A1FD6E3746F626E0AFB211F6984367B34C954B0E08E3F612590EFB8396ECD9AE77F15D5222A6DB106E8325C3ABEA54BB59E060F9EA0
Msg:
test
Hmac OpenSSL:
b37f79df52afdbbc4282d3146f9fe7a254dd23b3
Hmac Java Mac:
b37f79df52afdbbc4282d3146f9fe7a254dd23b3
Key 2: 636A66B29072C49AB6DC93BB2BA41A53E169D14621872B0345F01EBBF117FCE48EEEA2409CFC1BD92B0428BA0A34092E3117BEB4A8A14F03391C661994863DAC1A75ED437C1394DA0741B16740D018CA243A800DA25311FDFB9CA4361743E8511E220B79C2A3483FCC29C7A54F1EB804481B2DC87E54A3A7D8A94253A60AC77FA4584A525EDC42BF82AE2A1FD6E3746F626E0AFB211F6984367B34C954B0E08E3F612590EFB8396ECD9AE77F15D5222A6DB106E8325C3ABEA54BB59E060F9EA0
Msg:
test
Hmac OpenSSL:
bac64a905fa6ae3f7bf5131be06ca037b3b498d7
Hmac Java Mac:
bac64a905fa6ae3f7bf5131be06ca037b3b498d7
Key 3: 836A66B29072C49AB6DC93BB2BA41A53E169D14621872B0345F01EBBF117FCE48EEEA2409CFC1BD92B0428BA0A34092E3117BEB4A8A14F03391C661994863DAC1A75ED437C1394DA0741B16740D018CA243A800DA25311FDFB9CA4361743E8511E220B79C2A3483FCC29C7A54F1EB804481B2DC87E54A3A7D8A94253A60AC77FA4584A525EDC42BF82AE2A1FD6E3746F626E0AFB211F6984367B34C954B0E08E3F612590EFB8396ECD9AE77F15D5222A6DB106E8325C3ABEA54BB59E060F9EA0
Msg:
test
Hmac OpenSSL:
c189c637317b67cee04361e78c3ef576c3530aa7
Hmac Java Mac:
472d734762c264bea19b043094ad0416d1b2cd9c
As the data shows, when the key gets to big, an error occurs. If have no idea which implementation is faulty. I have also tried with bigger keys and smaller keys. I haven't determined the exact threshold. Can anyone spot the problem? Is there anyone capable of telling me which HMAC is incorrect in the last case by doing a simulation using different software or can anyone tell me which 3rd implementation I could use to check mine?
Kind regards,
Roel Storms
When you convert a hexadecimal string to a BigInt in Java, it assumes the number is positive (unless the string includes a - sign).
But the internal representation of it is twos-complement. Meaning that one bit is used for the sign.
If you are converting a value that starts with a hex between 00 and 7F inclusive, then that's not a problem. It can convert the byte directly, because the leftmost bit is zero, which means that the number is considered positive.
But if you are converting a value that starts with 80 through FF, then the leftmost bit is 1, which will be considered negative. To avoid this, and keep the BigInteger value exactly as it is supplied, it adds another zero byte at the beginning.
So, internally, the conversion of a number such as 7ABCDE is the byte array
0x7a 0xbc 0xde
But the conversion of a number such as FABCDE (only the first byte is different!), is:
0x00 0xfa 0xbc 0xde
This means that for keys that begin with a byte in the range 80-FF, the BigInteger.toByteArray() is not producing the same array that your C++ program produced, but an array one byte longer.
There are several ways to work around this - like using your own hex-to-byte-array parser or finding an existing one in some library. If you want to use the one produced by BigInteger, you could do something like this:
BigInteger k = new BigInteger(key, 16);
byte[] kByteArr = k.toByteArray();
if ( kByteArr.length > (key.length() + 1) / 2 ) {
kByteArr = Arrays.copyOfRange(kByteArr,1,kByteArr.length);
}
Now you can use the kByteArr to perform the operation properly.
Another issue you should watch out for is keys whose length is odd. In general, you shouldn't have a hex octet string that has an odd length. A string like F8ACB is actually 0F8ACB (which is not going to cause an extra byte in BigInteger) and should be interpreted as such. This is why I wrote (key.length() + 1) in my formula - if key is odd-length, it should be interpreted as a one octet longer. This is also important to watch out for if you write your own hex-to-byte-array converter - if the length is odd, you should add a zero at the beginning before you start converting.

How does BigInteger interpret the bytes from a string?

Im working on a program that is an implementation of the RSA encryption algorithm, just as a personal exercise, its not guarding anyone's information or anything. I am trying to understand how a plaintext passage is being interpreted numerically, allowing it to be encrypted. I understand that most UTF-8 characters end up only using 1 byte of space, and not the 2 bytes one might think, but thats about it. Heres my code:
BigInteger ONE = new BigInteger("1");
SecureRandom rand = new SecureRandom();
BigInteger d, e, n;
BigInteger p = BigInteger.probablePrime(128, rand);
BigInteger q = BigInteger.probablePrime(128, rand);
BigInteger phi = (p.subtract(ONE)).multiply(q.subtract(ONE));
n = p.multiply(q);
e = new BigInteger("65537");
d = e.modInverse(phi);
String string = "test";
BigInteger plainText = new BigInteger(string.getBytes("UTF-8"));
BigInteger cipherText = plainText.modPow(e, n);
BigInteger originalMessage = cipherText.modPow(d, n);
String decrypted = new String(originalMessage.toByteArray(),"UTF-8");
System.out.println("original: " + string);
System.out.println("decrypted: " + decrypted);
System.out.println(plainText);
System.out.println(cipherText);
System.out.println(originalMessage);
System.out.println(string.getBytes("UTF-8"));
byte byteArray[] = string.getBytes("UTF-8");
for(byte littleByte:byteArray){
System.out.println(littleByte);
}
It outputs:
original: test
decrypted: test
1952805748
16521882695662254558772281277528769227027759103787217998376216650996467552436
1952805748
[B#60d70b42
116
101
115
116
Maybe more specifically i am wondering about this line:
BigInteger plainText = new BigInteger(string.getBytes("UTF-8"));
Does each letter of "test" have a value, and they are literraly added together here? Like say t=1,e=2,s=3,t=1 for example, if you get the bytes from that string, do you end up with 7 or are the values just put together like 1231? And why does
BigInteger plainText = new BigInteger(string.getBytes("UTF-8")); output 1952805748
I am trying to understand how a plaintext passage is being interpreted numerically, allowing it to be encrypted.
It really boils down to understanding what this line does:
BigInteger plainText = new BigInteger(string.getBytes("UTF-8"));
Lets break it down.
We start with a String (string). A Java string is a sequence of characters represented as Unicode code points (encoded in UCS-16 ...).
The getBytes("UTF-8") then encodes the characters as a sequence of bytes, and returns them in a newly allocated byte array.
The BigInteger(byte[]) constructor interprets that byte array as a number. As the javadoc says:
Translates a byte array containing the two's-complement binary representation of a BigInteger into a BigInteger. The input array is
assumed to be in big-endian byte-order: the most significant byte is
in the zeroth element.
The method that is being used here is not giving an intrisically meaningful number, just one that corresponds to the byte-encoded string. And going from the byte array to the number is simply treating the bytes as a bit sequence that represents an integer in 2's complement form ... which is the most common representation for integers on modern hardware.
The key thing is that the transformation from the text to the (unencrypted) BigInteger is lossless and reversible. Any other transformation with those properties could be used.
References:
The Wikipedia page on 2's Complement representation
The Wikipedia page on the UTF-8 text encoding scheme
javadoc BigInteger(byte[])
javadoc String.getBytes(String)
Im still not quite understanding how the the UTF-8 values for each character in "test", 116,101,115,116 respectively come together to form 1952805748?
Convert the numbers 116,101,115,116 to hex.
Convert the number 1952805748 to hex
Compare them
See the pattern?
The answer is in the output, "test" is encoded into array of 4 bytes [116, 101, 115, 116]. This is then interperted by BigInteger as binary integer representation. The value can be calculated this way
value = (116 << 24) + (101 << 16) + (115 << 8) + 116;

Get hash of a String as String

I'm here:
String password = "123";
byte passwordByte[] = password.getBytes();
MessageDigest md = MessageDigest.getInstance("SHA-512");
byte passwortHashByte[] = md.digest(passwordByte);
The passwortHashByte-Array contains only a lot of numbers. I want to convernt this numbers to one String which contains the hash-code as plaintext.
How i do this?
I want to convernt this numbers to one String which contains the hash-code as plaintext.
The hash isn't plain-text. It's binary data - arbitrary bytes. That isn't plaintext any more than an MP3 file is.
You need to work out what textual representation you want to use for the binary data. That in turn depends on what you want to use the data for. For the sake of easy diagnostics I'd suggest a pure-ASCII representation - probably either base64 or hex. If you need to easily look at individual bytes, hex is simpler to read, but base64 is a bit more compact.
It's also important to note that MD5 isn't a particularly good way of hashing passwords... and it looks like you're not even salting them. It may be good enough for a demo app which will never be released into the outside world, but you should really look into more secure approaches. See Jeff Atwood's blog post on the topic for an introduction, and ideally get hold of a book about writing secure code.
Here is how I did it for my website.
private static byte[] fromHex(String hex) {
byte[] bytes = new byte[hex.length() / 2];
for (int i = 0; i < hex.length() / 2; i++) {
bytes[i] = (byte)(Character.digit(hex.charAt(i * 2), 16) * 16 + Character.digit(hex.charAt(i * 2 + 1), 16) - 128);
}
return bytes;
}
private static String toHex(byte[] bytes) {
String hex = new String();
for (int i = 0; i < bytes.length; i++) {
String c = Integer.toHexString(bytes[i] + 128);
if (c.length() == 1) c = "0" + c;
hex = hex + c;
}
return hex;
}
That'll allow you to convert your byte array to and from a hex string.
Well, byte passwortHashByte[] = md.digest(passwordByte); can contain some controll characters, which will broke your String. Consider encoding passwortHashByte[] to Base64 form, if you really need String from it. You can use Apache Commons Codecs for making base64 form.

Converting a BigInteger into a Key

how do i cast a Big Integer into a Key for java cryptography library?
I am trying to use a shared diffie hellman key that i generated myself for the key value for AES encryption.
Below is the code that i used
BigInteger bi; long value = 1000000000;
bi = BigInteger.valueOf(value);
Key key = new Key (bi);
however it did not work.
May i know how do i convert a BigInteger value into a Key value?
Thanks in advance!
First, you cannot cast it. There is no relationship between the BigInteger class and the Key interface.
Second, Key is an interface not a class, so you can't create instances of it. What you need to create is an instance of some class that implements Key. And it most likely needs to be a specific implementation class, not (say) an anonymous class.
The final thing is that the Java crypto APIs are designed to hide the representation of the key. To create a key from bytes, you need to create a KeySpec object; e.g. SecretKeySpec(byte[] key, String algorithm)) and then use a KeyFactory to "generate" a key from it. Typical KeySpec constructors take a byte[] as a parameter, so you first need to get the byte array from your BigInteger instance.
You need to convert your BigInteger to a byte array of a specific size, then use the first (leftmost) bytes to create a key. For this you need to know the size of the prime p used in DH, as the value needs to be left-padded to represent a key. I would suggest to use standardized DH parameters (or at least make sure that the size of the prime is dividable by 8).
Note that there may be a zero valued byte in front of the byte array retrieved using BigInteger.toByteArray() because the value returned is encoded as a signed (two-complement) big-endian byte array. You need to remove this byte if the result is bigger than the prime (in bytes) because of it.
public static byte[] encodeSharedSecret(final BigInteger sharedSecret, final int primeSizeBits) {
// TODO assignment add additional tests on input
final int sharedSecretSize = (primeSizeBits + Byte.SIZE - 1) / Byte.SIZE;
final byte[] signedSharedSecretEncoding = sharedSecret.toByteArray();
final int signedSharedSecretEncodingLength = signedSharedSecretEncoding.length;
if (signedSharedSecretEncodingLength == sharedSecretSize) {
return signedSharedSecretEncoding;
}
if (signedSharedSecretEncodingLength == sharedSecretSize + 1) {
final byte[] sharedSecretEncoding = new byte[sharedSecretSize];
System.arraycopy(signedSharedSecretEncoding, 1, sharedSecretEncoding, 0, sharedSecretSize);
return sharedSecretEncoding;
}
if (signedSharedSecretEncodingLength < sharedSecretSize) {
final byte[] sharedSecretEncoding = new byte[sharedSecretSize];
System.arraycopy(signedSharedSecretEncoding, 0,
sharedSecretEncoding, sharedSecretSize - signedSharedSecretEncodingLength, signedSharedSecretEncodingLength);
return sharedSecretEncoding;
}
throw new IllegalArgumentException("Shared secret is too big");
}
After that you need to derive the key bytes using some kind of key derivation scheme. The one you should use depends on the standard you are implementing:
As stated in RFC 2631
X9.42 provides an algorithm for generating an essentially arbitrary
amount of keying material from ZZ. Our algorithm is derived from that
algorithm by mandating some optional fields and omitting others.
KM = H ( ZZ || OtherInfo)
H is the message digest function SHA-1 [FIPS-180] ZZ is the shared
secret value computed in Section 2.1.1. Leading zeros MUST be
preserved, so that ZZ occupies as many octets as p.
Note that I have discovered a bug in the Bouncy Castle libraries up to 1.49 (that's the current version at this date) in the DH implementation regarding the secret extraction - it does strip the spurious leading 00h valued bytes, but it forgets to left-pad the result up to the prime size p. This will lead to an incorrect derived key once in 192 times (!)

Categories

Resources