How to convert value greater than 1000 to char in java - java

I'm developing my own RSA implementation in java. and while encrypting the data I'm encrypting each letter one by one and after calculating m^e mod n I'm getting values greater than 1000 and if i try to convert that value with (char) 1034 I'm getting ? this symbol for each character and I'm not able to decrypt the cipher text back to plain text what can i do please suggest some idea ???

Don't cast to char. You need to encode your value. One possible simple encoding is hexadecimal. Radix-64 (or base64) are frequently used (as in OpenPGP). Another possible choice is base85.

Related

TOTP / HOTP / HmacSHA256 with unsigned bytes key in Java

As we can see from the following questions:
Java HmacSHA256 with key
Java vs. Golang for HOTP (rfc-4226)
, Java doesn't really play nicely when using a key in a TOTP / HOTP / HmacSHA256 use case. My analysis is that the following cause trouble:
String.getBytes will (of course) give negative byte values for characters with a character value > 127;
javax.crypto.Mac and javax.crypto.spec.SecretKeySpec both externally and internally use byte[] for accepting and transforming the key.
We have acquired a number of Feitian C-200 Single Button OTP devices, and they come with a hexadecimal string secret which consist of byte values > 127.
We have successfully created a PoC in Ruby for these tokens, which works flawlessly. Since we want to integrate these in Keycloak, we need to find a Java solution.
Since every implementation of TOTP / HOTP / HmacSHA256 we have seen makes use the javax.crypto library and byte[], we fear we have to rewrite all the used classes but using int in order to support this scenario.
Q: Is there another way? How can we use secrets in a HmacSHA256 calculation in Java of which the bytes have values > 127 without having to rewrite everything?
Update
I was looking in the wrong direction. My problem was that the key was represented a String (UTF-16 in Java), which contained Unicode characters that were exploded into two bytes by getBytes(), before being passed into the SecretKeySpec.
Forcing StandardCharsets.ISO_8859_1 on this conversion fixes the problem.
Signed vs. unsigned is a presentation issue that's mainly relevant to humans only. The computer doesn't know or care whether 0xFF means -1 or 255 to you. So no, you don't need to use ints, using byte[] works just fine.
This doesn't mean that you can't break things, since some operations work based on default signed variable types. For example:
byte b = (byte)255; // b is -1 or 255, depending on how you interpret it
int i = b; // i is -1 or 2³² instead of 255
int u = b & 0xFF; // u is 255
It seems to throw many people off that Java has only signed primitives (boolean and char not withstanding). However Java is perfectly capable of performing cryptographic operations, so all these questions where something is "impossible" are just user errors. Which is not something you want when writing security sensitive code.
Don't be afraid of Java :) I've tested dozens tokens from different vendors, and everything is fine with Java, you just need to pickup correct converter.
It's common issue to get bytes from String as getBytes() instead of using proper converter. The file you have from your vendor represent secret keys in hex format, so just google 'java hex string to byte array' and choose solution, that works for you.
Hex, Base32, Base64 is just a representation and you can easily convert from one to another.
I've ran into absolutely the same issue (some years later): we got Feitian devices, and had to set up their server side code.
None of the available implementations worked with them (neither php or java).
Solution: Feitian devices come with seeds in hexadecimal. First you have to decode the seed into raw binary (e.g. in PHP using the hex2bin()). That data is the correct input of the TOTP/HOTP functions.
The hex2bin() version of java is a bit tricky, and its solution is clearly written in the question of the OP.
(long story short: the result of hex2bin you have to interpret with StandardCharsets.ISO_8859_1, otherwise some chars will be interpreted as 2 bytes utf-16 char, which causes different passcode at the end)
String hex = "1234567890ABCDEF"; // original seed from Feitian
Sring secretKey = new String(hex2bin(hex), StandardCharsets.ISO_8859_1);
Key key = new SecretKeySpec(secretKey.getBytes(StandardCharsets.ISO_8859_1), "RAW");
// or without String representation:
Key key = new SecretKeySpec(hex2bin(hex), "RAW");

How to write java code for FPE (Format Preserving Encryption) functions?

For FPE, I have passed plaintext as a 38D8DDD0D2 (10 digit) and tweak value as 18AD3A1387A9BCEB9BD223C44391CAB7 (32 digit) for encryption and decryption which are working, but not able to achieve FPE format.
But for FPE (Format Preserving Encryption), the output encryption value should be same format and length as like plaintext (10 digit).
Overall, If I give the input 10 digit string value, then the encryption value would be the same format with 10 digit length, and again after decryption - the same input string will be returned.
Please help me do that. Thanks
I think you are confusing FPE mode with other more common AES modes, e.g., the example code you shared is for ECB mode. Unfortunately using different AES modes in Java is not plug-and-play, each mode has to be used and handled slightly differently.
As for FPE, I don't think its even supported by the default Java JCE. See if you can use this implementation of it instead.

AES Encryption from integers to integers

I am using AES encryption algorithm in java to encrypt my database values..My encryption function returns encrypted value as String but the columns of type "Int" fails to store such string values which is quite logical..Is there a way to encrypt the integers as integers (numerical values)? Thankyou.
Plain AES returns an array of bytes. You can store this as an array of bytes, a Base64 text string or as a BigInteger:
BigInteger myBigInt = new BigInteger(AESByteArray);
It is very unlikely that the 128 bit, or larger, AES result will fit into a 32 bit Java int.
If you want 32 bit input and 32 bit output, so everything fits into a Java int, then either write your own 32 bit Feistel cipher, or use Hasty Pudding Cipher, which can be set for any bit size you require.
Encrypting integer into integer is FPE (format preserving encryption). FPE does not change data type or data length.
Here is a reason why databases implementing FPE only for character data, never for int.
AES 128 will encrypt 128-bit block. Which is 16 bytes.
If you want to encrypt 64 or 32 bit integer(4 or 8 byte values), you still have to encrypt 16 byte block. This problem can be solved by adding 8 (or 12) bytes to int32 or int64 values. This creates issue - if added bytes are always 0, you create huge weakness in encryption, as your data set is severely limited. It can be used for brute force attack on AES etc. In turn, this can be solved by filing with cryptographically strong random number added 8 or 12 bytes (that also creates a weakness, as most likely your random genertor is not strong enough). When decrypting, you can purge extra added bytes and extract only 4 or 8 bytes our of 16 bytes.
Still, life is not perfect. AES encryption does not change size of block, it always produces 16 bytes. You can encrypt your int into 16 bytes, but database can store only 8 bytes for int.
Unless you will store data in binary(16) column. But that is not an integer, and you are asking for integer.
In theory, numeric(38) is taking 16 bytes. In some databases it is possible to set 16 bytes to arbitrary value and then extract. I have not seen it is implemented.
You can always encode your string in an integer, however it could be a large integer.
If you can't afford large integer, you can encode it in multiple small integers.
If you can afford neither large integer nor multiple integers, maybe you can't do it well anyway, using a block cipher in ECB mode is almost always a bad idea.
Try converting the output of the encryption from string to binary, and then from binary to a decimal integer.

Cassandra = Memory/Encoding-Footprint of Keys (Hash/Bytes[]=>Hex=>UTF16=>Bytes[])

I am trying to understand the implications of using an MD5 Hash as Cassandra Key, in terms of "memory/storage consumption":
MD5 Hash of my content (in Java) = byte[] is 16 bytes long. (16 bytes is from wikipedia for generic md5, I am not shure if the java implementations also returns 16 bytes)
Hex encode this value, to be able to print it in human readable format => 1byte becomes 2hex values
I have to represent every hex value as a "character" in java => result= "two string character values" (for examle "FF" is a string of length/size = 2.)
Java uses UTF-16 => so every "string character" is encoded with two bytes. "FF" would require 2x2 bytes?
Conclusion => The MD5 Hash in Bytes format is 16 bytes, but represented as a java hex utf16 string consumes 16x2x2 = 64Bytes (in memory)!?!? Is this correct?
What is the storage Consumption in Cassandra, using this as a row-key?
If I had directly used the byte-array from the Hash function i would assume it consumes 16 bytes in Cassandra?
But if I use the hex-String representation (as noted above), can cassandra "compress" it to a 16 byets or will it also take 64bytes in cassandra? I assume 64 bytes in Cassandra, is this correct?
What kind of keys do you use? Do you use directly the outpout of an hash function or do you first encode into a hex string and then use the string?
(In MySQL I always, whenever I used a hash-key, I used the hex-string representation of it...So it is directly readable in the MySQL Tools and in the whole application. But I now realize it wastes storage???)
Maybe my thinking is completely incorrect, then it would be kind to explain where I am wrong.
Thans very much!
jens
Correct on both counts: byte[] would be 16 bytes, utf16-as-hex would be 64.
In 0.8, Cassandra has key metadata so you can tell it "this key is a byte[]" and it will display in hex in the cli.

JAVA RSAES-OAEP attack

I need to implement an RSAES-OAEP PKCS#1 V2.1 attack, using a unix executable oracle and a ASCII format challenge file. The format of challenge ASCII file is
{n}
{e}
{c}
where N (an integer) is a 1024-bit modulus, e (an integer) is the public exponent and c (an
octet string) is the ciphertext corresponding to the RSAES-OAEP encryption of some unknown
plaintext m (an octet string) under the public key (N, e). Note that the plaintext is ASCII text
(i.e., each octet is an ASCII encoded character), and that the RSAES-OAEP encryption will
have used SHA-1 as the hash function and a null label (i.e., in all cases the label is an octet
string of length zero).
The executable represents an RSAES-OAEP decryption oracle: when executed from a BASH
shell using the command
bash$ ./ USER < USER . challenge
it tries to decrypt the ciphertext read from stdin using the private key (N, d). Note that N is
read from stdin (i.e., from the challenge) but d (an integer) is a private exponent embedded
into the oracle (i.e., you do not have access to it).
The challenge file is as follows:
99046A2DB3D185D6D2728E799D66AC44F10DDAEE1C0A1AC5D7F34F04EDE17B96A5B486D95D927AA9B58FC91865DBF3A1685141345CC31B92E13F06E8212BAB22529F7D06B503AAFEEB89800E12EABA50C3F3BBE86F5966A88CCCF5C843281F8B98DF97A3111458FCA89B8085A96AE68EAEBAE270831D41C956159B81D29503
80A3C4043F940BE6AC16B11A0A77016DBA96B0239311AF182DD70E214E07E7DF3523CE1E269B176A3AAA0BA8F02C59262F693D6A248F22F2D561ED7ECC3CB9ABD0FE7B7393FA0A16C4D07181EEF6E27D97F48B83B90C58F51FD40DCDA71EF5E3C3E97D1697DC8E26B694B5CAFE59E427B12EE82A93064C81AAB74431F3A735
57D808889DE1417235C790CB7742EB76E537F55FD49941EBC862681735733F8BB095EDBB3C0DA44AB8F1176E69A61BBD3F0D31EB997071758A5DD850730A1D171E9EC92788EBA358974CE521537EE4A809BF1607D04EFD4A407866970981B88F44D5260D25C9E8864D5FC2AFB2CB90994DD1934BCEA728B38A00D4712AE0EE
Any ideas as to how to proceed for this attack?!
thanks
Anyone to guide me for this?!!!!!!!!!!
The first thing you could try is to find out whether you can apply the attack by
J. Manger from the paper "A Chosen Ciphertext Attack on RSA Optimal Asymmetric Encryption
Padding (OAEP) as Standardized in PKCS #1 v2.0." Crypto 2001.
That means you have to find out what kind of information you can get from the oracle.
I.e. Choose two arbitrary integers m0, m1 such that m1 is a 1024-bit integer smaller than n
and m0 is 1023 or less bits long. If you pass m0^e mod n and m1^e mod n to the oracle do you get a different response? If so then you might be able to apply the attack in the paper above. Otherwise you will have to search for another flaw in the decryption oracle.
Another approach that might work is to try to modify the modulus n. If the oracle really reads the modulus from user supplied input, then it looks like modifying the modulus should work and the attack becomes quite easy. I don't have access to the implementation of the oracle so I can only guess what might be possible. If you can check for any chosen
n',c' whether c'^d mod n' is a valid OAEP encoded plaintext then you decrypting the original message is not all you can do, in fact you can also recover d and hence factor the original RSA modulus.
(Furthermore this would indeed be a very nice puzzle, so I don't want to spoil the fun by giving
a step by step receipe on how to solve it.)

Categories

Resources