I have written a code that is able to generate 2048 bit prime p & q's and use that to encrypt messages in RSA. The way I generate these numbers is by using the java.math.BigInteger package's probablePrime() function. My question is how strong of an encryption these function generated prime numbers are in terms of encryption.
Here is my code for generating these numbers, isPrime is just a boolean function I wrote to check if the number is prime.
BigInteger definitePrime(int bits, Random rnd) {
BigInteger prime = new BigInteger("4");
while (!isPrime (prime)) {
prime = BigInteger.probablePrime(bits, rnd);
}
return prime;
}
As Stephen C points out in his answer, the primes are probably ok for RSA encryption.
Cryptographic randomness
I would add, that you shouldn't actually use any Random instance, but only your systems best SecureRandom implementation.
new Random() is not a cryptographic randomness source, whereas new SecureRandom() should be. If the random numbers that are used for prime generation are not cryptographically secure, then an attacker may have a chance to simply recreate those based on other information (such as time or previous outputs of the weak randomness source).
No Textbook RSA
You're doing "everything" yourself and it seems that you actually want to use this for serious encryption. If you are, you're missing something crucial and that is the padding scheme.
It is easy to use the BigInteger methods to implement RSA so that it works, but it's not enough to make it secure. You need to use a padding scheme like PKCS#1 v1.5 (not recommended anymore) or PKCS#1 v2 OAEP (recommended).
Use existing implementations
Instead of implementing those padding schemes for your "handmade" RSA, use Java's Cipher instance which provides RSA with those padding schemes:
RSA/ECB/PKCS1Padding
RSA/ECB/OAEPWithSHA-256AndMGF1Padding
The javadoc for BigInteger.probablePrime() says:
Returns a positive BigInteger that is probably prime, with the specified bitLength. The probability that a BigInteger returned by this method is composite does not exceed 2-100.
2-100 means one chance in 1,267,650,600,228,229,401,496,703,205,376; i.e. 1 chance in ~1.267 * 1030
Since you are trying to generating 2 primes, that means that you have 2 chances in roughly 1030 of generating a weak RSA key pair.
I'd have thought that that was good enough, but if you don't think so then you could use BigInteger.isProbablePrime(certainty) to test your prime candidates to an even higher level of certainty.
My question is how strong of an encryption these function generated prime numbers are in terms of encryption.
I don't know if there is a mathematically rigorous way to quantify the strength of an encryption algorithm. But the above analysis tells you the probability that a given generated key-pair will be weak / easy to crack.
The generated primes are not secure if you use something like java.util.Random instead of an instance of SecureRandom. Your code snippet does not tell us what you are using, hence it is not possible to validate your code. Of course, you probably should just use JCE to generate new RSA keys.
Related
I don't code much but, when I'm trying to implement RSA encryption in JAVA my encryption and decryption work for smaller size primes but, when I try to do it for 1536 bit size prime numbers the decryption stops working. I've gone through it but, I don't see what the problem is.
I've already tried to see at what point the bit length starts to be an issue and it seems to be when I set it to 50 is when stops working. I also tend to get an error were my mod inverse method throws an error about e.modinv(lambda)
not being possible. I tried fixing this by adding the do loop but, it seems like this didn't fix the issue.
public BigInteger Random_Prime()
{
SecureRandom random = new SecureRandom();
byte [] randomize = new byte [192];
random.nextBytes(randomize);
BigInteger big = new BigInteger(randomize);
return big.probablePrime(1536,random);
}
public BigInteger lcm(BigInteger p, BigInteger q)
{
long p1 = p.longValue()-1;
long q1 = q.longValue()-1;
BigInteger test1 = p.valueOf(p1);
BigInteger test2 = p.valueOf(q1);
return test1.multiply(test2).divide(test1.gcd(test2));
do {
p = obj1.Random_Prime();
q = obj1.Random_Prime();
lambda = obj1.lcm(p, q);
}
while(lambda.gcd(e).compareTo(ONE)!=0);
BigInteger n = p.multiply(q);
BigInteger m = new BigInteger("75");
BigInteger d = e.modInverse(lambda);
BigInteger c = obj1.Encrypt(n,e,m);
I expect 75 to come back as 75 after going through encryption and decryption.
Your lcm() is wrong.
Java long is only 64 bits including sign, and cannot represent numbers greater than 2^63. (Or equal, but a large prime is never equal to a power of 2.) Thus your lcm computation should work for p,q up to 63 bits, and does for me, but produce totally wrong and useless results for anything larger. Instead use {p,q}.subtract(BigInteger.ONE) for the numbers you multiply-and-divide-by-gcd.
Also, BigInteger.probablePrime(int,Random) is a static (aka classwide) method; you do not need to call it using any instance, much less one with a random value you wasted time computing because it is ignored. For that matter BigInteger.valueOf(long) is also static and ignores any instance used to call it. If you are using any kind of Java development environment more advanced than simply typing javac to a shell or COMMAND, it should (at least optionally) give you a warning about using an instance to call a static method.
Finally, if you aren't aware, using the RSA primitives m^e mod n and c^d mod n directly to encrypt/decrypt data, especially small data, is not secure. You must use a sufficiently large and random padding scheme for this to be secure at all; see wikipedia for a short explanation, and if you want more search on https://crypto.stackexchange.com and maybe https://security.stackexchange.com where this has been asked about and answered many times. And if you decrypt (which you didn't show) simply by doing c.modPow(d,n) that is both inefficient (see wikipedia about CRT) and insecure (see wikipedia about timing attacks, and again crypto.SX and security.SX). And using RSA directly for data is very limiting and inefficient, so in practice people use hybrid encryption -- use a symmetric algorithm (nowadays usually AES) to encrypt the data under a nonce key, and encrypt that nonce key using RSA -- or perhaps better derive it using RSA-KEM (again see wikipedia crypto.SX security.SX).
If you actually want security and not just to play about, use the crypto from the Java library which has been implemented correctly (and reviewed) by competent people, unlike your code.
I generate randomly IV value everytime I encrypt when doing AES/CBC.
private static IvParameterSpec getRandomIvParameterSpec() {
byte[] iv = new byte[16];
new SecureRandom().nextBytes(iv);
return new IvParameterSpec(iv);
}
And I concat IV Value to cipher byte everytime I encrypt.
Is there any secure improvement if I hash (SHA-256) IV value before concat to cipher byte?
SHA-256 is injective. You give it the same input, it will give you the same output. It is not surjective, however. If m1 and m2 both hash to h, you cannot conclude that m1 = m2, even if you know that |m1| = |m2| (both messages are of the same length).
Therefore, applying SHA-256 (or any deterministic function) cannot increase the entropy of your data. At best, it won't decrease it. In other words: If your data is 16 purely random bytes, it won't be “more than purely random” after you hash it. And if your data was not purely random to begin with, then hashing it won't help making it random. You have to use a better entropy source in the first place.
Another problem that you didn't mention is that you currently have 16 random bytes but if you put them into your SHA-256 hash function, you'll get 32 bytes out. Which ones are you going to use? If you only use every second byte – due to injectivity – you won't get all possible bit patterns even if your input was perfectly random and the hash function was flawless. (If you did, then this would – by the pidgin hole principle – mean that the other half of the bytes would always be a function of the bytes you did chose. Only a really crappy hash function, which SHA-256 of course is not, would have such property.) If you try to be clever and combine the bytes in some “smart” way, chances are that you'll make things even worse.
So the short answer is: just don't do it. Generate as many random bytes as you need using the strongest non-deterministic entropy source you have available and use them directly.
I'm trying to solve the problem with the decryption in RSA algorithm.
ArrayList<Integer> m=convertTextToInt(str,e,n,listc);
for (int elem1 : m){
System.out.print( elem1+ " ");
}
Here, I'm passing str which is a string from user it will converted to integer numbers in the convertTextToInt method, a number e and n to calculate the cipher text and array list to store the value after calculate the cipher text with the formula
int c= (int)(Math.pow(Coded,e))%n;
listc.add(c);
then i write this in the main, the problem is that whenever i executed this loop it gives the same result which is 144 !!:
System.out.print("plain text is: ");
for (int element : m){
int plain = (int)(Math.pow(element,d))%n;
listp.add(plain);
System.out.print("listp:"+listp);
}
i tried to change the datatype for the plain to double and it gives me another number but not the right number.
my question is : why the formula of the plain text (description) gives me the same and wrong result every time which is 144 ! :(
Near dupe Storing large numbers for RSA encryption in java
You don't give any clue what numbers you are using, but RSA is only secure if you use numbers of at least about 1000 bits (about 330 decimal digits) which absolutely requires BigInteger in Java. Even this is borderline unsafe; current security standards like those for the WWW from CA/browser forum and for the US government from NIST require 2048 bits.
Even for 'toy' numbers (often used in coursework where no actual security is desired) that fit in Java int (9+ decimal digits) or long (18+ digits), the notional 'raw' decryption computation cd does NOT fit. Math.pow is floating-point so it first returns an imprecise (rounded-off) result, and truncating to int or long turns that into a completely wrong and useless result. You should instead do modular exponentiation in stages with each stage modulo n, as explained in Wikipedia linked from the article on RSA.
If you use BigInteger (as you must for non-toy cases) its modPow method already does this.
Also note 'textbook' RSA -- only exponentiation modulo a large semiprime -- is not actually secure in practice. This is also explained briefly in that Wikipedia article, and https://crypto.stackexchange.com/ has dozens of Q&As on the dangers and limitations of textbook RSA and what to do instead. If you actually want security in Java, at least 'desktop' or 'phone' Java (like Android, but not smartcards and embedded devices), the Java Cryptography Extensions already implement RSA correctly, including the additional things you need such as PKCS1 padding.
While doing a beginner's crypto course I'm trying to get to grips with Java's SecureRandom object. What I think I understand is that:
a) No matter how long a sequence of random numbers you know, there is no way of predicting the next random number in the sequence.
b) No matter how long a sequence of random numbers you know, there is no way of knowing which seed was used to start them off, other than brute force guesswork.
c) You can request secure random numbers of various sizes.
d) You can seed a newly-created SRNG with various different-sized values. Every newly-created SRNG you create and seed with the same value will produce the same sequence of random numbers.
I should add that I'm assuming that this code is used on Windows:
Random sr = SecureRandom.getInstance("SHA1PRNG", "SUN");
Is my basic understanding correct? Thanks in advance.
I have some further questions for anyone who's reasonably expert in crypto. They relate to seeding a SRNG as opposed to letting it seed itself on first use.
e) What difference, if any, does it make to the random numbers generated, if you seed a SRNG with a long integer as opposed to an array of 8 bytes?
f) If I seed a SRNG with, say, 256 bytes is there any other seed that can produce the same sequence of random numbers?
g) Is there some kind of optimum seed size? It seems to me that this might be a meaningless question.
h) If I encrypt a plaintext by seeding a SRNG with, say, 256 bytes then getting it to generate random bytes to XOR with the bytes in the plaintext, how easy would it be for an eavesdropper to decrypt the resulting ciphertext? How long might it take? Am I right in thinking that the eavesdropper would have to know, guess, or calculate the 256-byte seed?
I've looked at previous questions about SecureRandom and none seem to answer my particular concerns.
If any of these questions seem overly stupid, I'd like to reiterate that I'm very much a beginner in studying this field. I'd be very grateful for any input as I want to understand how Java SecureRandom objects might be used in cryptography.
d) This is true for a PRNG. It is not always true for a CSRNG. Read the Javadoc for SecureRandom.setSeed(): "The given seed supplements, rather than replaces, the existing seed. Thus, repeated calls are guaranteed never to reduce randomness."
Any reasonable CSRNG will have "invisible" sources of entropy that you cannot explicitly control, often various internal parameters taken from the Operating System level. Hence there is more seeding than any number you explicitly pass to the RNG.
OK, in order:
a) correct
b) correct
c) correct, you can even request a number in a range [0, n) using nextInt(n)
d) as good as correct: the implementation of SHA1PRNG is not publicly defined by any algorithm and there are indications that the implementation has changed in time, so this is only true for the Sun provider, and then only for a specific runtime configuration
e) as the API clearly indicates that all the bytes within the long are used ("using the eight bytes contained in the given long seed") there should not be any difference regarding the amount of entropy added to the state
Note that a quick check shows that setSeed(long) behaves entirely different from setSeed(byte[]) with the main difference that the seed of the long value is always mixed in with randomness retrieved from the system, even if it is the first call after the SecureRandom instance is constructed.
f) yes - an infinite number of seeds generate the same stream; since a hash function is used, it will be impossible to find one though
g) if you mix in additional entropy, then the more entropy the better, but there is no minimum; if you use it as the only seed then you should not start off with less than 20 bytes of seed, that is: if you want to keep the seed to the same security constraints as the inner state of the PRNG
And I would add that if you use less than 64 bytes of entropy you are in the danger zone for sure. Note that 1 bit of entropy does not always mean 1 bit in a byte. A byte array of size 8 may have 64 bits of entropy or less.
h) that's basically a hash based stream cipher; it's secure, so an attacker has little to no chance (given you don't reuse the seed) but it is a horribly unreliable (see answer d) and slow stream cipher, so please never ever do this - use a Cipher with "AES/CTR/NoPadding" or "AES/GCM/NoPadding" instead
e) I don't think that it makes a difference. Assuming that the long and the 8-byte array contain the same data.
f) In principle, yes. If your seed is larger than the internal state of the RNG, then there may exist some other seed that will result in the same internal state. If the seed is smaller than the state, then there shouldn't be. I don't know what SecureRandom's internal state looks like.
g) It's not the size of the seed that matters; it's the amount of entropy in it. You need there to be at least as much entropy in your seed as the security you expect out of the RNG; I'm not quite sure what best practices are here.
h) I'm not sure how easy it would be to break the RNG-based stream cipher that you propose. But I would recommend against using it in practice, because it's not a standard cryptographic construction that has been reviewed by experts and has reasonable security proofs. Remember the Rules of Crypto:
Never design your own crypto.
Never implement your own crypto.
Anyone can design crypto that they can't break themselves.
I know that if we declare:
SecureRandom random=new SecureRandom();
It initializes the default algorithm to generate random number is NativePRNG which reads /dev/random to generate the truly random seed. Now we have truly random seed which is 160 bit size, but i am confused what happens when we call random.nextBytes(bytes);. How it generates bytes from the seed,does it again read the /dev/random or something else.
Thanks.
N.B.: i am looking for default behavior in java 7 in linux/MAC box.
From the Java API docs:
Many SecureRandom implementations are in the form of a pseudo-random
number generator (PRNG), which means they use a deterministic
algorithm to produce a pseudo-random sequence from a true random seed.
Other implementations may produce true random numbers, and yet others
may use a combination of both techniques.
So whether nextBytes(bytes) returns true random bytes from /dev/random or whether it returns pseudo-random numbers generated from the true random seed depends. The second case means that using the initially random seed, a deterministic but seemingly random (and hence, pseudo-random) number sequence is generated by any calls to the SecureRandom.
Java 7 allows for a configurable PRNG source to be specified, but on Linux the default one is NativePRNG and on Windows SHA1PRNG. You can also specify SHA1PRNG on Linux, but the default option of NativePRNG is better. The SHA1PRNG generates PRNG bits and bytes through the use of SHA1. On Linux (and possibly other Unixes, where the mechanism is "NativePRNG"), the algorithm reads from /dev/random and /dev/urandom, so as long as there is enough entropy available through either of those. For the sake of completeness, from the Linux man page on random:
A read from the /dev/urandom device will not block waiting for more
entropy. As a result, if there is not sufficient entropy in the
entropy pool, the returned values are theoretically vulnerable to a
cryptographic attack on the algorithms used by the driver.
Therefore, on Linux at least, your SecureRandom will have a certain amount of true random output until /dev/random blocks due to a shortage of entropy, however if you request too many random bits, they will eventually start being generated by the underlying /dev/urandom machinery, which may use SHA1 or some other cryptographic hashing algorithm in a PRNG.
It's best to create a SecureRandom without specifying any explicit seed yourself, as it will seed itself (by default via /dev/random and /dev/urandom for the NativePRNG on Linux) with a good seed. Calling nextBytes(bytes) every few minutes, even for a large amount of bytes, is not bound to be an issue in almost any circumstance. Even if you are using the NativePRNG and it resorts to getting pseudo-random bytes from /dev/urandom via something like SHA-1, the output of this will still be extremely difficult to predict.
If you are asking for gigabytes of randomness, it might be good to re-seed, either using some output from the SecureRandom itself or by providing your own seed. Note that it should be safe providing any kind of seed to setSeed(), as SecureRandom internally augments the current seed by feeding the seed you provide and the previous seed to something like SHA-1 or another cryptographic hashing algorithm. However, it is still best to create the initial SecureRandom without giving your own seed.