Is the precision of nextProbablePrime () related to the size of the input value?

I am using BigInteger nextProbablePrime in my homework to calculate the next prime number that I can use to resize a hash table that uses quadratic probing.

The table stores data items read from a file. The sample file I received only contains 100 items, but I cannot assume that this is the maximum dataset my program will be tested against.

I am wondering if there is any relationship between the size of the value I pass to nextProbablePrime and the likelihood that it will correctly return a prime number? In other words, is there a number below which the precision of the next ProbablePrime is guaranteed? Is it reasonable for me to rely on this?

+3


source to share


3 answers


Since "the probability that the number returned by this method is not greater than 2^-100

" I think it is reasonable to assume that it relies on the fact that it returns a prime number.



+2


source


nextProbablePrime () is actually used for cryptographic operations that require finding (really) huge numbers that are "probably basic". The nextProbablePrime () method returns a number that is prime with a probability of 2 ^ -100.



I don't think such a method is useful for your needs. For reasonable numbers (like the size of a hash table), you can simply check if the number is prime using a simple home implementation, especially if high performance is required.

0


source


This is good for your purpose.

Better would be to use a precomputed list of hash tables. You are probably only half a dozen or so; choose based on the expected size of your hash table and increase if the load factor gets too large.

The general lesson is to adopt all possible methods to reduce the complexity of your code. You are writing a hash table. You don't need to worry about prime numbers.

0


source







All Articles