True hardware random number generators
True hardware random number generators (TRNGs) do produce true random numbers based on true random processes like the falling of a dice or the noise of a resistor or oscillator. The true randomness of these numbers is the base of many cryptographical applications, e. g. the generation of PINs, TANs and cryptographical keys. They can also be used for many other applications like stochastic resonance, genetic algorithms, monte carlo simulations, surrogat data method, creation of lottery numbers, video roulette, and many other.
An often emerging problem of hardware random number generators, as well of pseudorandom number generators, are statistical conspicuities, which can be characterized partially by the entropy (average randomness of a single random number). The information theory entropy is not meaningful, because it only incorporates the single probabilities of the random numbers but not conditional probabilities and other. Furthermore the entropy is not invariant against a basis transformation, so the bit-wise entropy and the byte-wise entropy can have totally different values. That's the reason why there are so many different entropy tests and why in the strict sense not only one but also many information theory entropies do exist.
The statistical conspicuities can have drastic consequences. Quote from a textbook:
"If all scientific paper whose results are in doubt because of bad rand()s were to disappear from library shelves, there would be a gap on each shelf about as big as your fist."
Numerical Recipes (in C), Chapter 7.1
Another problem is the estimation of the degree of non-computability or true randomness / really randomness, also called unpredictability or degree of nondeterminateness.
Because a hardware random number generator is a stochastic automat with a finite memory, it's true randomness can be calculated by the theory of markov chains.
The true randomness, more precisely the average randomness of a random number of a stochastic automat, is known as the entropy cover in this theory. In contrast of the entropy, the entropy cover incorporates conditional probabilities and it's value is invariant against basis transformations.
In the theory the memory of length m means that a random number depends on the m previous produced random numbers.
The number of interior states is n^m (n to the power of m) with n=number of possible random numbers, e. g. (2^(16))^(1000) = 3,02E4816 with a 16-bit random number generator and a memory length of m=1000. Because of this huge number of states and a greater number of transitions the entropy cover can generally not be calculated. Another problem is that before this calculation the parameters, the ergodic probabilities of the states and transitions probabilities, have to be estimated. Additionally the independence from outside influences, at all allowed parameter values, has to be validated. Because of Bell's theorem it is not necessary to care about hidden variables.
The due to the huge number of interior states and conditional probabilities practically impossible effort is the only reason why hardware random number generators have to be tested; in theory they can be calculated completely.
It is also the reason why true random numbers often are after-treated deterministic. A complete deterministic after-treatment like exclusive oring (so called XOR mode) with pseudorandom numbers increases the entropy in most cases but never the entropy cover. For cryptographically application such after-treatments are senseless and should be avoided, because they can only mask the weaknesses of the hardware but they can not cancel the weaknesses. Through this, such after-treatments do fool the user and cozenage a randomness which is not really there.
Because the entropy cover can generally not be estimated, in practice only tests like an entropy test or the diehard tests and estimating of simplifying details like an entropy be done.
In special cases an entropy and the entropy cover can be calculated. An example are pseudorandom number generators because they do have N interior states and all transition probabilities are 0 or 1. Therefore the term p*ld(p) in the definition formula of the entropy cover is always zero and also the whole entropy cover.
As can bee seen at the simple definition formula, the entropy is also zero when, and only when, the random numbers where so designed (the number width so choosen) that they meet an integer multiple of the length M>0 of the period length P of the generator. For this the period P has to be finite, because otherwise the entropy would always be zero, even at nondeterministic random number generators. Pseudorandom number generators (PRNGs) with an infinite period, for example an algorithmically random sequence like the decimal places of a normal number, e. g. the Champernowne constant, the entropies are (close to) one because every newly calculated decimal place is a surprise (because it was unknown before) while the entropy cover is zero because the decimal places are constant and do exist, indendently from a calculation.
In practice often there is no distinction between pseudorandomness and true randomness (nondeterminism) and named entropy what is accurately named entropy cover. With this there is no distinction between appearances and reality of randomness.
Generally the physical states do not meet the markov-states, but the physical states can be approximated by markov-states. With a m growing to infinity this approximation is arbitrary accurate. That's why every random number generator is a markov process and can be described by the theory of markov chains.
The nondeterminism is the criteria for information theoretical safety of the random numbers and is important e. g. for preventing that someone can calculate or guess with a high probability transaction numbers (TANs), even if that someone knows all previous TANs (e. g. from the recovered paper container). For this another quote:
"There is a DEFINITE NEED FOR DATA WITH ENTROPY NOT BASED ON ALGORITHMS. As to whether we need *true* random numbers, that's a subject for debate. Typically we get data that's as good as we need it to be from a hardware generator by running the initial data (which may have some kind of pattern or bias) through an algorithm which sort of mixes it. But it is very useful to have this entropy come from a hardware device, because it affords an EXTRA LEVEL OF SECURITY; pseudorandom numbers can be compromised if the algorithm is known and the seed is discovered, but THERE IS NO SUCH EQUIVALENT FOR RANDOM NUMBERS GENERATED BY A HARDWARE DEVICE."
Moses Liskov (firstname.lastname@example.org), http://www.rsa.com/ , year 2000
For solving these two problems, producing random numbers without statistical conspicuities and with high entropy and entropy cover, the below-mentioned hardware random number generators have been developed.
There are virtual for free hardware random number generators like /dev/random (the standard hardware random number generator on Linux and other Unix derivatives) which also do produce high quality random numbers but only with a speed of approximately 2 Bytes/s, which is million times slower than the below-mentioned generators.
As tests the FIPS, AIS, NIST, diehard and special tests, e. g. with a hardware spectrum analyser, were used. Because the diehard tests are most often stronger and more versatile than the other tests, they were used most often.