August 05, 2016

The Kolmogorov-Smirnov Test and RNGs

During a recent Web crawl, I came across the topic of pseudorandom number generators (PRGs). I’ve talked about ways in which they’re implemented in the past. There are plenty of deterministic PRGs that stretch a small amount of randomness with sufficiently high entropy into a longer stream of bytes with comparable entropy. Nowadays, Bernstein’s ChaCha20 seems to be the most widely used algorithm. You might ask yourself why ChaCha20 is so prevalent. One reason is that it’s been thoroughly vetted as a secure stream cipher. And from a stream cipher, one can build a PRG. The process is simple: the key stream that’s generated which would originally be XORed with some plaintext message is used as the PRG byte stream output. Clearly, the cipher is only secure if this byte stream is indistinguishable from random. So if the cipher is secure then we have a cryptographically secure PRG (CSPRG). The only caveat is that the initial state must be secret; otherwise someone could easily determine future output of the PRG. When used in the Linux kernel, this state is the result of entropy collected from the environment. So it is effectively secret.

A stream cipher as a CSPRG.

For the sake of argument, assume we have a (CS)PRG that we want to use. Moreover, even if we have a security proof, we do not trust it. (Bear with me…) Instead, we only have faith in randomness that we can test. But how do we assess the randomness of ? Long ago (over two decades), George Marsaglia proposed a suite of statistical tests – called the DieHard tests [1] – aimed at measuring the effectiveness of random number generators. It included tests such as the “birthday spacings,” “overlapping permutations,” “minimum distance,” and so on. While the Diehard tests are no longer suitable for assessing the randomness properties of PRGs like , it’s a good place to start. (Modern test suites include DieHarder [2], TestU01 [3], and the NIST suite [4].)

Suppose the first test we want to run is the “minimum distance” test [5]. This works as follows. Let be the number of trials we conduct for this test. For each trial, pick random points in a square with sides of length . Then, find the minimum distance between each pair of points. The Python code to do this is below.

If the points are truly i.i.d. variables drawn from a uniform distribution, as we would expect for a random number generator, then should be exponentially distributed with a mean of .

So how do we actually test that the distribution of is exponential with the given mean? This is where the Kolmogorov-Smirnov (KS) test comes in handy. The KS test tests that the distribution of some set of data matches a specific distribution. (In this case, the exponential distribution.) The test itself is rather elegant. It basically works as follows. Let be the empirical distribution function of the input data set, and let be the CDF of the known distribution. is computed iterating over each element in the data set and computing the number of other elements that are less than or equal to . This frequency is then divided by the total number of elements in the set. (Its relation to the CDF should be clear, then.) Once this is obtained, the KS statistic is computed as [6]

That is, equals the maximal difference between any two outputs of the and functions. If the distributions are identical then converges to 0 as approaches infinity. So, given a finite set, these distributions are equated by comparing against a table of acceptable values. A confidence level is also used to determine the error margin for this test. The larger the value of , the smaller the value of that is supported (in the long run of ). Or, put another way, as our confidence level increases, the acceptable difference between the empirical distribution function and the known CDF decreases.

I recognize that the KS test is implemented in most major statistical software tools. but let’s look at how we might implement it if we had to do so from scratch. The code is actually somewhat simple. We begin by computing the empirical distribution function . This works by iterating over every unique sample and counting the number of elements that are less than .

Now we need to compute the KS statistic given the two distributions and . This is done by finding the maximum difference between the two distributions. Simple enough.

The last step is to actually perform the test given some confidence level . I just hard-coded the KS test table into the code and compare the KS statistic against this value with . If the distributions are close, i.e., if the statistic is less than the corresponding value in the table, the test returns true. Otherwise, it returns false.

So now we can finally get back to the question at hand: does the minimum distance from the MDT follow an exponential distribution? To check this, I created the exponential CDF and ran it through the KS test with the minimum distance test code. As we would expect, the result was positive.

I’d like to explore other randomness tests in the future. But for now, this was a nice way to get started. Recently there was a paper published entitled, “PCG: A Family of Simple Fast Space-Efficient Statistically Good Algorithms for Random Number Generation” [7]. The accompanying website [8] has a lot of great information about related random number generators. I hope to read through this paper soon to catch up with the state of the art.

References