The Middle-square method is a really old PRNG algorithm from 1949. It was discovered by John von Neumann.

# The Algorithm

def ms_next(seed, seed_len): seed = seed * seed seed = str(seed).zfill(seed_len * 2) mid = int(seed_len / 2) seed = seed[mid:-mid] return int(seed)

def ms_numbers(seed, seed_len): while True: seed = ms_next(seed, seed_len) yield seed

for i, num in zip(range(5), ms_numbers(123456, 6)): print(i, num)

0 241383 1 265752 2 624125 3 532015 4 39960

# Increasing seed space

Let’s seed it with /dev/urandom.

import os def rand_digit(): while True: d = os.urandom(1)[0] if d > 250: continue return d % 10 seed_len = 50 seed = "" for _ in range(seed_len): seed += str(rand_digit()) seed = int(seed) seed

7378710975714809271419972422814068416462491488115

Let’s see if we can turn these numbers into random bits.

def ms_bits(seed, seed_len): for num in ms_numbers(seed, seed_len): yield num & 1 def ms_bytes(seed, seed_len): bits = ms_bits(seed, seed_len) while True: b = 0 for _ in range(8): b <<= 1 b |= next(bits) yield b bits = "" for i, bit in enumerate(ms_bits(seed, seed_len)): if i == 40: break bits += str(bit) print(bits)

1100101111000110001001111100001001101011

# Estimating randomness quality

import subprocess ent = subprocess.Popen(["ent"], stdin=subprocess.PIPE, stdout=subprocess.PIPE) buf = b"" for i, byte in enumerate(ms_bytes(seed, seed_len)): buf += bytes([byte]) if i == 500_000: break ent.stdin.write(buf) ent.stdin.close() print(ent.stdout.read().decode('ascii'))

Entropy = 7.999652 bits per byte. Optimum compression would reduce the size of this 500001 byte file by 0 percent. Chi square distribution for 500001 samples is 241.47, and randomly would exceed this value 71.91 percent of the times. Arithmetic mean value of data bytes is 127.6098 (127.5 = random). Monte Carlo value for Pi is 3.138780555 (error 0.09 percent). Serial correlation coefficient is -0.002272 (totally uncorrelated = 0.0).

Not too bad.

# Useful links and references

- https://en.wikipedia.org/wiki/Middle-square_method