I implemented a Random Number Generator using the Linear Congruential Generator (LCG) algorithm. The formula I used is:
X_{n+1} = (a * X_n + c) mod m
For my parameters, I chose:
* a = 1664525
* c = 1013904223
* m = 2^32
* Seed = current system time in nanoseconds (to add non-determinism)
Code (Python)
import time
import math
import zlib
# Linear Congruential Generator
class LCG:
def __init__(self, seed=None):
if seed is None:
seed = int(time.time_ns())
self.state = seed
self.a = 1664525
self.c = 1013904223
self.m = 2**32
def next(self):
self.state = (self.a * self.state + self.c) % self.m
return self.state
def randint(self, low, high):
return low + self.next() % (high - low + 1)
# Generate 100,000 random numbers between 0 and 100
rng = LCG()
numbers = [rng.randint(0, 100) for _ in range(100000)]
# Shannon Entropy
def shannon_entropy(data):
freq = {}
for n in data:
freq[n] = freq.get(n, 0) + 1
entropy = 0
for count in freq.values():
p = count / len(data)
entropy -= p * math.log2(p)
return entropy
entropy = shannon_entropy(numbers)
# Compression Test
data_bytes = bytes(numbers)
compressed = zlib.compress(data_bytes)
compression_ratio = len(compressed) / len(data_bytes)
print("Shannon Entropy:", entropy)
print("Compression Ratio:", compression_ratio)
Results
* Shannon Entropy: ~6.63 bits (close to the theoretical maximum of log2(101) ≈ 6.66)
* Compression Ratio: ~0.99 (very close to 1.0, meaning the data is hard to compress and thus more random-like)
Notes
* Using system time in nanoseconds as the seed ensures that each run produces a different sequence.
* The entropy and compression results suggest the RNG produces a reasonably uniform and non-compressible distribution.