Modern cryptography relies heavily on mathematical principles. Here we explore some of the key formulae that underpin network security.
RSA Key Generation#
The security of RSA depends on the difficulty of factoring large semiprimes. Given two large primes $p$ and $q$, we compute:
$$n = p \cdot q$$
The totient is calculated as:
$$\phi(n) = (p - 1)(q - 1)$$
We then choose $e$ such that $1 < e < \phi(n)$ and $\gcd(e, \phi(n)) = 1$, and compute the private key $d$ where:
$$d \equiv e^{-1} \pmod{\phi(n)}$$
Encryption and Decryption#
For a message $m$, encryption produces ciphertext $c$:
$$c = m^e \bmod n$$
Decryption recovers the original message:
$$m = c^d \bmod n$$
Information Entropy#
Claude Shannon’s entropy formula measures the uncertainty in a random variable $X$:
$$H(X) = -\sum_{i=1}^{n} p(x_i) \log_2 p(x_i)$$
This is fundamental to understanding the strength of passwords and keys. A truly random 128-bit key has an entropy of exactly 128 bits.
Diagram#
Below is an example visualisation from a network analysis:

Sample network diagram from a traffic analysis
Continuous Entropy#
For continuous distributions, Shannon entropy generalises to differential entropy. Given a probability density function $f(x)$, the differential entropy is:
$$H(X) = -\int_{-\infty}^{\infty} f(x) \ln f(x) , dx$$
For example, the differential entropy of a Gaussian distribution with variance $\sigma^2$ is:
$$H(X) = \frac{1}{2} \ln(2\pi e \sigma^2)$$
This result shows that among all distributions with a given variance, the Gaussian has the maximum entropy — a property exploited in cryptographic noise generation.
The Birthday Problem#
The probability of a hash collision relates to the birthday paradox. For a hash function with $n$ possible outputs, the probability of at least one collision after $k$ hashes is approximately:
$$P(k, n) \approx 1 - e^{-k^2 / 2n}$$
This means a 256-bit hash requires roughly $2^{128}$ attempts to find a collision — a result that directly informs minimum key lengths in modern protocols.
Computing Entropy in Practice#
Here’s a simple Python implementation for calculating the Shannon entropy of a byte sequence:
import math
from collections import Counter
def shannon_entropy(data: bytes) -> float:
"""Calculate the Shannon entropy of a byte sequence."""
if not data:
return 0.0
counts = Counter(data)
length = len(data)
entropy = 0.0
for count in counts.values():
p = count / length
entropy -= p * math.log2(p)
return entropy
# Example: measure entropy of a string
message = b"correct horse battery staple"
h = shannon_entropy(message)
print(f"Entropy: {h:.4f} bits per byte")
# Compare with random data
import os
random_data = os.urandom(256)
h_rand = shannon_entropy(random_data)
print(f"Random entropy: {h_rand:.4f} bits per byte")A truly random byte stream will have an entropy close to 8.0 bits per byte, while natural language text typically falls between 3.0 and 5.0.