In 1937, Claude Shannon wrote what may be the most important master’s thesis of all time—and then, as if that weren’t enough, he went on to invent Information Theory. So why hasn’t Hollywood made a movie about his life? Maybe he wasn’t “flawed” enough to fit the mould of a tragic genius like John Nash or Alan Turing1. Whatever the reason, it’s a shame: because his story hasn’t been told on the big screen, few people today grasp the full weight of his ideas—especially entropy. And without that understanding, they miss what’s fundamentally broken about modern programming languages, particularly the "Object-Oriented" ones2.
The Case for Entropy
In information theory, entropy quantifies uncertainty. A fair coin flip has 1 bit of entropy (complete uncertainty), while a rigged coin that always lands heads has 0 bits (total predictability). This scales to any set:
8 elements = 3 bits of entropy (coffee choices at your local cafe)
1024 elements = 10 bits (coffee choices at a hipster cafe)
This isn't just theory—it's why your computer uses 8-bit bytes and why crypto needs 256-bit keys.
The Inheritance Paradox
Base classes represent broader categories (high entropy), while child classes should narrow possibilities (low entropy). Like describing a hit-and-run:
"A yellow car" = high entropy (many possibilities) - sedan, NY city taxi?
"A 2023 Maserati Gran Turismo" = lower entropy
With the license plate = 0 entropy (case closed!)
But here's where OOP fails us...
The Bug in Object-Oriented Programming
Consider:
class Car:
def __init__(self, color: str): # 24-bit color = high entropy
self.color = color
class ModernCar(Car):
def __init__(self, color: str, has_abs: bool):
super().__init__(color)
self.has_abs = has_abs # +1 bit entropy
Wait—adding has_abs
increases entropy! This violates the core principle that specialisation should reduce uncertainty.
This is a bug in all object-oriented programming which leads to a lot of programming errors, and, consequently, a lot more programming jobs than need to be.
An Entropy-aware language
What would a programming language look like if it truly understood entropy? In the previous posts, we discussed this in terms of basic types without bringing up the hidden topic of entropy. We saw how
Numeric types
aren't just int
or float
- they're precise ranges:
Temperature = Numeric(0.0..100.0, precision=0.1) # 10 bits of entropy
Percentage = Numeric(0..100) # ~6.7 bits
Strings
become constrained sets:
Username = String(min=3, max=20, charset=alphanumeric)
Email = String(regex=r".+@.+\..+") # Still too permissive, but better
Take that, Little Bobby Tables!
Identifiers
maintain their valid value sets:
UserIDs = Set[int](existing_users) # Only assigned IDs allowed
The type system tracks both the possible values (set members) and their probability distribution.
Relationships
get first-class treatment:
owner: Mandatory[User] # 1:1 required relationship
tags: Optional[Set[Tag]] # 0..* optional relationship
The compiler enforces null checks and existence guarantees.
Here’s what we have been building up towards.
Complex Types
Instead of top-down inheritance:
# Traditional OOP
class Vehicle: ...
class Car(Vehicle): ...
We have bottom-up class derivation
# Entropy-aware modelling
cars = [Sedan(...), Coupe(...), Truck(...)]
Vehicle = generalise(cars) # Automatically derives minimal interface
The compiler:
Analyses actual usage patterns
Calculates the optimal abstraction level
Generates type constraints that preserve semantic meaning
If this seems radical, it’s only because OOP trained us to think backward. We don’t start with abstract Vehicle
classes—we discover them by generalizing concrete Tesla
and GranTurismo
objects, just like children learn "dogs" after meeting Fido and Spot.
Here’s the kicker: This is exactly how machine learning works.
So why do we force programmers to hand-carve rigid class hierarchies when we could:
Let the compiler infer types from real usage (like GPT deduces grammar)
Treat inheritance as a compression problem (optimal entropy reduction)
Make the type system learn constraints from unit tests (hello, differentiable programming)
The future isn’t vibe coding—it’s programming languages that evolve alongside their databases.
Or maybe he was a part of the Deep State rather than a victim of it?
SQL is pretty screwed up too, but that’s a story for another time.