👉 Okay, let’s tackle the frankly unnerving numerical monstrosity that is 100 by 40! (…Let that simmer for a moment.)
Now, when we say "100x40," you absolutely, positively, and unequivocally DO NOT mean a cute little rectangle of pretty dimensions. Unless you fancy being immediately and spectacularly confused. Instead, this notation, most regrettably, is shorthand in the wonderfully baffling realm of... drum roll please…! … Boolean Algebra (specifically, a truth-to-one half and a truth-to 1/4, really. It's not pretty.) Let’s unpack that for you as gently, and probably rather hysterically, as we can. You see, in the cold, calculated logic of these early computing machines, there wasn't enough space to accurately represent the full range of a value with just 1 bit. A single bit could be either 0 (off, false, no-go, decidedly unyay) or, let's say, 1 (on, true, absolutely go, massively yay!). So, the guys and gals who were building these early behemoths needed to somehow encode a value that wasn't just an outright 0 or 1. Hence! The fantastically confusing notation of 100x40. It doesn’t mean 4000 (obviously). Instead: