In 1995, Turing laureate Niklaus Wirth wrote an essay called A Plea for Lean Software in which he mostly gripes about the state of software at the time. Among these gripes is this claim which Wirth attributes to his colleague Martin Reiser1, though it's become to be known as Wirth's Law:
Software is getting slower more rapidly than hardware becomes faster.
Doing his best grandpa Simpson impersonation, Wirth complains:
About 25 years ago, an interactive text editor could be designed with as little as 8,000 bytes ofstorage. (Modern program editors request 100 times that much!) An operating system had to manage with 8,000 bytes, and a compiler had to fit into 32 Kbytes, whereas their modern descendants require megabytes. Has all this inflated software become any faster? On the contrary. Were it not for a thousand times faster hardware, modern software would be utterly unusable.
Aside from the numbers involved here, which must sound utterly preposterous to the average modern reader, there's a lot to relate to. My 25 year career in software, all of which happened after 1995, was in many ways a story in two parts about Wirth's Law, an action and a reaction.
Personally, I disagree with Wirth's conclusion that nothing of value had been gained for the loss in efficiency. When he laments "the advent of windows, cut-and-paste strategies, and pop-up menus, [..] the replacement of meaningful command words by pretty icons", he is not properly appreciating the value these features had in making computing accessible to more people, and is focusing too much on their runtime cost. Programmers are often too quick to judge software on its technical merits rather than it's social ones.
Wirth passed away 2 years ago, but he was a giant in the field of Computer Science and a huge inspiration to me and to many of my other inspirations. In many ways, my focus on simplicity and my own system design sensibilities find their genesis with him.
Wirth's law is so self evidently true that it's been a topic of continuous investigation and rediscovery.
A notable example of this was Dan Luu's great post on input lag back in 2017. He felt that input latency was getting worse over time, so he got a high speed camera and measured the delay between pressing a key and the letter appearing on screen across a lot of different hardware. The lowest latency computer was the Apple 2e from 1983.
Input latency has gone up since 1983 because there is a lot more software involved in the pipeline for handling input. The kind of hardware interrupt based input handling the Apple 2e had is not flexible enough to meet modern requirements, so this additional complexity buys us a lot of value... but it's certainly not free, and if you're not careful, one of the costs is latency.
... continue reading