Tech News
← Back to articles

I started programming when I was 7. I'm 50 now and the thing I loved has changed

read original related products more articles

I Started Programming When I Was 7. I'm 50 Now, and the Thing I Loved Has Changed

I wrote my first line of code in 1983. I was seven years old, typing BASIC into a machine that had less processing power than the chip in your washing machine. I understood that machine completely. Every byte of RAM had a purpose I could trace. Every pixel on screen was there because I’d put it there. The path from intention to result was direct, visible, and mine.

Forty-two years later, I’m sitting in front of hardware that would have seemed like science fiction to that kid, and I’m trying to figure out what “building things” even means anymore.

This isn’t a rant about AI. It’s not a “back in my day” piece. It’s something I’ve been circling for months, and I think a lot of experienced developers are circling it too, even if they haven’t said it out loud yet.

The era that made me

My favourite period of computing runs from the 8-bits through to about the 486DX2-66. Every machine in that era had character. The Sinclair Spectrum with its attribute clash. The Commodore 64 with its SID chip doing things the designers never intended. The NES with its 8-sprite-per-scanline limit that made developers invent flickering tricks to cheat the hardware. And the PC — starting life as a boring beige box for spreadsheets, then evolving at breakneck pace through the 286, 386, and 486 until it became a gaming powerhouse that could run Doom. You could feel each generation leap. Upgrading your CPU wasn’t a spec sheet exercise — it was transformative.

These weren’t just products. They were engineering adventures with visible tradeoffs. You had to understand the machine to use it. IRQ conflicts, DMA channels, CONFIG.SYS and AUTOEXEC.BAT optimisation, memory managers — getting a game to run was the game. You weren’t just a user. You were a systems engineer by necessity.

And the software side matched. Small teams like id Software were going their own way, making bold technical decisions because nobody had written the rules yet. Carmack’s raycasting in Wolfenstein, the VGA Mode X tricks in Doom — these were people pushing against real constraints and producing something genuinely new. Creative constraints bred creativity.

Then it professionalised. Plug and Play arrived. Windows abstracted everything. The Wild West closed. Computers stopped being fascinating, cantankerous machines that demanded respect and understanding, and became appliances. The craft became invisible.

But it wasn’t just the craft that changed. The promise changed.

... continue reading