Note I don’t usually write speculative posts like this. However, as AI is fast changing the way we build software, I’ve been thinking more about where all this might be headed. So this isn’t a prediction or a manifesto, just some thoughts I’ve been kicking around.
Throughout my career, I have had the pleasure (and sometimes the pain) of working with various programming languages and paradigms. And recently, like everyone else, I have been leveraging large language models (LLMs), and user interfaces for them like GitHub Copilot, Cursor, Claude Code, …, for increasing my output. Yet, I get skeptical when I read claims like “English will be the only programming language you’ll ever need”. What this really means, especially the “ever” part, is: if AI can translate our English descriptions into working code, do we still need programming languages at all?
Sure, AI does an impressive job at translating natural language into code. And sure, if you can express your intent clearly in English (or any other natural language), you can get a lot done, I know I do!
Even with all that, I think that one aspect of programming will remain essential: debugging. No matter how good AI gets at generating code and even at debugging it, we’ll still need to understand what that code actually does when it doesn’t work as expected. And for that, we need programming languages. Not necessarily for writing the initial code, but for reading, tracing, and reasoning about it when things go wrong.
When Natural Language Falls Short
There’s a classic joke that my brother loves: a software engineer’s partner asks him to go to the store and get milk, and if there are eggs, bring twelve! The engineer comes back with twelve bottles of milk. When asked why, he says “they had eggs”.
This kind of ambiguity is usually resolved through context and common sense for (most) humans, but computers require unambiguous instructions. In programming, such ambiguity can lead to catastrophic failures rather than just humorous misunderstandings.
Programming languages eliminate this ambiguity by design. When we write:
for ( int i = 0 ; i < 10 ; ++ i) { printf ( " %d
" , i); }
... continue reading