Learning to Think Again, and the Cost of AI Dependency.
There are so many (hype/boring) posts about AI coming out every day. It’s OK to use it, and everyone does it, but still learn your craft, and try to think.
Similar to what DHH said:
It’s also more fun to be competent in something than constantly waiting for an AI to complete.
The probability that AI will make us unhappy is very high IMO. Use it, yes, but not for every task. For discovering, creating a historical overview, or creating diagrams (Canva, Figma), but a big no to the writing (or coding). Someone needs to add knowledge or new insights; AI cannot train itself. So articles, books, and words will be written, and writers will be more in demand as everyone relies on AI, which at some point just plateaus.
It will be a long-term loss; people stop thinking and learning. Time will tell. My two cents, if you are a senior in something, you know better. Bsky
# Guidance on When to Use It
I heard from ThePrimeagen: It depends on how far you fix into the future. Short-term autocomplete is fine, but architectural decisions are big no, no’s.
This is about the bottom where we have time and the left where we have amount of errors. This means that the longer we use AI for fixing something in the future, like an architecture, the more errors it will produce.
If we use it for quick autocomplete or creating a well-defined algorithm function, it’s less prone to errors. In that first phase, you gain 20% productivity; in the later phases, you lose more.
This is like in real life, the longer I wait with making m decision, the more information I have, the better the decison will be. And is exactly what Shape Up preaches with maximum deciding for 6 weeks (a cycle), don’t have roadmaps and backlogs for longer than that in the future. Similar is it with using AI, as all of it is predicted probability.
Another great illustration by Forrest Brazeal:
Also to keep in mind what’s most imporant to your usecase like illustrated by Thomas Ptacek in My AI Skeptic Friends Are All Nuts · The Fly Blog:
# Soulless
Nobody wants to read some soulless text, and what if it’s even good? Where do you get more from? I think this is a big trap that only over time people will realize. Sure, they help, and everyone needs to use them for “certain” tasks, but not the writing itself.
In the end, LLMs and AI require guidance; they’re just probabilities. See also Writing Manually.
# Distraction
I think we will be more distracted than ever. We can’t even have 2 seconds to think before Grammarly, Copilot, or Cursor suggests something. So instead of doing the thinking, we just cruise on. We are losing the driver’s seat.
It brings me back to the article I wrote recently about « Finding Flow». More on Don’t use AI for everything, you stop thinking-learning AI Use and Writing is Hard.
# Don’t Get Me Wrong
Don’t get me wrong, I use it every day, too. But more deliberately. I turned off my Grammarly and my Copilot (a long time ago), so I have the space to think and to learn. If you do it once or twice, that’s OK, but if you do it everywhere, you also lose the ability to learn new skills or the fun of it.
Interesting about the LCI (LLM Collaborative Intelligence), and sure, there will be a lot of benefits, but I am not sure if these insights are anything that comes close to a human insight that has felt, sensed, or experienced something through hardship. So yes, but I do not have many expectations, nor do I want it to create new insights. This is the fun part of my job :)
# Exercising a Skill
It’s never always or never; it’s in between. The problem with learning is if you use it often, I’d argue that you, in fact, don’t learn much. You just copy and paste in writing or just tab tab tab in coding. The learning is gone. And do that often enough; our brain is not used to learning or, more critically, thinking anymore. Same as remembering, how good can we remember mobile phone numbers? not really, but I could very well during the early phone times because I trained it every day.
It’s all a matter of exercise, and I learned for myself—it doesn’t have to be true for everyone—that I didn’t learn or think anymore. And frankly, it was also not fun anymore. That’s to be said in the stuff I know well.
In other areas, like creating an image (like the one I created for this article 😆) or updating my website’s front page with HTML/CSS, which would have taken me much longer as I don’t practice, it helped a lot. But I’d argue the fact that I didn’t learn anything new except prompting Claude Code :). It’s all tradeoffs, as always, right? :)
# Other Opinions
# Paul Graham on Writing
Paul Graham says on Writes and Write-Nots (internal):
The result will be a world divided into writes and write-nots. There will still be some people who can write.
Yes, it’s bad. The reason is something I mentioned earlier: writing is thinking.
In fact there’s a kind of thinking that can only be done by writing.
If you’re thinking without writing, you only think you’re thinking.
So a world divided into writes and write-nots is more dangerous than it sounds. It will be a world of thinks and think-nots.
# Nathan Baugh
Nathan Baugh shares on About AI and ghostwriting:
1st Order Effect: The world will be overrun with slop content and stories.
and stories. We already see this. Just look at AI written comments on this platform. 2nd Order Effect: People will stop learning the foundational skills – storytelling, writing, rhetoric – required to communicate their experiences and ideas effectively.
– storytelling, writing, rhetoric – required to communicate their experiences and ideas effectively. They will over rely on AI. It starts as a tool, becomes a crutch, and ends as a hindrance. 3rd Order Effect: People who invest in those same skills see massive returns.
Writing sharpens your ideas. Story gives leverage to those ideas.
# Ted Gioia
The good news, and why AI won’t replace writers on 2024-08-31 by Ted Gioia. Some of the reasons why he thinks AI Writing won’t be as good:
Source on Twitter/X. Full article Google Thinks Beethoven Looks Like Mr. Bean - by Ted Gioia.
# Mitchell Hashimoto
2.5 years into the AI craze, and I continue to firmly believe that if your company wasn’t already interesting/succeeding without AI, then doing “whatever plus AI” isn’t going to save you. For the few that seem this way (eg Cursor), I think their moat is a lot weaker than it seems. You have to play the game and the game is AI, but I don’t think it’s a defensible foundational capability. Might play out as an excellent land and grab strategy to buy you time to fill out the meat though. Mitchell Hashimoto on Twitter
# Andrew Ng
Andrew Ng on Twitter/X:
Some people today are discouraging others from learning programming on the grounds AI will automate it. This advice will be seen as some of the worst career advice ever given. I disagree with the Turing Award and Nobel prize winner who wrote, “It is far more likely that the programming occupation will become extinct […] than that it will become all-powerful. More and more, computers will program themselves.” Statements discouraging people from learning to code are harmful! In the 1960s, when programming moved from punchcards (where a programmer had to laboriously make holes in physical cards to write code character by character) to keyboards with terminals, programming became easier. And that made it a better time than before to begin programming. Yet it was in this era that Nobel laureate Herb Simon wrote the words quoted in the first paragraph. Today’s arguments not to learn to code continue to echo his comment. As coding becomes easier, more people should code, not fewer! Over the past few decades, as programming has moved from assembly language to higher-level languages like C, from desktop to cloud, from raw text editors to IDEs to AI assisted coding where sometimes one barely even looks at the generated code (which some coders recently started to call vibe coding), it is getting easier with each step. I wrote previously that I see tech-savvy people coordinating AI tools to move toward being 10x professionals — individuals who have 10 times the impact of the average person in their field. I am increasingly convinced that the best way for many people to accomplish this is not to be just consumers of AI applications, but to learn enough coding to use AI-assisted coding tools effectively.
den>
One question I’m asked most often is what someone should do who is worried about job displacement by AI. My answer is: Learn about AI and take control of it, because one of the most important skills in the future will be the ability to tell a computer exactly what you want, so it can do that for you. Coding (or getting AI to code for you) is a great way to do that. When I was working on the course Generative AI for Everyone and needed to generate AI artwork for the background images, I worked with a collaborator who had studied art history and knew the language of art. He prompted Midjourney with terminology based on the historical style, palette, artist inspiration and so on — using the language of art — to get the result he wanted. I didn’t know this language, and my paltry attempts at prompting could not deliver as effective a result. Similarly, scientists, analysts, marketers, recruiters, and people of a wide range of professions who understand the language of software through their knowledge of coding can tell an LLM or an AI-enabled IDE what they want much more precisely, and get much better results. As these tools are continuing to make coding easier, this is the best time yet to learn to code, to learn the language of software, and learn to make computers do exactly what you want them to do.
# Harry Dry
Big ideas are less about creativity and more about conviction. [..] So, what happened? ‘Sauce and other shit’ got incredibly cheap! [..] There is no AI prompt for conviction. Harry Dry
^64403f
More on Is AI solving this?.
# Jason Fried
As Jason Fried said, initially, it’s magical. After a while, you see it so clearly and it’s just average:
Cover letters? Yes!
The hardest thing is not making something.
The hardest thing is maintaining something.
It’s become so easy to just make stuff and vomit out ideas, and I mean this in the best possible way… Jason Fried on LinkedIn
This another valid insights, it’s hard to maintain code that is not made by you, it’s losing it’s fun. Therefore this will be a big part of a winning business, to have sustainable, and energy to want to maintain a product. And not “just making it”.
Also who takes responsibility for the generated (vibed) code?
# David Perell
David Perell has similar views as me on being soulless:
When you outsource your writing to AI, you end up with words that lack soul or personality. Gone go your quirks and your idiosyncrasies, which are the very things that make your writing irreplaceable. LinkedIn
# Ezra Klein
Ezra Klein has great insights that I very align with in terms of writing. He says that there are no shortcuts for research. When you grapple with a text or book for seven hours, it will change you. This will influence your writing, too. There’s no summary that gives you this kind of in-depth connection.
Also, you can’t prompt your way into it, as there’s no prompt that knows that you don’t know yet, or AI doesn’t know what you wanted to have read or what connections you would have made. On the contrary, you actually lose time reading something, and over time, we think we read lots of stuff, but we actually only read summaries. Full episode on The Case Against Writing with AI.
# Will It Replace X
# Writers
Are Cover letters still a thing? Yes. This reminded me of good writing is key for every job these days. Writing was always an asset, but even more these days; although people think they don’t need it, as AI is doing that. But that’s a very dangerous bet I wouldn’t take.
I wrote more on Writing Manually.
# Data Engineers?
Probably not.
Nice comparison by Mehdi Ouazza:
Did the music record replace musicians 100 years ago? Nope, it changed them and the industry.
Did cloud computing take all IT jobs? Nope, it also changed the industry and our jobs.
Same here; it will change our industry and job, but we won’t disappear.
More on Will AI replace Data Engineers.
# Image Generation
Initial generation, yes. But final touch, no. Whenever I try to create images with AI, I am always initially impressed, but that quickly fades over time.
Yesterday, I updated my second brain image, but I changed it again today. I created some more with AI; prompted prompted prompted. In the end, I made one manually based on my copy. I think it’s more powerful. What do you think?
# ChatGPT
# My Own
Some AI-generated images I like too, but they were always missing something, and yeah, they looked so AI-generated. I started to feel the same as I did for AI writing () and AI data engineering (Will AI replace Data Engineers); now, with AI image generation, doing it yourself is more fulfilling, and you end up happier.
More on AI Generated Images.
# How to detect AI Writing
How to detect AI Writing
If we know how AI is writing, should we stop using em dashes or thing AI does?
I don’t think so. I love the em dash. I even have a keyboard shortcut for the em dash. And sometimes when I write a negation, I’m thinking «could that look like it’s written by AI».
But at the end, convition is a good word. I can focus what an AI thinks while I write, I must write. So having something to say, and trying my best to communicate that, is the best I can do. ^ebca60
# History Logs
What AI Writing can’t do, because it can only think one word at a time.
E.g., in the below example, as a writer you know you need to start all sentences the same, but the AI model can’t do that.
Writing from Abundance is the art of collecting ideas so you can think better and avoid writer’s block.
is the art of collecting ideas so you can think better and avoid writer’s block. Writing from Conversation is the art of using dialogue to identify your best ideas and double down on them.
is the art of using dialogue to identify your best ideas and double down on them. Writing in Public is the art of broadcasting your ideas to the Internet so you become a beacon for people, opportunities, and serendipity.
More on Copywriting.
# AI Slop - Companies not doing great
AI Slop is generating more content, no matter the quality. It’s a the never ending Quality vs Quantity discussion, but now ever more important.
Here are some companies backpedaling after going full AI-first:
Klarna backpedaling AI customer service.: “After years of depicting Klarna as an AI-first company, the fintech’s CEO reversed himself, telling Bloomberg the company was once again recruiting humans after the AI approach led to “lower quality.” An IBM survey reveals this is a common occurrence for AI use in business, where just 1 in 4 projects delivers the return it promised and even fewer are scaled up.”
Duolingo getting worse with AI
Next up, Shopify after the announcement to go full AI?
# Learning With AI
Learning with AI
# Future
Nice insights, why LLMs with token pretictors are not so good for understanding the worlds. It kinda works (but not so so good) for writing, but to understand physics, and world models, this is much harder he says: Metas AI Boss Says He DONE With LLMS…
# Further Reads
Origin: Artificial General Intelligence
References: ChatGPT, My AI Logs of Will AI replace humans, My AI Prompts
Created 2024-08-31