AI 2027 forecasts that AGI could plausibly arrive as early as 2027. I recently spent some time looking at both the timelines forecast and some critiques [1, 2, 3].
Initially, I was interested in technical issues. What’s the best super-exponential curve? How much probability should it have? But I found myself drawn to a more basic question. Namely, how much value is the math really contributing?
This provides an excuse for a general rant. Say you want to forecast something. It could be when your hair will go gray or if Taiwan will be self-governing in 2050. Whatever. Here’s one way to do it:
Think hard. Make up some numbers.
Don’t laugh—that’s the classic method. Alternatively, you could use math:
Think hard. Make up a formal model / math / simulation. Make up some numbers. Plug those numbers into the formal model.
People are often skeptical of intuition-based forecasts because, “Those are just some numbers you made up.” Math-based forecasts are hard to argue with. But that’s not because they lack made-up numbers. It’s because the meaning of those numbers is mediated by a bunch of math.
So which is better, intuition or math? In what situations?
Here, I’ll look at that question and how it applies to AI 2027. Then I’ll build a new AI forecast using my personal favorite method of “plot the data and scribble a bunch of curves on top of it”. Then I’ll show you a little tool to make your own artisanal scribble-based AI forecast.
Two kinds of forecasts
... continue reading