In 2001, a quantum computer was able to factor 15.
In 2012, a quantum computer was able to factor 21, sorta. They cheated by not implementing gates that they knew a priori would not be used. To this day, there still hasn’t been a quantum computer to factor 21 without taking some shortcuts. Some reasons why are given here.
Extrapolating from two data points is kinda ridiculous, but we only have two data points, and we’re going to do it anyway.
Some may object that extrapolating the size of the numbers that can be factored isn’t the right approach. Instead we should be extrapolating the number of qubits or something else. Fine, but that’s not what we’re doing in this post.
Linear interpolation
Using linear interpolation, a quantum computer wouldn’t be able to factor a cryptographically relevant prime number before the heat death of the universe.
Exponential interpolation
Let’s switch to exponential extrapolation. Instead of assuming the size of the numbers grows linearly, we’ll assume the number of bits in the numbers grows linearly, meaning the numbers grow exponentially.
In about 10 years, the size of number that could be factored increased by
21/15 = 1.4 ≈ √2.
... continue reading