You’ve no doubt heard of Moore’s Law. But do you know its history and exactly how it works?
Almost all of the developments in the exponential tech world are built on ever-increasing computational power, which is in turn driving the speed of AI, robotics, biotech, and VR.
For the past 60+ years, Moore’s Law, the exponential growth of computing power, has continued nonstop. But to truly appreciate the impact of Moore’s Law, it helps to understand its origins.
That’s what today’s blog is all about.
Let’s dive in…
Moore’s Law Continues…
In 1965, Gordon Moore (a co-founder of Intel) published a paper observing that between 1959 and 1965, the number of transistors on an integrated circuit had been doubling roughly every 18 to 24 months. He projected this would continue for some time.
This concept has held true for several decades and has become known as "Moore's Law."
The chart below, plotted on a log scale, demonstrates this trend:
Here’s a historical example to illustrate the power (and accuracy) of Moore’s Law. In 1971, Intel came out with its first commercial product, a 4-bit CPU called the Intel 4004 integrated circuit. The 4004 had 2,300 transistors with a gate length of 10,000 nanometers, and computer power of about 740 KHz.
By this time, each transistor cost about $1, on average.
Today, the cost of a single transistor is about 1 billion times less than it was in the 1970s, and you can fit a staggering number of them on a single chip. For example, in 2022 NVIDIA released its RTX 4090 graphic processing unit (GPU), which consists of 76 billion transistors. And that’s not even the most powerful chip on the market. Apple’s M1 Ultra boasts an incredible 114 billion transistors.
In just 50 years, the technology experienced a 100-billion-fold improvement, right on schedule for Moore's Law.
Here’s a quote from Moore himself I absolutely love that captures the significance of this trend:
“If the auto industry advanced as rapidly as the semiconductor industry, a Rolls Royce would get half a million miles per gallon, and it would be cheaper to throw it away than to park it.”
But Moore's Law only describes computational growth utilizing integrated circuits.
The Law of Accelerating Returns
In his most excellent book The Singularity Is Near, Ray Kurzweil described exponential growth in computation during a period predating the integrated circuit, over the past 120+ years.
Kurzweil points out that computation has gone through five different paradigms, the last of which we know as “Moore’s Law”:
Moore's Law (the 5th paradigm of computation) is therefore a subset of a much broader exponential principle described by Kurzweil's “Law of Accelerating Returns.”
Why This Matters
Earlier in this series, we looked at how much the world has changed over the past 500 or so years.
The scope and speed of these sweeping changes is nothing less than breathtaking.
Because of accelerating tech advancements and the continued influence of forces like Moore’s Law and the 6Ds, the world is getting more abundant all the time on many critical fronts. Changes across dozens of areas have driven us towards increasing abundance—“up and to the right”—at a profound rate.
I call this “data-driven optimism” and it’s one of the key reasons to be optimistic about the future.
Over the next few blogs, we’ll look at evidence of abundance—from declining poverty and child mortality rates, to increasing voting rights and literacy.