What is Moore’s Law?

Moore’s Law: The exponential increase of the number of transistors on integrated circuits

Moore’s Law is the observation that the number of transistors on integrated circuits doubles approximately every two years.

This regularity of technological change is important as the capabilities of many digital electronic devices are strongly linked to the number of transistors. On this page you find evidence that a diverse range of technological measures — processing speed, product price, memory capacity, and even the number and size of pixels in digital cameras — have also been progressing exponentially.

The law was already described in 1965 by Intel co-founder Gordon E. Moore after whom it is named.1 Below you find the famous little graph that Moore published in 1965. As you can see, Moore had only seven observations from 1959 until 1965, but he predicted continuing growth saying, “There is no reason to believe it will not remain nearly constant for at least 10 years”.2

Moore’s original graph from 1965: ‘The Number of Components per Integrated Function’3

As our large updated graph here shows, he was not only right about the next ten years. Astonishingly the regularity he found held true for more than half a century now.

Note the logarithmic vertical axis chosen to show the linearity of the growth rate. The line corresponds to exponential growth with the transistor count doubling every two years.

The Number of Components per Integrated Function Moore's Original Graph - Moore0
Transistor count over time

Computational power: Exponential growth of FLOPS and operations per second

In itself, the doubling of transistors every two years does not directly matter in our lives. What impacts our lives is not the structure of these computers, it is their capacity.

This chart shows that the computational capacity of computers increased exponentially. The doubling time of computational capacity for personal computers was 1.5 years between 1975 and 2009.

The interactive chart shows more recent data. Here, the growth of supercomputer power is measured in terms of the number of floating-point operations carried out per second (FLOPS) by the largest supercomputer in any given year.

Exponentially increasing computational capacity over time (computations per second) – Koomey, Berard, Sanchez, and Wong (2011)4
Exponentially increasing Computational Capacity over Time (Computations per Second) - Koomey, Berard, Sanchez, and Wong (2011)

Exponential progress in computing efficiency

The cost to keep the machine running also matters. Computing efficiency measures the computational capacity per unit of energy. 

The progress in this respect has been very substantial: researchers found that over the last six decades the energy demand for a fixed computational load halved every 18 months.5

On this chart, we see the computing efficiency of various processors over time. Here, computing efficiency is measured as the number of watts (a measure of electrical power) needed to carry out a million instructions per second (Watts per MIPS).

This improvement in efficiency is also important with respect to the environmental impact of computers.

Exponential progress in computer memory and storage