Tuesday, April 26, 2016

The End Of Moore's Law? Maybe Sooner Than You Think

Let's first clarify that "Moore's law" is not a physical law in the sense of Newton's 2nd law of motion, e.g.

F = m (dv/dt)

But rather a law that appears to govern the miniaturization and speed of microprocessors, applicable at least for the past 44 years.  It can be traced to Gordon Moore, who became one of the founders of Intel and in 1965 wrote a paper in which he claimed that the number of electronic components that could be crammed into an integrated circuit was doubling every year. This exponential increase became known as Moore's law.

By 1970 the rate of doubling was reduced to once every two years and this has pretty well held for the past 44 years. Effectively we're looking at a 22 times doubling (starting with Intel's 4004 chip in 1971) and hence something 4 million times improved. This is just about what's happened. Whereas the 4004 had roughly 2,300 transistors (tiny electrical switches representing the 1s and 0s as the binary language of computers), the Intel's new Xeon Haskell E-5 (launched in 2014) has 5 billion - just 22 nm apart.

But that may be about to end in what some might term "the limit of small".  That is, the components are now approaching the fundamental limit of smallness: the atom. For example, the Intel Skylake transistor is only 100 atoms across. The fewer atoms in scale, the more difficult it is to store and manipulate the electronics and especially those 1s and 0s.

Of course, according to one wag, "There's a law about Moore's law: The number of people predicting the death of Moore's law doubles every two years."

But I am predicting it, based on the limits not only of size but of energy to continue to manufacture any scale of chips. This limit of energy I already wrote about at length in terms of its impact on economics in general, e.g.

http://brane-space.blogspot.com/2013/09/44-trillion-in-deficits-by-2024-minus.html

Energy limits and especially efficiency are important because once the EROEI (energy returned on energy invested) goes below a given threshold all bets are off and production is limited - and more costly - across a wide spectrum of products, not just computer chips. Wonder why NASA's space exploration budget has been pared by nearly one -third (according to the most recent Physics Today)? It's because of the much higher cost of energy to drive those rockets, space craft across vast distances. Wonder  why thousands of oil shale wells across the U.S. have shut down and workers sent home? It's because it is no longer cost effective to take the stuff out of the ground because the price per barrel is too low to support the expenditure of energy needed for extraction, storage, transport.

At root of many of these issues is the standard problem of entropy, or ever smaller amounts of useful energy available for a closed system with each energy conversion.  That is, more waste energy is generally given off, e,g, as heat in the process of transport, than used in the actual chips for computing. The more energy wasted or lost this way the less available for computing. Modern chips are so "power hungry" as the Economist piece notes, that up to 80 percent of the total input energy is expended in the course of transport leaving only a few to get energy in and out.

One possible solution is "spintronic transistors" given the voltage needed to drive them is only 10- 20 millivolts (1 mV = 1/1000 of a volt). This is hundreds of times lower than for a conventional transistor which means the latter's energy needs are hundreds of time greater. Thus, the spintronic device would solve the heat problem at a stroke but the problem is research has been ongoing for 15 years with nothing to show for it. This implies design problems of its own given with such minute voltages distinguishing between a 1 and a 0 from electrical background noise becomes tricky.

To push Moore's law further along a number of key changes would have to occur beyond merely resorting to a few more different designs and materials i.e. that may make transistors amenable to a bit more shrinkage.  (One of these 'tweaks' is to diffuse computing power rather than concentrating it, i.e. spreading the ability to calculate and communicate across an ever larger range of everyday objects.)

Higher energy efficiency will be essential to even keep Moore's law going for a decade. That includes dispensing with energy-intense lithium batteries (unsustainable in a lower energy environment)  and instead harvesting energy from surroundings including from the vibrations of E-M waves - using  tiny amounts of power amidst an intensely crowded radio spectrum.

Even Lindley Gwennap, who runs the Lindley Group of Silicon Valley analysts has admitted in the March 12th Economist: "From an economic standpoint, Moore's law is over." In other words, the cost to continue the Moore's law doubling (for miniaturization and chip power) is no longer sustainable in the current degraded energy environment operating on lower EROEI fuel. (Intel's boss, Brian Krzanich, has also publicly admitted the firm's rate of progress has slowed.)

As The Economist observes:

"The twilight of Moore's law then will begin change, disorder and plenty of creative destruction. An industry that used to rely on a handful of devices will splinter."

According to Bob Colwell who helped design Intel's Pentium chip:

"Most of the people who buy computers don't even know what a transistor does. They simply want the products they buy to keep getting better and more useful than in the past."

That route is getting more difficult but hopefully there will be other ways to make better computers and computing devices even without the full benefit of Gordon's Moore's "law".

We just have to be patient and see what they are.

No comments: