It's the End of Moore's Law as We Know It (But Not Really)


That's a 2005 silicon wafer signed by Gordon Moore. Hard to believe more than 50 years have now passed since Moore first penned those prophetic words. Science & Society Picture Library/SSPL/Getty Images
That's a 2005 silicon wafer signed by Gordon Moore. Hard to believe more than 50 years have now passed since Moore first penned those prophetic words. Science & Society Picture Library/SSPL/Getty Images

In 1965, Fairchild Semiconductor's director of research and development wrote an article for Electronics magazine. In that article, he pointed out that economics made it possible for companies to cram more components, such as transistors, onto integrated circuits. He also noted that this progression followed a fairly predictable course, allowing him to project that integrated transistors would have twice the number of components every year. That director was Gordon E. Moore, and his observation became known as Moore's law.

Over the years, Moore's law has evolved a bit. Today, we tend to say computers will double in processing power every 18 months or so. But that original definition Moore supplied — the idea of adding more components to a square inch of silicon semiconductor chip in a traditional integrated circuit — finally may be reaching its limit. According to the International Technology Roadmap for Semiconductors, after 2021 we won't be able to shrink transistors any more. They'll be as small as they're going to get.

We're hitting fundamental limits on what is physically possible with transistors. When you shrink beyond a certain size, quantum physics comes into play and introduces errors in calculations. Engineering around these limitations is complicated, which also means it's more expensive. And Moore's point way back in 1965 was that the real reason integrated circuits were getting more complex was that it was economically viable to go that route: There was a demand for powerful electronics, and that demand provided the economic necessity to improve manufacturing processes. But if it costs more money to work around quantum physics hurdles than you'll ever recapture in sales, the law falls apart.

Does this mean our electronics and computers will plateau in power by 2021? Not necessarily. While we'll likely reach the fundamental limits of what we can do with nanotechnology and classical integrated circuits, we're also looking at new approaches to microprocessor design. Your traditional integrated circuit is, essentially, two-dimensional. But future processors may build "up," adding vertical channels to increase transistor density. To do that, we'll need to create some innovative approaches to transistor gates and heat distribution.

So if you interpret Moore's law to include the option of stacking those components on top of one another rather than shrinking them down to fit more on a square inch of silicon, the law is still in good shape.

Bottom line: We don't have to worry about our computers hitting peak performance. Yet.