Moore’s Law is Dead. What Happens Next?

2 weeks ago by David Rutland

Since 1965, computer engineers have lived by the words of Gordon Moore, co-founder of Intel, that the density of transistors that can be crammed onto a chip will double every 12 months. But, Moore’s Law is more of an observation and an extrapolation of trends than a genuine law of nature. Technology struggled to keep up and, within a decade, the optimistic 12-month prediction had extended to 18 months then, later, to 24 months.

Moore’s so-called law is fraying at the edges as manufacturers approach the limits of miniaturisation. Moore himself has never said outright that his eponymous prediction would fail, however, he has stated on several occasions that it was becoming more and more difficult to keep the process going.

In 2015, he echoed Stephen Hawking, stating that, “We’re very close to the atomic limitation now. We take advantage of all the speed we can get, but the velocity of light limits performance ... I guess I see Moore’s Law dying here in the next decade or so, but that’s not surprising”.

 

Gordon Moore. Credit: Science History Institute.

 

End users never see the unravelling of one of computing's most fundamental pillars.

Algorithms are becoming more efficient and, with the advent of cloud computing, more processor-intensive tasks can be outsourced to offsite machines that have a greater capacity. Home users and businesses need only a thin client on site; the bulk of the heavy lifting is done in the cloud.

Last week, Google launched the Stadia gaming platform. Using it, gamers can play the latest titles on almost any device, as graphics processing is performed remotely, and players receive a video stream of what is actually happening elsewhere.

Cloud-based processing is a stopgap measure. The cloud doesn’t really exist; it’s someone else’s computer. And, regardless of how fast the processing, data access is restricted by the available bandwidth.

For computer users, it is, in the end, a run around the limitations of both Gordon Moore’s Law and his predictions regarding its limitations. It is a 'bodge', putting off the inevitable.

Moving Away from Silicon?

Virtually all chips in all electronic devices rely on silicon. Silicon is a semiconductor—under certain circumstances, it will conduct electricity and, under others, it will act as an insulator.

It is almost the ideal material; it is cheap and abundant. More importantly, with the right doping elements, its properties can be customised to suit a specific purpose. But, aside from these admittedly great advantages, silicon may not be the ideal material from which to build chips in the future.

As circuitry shrinks, Si shows its inefficiencies. Heat is generated as we approach the nanoscale, and electrical current is lost. It is the limiting factor causing Moore’s Law to bend and break.

So, what could replace it?

Graphene is one of the strongest and most versatile materials on earth. Making its debut in 2004, it was originally manufactured one tiny sheet at a time, using sticky tape to transfer it from one surface to another. Fifteen years later, it is now being manufactured by the tonne.

In the initial enthusiasm for the material—a hexagonal carbon lattice structure at only one atom in thickness—it was touted as being the saviour of Moore’s Law, enabling miniaturisation to a scale previously unheard of.

The fact that carbon isn’t a semiconductor doesn’t matter, because graphene ribbons less than 10 nanometres wide behave like a semiconductor.

Whether they truly are or not is irrelevant when it comes to its applications.

 

Graphene, once thought to be the 'saviour' of computing. Credit: AlexanderAIUS.

 

A Significant Challenge

Moving away from silicon will mean redesigning chips and computer systems from the ground up. It will be a vast and slow undertaking, costing a huge amount in infrastructure as well as research and development.

If the computer industry is going to make an investment of such a magnitude, it needs to make sure it gets it right. If not, we may loop around and end up in exactly the same situation we find ourselves presently.

Graphene provided the first clue that two-dimensional materials were the way forward, and now scientists and engineers are looking for the most efficient molecules to facilitate the fast movement of electrons.

Among the leading candidates are the transition metal dichalcogenides such as molybdenum disulfide, something that is orders of magnitude more efficient than Si at allowing and halting an electrical current on demand. Its drawback, however, is that it is exceptionally unwieldy to work with, and the transistors which have been built so far cannot currently be transferred onto a processor.

In short, nobody knows what computer chips of the future will look like. Although advances are being made in the manufacture of alternatives to Si, it is far too early for us to make any concrete predictions or hail graphene—or other potential alternatives, for that matter—as the solution.

But, one thing is for certain: without material advances, the inexorable advances in processing speed will be a distant memory. Moore’s Law and the observations which led to it will be not only dead, but buried too.

Comments