Unveiled by the company last week at the Hot Chips event in Palo Alto, California, the SWE is a square, roughly 8-inches-by-9-inches chip, making the device the biggest chip ever built.
Created with AI applications in mind, according to the company, the device would be used in large data centres for deep learning and to improve self-driving cars and digital assistants (such as Amazon’s Alexa and Apple’s Siri).
Sean Lie, co-founder and chief hardware architect of Cerebrus Systems. Image courtesy of Cerebras Systems.
In terms of technical specs, the Cerebras WSE contains 400,000 sparse linear algebra cores, 18GB of on-die memory, 9PB/sec of memory bandwidth across the chip, and distinct fabric bandwidth of up to 100Pbit/sec. The whole chip is built on the 16nm FinFET process from TSMC (Taiwan Semiconductor Manufacturing Company).
As the chip consists mainly of a single wafer, Cerebras said it has implemented methods of routing around bad cores, allowing the chip to stay connected even if it has bad cores in a section of the wafer.
Furthermore, since a traditional CPU wouldn’t be sufficient to cool such a massive component, the chip is cooled using a very large cold plate that sits above the silicon, with vertically-mounted water pipes used for direct cooling.
Of course, no traditional package would be large enough to fit the chip, so Cerebras has designed its own. The component has not been publicly released yet, but according to PCWorld, it is like “combining a PCB, the wafer, a custom connector linking the two, and the cold plate”.
Cerebras Systems’ Wafer Scale Engine (WSE) Chip, positioned next to a MacBook keyboard for scale. Image courtesy of Cerebras Systems.
Cerebras’s WSE is indeed the biggest chip in existence, but this is not the first time that a company has tried to build a gigantic chip. In the 1980s, a startup called Trilogy (founded by the well-known IBM chip engineer Gene Amdahl) tried to do so with over $230 million in funding. Unfortunately, Trilogy eventually deemed the task too difficult and abandoned it after 5 years.
More recently, at the said Hot Chips event, tech giant IBM has announced its Power9 iteration chip, which, despite the resources and manpower that the company owns, has only 8 billion transistors.
The difficulties behind building large chips are numerous, some of which have been mentioned above (its cooling and packaging demands being two of them).
Moreover, at the time of writing, Cerebras’s claims regarding the chip’s performance have not been independently verified.
The company also did not disclose a price for the technology, but, understandably, such information will depend on how efficiently Cerebras and its manufacturing partner, TSMC, will be able to build the chip.
Should Cerebras be able to assemble and sell a fully-functional wafer-scale processor, that would be an interesting demonstration of whether this new technological application could be commercialised at scale.
It is unlikely that components like the WSE will ever be sold to consumers directly, but businesses have been expressing interest in using wafer-scale processing to improve performance and power consumption in a range of markets—and that’s the share of the market pie that Cerebras should be targeting.
Several chips have been showcased at the Hot Chips conference last week, but (as is the case every year) many of them are never going to be sold commercially, or even developed further.
Will Celebras’s WSE become one of such commercial no-shows, or will it find its place in the ever-changing technology landscape?
Let us know your opinion in the comments section!