- Wafer-scale integration
Wafer-scale integration, WSI for short, is a yet-unused system of building very-large
integrated circuit networks that use an entire silicon wafer to produce a single "super-chip". Through a combination of large size and reduced packaging, WSI could lead to dramatically reduced costs for some systems, notablymassively parallel supercomputer s. The name is taken from the termvery-large-scale integration , the currentstate of the art when WSI was being developed.The concept
To understand WSI, one has to consider the normal chip-making process. A single large cylindrical crystal of silicon is produced and then cut into disks known as wafers. The wafers are then cleaned and polished in preparation for the fabrication process. A photographic process is used to pattern the surface where material ought to be deposited on top of the wafer and where not to. The desired material is deposited and the photographic mask is removed for the next layer. From then on the wafer is repeatedly processed in this fashion, putting on layer after layer of circuitry on the surface.
Multiple copies of these patterns are deposited on the wafer in a grid fashion across the surface of the wafer. After all the possible locations are patterned, the wafer surface appears like a sheet of graph paper, with grid lines delineating the individual chips. Each of these grid locations is tested for manufacturing defects by automated equipment. Those locations that are found to be defective are recorded and marked with a dot of paint. The wafer is then sawed apart to cut out the individual chips. Those defective chips are thrown away, or recycled, while the working chips are placed into packaging and re-tested for any damage that might occur during the packaging process.
Flaws on the surface of the wafers and problems during the layering/depositing process are impossible to avoid, and cause some of the individual chips to be defective. The revenue from the remaining working chips has to pay for the entire cost of the wafer and its processing, including those discarded defective chips. Thus, the higher number of working chips or higher "yield", the lower the cost of each individual chip. In order to maximize yield one wants to make the chips as small as possible, so that a higher number of working chips can be obtained per wafer.
The vast majority of the cost of fabrication (typically 30%-50%) is related to testing and packaging the individual chips. Further cost is associated with connecting the chips into an integrated system (usually via a
printed circuit board ). Wafer-scale integration seeks to reduce this cost, as well as improve performance, by building larger chips in a single package – in principle, chips as large as a full wafer.Of course this is not easy, since given the flaws on the wafers a single large design printed onto a wafer would almost always not work. It has been an ongoing goal to develop methods to handle faulty areas of the wafers through logic, as opposed to sawing them out of the wafer. Generally, this approach uses a grid pattern of sub-circuits and "rewires" around the damaged areas using appropriate logic. If the resulting wafer has enough working sub-circuits, it can be used despite faults.
Production attempts
Many companies attempted to develop WSI production systems in the 1970s and 80s, but all failed. TI and
ITT both saw it as a way to develop complex pipelinedmicroprocessor s and re-enter a market where they were losing ground, but neither released any products.Gene Amdahl also attempted to develop WSI as a method of making a supercomputer, startingTrilogy Systems in 1980 and garnering investments fromGroupe Bull ,Sperry Rand andDigital Equipment Corporation , who (along with others) provided an estimated $230 million in financing. The design called for a 2.5" square chip with 1200 pins on the bottom.The effort was plagued by a series of disasters, including floods which delayed the construction of the plant and later ruined the clean-room interior. After burning through about 1/3rd of the capital with nothing to show for it, Amdahl eventually declared the idea would only work with a 99.99% yield, which wouldn't happen for 100 years. He used Trilogy's remaining seed capital to buy
Elxsi in 1985, a maker ofVAX -compatible machines. The Trilogy efforts were eventually ended and "became" Elxsi.The last serious attempt to use WSI appears to have been
Clive Sinclair 's involvement at his MetaLabthink tank . When looking for submissions for ideas, he received the plans ofIvor Catt to produce a new WSI known as the Catt Spiral. Catt proposed to extend work that he had previously done for Burroughs, this had been abandoned at about the same time as Burroughs's merger with Sperry to becomeUnisys [http://www.ivorcatt.org/icr-ew47boole.pdf] . As the name implies, the Spiral was not laid out in a grid, but a series of cylinders of small chips on the wafer, connected to all of its neighbors.After processing, the wafer was sent to a tester that connected in turn to the chips around the outside rim of the wafer, testing each one until it found one that worked. When it did, it would ask that working chip to pass along the test signals to one of its neighbors, starting with the "next" one on the same track. This process was continued until it ran out of working chips, thereby eventually finding all of the working ones on the wafer and writing that information back into
NVRAM . The design required only one set of pins, connected to that first working chips, and did not require extensive packaging due to details of the wafer itself.Sinclair saw a sweet spot in the market at a time when
RAM prices were still fairly high andhard disk systems were very expensive. After preliminary meetings in 1983 Catt convinced Sinclair that the idea was "for real" and started development of a 512kB memory that would reduce the cost of future Sinclair products – it was a similar low-cost development fromFerranti that made the original Sinclair products so inexpensive. Sinclair eventually organized a new company, Anamartic, to produce the design, but a crash in RAM prices soon rendered the entire system infeasible.Ivor Catt continues to promote the idea today, as the basis for a spiral-like supercomputer system he calls the Kernel. A basic Kernel would include 1 million processors in a 1000 by 1000 grid.
Wikimedia Foundation. 2010.