Home history Copper interconnects Copper interconnects
Full of potential but fraught with dangers, copper emerged as a semiconductor standard through ingenuity and persistence
Copper interconnects in a microchip as seen through a scanning electron microscope.

IBM wowed the world in 1997 with a new breed of semiconductor made with copper, a metal whose successful application to chipmaking had eluded scientists for more than 30 years. The breakthrough yielded faster, cheaper chips and a new roadmap for advancements in microprocessors. Because copper wires are far more durable, 100 times more reliable, and can be shrunk to smaller sizes, the move away from aluminum ushered in an explosion of devices with computing capabilities, from smartphones to automobiles.

The announcement was a lightning bolt to a semiconductor industry desperate for a spark of inspiration. Locked in a technological arms race to make faster, smaller and more durable chips, manufacturers shrunk transistors to ever more minuscule sizes and crammed more and more onto wafers, nudging up performance while finding creative ways to maximize space. Other components on the chip were having a hard time keeping up.

Interconnects, or the wires that pass electricity between transistors, were a particular area of concern. As chips became more tightly packed, interconnects had a bigger job to do at smaller sizes. Aluminum, an ideal material for interconnects and the industry standard until that point, was fast approaching the limits of its conductive capacity. What’s more, aluminum wires were more susceptible to breakage. Enter copper, just in time.

Ensuring the safety of copper

For years, scientists understood the enormous potential of copper as a highly conductive and resilient metal. Its benefits in chipmaking were enormous. Copper wires conduct electricity with about 40% less resistance than aluminum, which results in a 15% boost in microprocessor speed. And IBM was forecasting that using copper wiring could also reduce the cost of making chips by as much as 15%.

Aluminum’s days were clearly numbered, but despite decades of research no one had figured out how to safely incorporate copper into chip design. The primary hurdle involved overcoming a troubling by-product of copper’s interaction with silicon. Unlike aluminum, copper effectively leaks atoms and corrupts silicon’s electrical properties, potentially rendering it useless. “Copper was considered to be a killer of semiconductor devices,” said Lubomyr Romankiw, an IBM Fellow and expert on copper applications, in IBM Research Magazine. “The conventional wisdom was to stay as far away from copper as you could.”

There was also the challenge of mass production. Even a few stray copper atoms could contaminate tools in a shared factory that was making aluminum/silicon chips, potentially costing the company millions in lost time and product.

Copper wires as conductor 40% less resistance than aluminum 15% boost in microprocessor speed up to 15% forecasted cost reduction in making chips
Three core challenges

Making copper interconnects viable required solving three problems: determining how best to chemically deposit it on the wafer, how to protect the silicon from being poisoned, and how to physically lay the copper out on the chip.

IBM tested several methods to apply the copper, including depositing solid copper from a gas suspension, and using an electrical charge to draw copper ions from liquid onto the silicon, a process called electroless plating. While the former method, known as sputtering, initially seemed promising, scientists settled on a third option, electrolytic plating, an impractical but familiar — and surprisingly successful — approach.

To protect the silicon, IBM scientists deployed a stable metal that the company had researched in the mid-1980s as a diffusion barrier against stray copper ions. The company devised a way to deposit the diffusion barrier in the wafers along with copper. Researchers borrowed an etching technique that IBM had invented in the early 1980s for its DRAM program.

Named for the metallurgists of old Damascus, Syria, who perfected a process to inlay metal, the so-called dual-damascene method for etching the wires and vias of the copper interconnects was critical to the success of the project. By removing depositing and polishing steps from the typical manufacturing process, it created a big economic incentive to pursue a workable solution in copper. 

A breakthrough that almost didn’t happen

Despite all the enthusiasm, money and effort going into copper research, the breakthrough almost didn’t happen. While the company remained committed to copper studies even during its financial challenges of the early 1990s — spending millions on a dozen different projects before 1994 — other priorities nearly eclipsed the work.

IBM was moving to a new, complementary metal-oxide semiconductor (CMOS) technology for chips and planned to discontinue its research on bipolar chip technology. And although an IBM Research team had devised a way to use copper, it wasn’t tenable in mass production. It was a familiar challenge: scaling and commercializing an invention that showed great promise in the lab. “We can’t work with possibilities; we need a practical solution,” management at the IBM’s Burlington, Vermont chip manufacturing plant told the team.

One small team remained committed. Having gone back to the drawing board, they made a last push to overcome production hurdles. John Heidenreich, a research manager at IBM Research, got a list from the Burlington plant outlining the nine obstacles to moving the solution out of development. The team had six weeks to prove it could address these “Nine Commandments,” as they became known, and deliver product into the real world.

Full-scale manufacturing
1997

Three years later, on September 22, the company announced that it would start full-scale manufacturing of chips using copper. It also launched a new service platform to help other electronics manufacturers design their own copper-based chips.

1998

In September, the company started shipping the copper technology in its PowerPC 740/750 microprocessor line, replacing the 750’s aluminum 300 MHz design with the 400 MHz-capable copper chips. Hailing the copper chip’s strikingly small size, performance and power-saving characteristics, it also pledged to incorporate the breakthrough approach into the S/390, RS/6000, and AS/400 server families.

1999

IBM introduced the first enterprise server with copper chips. The S/390 G6 delivered a 50% performance boost over its predecessor by packing in two additional processors, a feat made possible by the new copper technology. With 31 chips, 14 microprocessors and more than 1.4 billion transistors on the 5-inch-square ceramic substrate, it was the world’s densest chip module.

2000s

In the ensuing decade, copper would become the industry standard for interconnects in chip design and drive continued gains in processing speeds and efficiency. In 2004, IBM received the US National Medal of Technology and Innovation for its decades of leadership in semiconductor technology and its contributions to the explosive growth in information technology and consumer products. IBM’s copper technology was among the cited achievements for the award.

In the 2000s, copper would become the industry standard for interconnects in chip design
Related stories Silicon germanium chips

An inspired researcher overcame a nagging technical problem to pioneer massive opportunity in semiconductors

Dynamic random-access memory (DRAM)

A memory architecture sparked a revolution in computers, electronics and mobile phones

The IBM System/390

In 1990, IBM debuted a mainframe for an internet world