Home history DRAM Dynamic random-access memory (DRAM)
The semiconductor-based memory architecture sparked a revolution in computers, consumer electronics and mobile phones
A technician in a clean suit holds a silicon wafer in a lab

In 1966, Robert Dennard was lying on his living room couch in Westchester County, New York, when he had an idea that would come to change the trajectory of computing and provide the foundation for the tech-centric world we live in today. Dennard was part of an IBM team working to reimagine the way computers store information. He and his colleagues were fixated on a bulky, costly memory system that used a series of six transistors to store just 1 bit of data. 



That evening, Dennard had a notion to reconfigure the contraption so it could perform the same feat far more quickly using a single transistor. His brainstorm planted the seed for the invention of dynamic random-access memory, or DRAM, the semiconductor-based memory architecture used by the majority of today’s computers, servers and consumer electronics, including mobile phones, game consoles and digital cameras.

The charging process was a key stroke of brilliance — it enabled the capacitor to maintain data, and made the RAM system ‘dynamic’
Reimagining computer memory
Faster, more elegant and cheaper

In the mid-1960s, the basic element of computer memory was magnetic core storage, which comprised hundreds of thousands of doughnut-shaped ferrite elements, each about half a millimeter in diameter, laced into rectangular arrays with thin copper wires. Although their assembly had become highly automated, the achievable density, cost and performance of magnetic core systems was reaching a limit — even as the demand for computer memory was skyrocketing.

A practical alternative was emerging in the form of metal-oxide semiconductor (MOS) technology, in which data is stored on a memory chip. Dennard was working on an MOS project that, while promising, was highly complex — each memory cell used six transistors. The system worked, but it required considerable real estate on an integrated circuit to store just 1 bit of data. Worse, it was slow. The goal was to design something faster, more elegant and cheaper.

Dennard found inspiration in a presentation he saw earlier in the day by a group of IBM researchers who were trying to shrink magnetic memory to a compact 25-centimeter square. He followed their lead and sketched out a plan for a system that would hold a bit in a single transistor. It would store bits as positive or negative charges on capacitors — energy-storage devices within a miniaturized electronic circuit — which would be refreshed repeatedly by tiny amounts of energy. The charging process was a key stroke of brilliance. It’s what enabled the capacitor to maintain the data, and it made the RAM system “dynamic.”

Instantly obsolete: DRAM takes the market by storm
Dennard and IBM were issued the patent for DRAM in 1968. The technology was put into popular use in 1970 when Intel built a successful 1-kilobit DRAM chip using a three-transistor cell design. The simplicity, low cost and low power consumption of DRAM made earlier magnetic technologies obsolete almost instantly. DRAM immensely increased the capacity of digital information storage, leading to dramatic progress in information and telecommunications technology.
With the oversight of IBM Fellow Dale L. Critchlow, Dennard and his team went on to propose guidelines for how to implement and scale DRAM. The team introduced the constant-field scaling theory, which built a framework for allowing computers to run faster on significantly less energy, thus reducing operating costs. “The implications of scaling were remarkable,” Critchlow later wrote. “These were exactly the results we needed to develop a competitive low-cost memory.”
The combination of DRAM and the first low-cost microprocessors accelerated the miniaturization of computing and opened the door to today’s high-powered handheld devices. In 1976, computer hobbyist Steve Wozniak built a user-friendly desktop computer containing 4 kilobytes of DRAM. It attracted the interest of Wozniak’s friend, Steve Jobs, and inspired them to start Apple Computer. Within a year, Wozniak had created the Apple II, improving upon his initial design by adding a faster microprocessor and more DRAM. The Apple II was the first commercially successful desktop computer built with DRAM. By 1984, sales of the Apple II reached 2 million.
A USD 100 billion market
DRAM gives rise to smartphones, ubiquitous photo and music libraries

Today’s highest-capacity DRAM chips can hold 4 billion bits, and the global DRAM market is expected to surpass USD 100 billion by 2026. More significantly, DRAM has become a foundational component of information technology and helped reshape the way we work, entertain ourselves, conduct business and educate our children. 

Some two-thirds of the human population rely on Dennard’s invention as an integral component of their smartphones, which use DRAM to store many gigabytes’ worth of movies, photos and music libraries. “I knew it was going to be a big thing,” Dennard said of DRAM in an interview after winning the Kyoto Prize in 2013, “but I didn’t know it would grow to have the wide impact it has today.”

Related stories
Robert Dennard

The inventor of DRAM laid the foundation for modern computing and received the US National Medal of Technology

The IBM PC

A USD 1,500 open-architecture machine became an industry standard and brought computing to the masses

Computer science

IBM fostered a highly consequential field to find new utility for its machines and to train the next generations of technical talent