My IBM Log in
Viewpoint: Why We Should Chase Nature's Most Advanced Computer
Jul 16,2015

IBM today sent a letter to Senate Majority Mitch McConnell and Minority Leader Harry Reid urging them to bring forward legislation that would strengthen America’s cyber defenses.

The study of the brain dates back to the ancient Egyptians. Through the Greeks, Romans, and the Medieval Muslim world, there have been many contributions to neuroscience, the scientific study of the nervous system.  From organizations formed in the 1960’s to provide forums and direction for neuroscientists and educators to the most recent research programs in both the United States and Europe, the study of the brain contributes not only to national interests in science, medicine, economic growth, security, and well-being, but offers us insight into nature’s most powerful “computer.”
“The human brain is the most complex mass of protoplasm on earth—perhaps even in our galaxy,” said Marian C. Diamond and Arnold B. Scheibel.  Indeed, the brain packs 100 trillion (1014) synapses in a two-liter volume and weighs just three pounds.  But, to simulate even a simple mathematical model at the scale of the brain requires one of the largest supercomputers on earth with 1.5 million processors, 1.5 petabytes of main memory, and 6.3 million threads in a device roughly equal in size to an Olympic swimming pool.  Even then the simulation runs thousands of time slower than the brain.  Thus, there is a billion-fold energy-disparity between the brain and today’s advanced computers.  There is a tremendous opportunity to mine Mother Nature’s patent portfolio to exploit the brain’s energy-efficiency and scalability for perception, action, and cognition.

The opportunity for brain-inspired computing has increasingly become more palpable as neuroscience has steadily gathered data at finer spatial and temporal scales using both imaging and in-vivo animal studies, leading to ever-deeper insights into dynamics, structure, function, and behavior.  Translating these insights into applications and systems is a key to deep learning, which promises to become vital to civilization’s need to deal with a tsunami of sensory data from video, audio, images, and text.

Today, brain-inspired computing has come of age.  In August 2014, after a decade-long research effort, and under the auspices of DARPA’s SyNAPSE program, IBM demonstrated a novel, parallel, distributed, modular, scalable and fault-tolerant brain-inspired computing architecture that breaks path with the 70-year-old von Neumann computer architecture.  We have built an end-to-end ecosystem that consists of an index card-sized board with 1 million neurons; a simulator; a programming language; an integrated programming environment; a library of algorithms and applications; firmware; a tool for composing neural networks for deep learning; a teaching curriculum; and cloud enablement.

Because of its energy-efficiency, this technology is ideal for always-on context generation for sensor hubs in smartphones and wearables.  Due to real-time multimodal sensor fusion capability, the technology is well-suited for embedded intelligence providing contextual understanding in automobiles, robotics, cameras, and medical imaging.  Owing to inherent scalability, the technology is natural for cloud data centers and synaptic supercomputers that can ingest a wide variety of sensory and binary inputs.

We now have an energy-efficient substrate for deep learning, which can become the foundation of a new generation of high-performance computing machines that can accelerate U.S. innovation and competitiveness.  Within a decade, it will become possible to build a “brain-in-a-box” Synaptic Supercomputer with ten billion neurons that consumes merely one kilowatt of power while occupying less than two liters of volume.  The economic and societal implications of such technological advances will touch every facet of civilization.

Today in Washington, D.C., Dharmendra joined a high-level panel of experts convened to explore the intersection of neuroscience, advanced computing and the state of scientific research policy in the United States. Representatives from IBM, the Argonne and Lawrence Livermore National Laboratories, M.I.T. and Penn State University showcased how studying the brain has pushed the envelope of computing capability. In turn, they pointed out how the latest generation of supercomputers is advancing progress in neuroscience.

The panel’s overarching message to policy makers was clear: funding for federal research in not only neuroscience and advanced computing but also scientific research as a whole is critical not only to our development as a nation, but to the advancement of human knowledge. In recent years however, investment has been notably reduced and is now flat at best.

Talented technologists from industry, academia and the U.S. national labs have been working tirelessly to sustain the pace of discovery despite resource constraints. During today’s event, we announced that IBM will open its brain-inspired computing chip to a community of developers who can further advance its potential by leveraging its revolutionary capabilities as the foundation for new innovations. These R&D partnerships are crucial, but they cannot fully offset reductions in federal funding.

Now is the time to martial our collective will and step up the national investment needed to fully realize the inherent potential of brain-inspired computing technology. The benefits will ripple throughout the country, and around the world. Lawmakers need to hear from all of us on the importance of continued federal investment in scientific research.

Share this post: