The Day that Changed Everything for Cognitive Computing Pioneer Dharmendra Modha


That's the day everything changed for Dr. Dharmendra Modha. 
Most people don't remember the exact day they realized what they wanted to do with the rest of their lives. Maybe it was a crisp fall day halfway through high school, or college or even middle school.
But that’s not the case for Modha. His “day” was July 16, 2004—and he remembers it vividly.
By 2004, Modha was already well on his way to being considered a computing pioneer. He joined IBM after receiving his bachelor’s from the India Institute of Technology in computer science and his Ph.D. in electrical computing engineering at the University of California, San Diego.
Once at IBM, Modha has a series of extremely successful projects. He invented a code that went into every IBM disk drive; he invented algorithms to visualize data in tens of thousands of dimensions, which eventually became part of Watson; and he invented caching algorithms for large storage systems, which has generated billions of dollars for IBM over the years.
“But then, I became acutely aware of the finiteness of life,” Modha recalled to R&D Magazine. “I wanted to do something that could have a paradigm-shifting effect on the field of computing. Something that would make the world better in a deep sense. But it had to have maybe just a sliver of chance of working. A very high-risk, high-leverage project.”
After meditating for a year on what to do next, Modha came up with just what he wanted—the crazy, almost impossible idea to build a brain-inspired computer.
But, can someone really build a computer inspired by the brain? After all, the human brain boasts about 100 trillion (1014) synapses and 100 billion (1011) neurons firing anywhere from five to 50 times per second.
The point was never to compete with existing computers, Modha explains. “It was always, how can we complement today’s computers?”
Cognitive computing, or brain-inspired computing, aims the emulate the human brain’s abilities for perception, action and cognition. Traditional computers are symbolic, fast and sequential with a focus on language and analytical thinking—much like the left brain.
The neurosynaptic chips Modha and his team design are much more like the right brain—slow, synthetic, capable of addressing the five senses as well as pattern recognition.
Today’s chip—called TrueNorth—features 1 million neurons, 256 million synapses, consumes 17 milliwatts of power and is about 4 square centimeters in size.
Based on an innovative algorithm just published in September, TrueNorth can efficiently implement inference with deep networks to classify image data at 1,200 to 2,600 frames per second while consuming a mere 25 to 275 milliwatts. This means the chip can detect patterns in real-time from 50 to 100 cameras at once—each with 32x32 color pixels and streaming information at the standard TV rate of 24 fps—while running on a smartphone battery for days without recharging.
“The new milestone provides a palpable proof-of-concept that the efficiency of brain-inspired computing can be merged with the effectiveness of deep learning, paving the path towards a new generation of cognitive computing spanning mobile, cloud and supercomputers,” Modha explained.
The novel algorithm builds off the scaled-up platform IBM was able to deliver to Lawrence Livermore National Laboratory in March 2016. Called NS16e, the configuration consists of a 16-chip array of TrueNorth processors designed to run large-scale networks that do not fit on a single chip. The NS16e System interconnects TrueNorth chips via a built-in chip-to-chip message-passing interface that does not require additional circuitry or firmware.
Both the algorithm and the scaled-up version of TrueNorth is the culmination of 12 ½ years of research and development, dating all the way back to that July day in 2004.
The beginning and the middle
Once the project received a green light and funding from IBM in 2006, Modha quickly identified three elements that were crucial to the success of his computer: neuroscience, supercomputing and architecture.
After all, to build a brain-inspired computer, one must first understand how the brain works. Modha and his team consumed every bit of published information available about the brain, including 30 years of research regarding neurons. They ended up mapping out the largest, long distance line diagram of the brain—which consisted of 383 regions in the macabre monkey brain, illustrating 6,602 connections.
Besides being “the most beautiful illustration” Modha as ever seen, the map successfully provided the researchers with a platform to study the brain as a network.
The team turned to supercomputing simulations next. Luckily, they didn’t have to go far as IBM owns some of the most important milestones in supercomputing history, including the development of the Blue Gene/L, Blue Gene/P and Blue Gene/Q.
Modha carried out a series of increasingly larger and increasingly more complex simulations on the largest Blue Gene supercomputers IBM has to offer. The largest simulation was done on the Blue Gene/Q— it was able to simulate a brain-like graph at a scale of 100 trillion synapses, or 1014.
While that’s the same scale as the number of synapses in the human brain, there did exist a discrepancy—the simulation ran 1500x slower than real-time, even when using much simpler connectively and computation than the brain.
“We figured a hypothetical computer designed to run the brain’s 100 trillion synapses in real-time would require 12 gigawatts of power,” Modha said, explaining what he learned from the supercomputer simulations. “That’s enough to power NYC and LA. In contrast, the human brain consumes just 20 watts. So, there’s a billion-fold disparity behind modern computers compared with what the brain can do. And that’s really what led us to the third element.”
The third element was perhaps the riskiest, and thereby the most rewarding. Modha wanted to turn 70+ years of computing on its head by designing a brand new architecture that was completely different than the traditional von Neumann architecture.
Described in 1945 and prevalent in most of today’s computers, von Neumann architecture refers to an electronic digital computer that shares a bus between program memory and data memory. This shared bus leads to a limited throughput (data transfer rate) between the CPU and memory compared with the amount of memory. This means power must increase as the communication rate (clock frequency) increases.
Of course, Modha turned to the brain for inspiration on how to design a new architecture. His research turned up a neuroscience hypothesis that the brain is composed of canonical, cortical microcircuits, or tiny circuits that compose the fabric of the cerebral cortex. Applying this to computing, Modha sought to design an architecture based on tiny modules that could be tiled to create an overall system—which is precisely what TrueNorth is.
“To prove the hypothesis, in 2011, we demonstrated a tiny little module, a neurosynaptic core with 256 neurons, the scale of a worm brain,” Modha explained. “This tiny little module formed the foundation. Then we shrank this core in area by an order of magnitude, in power by two orders of magnitude, then tiled 4,096 of these tiny cores to create the chip that is now called TrueNorth.”
TrueNorth’s brain-inspired architecture consists of a network of neurosynaptic cores that are distributed and operated in parallel. Unlike von Neumann architecture, TrueNorth’s computation, memory, and communication are integrated, which results in a cool operating environment (allowing the chips to be stacked) and low power operation. Individual cores can fail and yet, like the brain, the architecture can still function. Cores on the same chip communicate with one another via an on-chip event-driven network. Chips communicate via an inter-chip interface leading to seamless scalability.

Comments

Popular posts from this blog

Electronic Music Theory: 8 Rhythmic Devices You Should Know (Polyrhythm, Syncopation +)

Computable Universes & Algorithmic Theory of Everything: The Computational Multiverse

Balancing your Studies and Social Life at University