Computers versus Brain: Who's the Winner? Neuromorphic Computing [Part 2/3]
In the first part of this article, we tried to find out who's the winner: the human brain or computers. We ended up with computers winning the first point; as we discovered the huge impact the new technologies have on our life basis tasks, and especially how they can help with things like brain cancer, tumor detection, and how the AI technologies are amplifying the improvements in medical fields.
The human brain is more complex than any other known structure in the universe.
One might think that the result in the first part of the article was not fair enough, yet to channel you to another horizon, one should follow what comes next for we are dive deep into both brain and computer architectures, and see how fascinating both components are.
The Magical Thing Between Our Ears
Lex Fridman said in one of his lectures talking about Deep learning:
"That visualization is just 3% of the neurons in our brain of the thalamocortical system, that magical thing between our ears that allow us all to see and hear and think and reason and hope and dream and fear our eventual mortality. All of that is the thing we wish to understand. That's the dream of artificial intelligence and recreating versions of it, echoes of it, in the engineering of our intelligence systems. That's the dream."
So why are scientists and engineers in the AI field very obsessed with understanding the complexity of the human brain?
The Energy Challenge
To train a computer AI algorithms to understand such complex tasks, or even the easiest ones, like differentiating cat from dog pictures, requires a lot of pictures as inputs for the AI neural network algorithms, and that process will take hours to several days to get the results.
And in reality, training AI models in some cases will consume an amount of energy used to train at over 3 times the yearly energy consumption of the average American.
On the other hand, our brains only need a few pictures, and the names of the animals, and it's processed and memorized in a matter of seconds; that is a powerful computing processing. And it would be worth understanding how it works!
In order to understand what's going on inside both the human brain and computers, we need to take a look into both architectures and see what differences translate into different computational results.
Processor Architecture
In the classic Von Neumann architecture (the famous computer hardware architecture, that is still used nowadays), programs and data are held in memory. The processor and memory are separated while data flows between the two. In that configuration, latency is unavoidable.
In the figure below, you can see how there's a continuous connection between the central processing unit (CPU), where all the processing happens (mathematical and logical), and memory, using buses (green color strings).
That means, no matter how fast a given processor can work, in effect, it is limited to the rate of transfer allowed by data buses.
The Von Neumann Bottleneck
As speeds have increased, the processor has spent an increasing amount of time idle, waiting for data to be fetched from memory.
And with all the different architectures that exist today in our devices, we can't even get closer to brain performance (as shown in the figure below), but most importantly the current architectures in the processors won't be able to run the future AI, as even the supercomputers struggle to run some AI modules with the classic architecture.
The von Neumann bottleneck has often been considered a problem that can only be overcome through significant changes to computer or processor architectures.
That's why we need better machines!
Brain "Architecture"
The human brain is the most powerful supercomputer in the world. It helps us navigate our environment by carrying out about one thousand trillion logical operations per second. It's compact, uses less power than a lightbulb, and has potentially endless storage.
So what's in the brain that makes it this complex and magical?
Parallel Processing
Brains compute in parallel as the electrically active cells inside them, called neurons, operate simultaneously and unceasingly. Neurons influence one another's electrical pulses via connections called synapses.
When information flows through a brain, it processes data as a fusillade of spikes that spread through its neurons and synapses.
To better illustrate, one recognizes the words in this paragraph because of a particular pattern of electrical activity in your brain triggered by input from your eyes, that you are able to recognize the words in this paragraph.
Co-located Memory and Processing
The brain doesn't have separate modules for processing and memory like the Von Neumann architecture.
Instead, neurons combine (co-locate) memory and processing in the same unit. In simple words, the processing unit and memory in the human brain are not separated, but both exist in the same place.
This synaptic "plasticity" is the basis of both short-term and long-term memory and thus is fundamentally responsible for how we learn.
Power Efficiency
When we perform a task, which is a computational task in our brain, we only consume a tiny fraction of our entire neuron count. This is the reason why the human brain is extremely power efficient:
- Power consumption: 20 watts
- Computational capacity: 1 exaFLOP (10¹⁸ floating-point operations per second)
Hitherto, with the rapid rise of machine learning (ML), we also meet another challenge which is power consumption. Our journey towards true AI may be slowed down or even become impossible because of the power limitations.
The Brain vs Supercomputers
Let's put this into perspective:
World's fastest supercomputer (IBM Summit):
- Power consumption: 30 megawatts
- Computational capacity: 200 petaflops
Human brain:
- Power consumption: 20 watts
- Computational capacity: 1 exaflop
That's five times the computational capacity of IBM Summit!
Without forgetting to give the human brain the credit for creating the AI and the hardware that runs it, the human brain gets a point for that (1–1).
The Verdict (So Far)
Arguably, we just realized that as far as our technologies and our CPUs/GPUs are very advanced, we are not yet able or capable of achieving what the human brain can do. That's why we need new hardware technologies to compete with this biological complicated organ, to achieve the dream of a future AI!
So obviously the brain is a big winner versus the computers at this point.
Score: Brain 2 – 1 Computer
Find out who's the winner in the third and last part of this article series!