Computers versus Brain, who's the winner? [Part 3/3]
In part 2, we talked about how amazing and sophisticated the human brain is. How it works very differently than our devices (computers), we further discussed how it can be a very powerful computational organ, for it exceeds the computing performance of computers with low energy consumption.
Why Don't We Take This Amazing Architecture and Put It in Hardware?
WHY DON'T WE TAKE THIS AMAZING SOPHISTICATED ARCHITECTURE OF HUMAN BRAIN AND PUT IT IN A SOLID STATE (the hardware of our devices)? OR IS THAT EVEN POSSIBLE?!
Neuromorphic Computing: Hardware That Works Like the Brain!
Guess what! Yes, it is possible! And that is exactly what scientists did, creating a new evolutionary hardware technology called Neuromorphic computing.
Inside a neuromorphic computer, there's totally a different core component which is an electronic brain; Artificial, but physically an equivalent of neurons and synapses that communicate through the use of electric signals, which is much closer, faster, and much more energy-efficient than a hardware emulation of the biological brain.
The Memristors
In order to make the synaptic-based chips, the memristor provides the answer. It's a device that consists of two metal electrodes separated by a thin film of titanium dioxide, or TiO₂.
The benefits of memristor technology include the following:
- Memristors do not consume power when idle and are comfortable with CMOS interfaces
- Density allows extra information to be stored
- Would allow for a faster boot-up since information is not lost when the device is turned off
- Memristor = Hard disk + RAM
- It uses less energy and generates less heat
Is Neuromorphic Better for Machine Learning?
The tests show up to 10 million times faster gain in performance for machine learning as the physical architecture using the neuromorphic paradigm which is much closer to the abstractions.
Neural networks are real neural networks, based on electronic hardware.
So with memristors and neuromorphic engineering technology, it comes the end of an era… and the beginning of another… the era of powerful neural networks and machine learning!
Both Architectures Are Fascinating
Both architectures are fascinating. The former is one of the most complicated systems in this universe, where even scientists can neither understand and encompass it nor decipher the mystery of its function to better dwell on how it really works, and what makes it a very powerful computational processor, and a great memory container.
This organ is the creature of the second architecture. Its concept might seem simple, yet it's one of the most complicated objects that was ever created by human beings, therefore, it is the reason that channeled human new revolution into existence.
And the human brain gets a point for this reason.
The Human Vulnerability Factor
Arguably, I would love to hammer on the way how the human brain can be vulnerable to many things that can influence its function. We humans get agitated, restless, argumentative, and panicky under the pressure of the event.
Most of us worry about:
- How others see us
- Where our careers are headed
- Important things about our lives that we have forgotten
We crave love but are thoughtless and insensitive to those close to us. Our bodies have a range of shameful habits and vulnerabilities… it's what makes us humans! But we can't deny that computers don't have these kinds of vulnerabilities.
For that reason, computers get a point for that.
Understanding the Brain: The Challenge of the 21st Century
Having said this, we are now, have talked about how the complicated organ helped computers hardware improve, and with this improvement comes a lot of benefits to our brains in return!
"Better understanding the human brain is really one of the challenges of the 21st century. We have an increasing amount of people suffering from neurodegenerative diseases, suffering from major depression, other psychiatric diseases. We need to have new tools to diagnose and have better therapies for these brain diseases. And since we are living in an aging population, these diseases, of course, play a major role in the future."
Brain-Computer Interface Technology: The Technology That Will Change Everything
Brain-computer interface (BCI) is a technology that wires human brains with the machine, by using the brain or neuro system data/signals to control or perform a certain activity. It may sound like some kind of technology from a science fiction movie, but it's real.
This technology allows humans to control many devices, just by thinking about it. How cool, right?
BCI helps with:
- Human disabilities (functional and physical)
- Brain and mental diseases
- Communication difficulties
- A way to better human life and further change the course of human evolution
How Does BCI Work?
There are 3 components of the BCI system:
1. The Physical Hardware
Responsible for the collection of the DATA from the brain/neuro system.
The physical hardware can be invasive or non-invasive:
Non-Invasive BCI:
- Most systems use electroencephalogram (EEG) signals
- Detect electrical activity in the brain using small, metal discs (electrodes) attached to the scalp
- Easier, safe, and painless
- The BCI device is a headset that is portable and wearable
Invasive BCI:
- Implemented directly into the brain's gray matter
- Requires neurosurgery
- Devices are chips with thousands of pins/electrodes to detect brain signals
- Transmits data to computers wirelessly
- More efficient in reading signals - produces high-quality and precise brain signals
- Complicated and might not be safe even with advanced robotic surgical technologies
The most famous invasive BCI technology is Elon Musk's new company "NEURALINK".
2. The Software
For the process of the data collected (using signal processing and machine learning algorithms).
The different brain frequency bands are classified according to what activity the user is performing at the moment - the input features for deep learning or machine learning algorithms. The goal is to decode the neural signals.
3. The Action
The deployment of the software into machines that will translate the results of the software into actions.
Once the algorithm can translate the input data using AI, then the software can generate some outputs, either:
- Moving a mechanical hand
- Writing and texting (the concept of Facebook/Meta BCI project)
- Controlling robots
- Playing video games
- But also, maybe controlling memories and thoughts (NEURALINK future goal)
The computer here did a lot for us. In this case, we can agree the point goes to the computer.
Score: Computer 3 – 3 Brain
The Competition Continues
This competition does not end here, and it may seem unfair since the human brain has been evolving for thousands of years, but on the other hand, computers, hardware, and software are in new but continuous development and improvement!
But without forgetting, scientists are still trying to understand how the trillions of neurons are wired inside our brains, and as much as we will get to know, human technologies will also get improved.
The Final Questions
And the question here is:
- Will computers and AI one day take over, since some of nowadays technologies can control our brains already?
- Will that end the human era?
- How would it be like living in such a world?
Many questions, but I guess we will find out soon anyway!
Conclusion
In this three-part series, we've explored:
- Part 1: How big data and AI are transforming industries
- Part 2: The fascinating architecture of the human brain vs traditional computing
- Part 3: The future with neuromorphic computing and brain-computer interfaces
The final score? 3-3. A tie.
Perhaps the real winner is the symbiosis between human intelligence and artificial intelligence - working together to push the boundaries of what's possible.