Why Our Brain Power Might Be 100 Times Greater Than Believed

Last Updated:Tuesday, February 6, 2024
SHARE:

Our Brains May be 100 Times More Powerful Than We Thought
Image credit: BrainMDHealth

Have you ever used an old music amplifier? If so, you may have already experienced what firing a neuron sounds like. The swift burst of static, the prompt string of pops and the coarse, abrasive buzzzz that traumatizes your ears.

Neuroscientists have long tried to listen to the electrical chattering of neurons in rats in order to decode the neural code.

In cracking the code, we may be better able to imitate the way neurons communicate and consequently create powerful computers that work just like the human brain.

But we haven’t cracked the code yet, and we may be a long way from cracking it.

What we do know is that the brain is made up of 77 percent water and weighs around 3 pounds; with 100 billion neurons, each one chemically and electrically wired with thousands of others to create the world’s most complex network.

It has more interconnections (about 1 quadrillion connections that are referred to as synapses which wire the cells together) than stars and planets in the Milky Way.

But how does it all work and how do our brains cause our hearts to beat, help us breathe without thinking, dream, learn, remember, feel, fall in love, smell and many other daily and life functions?

Those are some of the grand questions that scientists and philosophers have grappled with since the beginning of time.

René Descartes even proposed that mind and matter were two different things and that human activity was the consequence of dualism in which the mind controlled the body.

Since then, the answers have been few and vague. Until a ground-breaking study by researchers at UCLA.

The researchers found that our brains may be 100 times more powerful than previously thought.

This post will present the main findings of the study and its implications for developments in medicine and artificial intelligence.

Let’s first take a look at artificial intelligence, since all we’ve been hearing about is the threat of AI and its impact on our life as we know it.

Artificial Intelligence and The Human Brain

The most powerful computer today is the human brain. Let’s compare the human brain to a computer for a minute.

People and computers appear to have harmonizing skills: computers are excellent at tasks that humans are terrible at and humans are great at certain tasks that computers cannot perform yet.

Computers are exceptionally fast: they are much faster at math and repetitive tasks that humans simply have no time for or quickly get bored by. They can multiply 123,777 by 12,889,385 really fast.

Humans, on the other hand are great at pattern recognition, creative thinking and language abilities.

Computers are getting better at those things, especially pattern recognition, but they still can’t do it as well as children.

For example, facial recognition. Humans can recognize faces in various contexts; even faces that are disguised, aged or obscured. Computers cannot match our skills at such tasks. Yet.

That is the goal of Artificial intelligence (AI): to create a computer system that can learn and process images. It is a system of computing that tries to mimic the power of the human brain – to create machines that are human-like.

Scientists are interested in AI because it can be used for things like surveillance and facial recognition. Having computer systems that can solve problems autonomously and learn new terrain can be beneficial in many ways, and in many industries.

However, mimicking the human brain is hard. Almost impossible at this point.

Because in order to mimic the human brain, we have to first understand how the brain works. Thus, scientists have to completely re-engineer and re-alter a computer from hardware to software and everything else in between, because of how our brains function and how it is powered.

Think about it this way: supercomputers run on megawatts while our brain runs on water and salads (or burgers, for some) to function.

Millions of years of evolution means that our brains can function with limited resources to work so efficiently than we can outwork a supercomputer to process complex information without depleting our energy banks.

Computers have a powerful core but have a long line of tasks that are executed and processed in a sequence. Our brains, on the other hand, have neurons which are connected in a parallel manner. That means that we have unique advantages in learning and recognition.

That also means that we use only what we need and when we need it, and do not waste any energy running background processes that slows computers down.

We cannot emulate the same process in computers unless we understand how the brain really works, its structure and the processes it uses to send signals.

In an interview with AI expert Pascal Kaufmann, Ben Dickson of TechTalks, philosophizes on how close we really are to creating AI and accurately replicating the human brain.

Kaufmann believes we are nowhere close until we start understanding the inner workings of the human brain, much more deeply.

And much of it has to do with understanding the neural network.

Kaufmann says, “while in a classical artificial neural network, brain cell A is connected with brain cell B through one thick or thinner connection, often several hundreds of connections at differing lengths and strengths exist between the two biological brain cells.”

And those kind of connections work really well in the biological world but make no sense in the scientific or engineering fields. That coupled with the fact that the human brain cell fires at a range of 20 Hertz while a CPU fires at a rate of several giga-Hertz.

Kaufmann goes on to say that “the brain, however, outnumbers our fast CPUs by the vast number of brain cells and synapses (connections between brain cells).”

In addition, our brains may be 100 times more powerful than previously thought, according to UCLA researchers. The finding both presents challenges and opportunities for AI engineers.

Let’s take a look at the study.

How Powerful Are Our Brains?

A new study, conducted by researchers at UCLA, could change our understanding of how our brains work and could lead to advances in treating neurological disorders and to develop computers that think more like humans.

The researchers looked at the structure and function of dendrites which are constituents of neurons (nerve cells in the brain). Specifically, neurons are large tree-like configurations that have a body, a soma and several branches called dendrites.

dendritesandsoma
Image credit: ScienceAlert

Somas create brief electrical pulses, referred to as “spikes”, in order to communicate and connect with one another.

Scientists had the belief that soma spikes activate dendrites which then send currents to other somas. But that hadn’t been tested before. That process is actually the root of how memories are created and kept.

Scientists also believed that that was dendrites’ main function.

However, the UCLA researchers found that dendrites were not just inert channels. They discovered that dendrites are electrically active in animals that are moving freely and that they yield 10 times more spikes than somas do.

That finding challenges the belief that spikes in the soma are the main way learning, perception and memory formation occur.

Scientists had also previously thought that dendrites submissively sent currents they got from the intersections between two neurons, to the soma, which then generates an electrical impulse.

Those bursts, known as somatic spikes, were thought to be at the center of neural computation and learning. The new study however showed than dendrites create their own spikes ten times more often than somas.

The researchers also discovered that dendrites create large oscillations in voltage as well.

The somas generate all-or-nothing spikes, similar to digital computers; while dendrites produce big, slow fluctuating voltages, that are even larger than spikes. That suggests that dendrites perform analog computation.

“We found that dendrites are hybrids that do both analog and digital computations, which are therefore fundamentally different from purely digital computers, but somewhat similar to quantum computers that are analog.” said Mayank Mehta, lead UCLA researcher and professor of physics and astronomy.

Neuron
Image credit: UCLA

Mehta continues, “a fundamental belief in neuroscience has been that neurons are digital devices. They either generate a spike or not. These results show that the dendrites do not behave purely like a digital device.

“Dendrites do generate digital, all-or-none spikes, but they also show large analog fluctuations that are not all or none. This is a major departure from what neuroscientists have believed for about 60 years.”

Since dendrites are almost 100 times larger in capacity than neuronal centers, Mehta believes that the huge amount of dendritic spikes could mean that the brain has more than 100 times the computational capacity than was formerly thought.

Jason Moore, the study’s first author said that, “many prior models assume that learning occurs when the cell bodies of two neurons are active at the same time.”

“Our findings indicate that learning may take place when the input neuron is active at the same time that a dendrite is active — and it could be that different parts of dendrites will be active at different times, which would suggest a lot more flexibility in how learning can occur within a single neuron.”

Their research has provided a framework for several medical and scientific questions like diagnosing and treating diseases and building coputers.

However, Mehta said that the framework is grounded in the understanding that the cell body makes the decisions and that the process is digital.

He said, “what we found indicates that such decisions are made in the dendrites far more often than in the cell body, and that such computations are not just digital, but also analog.

“Due to technological difficulties, research in brain function has largely focused on the cell body. But we have discovered the secret lives of neurons, especially in the extensive neuronal branches. Our results substantially change our understanding of how neurons compute.”

Brain
Image credit: NBC News

The study answers a lot of questions in the field of neuroscience but still leaves a few questions unanswered, especially for AI engineers.

Let’s refer back to the Kaufmann interview.

Science has also yet to prove how and where memories are stored. Kaufmann asks, “is it in the firing patterns of brain cells, are there certain memory proteins or do we even need to dive deep into the sub-atoms space to take into account sub-quantum effects?”

Kaufmann explains that “while a computer is fairly well understood, the brain harbors a number of secrets, a fact that turns neuroscience and AI into some of the most exciting research fields of our time.

“We do not need to understand the role and purpose of every cell in the brain, but to understand the fundamental principle of how our minds work and what that essence of intelligence is.

“I like to compare this to Newton and the apple—the first step to understanding the very complex sciences around the cosmos started when we first began to understand the principles of gravity.

“Once we understand the patterns and principles of the brain we can use that understanding and apply it to develop human-like AI.”

What we do know is that the study opens the way to new discoveries within the medical field. It could help neuroscientists to treat specific neurological disorders.

It is important to take a look at a more recent study that also unlocks the power of the human brain.

Our Brains have Infinite Potential

Our brains generate new nerve cells well into old age. No, we didn’t make that up – it’s a finding of a recent study.

The study found that healthy people in their 70s have just as much nerve cells or neurons in the memory-related part of their brain as adolescents.

It suggests that our hippocampus keeps generating new neurons throughout our lives.

The findings contradict an earlier study that suggested that neurogenesis in the hippocampus ceases in childhood. The findings also reinforce other bodies of research that show that the adult human brain can generate new neurons.

However, those studies indicated that the neuron generating process tapers off over time. This new study shows that the process does not stop at all.

Researchers looked at the hippocampi of autopsied brains of 17 men and 11 women between the ages of 14 to 79.

While past studies relied on donations from patients who had no detailed medical histories, in this research study, the donors had no history of psychiatric or chronic illnesses; and did not test positive for any alcohol or drugs. They were healthy in all senses of the word.

Researchers were also able to look at the whole hippocampi as opposed to just a few slices. That enabled them to make more accurate assessments of the number of neurons.

To find signs of neurogenesis, the researchers looked for specific proteins that are produced by neurons at certain stages of development. GFAP and SOX2 proteins are made by stem cells which eventually turn them into neurons while Ki-67 proteins that are made more in newborn neurons.

The researchers found those newborn neurons in all of the brains.

The number of neural stem cells were a little lower in people in their 70s in comparison to people in their 20s, but the older brains still had plenty of those cells. The number of young neurons in intermediate to advanced stages of development were the same in all participants’ brains.

However, the healthy older brains did have cyphers of degeneration. The researchers found less evidence for the creation of new blood vessels and less protein markers that signify neuroplasticity (the brain’s ability to make new networks between neurons).

It is too early to predict what those findings mean for brain function since the study was done on autopsied brains. So why are we mentioning this study?

Because it is also ground breaking in highlighting the power of the human brain and could also (potentially) lead to breakthroughs in neuroscience and AI.

Understanding how human brains change over time is very important to researchers who deal with understanding mental conditions that affect older brains like depression, memory loss and stress.

Wrapping Up

The discovery that dendrites are 100 times larger than somas signifies that are brains could have 100 times more capacity. That also means that we have a greater capacity to compute information.

That could possibly lead to more discoveries in the medical field; specifically, on how to treat neurological conditions.

The two studies we covered also show that we have much more to learn about the human brain and its complete capacity.

Truly, we cannot really know what the human brain is capable of, yet, but the studies mark some steps closer to finding out.

The possibilities seem to be endless.

ex-machina-artificial-intelligence-deceiving-humans
Image credit: TechTalks

SHARE: