As as follow up on my “Google Brain” newsletter, I continue to pursue what defines ‘intelligence.’
“Yes the smart brain is an efficient brain, energy-wise. Just Google it. Don’t remember it. If your mind knows it can 'look it up' then it will not invest effort to encode a memory, only the code for searching it. Even having a smart phone in your line of sight, will hinder memory formation and its retrieval.”
Our brains do use quite a bit of energy resources when active. Some of the early research work on brain localization of functions utilized [C14]2-deoxy-D-glucose imaging to ‘light’ up the areas that were currently on the task. More recent techniques, such as fMRI, use blood flow, which correlates with oxygen consumption.
Biological optimization is a natural process, making our bodies and behavior as efficient as possible. If you compare the energy consumption of a breast stroke from an olympic competive swimmer, with that of a newbie learning in the Minnow class at a local YMCA, there is likely a magnitude of difference. The Minnow flails and paddles his arms, inefficiently wasting ATP. But the olympian’s smooth arm strokes are tightly controlled, allowing him to swim indefinately without tiring. He uses ‘minimal energy’ to execute the task.
And so does the intelligent brain.
The maze comprises a discrete state space, wherein white and black cells indicate pathways and walls, respectively. Starting from the left, the agent needs to reach the right edge of the maze within a certain amount of steps (time). The agent solves the maze using adaptive learning that follows the free-energy principle, which allowed it to learn the correct route through a maze through trial and error in a statistically optimal manner. Credit: RIKEN
Free-Energy Principle is what drives this efficiency. It follows a concept called Bayesian inference. In this system, an agent is continually updated by new incoming sensory data, as well its own past outputs, or decisions. The neural networks self-organize by changing the strength of their neural connections and associating past decisions with future outcomes.
Recurrent neural network (RNN) models have features that make them an ideal substrate for modeling the brain. Neural communication is energetically costly so a need to conserve energy might therefore constrain the behavior of any evolving neural network in organisms. In the RNN circuit, some artificial neurons act as prediction units representing a model of the expected inputs. Other neurons act as error units that are active when the prediction units have not as yet learned to correctly anticipate the next number. The error units became subdued when the prediction units finally get it right. The network is compelled to minimize energy usage, and it drives the creation of the architecture.
Balance. Of stimulation with suppression. Just the right amount. Perfect control. No flailing and flapping around.
In studies of real live human brains:
Intelligence is a measure of general cognitive functioning capturing a wide variety of different cognitive functions. It has been hypothesized that the brain works to minimize the resources allocated toward higher cognitive functioning. Thus, for the intelligent brain, it may be that not simply more is better, but rather, more efficient is better. Energy metabolism supports both inhibitory and excitatory neurotransmission processes. Indeed, in glutamatergic and GABAergic neurons, the primary energetic costs are associated with neurotransmission. We tested the hypothesis that minimizing resources through the excitation-inhibition balance encompassing gamma-aminobutyric acid (GABA) and glutamate may be beneficial to general cognitive functioning using 7 T 1H-MRS in 23 healthy individuals (male/female = 16/7, 27.7 ± 5.3 years). We find that a higher working memory index is significantly correlated with a lower GABA to glutamate ratio in the frontal cortex and with a lower glutamate level in the occipital cortex. Thus, it seems that working memory performance is associated with the excitation–inhibition balance in the brain. - Anouk Marsman et al
Thermodynamic perspective of brain sees it being in a state of high entropy. Studies of the thermodynamics of emotions may also relate the neurological origin of intellectual evolution. Energy profiles of positive and negative emotional states, may hint at future psychological and health consequences.
The living state is low entropy, highly complex organization, yet it is part of the energy cycle of the environment. Due to the recurring presence of the resting state, stimulus and its response form a thermodynamic cycle of perception that can be modeled by the Carnot engine. The endothermic reversed Carnot engine relies on energy from the environment to increase entropy (i.e., the synaptic complexity of the resting state). High entropy relies on mental energy, which represents intrinsic motivation and focuses on the future. It increases freedom of action. The Carnot engine can model exothermic, negative emotional states, which direct the focus on the past. The organism dumps entropy and energy to its environment, in the form of aggravation, anxiety, criticism, and physical violence. The loss of mental energy curtails freedom of action, forming apathy, depression, mental diseases, and immune problems. Our improving intuition about the brain's intelligent computations will allow the development of new treatments for mental disease and novel find applications in robotics and artificial intelligence. - Eva Déli, Zoltán Kisvárday 2020
The intelligent brain must also be nimble, able to pivot quickly as needed to adapt to changes. Fixed states expend energy for breaking down, then building up. A flexible mind uses less.
The brain's network configuration varies based on current task demands. For example, functional brain connections are organized in one way when one is resting quietly but in another way if one is asked to make a decision. We found that the efficiency of these updates in brain network organization is positively related to general intelligence, the ability to perform a wide variety of cognitively challenging tasks well. Specifically, we found that brain network configuration at rest was already closer to a wide variety of task configurations in intelligent individuals. This suggests that the ability to modify network connectivity efficiently when task demands change is a hallmark of high intelligence. - Douglas H Schultz, Michael W Cole. 2016
Intelligence = Smooth Operator.
Diamond life, Lover boy
He move in space with minimum waste and maximum joy
Melts all your memories and change into gold
His eyes are like angels but his heart is cold.
REFERENCES
Takuya Isomura et al. “Canonical neural networks perform active inference” 2022, Communications Biology. DOI: 10.1038/s42003-021-02994-2
The Free-Energy Principle Explains the Brain – Optimizing Neural Networks for Efficiency https://scitechdaily.com/the-free-energy-principle-explains-the-brain-optimizing-neural-networks-for-efficiency/
Your Brain Is an Energy-Efficient 'Prediction Machine' https://www.wired.com/story/your-brain-is-an-energy-efficient-prediction-machine/
Anouk Marsman et al. Intelligence and Brain Efficiency: Investigating the Association between Working Memory Performance, Glutamate, and GABA Front Psychiatry 2017 Sep 15;8:154 https://pubmed.ncbi.nlm.nih.gov/28966597/
Déli E, Kisvárday Z. The thermodynamic brain and the evolution of intellect: the role of mental energy. Cogn Neurodyn. 2020 Dec;14(6):743-756. doi: 10.1007/s11571-020-09637-y. Epub 2020 Sep 25. PMID: 33101528
Eva Deli et al. The thermodynamics of cognition: A mathematical treatment Comput Struct Biotechnol J. 2021; 19: 784–793. doi: 10.1016/j.csbj.2021.01.008 https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7843413/
Douglas H Schultz, Michael W Cole. Higher Intelligence Is Associated with Less Task-Related Brain Network Reconfiguration. J Neurosci. 2016 Aug 17;36(33):8551-61. doi: 10.1523/JNEUROSCI.0358-16.2016.