How the human brain may show AI how to use less energy

By understanding how the human brain can be so energy efficient, researchers hope to develop AI that can do more computing with much less consumption of energy.
Jeff Rowe

AI is poised to make significant contributions across healthcare and other sectors, but the technology chews up a lot of energy.

That’s one way of summing up the focus of a recent study by Penn State researchers that explored how brain cells known as astrocytes function, as well as how they can be emulated in the physics of hardware devices and, by extension help AI and ML technologies consume much less energy than the currently do.

Named for their star shape, astrocytes are support cells for neurons in the brain and play a crucial role in brain functions such as memory, learning, self-repair and synchronization.  

"This project stemmed from recent observations in computational neuroscience, as there has been a lot of effort and understanding of how the brain works and people are trying to revise the model of simplistic neuron-synapse connections,” said Abhronil Sengupta, assistant professor of electrical engineering and computer science. “It turns out there is a third component in the brain, the astrocytes, which constitutes a significant section of the cells in the brain, but its role in machine learning and neuroscience has kind of been overlooked.” 

One of the challenges when it comes to AI, Sengupta said, is the amount of power it takes to run a system, which is largely due to the interplay of switches, semiconductors and other mechanical and electrical processes that happens in any computer processing, but is even greater when the processes are as complex as what AI and machine learning demand.  

By contrast, the human brain has evolved to be quite energy efficient compared to a computer, so a step forward would involve neuromorphic computing, which is computing that mimics brain functions.

In addition, said Sengupta, astrocytes play a crucial role in self-repairing the brain. “So perhaps we can draw insights from computational neuroscience based on how astrocyte glial cells are causing self-repair in the brain and use those concepts to possibly cause self-repair of neuromorphic hardware to repair these faults.” 

In Sengupta’s view, creating such energy-efficient and fault resilient “astromorphic computing” could open the door for more sophisticated AI and machine learning work to be done on power-constrained devices such as smartphones.  

“AI and machine learning is revolutionizing the world around us every day, you see it from your smartphones recognizing pictures of your friends and family, to machine learning’s huge impact on medical diagnosis for different kinds of diseases,” Sengupta said. “At the same time, studying astrocytes for the type of self-repair and synchronization functionalities they can enable in neuromorphic computing is really in its infancy. There's a lot of potential opportunities with these kinds of components.” 

Photo by imaginima/Getty Images