The brains of people under general anaesthesia continue to process words and sounds, a study finds.Credit: BSIP/UIG Via Getty
People given general anaesthesia fall into a coma-like state in which their memory and perception of pain are switched off. But new data reveal that the hippocampus — a deep brain structure crucial for memory — remains remarkably active, parsing the grammar and meaning of spoken words and even anticipating what will be said next.
Anaesthetized brains can still process podcasts
The research, published today in Nature1, challenges the assumption that complex cognition, such as grasping semantics and forecasting future events, can occur only if a person is fully conscious. By observing people’s individual neurons firing in real time while they are under anaesthesia, researchers discovered that the brain receives stimuli and actively processes what those signals mean.
“The brain has developed such amazing, sophisticated mechanisms for doing all these complex tasks all day long, that it can do some of these things even without us being aware,” says Sameer Sheth, a neurosurgeon at Baylor College of Medicine in Houston, Texas.
Probing the unconscious brain
Previous studies have shown2 that the parts of the brain that first detect sensory input can still register simple sounds while a person is unconscious. But it has remained unclear whether other, deeper regions of the brain are capable of complex cognition while someone is unconscious. To address this, Sheth and his colleagues recorded the brain activity of seven people anaesthetized with the drug propofol while they were undergoing surgery to treat epilepsy.
How to detect consciousness in people, animals and maybe even AI
The researchers played a series of repetitive beeps interspersed with tones of a different pitch to three of the participants. Over the course of ten minutes, the brain recordings showed that the anaesthetized hippocampus becomes better at differentiating the tones and the beeps, suggesting a form of unconscious learning.
The team also played ten-minute podcast segments to four of the study participants and observed that specific neurons were responding to certain parts of speech, distinguishing nouns from other words, for instance. The neurons also anticipated upcoming words on the basis of the context of the sentence. “They were literally predicting what the next word is going to be,” Sheth says.
... continue reading