Baylor College of Medicine researchers have found that the human brain is capable of sophisticated language processing while in an unconscious state from general anesthesia. The findings, published in the latest edition of Nature, challenge what we know about the role of consciousness and cognition, and could open new ways of understanding memory, language and brain-computer interfaces.
“Our findings show that the brain is far more active and capable during unconsciousness than previously thought,” said Dr. Sameer Sheth, professor and Cullen Foundation Endowed chair of neurosurgery and a McNair Scholar at Baylor. “Even when patients are fully anesthetized, their brains continue to analyze the world around them.”
Sheth, who is also a neurosurgeon at Baylor St. Luke’s Medical Center, and his collaborators first recorded neural activity from hundreds of individual neurons in the hippocampus, a part of the brain associated with memory, while patients were under general anesthesia during epilepsy surgery. Patients undergoing this type of surgery were sought after because it allowed researchers access to this particular part of the brain.
Using Neuropixels probes, a technology which had not been used in this part of the brain before, the team collected data on how the brain processed sound and language without conscious awareness.
The study began with patients exposed to repetitive tones interrupted by an occasional different sound. Researchers found that hippocampal neurons could distinguish these unusual tones and that this ability improved over time, suggesting a form of learning or neural plasticity during anesthesia.
Researchers then moved on to conduct a more complex experiment where they played short stories to patients while recording neural responses. Surprisingly, the hippocampus demonstrated real-time processing of language. Neural activity showed the brain’s ability to differentiate parts of speech, such as nouns, verbs and adjectives, based on patterns of neuron firing.
Even more surprising, researchers found that neural signals could predict upcoming words in a sentence.
“The brain appears to anticipate what comes next in a story, even without conscious awareness,” said Sheth, who is also Director of The Gordon and Mary Cain Pediatric Neurology Research Foundation Laboratories within the Duncan Neurological Research Institute at Texas Children’s Hospital.
“This kind of predictive coding is something we associate with being awake and attentive, yet it’s happening here in an unconscious state,” said Dr. Benjamin Hayden, professor of neurosurgery and a McNair Scholar at Baylor.
These discoveries suggest that cognitive functions such as language comprehension and prediction do not require consciousness. Instead, consciousness may depend on broader coordination across brain regions rather than activity within a single structure like the hippocampus.
... continue reading