Footprints of Consciousness
Researchers collaborate to develop a model for understanding cognition
By Rob Emproto
During his residency as a neurosurgeon, Chuck Mikell, MD, found himself thinking about who was going to wake up and who wasn’t after a patient suffered head or brain trauma — a question he and his fellow surgeons get asked all the time.
That introspection led him to contemplate the very nature of consciousness.
“It’s a hard question to answer,” said Mikell, clinical associate professor of neurosurgery at Renaissance School of Medicine at Stony Brook. “It’s really a philosophical question. But in the lab, we can’t ‘walk in the weeds.’ We have to focus on the clinical aspect of consciousness.”
In this context, the question takes on a more tangible meaning.
“Can the subject follow a command?” said Mikell. “Can they open their eyes when you ask? Can they wiggle their toes? Can they exhibit goal-directed behavior? Many people would describe it as an awareness of self and surroundings. Since we can’t ever have access to someone else’s awareness, we focus on the proxies. For example, can the patient achieve a behavioral goal? This is a pretty standard clinical technique.”
As a computational neuroscientist and physicist, I see the brain as a massive network of cells. It has two modes of activity, conscious and unconscious. Consciousness is the state of operation for the brain that, given the external and internal inputs, cells can come together and form functional entities to encode cognitive functions. The unconscious state is passive, and these cells are incapable of forming effective connections and groups to encode various cognitive functions.
— Sima Mofakham
Sima Mofakham, Mikell’s partner in leading their lab on consciousness research and an assistant professor of neurosurgery, added, “We’re seeking answers to questions about the physical footprints of consciousness, about its origin and nature. What parts of the brain give rise to it? What are the minimal neuronal mechanisms that are sufficient to generate it? We are interested in furthering the fundamental understanding of cognition.”
A Bold Undertaking
While the concept of consciousness is at its core an abstract one, there’s nothing abstract about a unique multidisciplinary collaboration currently researching the topic at Stony Brook. The project is an ambitious one: searching for the visible indicators of cognitive awareness.
This understanding extends beyond the limits of any one discipline — even one as complex as neurosurgery. In a recently published paper titled “Microscale Multicircuit Brain Stimulation: Achieving Real-Time Brain State Control For Novel Applications,” Mikell and Mofakham, along with Petar Djuric, department chair and distinguished professor in the Department of Electrical and Computer Engineering in the College of Engineering and Applied Sciences, and the University of Wisconsin’s Yuri Saalmann, take on the nature of consciousness, highlighting a multidisciplinary effort that illustrates the scope of academic collaboration.
One can define consciousness as all that is experienced. However, in machine learning the definition of consciousness is completely different. Brain signals of a person who is conscious have completely different characteristics from the brain signals of someone who is not conscious. Machine-learning methods can differentiate the two types of signals.
— Petar Djuric
It all starts with Mikell, who gets the raw data. A literature major as an undergrad, Mikell has long had an interest in the humanity within the science. To that end, he’s always been drawn to understanding the brain and the mind. While at medical school, he was exposed to neuromodulation, a technique of altering nerve activity through targeted delivery of a stimulus. Shortly thereafter, in 2005, the first reports of doctors using brain stimulation for depression came out, documenting the use of electrodes in the brain and current to heal patients. Mikell was immediately enamored.
At Stony Brook, he found a kindred spirit in Mofakham, who was pondering her own questions about consciousness.
“I have always been fascinated with and wanted to understand how interactions of individual brain cells shape who we are, what we think, what we do and if we can reverse-engineer these interactions,” she said. “For my PhD, I studied computational neuroscience at the University of Michigan in a lab focused on complex systems. The ultimate complex system that you can work with is the human brain. It’s a huge network of 200 billion cells with hundreds of trillions of interconnections between the cells. Because of this complexity, we still don’t know much about how it works when it’s broken.”
As a result, she said, they model the brain and its neural networks to try to understand how the interplay of dynamics and the structures of these networks shape its function.
“When I was studying for my PhD, I thought, ‘how can we develop models closer to the real brain activity to better understand how the human brain works and modulate its activity in pathological cases?’,” Mofakham said.
Unbeknownst to her, Mikell was walking down a parallel path. After finishing his residency, he got involved with functional neurosurgery, which involves putting electrodes in the brain and using electrical current to change and improve brain functions and record electrical activity. Eventually he became an expert in these techniques.
“Chuck has expertise in recording from the depth cortical regions in recovering comatose patients,” said Mofakham. “These human recordings are very rare. I was interested in collaborating with him to see if we could come up with a model that illustrated how interactions of cells result in consciousness and what happens in traumatic brain injury patients when they lose consciousness. I wanted to come up with a model that might show us what we needed to do to bring them back.”
Since then, Mikell and Mofakham have run their joint lab where they study traumatic brain injury, loss of consciousness and more.
“The main theme of the lab is consciousness, but consciousness is the baseline for everything else,” said Mofakham. “When you know how consciousness arises from the activity of individual cells, then you have a lead on how cognition arises.”
Making Sense From Signals
Using Mikell’s brain recordings, Mofakham develops models showing how the interactions of different parts of the brain can bring about consciousness.
“We look at the structure and dynamics of the signals Chuck is recording to explore the state of consciousness,” she said. “But we have to modulate these signals in an intelligent way. Over time it became clear that we needed some engineering expertise and that’s when we connected with Petar Djuric.”
It’s a philosophical question. We have to focus on the clinical aspect of consciousness. Can the subject follow a command? Can they open their eyes when you ask? Can they wiggle their toes? Can they exhibit goal-directed behavior? Many people would describe it as an awareness of self and surroundings. Since we can’t ever have access to someone else’s awareness, we focus on the proxies. For example, can the patient achieve a behavioral goal?
– Chuck Mikell
Djuric, an electrical engineer, was also working on complex systems, but in a very different way. “I work in signal processing and machine learning,” he said. “Trying to predict what will happen next from signals and data is a big part of what we do in electrical and computer engineering.”
Djuric recalled meeting Mofakham and Mikell in 2019 and learning about their interest in studying patients with brain injuries.
“They wanted to better understand and predict outcomes,” he said. “They collect lots of signals from patients. And working with neural signals is something that is quite valuable in our field.”
Djuric explained that there’s a wide range of problems that Mofakham and Mikell are working on that require machine-learning expertise.
“For example, in studying the emergence of consciousness, understanding how brain areas interact to process information is crucial,” he said. “It’s important to discover causations or predict what will happen next or quantify dependencies.”
There are three main ingredients that play essential roles in learning the studied phenomena: the measured signals, the models of the signals and the methodologies for processing the signals based on the models.
“This process is at the center of machine learning and artificial intelligence,” he said. “It’s applied almost everywhere…in finance, in astronomy, computer science, civil engineering, mechanical engineering, bioengineering. There’s no field where it’s not applied.”
Djuric said it’s gratifying to be in a field where he can collaborate and contribute to colleagues in other areas, using mathematical equations to crunch the massive amounts of data generated in hopes of revealing the secrets hidden within.
“They collect signals and measurements, and they need to find the information within those measurements,” he said. “That’s exactly what I do with my students.”
Though physics is his first love, Djuric said he became interested in working with complex systems after joining Stony Brook in 1990.
“There’s nothing more complex than the brain,” he said. “I’m very lucky to be working with Sima and Chuck and being in position to contribute to having a better understanding of it.”
The conclusion of Djuric’s work with Mofakham and Mikell might not signify the end of one road, but rather, the beginning of another.
“Challenging research in the years to come is to continue the development of methods that can successfully reveal causations in complex systems that can readily be applied to brain signals,” he said. “The results of this research may in turn lead to discoveries of better therapeutic approaches for disorders of consciousness and cognitive problems. The acquired knowledge can also be used to advance the science of artificial intelligence.”
Not surprisingly, the findings of the research team’s paper reflect the input of all, espousing the view that the next generation of microscale multicircuit brain stimulation technology will use a closed-loop, machine learning-based approach. These technologies will improve treatments for diseases currently treated by neuromodulation and enable the treatment of diseases that are not currently candidates for it. The collaborators also stressed that early translation into human studies should then be pursued, given the overwhelming need for new treatments for neurological and psychiatric diseases.
Such a conclusion reinforces the fact that in a world that gets more complex every day, meaningful advancement will come from collective efforts across a wide spectrum of specialties.
“This project cannot be done without this collaboration because it needs all the components,” said Mofakham. “Having someone to know the brain’s different regions and record the signals like Chuck does is essential. And then you also need someone who knows how these different areas connect to each and how they interact and develop the working model, which is what I do. And with engineers like Petar, we can bring it together in a meaningful, intelligent way.”