Even before he was born, it was clear that the boy’s brain was unusual—so much so that his expecting parents flew from rural Alaska to Seattle, where specialists could attend to their son from birth. That is how James Bennett first met the boy, then a days-old infant struggling to breathe. The baby’s head was too big. The structures in his brain looked wrong. Bennett, a pediatric geneticist at Seattle Children’s, was tasked with figuring out why.
Microglia make up 10 percent of the brain’s cells, but they are not neurons and therefore have long been overlooked. The boy’s case makes their importance unmistakable. In the absence of microglia, the boy’s neurons still grew to fill his skull, but they ended up in the wrong places and made the wrong connections. Microglia, scientists have started to realize, guide the development of the brain.
“There wasn’t any part of the brain that wasn’t involved and affected in this child,” Bennett says. A part of the baby’s cerebellum jutted at an odd angle. His ventricles, normally small fluid-filled cavities in the brain, were too large. And a dense bundle of nerves that is supposed to connect the brain’s left and right hemispheres, called the corpus callosum, had entirely failed to develop.
In petri dishes and in animals, scientists had previously observed how microglia guide developing neurons to the right locations, creating the highly organized layers that make up the brain. They also prune connections between neurons. “Things get off track pretty quickly when you start manipulating the functions of microglia,” says Stephen Noctor, a developmental neurobiologist at the University of California at Davis who was not involved in examining the boy. To better understand the CSF1R gene, Bennett teamed up with zebra-fish biologists. In fish, turning off the gene disrupts a cellular pathway necessary for corpus-callosum neurons to grow in humans.
Kim Green, a neurobiologist at the University of California at Irvine, notes that mutant mice lacking microglia have broadly similar patterns of disorganization in their brains. These mice models essentially predicted what would happen in a human. Green had just never expected to see a person without microglia. “It’s absolutely remarkable,” he says.
The boy’s brain helped unlock these scientific mysteries. But he was ultimately still a boy, a very sick one with worried young parents. Their son’s condition was so severe, it turns out, because he had inherited two faulty copies of the CSF1R gene—one from each parent. His parents happened to carry the same rare mutation because they are cousins.
In adults, just one copy of a CSF1R mutation can lead to a brain disorder called adult-onset leukoencephalopathy with axonal spheroids and pigmented glia, which causes memory loss and eventually dementia beginning in one’s 40s. When the boy’s DNA-sequencing results came back, Bennett realized that he had to explain to the parents their own CSF1R mutation and their risks of developing the disorder. They were relieved, he says, to understand what was wrong with their child, but perhaps too overwhelmed to fully take in what it meant for their lives. The couple spoke with a genetic counselor before their son’s DNA sequencing, and Bennett says he arranged to have them meet with another genetic counselor back in Alaska, where they returned home.
This story has no miracle cure or happy ending. The boy died in Alaska at 10 months old of likely related causes, and Bennett says the family agreed to an autopsy. They have since lost touch. The phone numbers he has for them no longer work. He told me that he recently got hold of the mother’s sister, in an attempt to tell the family about the research made possible by their child. It’s a delicate balance: He feels a duty to inform, but he understands that the parents might not want to be reminded of their dead son.
A pediatric geneticist’s job, Bennett said, is often to diagnose extremely rare conditions, which push up against the limits of the human body. “On any day, you can find a patient you spend the rest of your career thinking about,” he said. The boy is one of them.
We want to hear what you think about this article. Submit a letter to the editor or write to email@example.com.
A study of over a hundred people’s brains suggests that abuse during childhood is linked to changes in brain structure that may make depression more severe in later life.
Nils Opel at the University of Münster, Germany, and his colleagues scanned the brains of 110 adults hospitalised for major depressive disorder and asked them about the severity of their depression and whether they had experienced neglect or emotional, sexual or physical abuse during childhood.
Statistical analysis revealed that those who experienced childhood abuse were more likely to have a smaller insular cortex – a brain region involved in emotional awareness.
Over the following two years, 75 of the adults experienced another bout of depression. The team found that those who had both a history of childhood abuse and a smaller insular cortex were more likely to have a relapse.
“This is pointing to a mechanism: that childhood trauma leads to brain structure alterations, and these lead to recurrence of depression and worse outcomes,” says Opel.
The findings suggest that people with depression who experienced abuse as children could need specialised treatment, he says.
Brain changes can be reversible, says Opel, and the team is planning to test which types of therapies might work best for this group.
WAKING UP. WORKING out. Riding the bus. Music is an ever-present companion for many of us, and its impact is undeniable. You know music makes you move and triggers emotional responses, but how and why? What changes when you play music, rather than simply listen? In the latest episode of Tech Effects, we tried to find out. Our first stop was USC’s Brain & Creativity Institute, where I headed into the fMRI to see how my brain responded to musical cues—and how my body did, too. (If you’re someone who experiences frisson, that spine-tingling, hair-raising reaction to music, you know what I’m talking about.) We also talked to researchers who have studied how learning to play music can help kids become better problem-solvers, and to author Dan Levitin, who helped break down how the entire brain gets involved when you hear music.
From there, we dove into music’s potential as a therapeutic tool—something Gabrielle Giffords can attest to. When the onetime congresswoman was shot in 2011, her brain injuries led to aphasia, a neurological condition that affects speech. Through the use of treatments that include melodic intonation therapy, music helped retrain her brain’s pathways to access language again. “I compare it to being in traffic,” says music therapist Maegan Morrow, who worked with Giffords. “Music is basically like [taking a] feeder road to the new destination.”
But singing or playing something you know is different from composing on the fly. We also wanted to get to the bottom of improvisation and creativity, so we linked up with Xavier Dephrepaulezz—who you might know as two-time Grammy winner Fantastic Negrito. At UCSF, he went into an fMRI machine as well, though he brought a (plastic) keyboard so he could riff along and sing to a backing track. Neuroscientist Charles Limb, who studies musical creativity, helped take us through the results and explain why the prefrontal cortex shuts down during improvisation. “It’s not just something that happens in clubs and jazz bars,” he says. “It’s actually maybe the most fundamental form of what it means to be human to come up with a new idea.”
If you’re interested in digging into the research from the experts in the video, here you go:
We are want to be included, to belong to the tribe. Our brains are constantly scanning our environment and our interactions to determine if we “fit in” or not. That’s why the “like me” bias is so prevalent—because we feel most comfortable (most safety and belonging) with people that are similar to us.
Who’s Special–And Thus Included?
I’m not going to talk about diversity here, as I’ve done so before. Instead, I want to urge you to look at your organization, and to notice who is being excluded and why. Sometimes it’s easiest to first look at who is included, or who’s in the “in group” (yes, just like in High School!). Ask yourself:
Who receives the high profile assignments/projects?
Who receives frequent public praise/is held up as an example of positive performance, attitude, etc?
Who receives promotions?
Who has lunch/is invited to play golf with the key leaders?
Chances are really good that you thought of a smallish group of people. And I’ll bet they all have things in common with the leaders that offer them the above benefits. We’ll call them the “in group”. That’s the “like me” cognitive bias at work, and beneath it, we’re subconsciously just trying to mitigate risk. Everyone else is the “out-group”.
Your brain has three to four times as much real estate devoted to identifying threats versus identifying opportunities and rewards. Since we are all naturally biased, there’s no need to feel ashamed of it. However, there’s a very profound business case for becoming more aware of exclusion and how it damages our performance, emotional engagement, health and happiness at work and in life overall.
YOU MAY ALSO LIKE
Your Brain On Exclusion
You’ve been left out of a group before (think back to Junior High or High School, or the last round of promotions you weren’t part of or the special meeting/project you weren’t included in, you get the idea). You know how emotionally painful it feels. Our belonging is threatened when we are ostracized or excluded, and we dive into Critter State (fight, flight, freeze). Now our brain literally cannot function the way it does when it feels safe and is in Smart State.
When we’re excluded, our brain will release an enzyme that attacks the hippocampus, which is responsible for regulating synapses. As a result, our brain does the following:
Reduces the field of view and focuses only on a narrow span of what it must do to survive. Myelin sheathing increases on existing neural pathways, and we are less likely to consider or try new solutions.
Shrinks its working memory, so that it is not distracted by other ideas, bits of information, or stray thoughts. This means we can’t problem solve optimally. Think of students panicked by a pop quiz: the information is there, but they cannot access it.
Is less creative. With less gray matter and modified synapses, we experience fewer ideas, thoughts, and information available to “bump into each other,” so our capacity to create is reduced.
Increases cell density in the amygdala, the area of the brain responsible for fear processing and threat perception, making us more likely to be reactive rather than self-controlled.
Is less likely to connect with others. Fight, flight, freeze, or faint is not a “sharing” type of activity. When the synapses have been modified in this way, we appear grumpy and unsociable.
As leaders, we must promote everyone’s Smart State by not just hiring diverse team members but including them. If your not-like-you team members don’t feel included, they’ll end up in Critter State, where no one wins.
The brain is profoundly impacted when a person feels excluded—and the person, their performance, their emotional engagement, and the organization overall suffers as a result
Leaders must raise their awareness to identify who’s being excluded and why—then include them
You make countless decisions every day that range from mundane to incredibly important, but what part of you is actually making those decisions? We all assume that our brains are focused on whatever task we’re tackling, but a new study suggests that your brain is usually working a few steps ahead all on its own, and it makes your decisions long before you consciously think about them.
The study, which was published in Scientific Reports, reveals that what we often think of as free will and our ability to make decisions on the fly isn’t nearly as cut-and-dry. Your brain, it turns out, might be running the show largely in the background.
The experiment was fairly straightforward, tasking volunteers to decide between two patterns with different colors and orientations. Their brains were being monitored in an fMRI machine while the images flashed before their eyes, and the researchers were able to match brain activity patterns with whatever choice the subject was making.
That part isn’t particularly surprising, since scientists have long known that repeatable brain patterns can correlate with decision making. But what’s interesting about this research is that the team found the participants brain activity could predict their eventual choices before the individual was even asked to make a choice.
“We believe that when we are faced with the choice between two or more options of what to think about, non-conscious traces of the thoughts are there already, a bit like unconscious hallucinations,” Professor Joel Pearson, co-author of the study, said in a statement. “As the decision of what to think about is made, executive areas of the brain choose the thought-trace which is stronger. In, other words, if any pre-existing brain activity matches one of your choices, then your brain will be more likely to pick that option as it gets boosted by the pre-existing brain activity.”
Put simply, the path you’re about to choose when you make a decision can sometimes be pre-determined before you even actively consider your options. The researchers found that they could predict the outcome up to 11 seconds before the subject began to weigh their decision.
“This would explain, for example, why thinking over and over about something leads to ever more thoughts about it, as it occurs in a positive feedback loop,” Pearson said.
Because sleep often becomes increasingly lighter and more disrupted as we get older, the study reinforces and potentially explains the links among aging, sleep deprivation, and heightened risk for Alzheimer’s disease.
“Sleep is critical to the function of the brain’s waste removal system and this study shows that the deeper the sleep the better,” says Maiken Nedergaard, codirector of the Center for Translational Neuromedicine at the University of Rochester Medical Center (URMC) and lead author of the study.
“These findings also add to the increasingly clear evidence that quality of sleep or sleep deprivation can predict the onset of Alzheimer’s and dementia.”
The study, which appears in the journal Science Advances, indicates that the slow and steady brain and cardiopulmonary activity associated with deep non-REM sleep are optimal for the function of the glymphatic system, the brain’s unique process of removing waste. The findings may also explain why some forms of anesthesia can lead to cognitive impairment in older adults.
WASHING AWAY WASTE
Nedergaard and her colleagues first described the previously unknown glymphatic system in 2012. Prior to that point, scientists didn’t fully understand how the brain, which maintains its own closed ecosystem, removed waste. The study revealed a system of plumbing which piggybacks on blood vessels and pumps cerebral spinal fluid (CSF) through brain tissue to wash away waste. A subsequent study showed that this system primarily works while we sleep.
Because the accumulation of toxic proteins such as beta amyloid and tau in the brain are associated with Alzheimer’s disease, researchers have speculated that impairment of the glymphatic system due to disrupted sleep could be a driver of the disease. This squares with clinical observations which show an association between sleep deprivation and heightened risk for Alzheimer’s.
In the current study, researchers conducted experiments with mice anesthetized with six different anesthetic regimens. While the animals were under anesthesia, the researchers tracked brain electrical activity, cardiovascular activity, and the cleansing flow of CSF through the brain.
The team observed that a combination of the drugs ketamine and xylazine (K/X) most closely replicated the slow and steady electrical activity in the brain and slow heart rate associated with deep non-REM sleep. Furthermore, the electrical activity in the brains of mice administered K/X appeared to be optimal for function of the glymphatic system.
“The synchronized waves of neural activity during deep slow-wave sleep, specifically firing patterns that move from front of the brain to the back, coincide with what we know about the flow of CSF in the glymphatic system,” says Lauren Hablitz, a postdoctoral associate in Nedergaard’s lab and first author of the study.
“It appears that the chemicals involved in the firing of neurons, namely ions, drive a process of osmosis which helps pull the fluid through brain tissue.”
The study raises several important clinical questions. It further bolsters the link between sleep, aging, and Alzheimer’s disease. Researchers have known that as we age it becomes more difficult to consistently achieve deep non-REM sleep, and this study reinforces the importance of deep sleep to the proper function of the glymphatic system.
The study also demonstrates that enhancing sleep can manipulate the glymphatic system, a finding that may point to potential clinical approaches, such as sleep therapy or other methods to boost the quality of sleep, for at-risk populations.
Furthermore, because several of the compounds used in the study were analogous to anesthetics used in clinical settings, the study also sheds light on the cognitive difficulties that older patients often experience after surgery and suggests classes of drugs that could help avoid this phenomenon. Mice in the study that researchers exposed to anesthetics that did not induce slow brain activity saw diminished glymphatic activity.
“Cognitive impairment after anesthesia and surgery is a major problem,” says coauthor Tuomas Lilius with the Center for Translational Neuromedicine at the University of Copenhagen in Denmark. “A significant percentage of elderly patients that undergo surgery experience a postoperative period of delirium or have a new or worsened cognitive impairment at discharge.”
Additional researchers from the University of Rochester and the University of Copenhagen contributed to the study. The National Institute of Neurological Disorders and Stroke, the National Institute on Aging, the Adelson Foundation, the Sigrid Juselius Foundation, the Novo Nordisk Foundation, and the Lundbeck Foundation supported the research.
We’ve often heard the negative ways that technology can affect our mental health. For example, studies have shown that spending too much time on Facebook and comparing your life (and body) to those of others can cause or exacerbate depression– and most of us are constantly aware how much faster (and more stressful) life has gotten with the advent of the smartphone.
However, in my experience, the internet is full of wonderful, easy-to-use tools for you to straighten out your brain. Here are some resources that I’ve found that can make technology work for you and your mental health.
You Feel Like Shit. You Feel Like Shit is my contribution to internet mental health– it’s a game-like self care guide that you can play through if you’re feeling bad. It asks questions and then gives recommendations based on your answers, including suggestions like playing with pets and drinking a glass of water.
Online counseling services. Therapy can be really inaccessible for a variety of reasons. Instead of traveling to a therapist near you, you can find counseling services on the web that work just like regular therapy. Check out this list from E-counseling.com to figure out which service is best for you.
Psychoeducation. Just learning about your symptoms can be a huge breakthrough and there’s tons of information about every disorder on the internet. If you have a diagnosis, start by learning the basics and then look up your symptoms for more specific information. An app that you can use for anxiety psychoeducation is SAMapp. Its features allow you to read about anxiety and then collect coping strategies that work for you.
To-do apps. If you struggle with stress caused by disorganization, to-do apps can change your life. I am personally a proponent of the Bullet Journal, but I recognize that it does have its flaws. (In particular, a paper journal cannot provide reminder alarms.) One that I recommend is ToDoist.
Online DBT courses. Dialectical Behavior Therapy, which was designed for people with Borderline Personality Disorder, is a selection of skills usually taught in a classroom-like setting. However, not everyone has the time for three-hour classes twice a week, even if they could really benefit from the material. Instead, try DBT Peer Connections, a YouTube channel made by a peer who wanted to bring DBT to the masses.
Guided meditation audio. Meditation, and the mindfulness that results, is a super important aspect of self care. With its budding popularity, there are tons of guided meditations out there for every use under the sun. You can find free meditation audio on YouTube, but if you’re into apps, Calm might be a great choice for you.
Crisis text line at 741 741. In a crisis, you may not be thinking clearly enough to use the skills and processes that you’ve been working on. That’s when the Crisis Text Line can step in– they will help guide you through your crisis and work on practical steps you can take to mitigate harm. All you have to do is send a text to 741 741. Another app you can use in a crisis is What’s Up. It includes grounding exercises, coping strategies, helpful information, a journal, and habit trackers (both positive and negative).
Communication apps. Regularly keep in touch with friends and loved ones who can help when you’re feeling down. You can use text messenger services (like Facebook Messenger) or video chat (like Skype)– either way, having a strong support system can make a difference in your mental health.
Woebot. Woebot is a robot that will help you with your woes. Using Cognitive Behavioral Therapy techniques, Woebot will respond intelligently to your messages and help you work on your mental health a day at a time. Woebot has Android and iOS apps, but you can also message it with Facebook Messenger.
Food trackers. Here we have to tread carefully: apps that help you lose weight are not going to do anything for your mental health and may, in fact, harm it. However, apps that help you get enough nutrition to keep your body running at its best will help you a lot, especially if you have a history of disordered eating. Some will even help you deal with urges to engage in disordered eating behavior. Some examples include RR Eating Disorder Management and Rise Up: Eating Disorder Help.
Mood trackers. Mood trackers can be particularly helpful when you’re gathering evidence so you can be diagnosed by a professional. Instead of guessing how many days a month you feel depressed, for example, you can have hard evidence. One mood tracker option is Daylio, which tracks a ton of data for you to use in your recovery.
Journal apps. Some of us, for better or worse, are glued to our phones. If a paper journal isn’t for you, you can always download a good journal app to talk out your feelings and record your insights. Paper journaling has been shown to increase mindfulness, but apps are more portable, giving you the opportunity to write at any time. If you’d like to keep a digital journal, try something designed for long-form writing like the Journey app.
Mental health games. Many app creators have taken the concept of gamification and applied it to mental health. Apps like SuperBetter or Habitica take your day-to-day activities and turn them into a game complete with achievements and rewards. If you’re a video game junkie, you can redirect your urge to win into meeting real-life goals.
Unlike a bicep or a quadricep, we can’t see or feel when our brain is turning into mush through either disuse or misuse. Instead, any atrophy will instead make itself known when we’re struggling to remember a very common word, getting hopelessly lost in a part of town we’re intimately familiar with, or being driven to tears trying to figure out how to set up a personal hotspot. That last one happened to me about 90 minutes ago.
While the brain isn’t literally a muscle, its function can be positively and negatively affected by the behaviors we engage in—and ones that we don’t—each and every day. Below is a litany of habits you can pick up that could help you stop fucking with your grey matter and help enhance its function instead. If you change your ways, your chances of regaining your sparkle are good.
As I’m sure you’ve noticed, sleep is extremely important to all aspects of our health. Unfortunately, we’re getting less of it than ever. As recently as the mid-1900s, people slept around nine hours per night. In 1970, that number had fallen to around 7.5 hours per night. According to the CDC, over a third of American adults getting less than seven hours shut-eye per night. “Sleep is essential for optimal neuropsychological ability,” says Virginia-based neurologist and sleep specialist W. Christopher Winter. He elaborates on this in his book, The Sleep Solution: Why Your Sleep is Broken and How to Fix It. “From interpreting nonverbal cues and emotional content to managing concentration and organizing information in our minds, sleep is vital—and restricted sleep can dramatically impact cognitive performance.”
Another sleep-related thing to consider: naps are not just for cranky toddlers. A small study from 2010 looked at the academic performance of two groups of young adults: nappers and non-nappers. In the experiment, every participant completed a rigorous learning task. After the first task, one group took a 90-minute nap while the other stayed awake until a second task was administered hours later. The participants who napped in between tasks did significantly better on the second task and also showed signs of improvement and learning.
The non-nappers, on the other hand, became worse at learning and their ability to retain information decreased. “Napping helps raise levels of alertness and can help with memory,” says clinical psychologist and sleep specialist Michael Breus. He explains that a 20 to 25 minute cat nap can help you to stay sharp when you just didn’t get enough sleep the night before, but that getting more nighttime sleep is the best solution.
Caffeinate (in moderation)
Many of us are well acquainted with coffee’s ability to get us moving in the morning, but it can also help you process things more quickly. Winter says that caffeine’s role as a performance-enhancing drug has long been known. “It helps with concentration, focus, and memory processing as well as recall,” he says. According to a study from 2012, 200 mg of caffeine (about as much as you’d find in a 12-ounce cup of coffee) can improve a person’s verbal processing speed. By providing a group of adults a 200 mg caffeine pill in the morning and then asking them to complete word-recognition tasks, researchers discovered improved speed and accuracy compared to when they completed these tasks without caffeine.
Put the bottle down once in a while
In a study in the British Medical Journal, researchers looked at the impact of moderate alcohol consumption on the brain through the cognitive ability of more than 500 adults over 30 years. It was demonstrated that people who drank between 15 and 20 standard drinks per week were three times more likely to suffer from hippocampal atrophy—damage to the area of the brain involved in memory and spatial navigation.
Overall, drinking doesn’t “kill your brain cells,” but drinking too much too often can damage the part of your brain responsible for remembering things, which is almost as bleak. That actually leads me to my next suggestion.
Give Google a break
If you’re older than say, 35, you can probably remember a time when you had at least a dozen phone numbers committed to memory. You may also recall certain mental tricks you may have employed to help you do so, such as associating certain number sequences with the location of their keys on the dial pad, or “clustering” the numbers into groups to help you retain them. Guess what? That’s called using your brain.
In today’s connected world, we’re storing information basically everywhere else. In a 2011 paper entitled Google Effects on Memory: Cognitive Consequences of Having Information at Our Fingertips, college students were shown to recall less information when they knew they could search for it instead. Winter says that stress can be helpful in memory formation. Knowing that you have access to all the information you’ll need “might reduce memory capacity,” he says.
Have more sex
Sometimes, after a long, hard day, the thought of energetic humping can seem so daunting that you and your partner agree to a half-assed snuggle instead. But if you’re not making sex a priority at all, it might be worth checking out some of the research that touts the benefits it might have on our brain function.
In a small 2017 study published in the Journals of Gerontology, researchers asked a group of older adults questions about their sex lives and then had them to take a standardized test. This revealed a link between sex frequency and intelligence: People who claimed to engage in sexual activity weekly wound up having higher test scores than people who did not. It’s important to note that we can’t be certain of the direction of this effect—people who feel sharper might be more likely to be having more sex.
Still, other recent research has demonstrated a strong link between getting wild and getting smart. In 2017, another study published in the Archives of Sexual Behavior looked at the effect of sex on the cognitive abilities of 78 women aged between 18 and 29. Controlled for other factors such as menstrual phase and relationship length, researchers found that women who had sex more often had better recall of abstract words on a memory test. In fact, the bulk of research done on the benefits of sex on the brain revolves around memory. People who are getting some on a regular basis may be less depressed and more emotionally satisfied too, Winter says. This, he adds, could line up with sex being cognitively beneficial and helpful with focus.
IT BEGAN ABOUT a decade ago at Syracuse University, with a set of equations scrawled on a blackboard. Marc Howard, a cognitive neuroscientist now at Boston University, and Karthik Shankar, who was then one of his postdoctoral students, wanted to figure out a mathematical model of time processing: a neurologically computable function for representing the past, like a mental canvas onto which the brain could paint memories and perceptions. “Think about how the retina acts as a display that provides all kinds of visual information,” Howard said. “That’s what time is, for memory. And we want our theory to explain how that display works.”
But it’s fairly straightforward to represent a tableau of visual information, like light intensity or brightness, as functions of certain variables, like wavelength, because dedicated receptors in our eyes directly measure those qualities in what we see. The brain has no such receptors for time. “Color or shape perception, that’s much more obvious,” said Masamichi Hayashi, a cognitive neuroscientist at Osaka University in Japan. “But time is such an elusive property.” To encode that, the brain has to do something less direct.
Original story reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.
Pinpointing what that looked like at the level of neurons became Howard and Shankar’s goal. Their only hunch going into the project, Howard said, was his “aesthetic sense that there should be a small number of simple, beautiful rules.”
They came up with equations to describe how the brain might in theory encode time indirectly. In their scheme, as sensory neurons fire in response to an unfolding event, the brain maps the temporal component of that activity to some intermediate representation of the experience—a Laplace transform, in mathematical terms. That representation allows the brain to preserve information about the event as a function of some variable it can encode rather than as a function of time (which it can’t). The brain can then map the intermediate representation back into other activity for a temporal experience—an inverse Laplace transform—to reconstruct a compressed record of what happened when.
Just a few months after Howard and Shankar started to flesh out their theory, other scientists independently uncovered neurons, dubbed “time cells,” that were “as close as we can possibly get to having that explicit record of the past,” Howard said. These cells were each tuned to certain points in a span of time, with some firing, say, one second after a stimulus and others after five seconds, essentially bridging time gaps between experiences. Scientists could look at the cells’ activity and determine when a stimulus had been presented, based on which cells had fired. This was the inverse-Laplace-transform part of the researchers’ framework, the approximation of the function of past time. “I thought, oh my god, this stuff on the blackboard, this could be the real thing,” Howard said.
“It was then I knew the brain was going to cooperate,” he added.
Invigorated by empirical support for their theory, he and his colleagues have been working on a broader framework, which they hope to use to unify the brain’s wildly different types of memory, and more: If their equations are implemented by neurons, they could be used to describe not just the encoding of time but also a slew of other properties—even thought itself.
But that’s a big if. Since the discovery of time cells in 2008, the researchers had seen detailed, confirming evidence of only half of the mathematics involved. The other half—the intermediate representation of time—remained entirely theoretical.
Until last summer.
Orderings and Timestamps
In 2007, a couple of years before Howard and Shankar started tossing around ideas for their framework, Albert Tsao (now a postdoctoral researcher at Stanford University) was an undergraduate student doing an internship at the Kavli Institute for Systems Neuroscience in Norway. He spent the summer in the lab of May-Britt Moser and Edvard Moser, who had recently discovered grid cells—the neurons responsible for spatial navigation—in a brain area called the medial entorhinal cortex. Tsao wondered what its sister structure, the lateral entorhinal cortex, might be doing. Both regions provide major input to the hippocampus, which generates our “episodic” memories of experiences that occur at a particular time in a particular place. If the medial entorhinal cortex was responsible for representing the latter, Tsao reasoned, then maybe the lateral entorhinal cortex harbored a signal of time.
The kind of memory-linked time Tsao wanted to think about is deeply rooted in psychology. For us, time is a sequence of events, a measure of gradually changing content. That explains why we remember recent events better than ones from long ago, and why when a certain memory comes to mind, we tend to recall events that occurred around the same time. But how did that add up to an ordered temporal history, and what neural mechanism enabled it?
Tsao didn’t find anything at first. Even pinning down how to approach the problem was tricky because, technically, everything has some temporal quality to it. He examined the neural activity in the lateral entorhinal cortex of rats as they foraged for food in an enclosure, but he couldn’t make heads or tails of what the data showed. No distinctive time signal seemed to emerge.
Tsao tabled the work, returned to school and for years left the data alone. Later, as a graduate student in the Moser lab, he decided to revisit it, this time trying a statistical analysis of cortical neurons at a population level. That’s when he saw it: a firing pattern that, to him, looked a lot like time.
He, the Mosers and their colleagues set up experiments to test this connection further. In one series of trials, a rat was placed in a box, where it was free to roam and forage for food. The researchers recorded neural activity from the lateral entorhinal cortex and nearby brain regions. After a few minutes, they took the rat out of the box and allowed it to rest, then put it back in. They did this 12 times over about an hour and a half, alternating the colors of the walls (which could be black or white) between trials.
What looked like time-related neural behavior arose mainly in the lateral entorhinal cortex. The firing rates of those neurons abruptly spiked when the rat entered the box. As the seconds and then minutes passed, the activity of the neurons decreased at varying rates. That activity ramped up again at the start of the next trial, when the rat reentered the box. Meanwhile, in some cells, activity declined not only during each trial but throughout the entire experiment; in other cells, it increased throughout.
Based on the combination of these patterns, the researchers—and presumably the rats—could tell the different trials apart (tracing the signals back to certain sessions in the box, as if they were timestamps) and arrange them in order. Hundreds of neurons seemed to be working together to keep track of the order of the trials, and the length of each one.
“You get activity patterns that are not simply bridging delays to hold on to information but are parsing the episodic structure of experiences,” said Matthew Shapiro, a neuroscientist at Albany Medical College in New York who was not involved in the study.
The rats seemed to be using these “events”—changes in context—to get a sense of how much time had gone by. The researchers suspected that the signal might therefore look very different when the experiences weren’t so clearly divided into separate episodes. So they had rats run around a figure-eight track in a series of trials, sometimes in one direction and sometimes the other. During this repetitive task, the lateral entorhinal cortex’s time signals overlapped, likely indicating that the rats couldn’t distinguish one trial from another: They blended together in time. The neurons did, however, seem to be tracking the passage of time within single laps, where enough change occurred from one moment to the next.
Tsao and his colleagues were excited because, they posited, they had begun to tease out a mechanism behind subjective time in the brain, one that allowed memories to be distinctly tagged. “It shows how our perception of time is so elastic,” Shapiro said. “A second can last forever. Days can vanish. It’s this coding by parsing episodes that, to me, makes a very neat explanation for the way we see time. We’re processing things that happen in sequences, and what happens in those sequences can determine the subjective estimate for how much time passes.” The researchers now want to learn just how that happens.
Howard’s mathematics could help with that. When he heard about Tsao’s results, which were presented at a conference in 2017 and published in Nature last August, he was ecstatic: The different rates of decay Tsao had observed in the neural activity were exactly what his theory had predicted should happen in the brain’s intermediate representation of experience. “It looked like a Laplace transform of time,” Howard said—the piece of his and Shankar’s model that had been missing from empirical work.
“It was sort of weird,” Howard said. “We had these equations up on the board for the Laplace transform and the inverse around the same time people were discovering time cells. So we spent the last 10 years seeing the inverse, but we hadn’t seen the actual transform. … Now we’ve got it. I’m pretty stoked.”
“It was exciting,” said Kareem Zaghloul, a neurosurgeon and researcher at the National Institutes of Health in Maryland, “because the data they showed was very consistent with [Howard’s] ideas.” (In work published last month, Zaghloul and his team showed how changes in neural states in the human temporal lobe linked directly to people’s performance on a memory task.)
“There was a nonzero probability that all the work my colleagues and students and I had done was just imaginary. That it was about some set of equations that didn’t exist anywhere in the brain or in the world,” Howard added. “Seeing it there, in the data from someone else’s lab — that was a good day.”
Building Timelines of Past and Future
If Howard’s model is true, then it tells us how we create and maintain a timeline of the past—what he describes as a “trailing comet’s tail” that extends behind us as we go about our lives, getting blurrier and more compressed as it recedes into the past. That timeline could be of use not just to episodic memory in the hippocampus, but to working memory in the prefrontal cortex and conditioning responses in the striatum. These “can be understood as different operations working on the same form of temporal history,” Howard said. Even though the neural mechanisms that allow us to remember an event like our first day of school are different than those that allow us to remember a fact like a phone number or a skill like how to ride a bike, they might rely on this common foundation.
The discovery of time cells in those brain regions (“When you go looking for them, you see them everywhere,” according to Howard) seems to support the idea. So have recent findings—soon to be published by Howard, Elizabeth Buffalo at the University of Washington and other collaborators—that monkeys viewing a series of images show the same kind of temporal activity in their entorhinal cortex that Tsao observed in rats. “It’s exactly what you’d expect: the time since the image was presented,” Howard said.
He suspects that record serves not just memory but cognition as a whole. The same mathematics, he proposes, can help us understand our sense of the future, too: It becomes a matter of translating the functions involved. And that might very well help us make sense of timekeeping as it’s involved in the prediction of events to come (something that itself is based on knowledge obtained from past experiences).
Howard has also started to show that the same equations that the brain could use to represent time could also be applied to space, numerosity (our sense of numbers) and decision-making based on collected evidence — really, to any variable that can be put into the language of these equations. “For me, what’s appealing is that you’ve sort of built a neural currency for thinking,” Howard said. “If you can write out the state of the brain … what tens of millions of neurons are doing … as equations and transformations of equations, that’s thinking.”
He and his colleagues have been working on extending the theory to other domains of cognition. One day, such cognitive models could even lead to a new kind of artificial intelligence built on a different mathematical foundation than that of today’s deep learning methods. Only last month, scientists built a novel neural network model of time perception, which was based solely on measuring and reacting to changes in a visual scene. (The approach, however, focused on the sensory input part of the picture: what was happening on the surface, and not deep down in the memory-related brain regions that Tsao and Howard study.)
But before any application to AI is possible, scientists need to ascertain how the brain itself is achieving this. Tsao acknowledges that there’s still a lot to figure out, including what drives the lateral entorhinal cortex to do what it’s doing and what specifically allows memories to get tagged. But Howard’s theories offer tangible predictions that could help researchers carve out new paths toward answers.
Of course, Howard’s model of how the brain represents time isn’t the only idea out there. Some researchers, for instance, posit chains of neurons, linked by synapses, that fire sequentially. Or it could turn out that a different kind of transform, and not the Laplace transform, is at play.
Those possibilities do not dampen Howard’s enthusiasm. “This could all still be wrong,” he said. “But we’re excited and working hard.”