Brain Food
Alzheimer's breakthroughs of mice and men, the stranger-than-fiction phenomenon of comb jelly intelligence, RIP Dobby, and three recommendations.

Thank you for reading The Garden of Forking Paths. This edition is free, but if you’d like to support my work—and keep my research and writing sustainable, consider upgrading for $4/month. I’m working on some more ambitious essays (which I’m excited to share with you) but they take a lot of time, so if you value my work, perhaps you might also enjoy the feel-good feeling that comes with clicking this little button and supporting it:
Lithium, mice brains, and reversing Alzheimer’s
Twenty-five centuries ago, the philosopher and polymath Pythagoras chunked up human lifespans into six distinct categories.1 The final two—Old Age (63 to 79) and Advanced Age (80 to death)—he called the senium.
His assessment of the final period was unkind, a decidedly unjust blow struck against octogenarians: “the scene of mortal existence closes after a great length of time that very fortunately, few of the human species arrive at, where the mind is reduced to the imbecility of the first epoch of infancy.”
Solon, another intellectual of the ancient Greek world, noted that the mind could be impaired by a variety of factors, with aging just one among many, the chief culprits being: “physical pain, violence, drugs, old age, or the persuasion of a woman.”
Shakespeare, ever the bard of the human condition, frequently riffed on the potential mental toll of aging. King Lear lamented: “I fear I am not in my perfect mind.” In The Tempest, Prospero speaks of cognitive decline: “And as with age his body uglier grows, So his mind cankers.” In Much Ado About Nothing, Dogberry, known for his malapropisms and misquotations, swaps out the phrase "When the ale is in, the wit is out,” for “When the age is in, the wit is out.”
For thousands of years, then, humanity grappled with the wisdom of aged experience pitted against the perils of mental decay. And yet, there were remarkably few typologies to explain widespread variation between people. Why were some old people seasoned but sharp, while others struggled to unlock the neurological filing cabinets of their past life?
Enter, stage right, the delightfully named Jean-Étienne Dominique Esquirol (1772-1840). Esquirol sought to identify what caused dementia. He reliably identified several unambiguous causes, among them:
Progression of age
Wine abuse
Unfulfilled ambitions
Unhappy love
Masturbation
Syphilis
Hemorrhoids surgery
Poverty
Fears
In the latter half of the 20th century, the global prevalence of dementia has soared, primarily because of increased average human lifespans. Thankfully, medical science has advanced considerably since the days of Esquirol, so we need not chalk it up to a surge in unfulfilled ambitions or wine-drunk fears.
Instead, it turns out there’s some rather good news coming out of the brains of mice recently. This month, a new study in Nature pointed to intriguing results from experiments on cognitive decline in rodents—alongside new findings about how human brains start to lose their mental sharpness.
Researchers began by measuring the levels of 27 key metals in the human brain. These metals are critical components in healthy brain activity, so they decided to simply measure the concentrations of each metal in three groups of post-mortem human brains:
Healthy people;
People with mild cognitive impairment; and
People with diagnosed Alzheimer’s disease
They found a single outlier: lithium.
The other 26 metals were pretty consistent throughout the three groups, but lithium (Li) levels were substantially lower in those with cognitive impairment. As Lynne Peeples writes, the researchers showed that “when lithium concentrations in the brain decline, memory loss tends to develop, as do neurological hallmarks of Alzheimer’s disease called amyloid plaques and tau tangles.”
(As a brief aside, lithium has long been touted as a brain tonic. The soda 7-Up was originally called “7up Lithiated Lemon Soda” in 1930, as it claimed to contain Lithium Citrate. But the government got involved in 1936, failed to find any evidence of lithium in the drink, and so the name was changed to 7 Up a year later.)
Having spotted lithium deficiencies in the two groups with cognitive decline, the scientists then explored and experimented with lithium deficiency in two kinds of mice: “model mice” and “wild-type mice.” (The former is a catch-all term for mice that have been altered by researchers for a specific purpose in studying human disease—in this case, mimicking the traits/pathology of Alzheimer’s—while the latter are just normal mice.)
What they found was extraordinary.
First, lithium deficiency in mice led to mental impairment, loss of memory, and symptoms akin to dementia in humans. As lithium levels declined, plaque increased, which led to even lower levels of lithium—a vicious cycle of decline that could help explain the progression into Alzheimer’s disease.
Second, they had a breakthrough in finding a specific kind of lithium that, when boosted, would not just slow cognitive decline, but possibly reverse it.
Previous studies mostly focused on lithium carbonate. That matters because when that form of lithium is introduced, existing plaques (and the dreaded amyloid-β peptides) in the brain are able to “trap” it. But when the scientists used a different form, lithium orotate, it didn’t get trapped—and made a striking difference in the cognitive functioning of the mice.
Rather than simply slowing or stopping the cognitive decay, the scientists found that the lithium orotate in the mice could “roll back memory loss, restoring the brain to a younger, healthier state…low doses of lithium orotate…reversed disease-related brain damage and restored the animals’ memory. Lithium carbonate did not have the same benefits, potentially helping to explain the mixed results of earlier clinical trials.”
This is a potentially extraordinary breakthrough finding. And yet, there are important caveats. Mice, as you may have noticed, are not the same as us. These findings may yet amount to nothing, a mere curiosity in how rodent brains work that intrigue us, but do little to help with managing human disease.
Nonetheless, the researchers struck a tone of cautious optimism, particularly because they were able to show—mechanistically—how cognitive decline happens with depleted lithium on the level of neurons and brain chemistry. And it may be the case that, after robust clinical trials, future interventions involve topping up brain stocks of lithium as our minds age.
However, in perhaps the most predictably depressing bit of medical forecasting, some scientists worry that the challenge may be in getting someone to fund further research. US government grants are drying up during the White House’s crusade against science. And pharmaceutical companies aren’t eager to chip in, as they stand to gain nothing from an intervention that’s basically just a metal on the periodic table.
“No pharmaceutical company is going to make any profit on lithium,” Tomas Hajek of Dalhousie University told Nature. “Lithium is dirty cheap.” How profoundly dystopian it is that a major barrier to funding medical progress is that a potentially game-changing treatment is…too affordable.
The Life of Chuck
Here, I’ll be brief: The Life of Chuck is the best film I’ve seen in the last five years. I went to see it last weekend knowing literally nothing about it. I just bought my ticket, sat down, and watched. And that is precisely what you should do. Don’t watch the trailer. Don’t read reviews. Don’t even read this sentence! Just go see it. If you’re confused in the beginning, be patient. It’s beautiful cinema, a film that made me feel overjoyed to be alive.
Alien intelligence OR What’s it like to be a comb jelly?
Longtime readers will know that I’m interested in the evolution of intelligence (and my third most-read essay is about that subject through the prism of octopus intelligence):
A fascinating question of evolutionary biology—and one that continues to divide scientists—is whether our specific form of intelligence and, eventually, consciousness, was inevitable. Alternatively, was the specific trajectory toward cognition in the animal kingdom at least partially a byproduct of chance? If so, what would be an alternative form of a nervous system that could have emerged?
A good place to explore that question comes with ctenophores (pronounced “ten-o-fors”). More colloquially, these simple other-worldly animals are known as the “comb jellies.” And some researchers now believe that their unique nervous system is evidence that animal nervous systems evolved not once, but twice.
For decades, these bizarre creatures were so overlooked by scientists—often mistakenly lumped in with jellyfish, a totally distinct branch on the tree of life—that they were sometimes even drawn upside down in textbooks.
There are admittedly some similarities. Weird propulsion. Jelly-like bodies. Both animal types have diffuse nervous systems rather than centralized brains. And like the so-called “immortal jellyfish,” researchers discovered in October 2024 that one species of comb jelly can engage in “reverse development,” in which it effectively can age backward, reverting to a preceding life cycle.
There are, however, significant differences between comb jellies and jellyfish, as Douglas Fox writes:
Unlike the jellyfish, which uses muscles to flap its body and swim, the ctenophore uses thousands of cilia to swim. And unlike the jellyfish with its stinging tentacles, the ctenophore hunts using two sticky tentacles that secrete glue, an adaptation with no parallel in the rest of the animal kingdom. The ctenophore is a voracious predator, known for its ambush tactics. It hunts by spreading its branched, sticky tentacles to form something like a spiderweb, and catches its prey meticulously, one by one.
Moreover, jellyfish secrete waste through their mouth, which helpfully doubles as an anus. By contrast, comb jellies have an anus—and sometimes several! One species of comb jelly, the memorably named warty comb jelly or sea walnut, has evolved a unique feature known as—I’m not making this up—a “transient anus.”
That anus is quite literally constructed only when the animal needs to defecate. Then, like a magic trick, it disappears completely. (Sidney Tamm of the Marine Biological Laboratory in Woods Hole, Massachusetts, excitedly told The New Scientist of this discovery: “It is not visible when the animal is not pooping,” Tamm explained. “There’s no trace under the microscope. It’s invisible to me.”
But such findings—which Tamm believes may be the evolutionary missing link on the origin story of anuses—were left undiscovered for centuries because the ctenophores were wrongly presumed to be boring, simple, and ordinary. As Fox notes:
When scientists began examining the ctenophore nervous system in the late 1800s, what they saw through their microscopes seemed ordinary. A thick tangle of neurons sat near the animal’s south pole, a diffuse network of nerves spread throughout its body, and a handful of thick nerve bundles extended to each tentacle and to each of its eight bands of cilia. Electron microscope studies in the 1960s showed what seemed to be synapses between these neurons, with bubble-like compartments poised to release neurotransmitters that would stimulate the neighbouring cell.
Scientists injected the neurons of living ctenophores with calcium – causing them to fire electric pulses, just as happens in the nerves of rats, worms, flies, snails and every other animal. By stimulating the right nerves, researchers could even prompt its cilia to rotate in different patterns – causing it to swim forward or back.
In short, the ctenophore’s nerves seemed to look and act just like those of any other animal. So biologists assumed that they were the same.
But in the last few decades, the more these creatures have been studied, the more they’re challenging conventional wisdom about the evolution of nervous systems.
Leonid Moroz, a neuroscientist at the University of Florida, has discovered something truly perplexing: that ctenophores don’t use neurotransmitters such as serotonin, dopamine and nitric oxide, which are widely considered to be the “universal neural language of all animals.”
As science advanced, fresh tools gave Moroz the opportunity to subject these animals to ever-more rigorous genomic testing. By happenstance, in 2007, some ctenophores caught his eye in the water at Friday Harbor in Washington state; he grabbed a net, collected some, froze them, and shipped them back to his lab in Florida for analysis. The eventual findings, later published in Nature, were astonishing:
“We all use neurotransmitters,’ Moroz says. ‘From jellyfish to worms, to molluscs, to humans, to sea urchins, you will see a very consistent set of signalling molecules.’ But, somehow, the ctenophore had evolved a nervous system in which these roles were filled by a different, as-yet unknown set of molecules.”
Even more bizarre, the ctenophores didn’t have several other ingredients that were previously assumed to be universal to all animal life, from sponges to humans. There were no micro-RNA genes or HOX genes, presumed to be essential biological ingredients for executing the blueprint of animal life. Even the lowly Placozoa, described lovingly as “blob-like animals composed of aggregations of cells” have both.
The comb jellies seem to be unique.
More befuddlement was to come. In April 2023, researchers took an even closer look at the neurons of ctenophores and found something shocking, reviving a previously settled debate.
At the dawn of the 20th century, there was a disagreement between the biologist Camillo Golgi and the man who became the father of modern neuroscience, Santiago Ramón y Cajal. Golgi argued that neurons were continuous and conjoined—inseparable. Ramon y Cajal disagreed, insisting that they were discrete cells, only connected by synapses.
Electron microscopy made it possible to resolve the debate. Ramon y Cajal was correct. And there were no exceptions. Every animal with neurons had synapses connecting them.
Now, fresh research is showing that ctenophores have some neurons that are continuous—an apparent evolutionary one-off.
The upshot is this: Moroz’s theory—that ctenophores are living evidence that nervous systems evolved independently twice—has gained a lot of steam in recent years. This is a big shift. “Most people said it was a crazy idea,” Moroz told Nature. “It was against common wisdom.”
What’s particularly intriguing about all this is the fact that scientists still are in the dark about the origins of neurons—and how such complex cells came to exist. These questions, which stretch back deep into the evolutionary past, remain fundamentally unresolved. But it tickles me to no end that some of the most profound insights about the origins of complex life and the evolution of intelligence are coming from seemingly basic, iridescent little weirdos floating around the world’s oceans, forgotten, ignored, but with profound, enigmatic truths lurking within their gelatinous bodies.
Pipeline (Podcast)
On February 25, 2022, a group of five professional divers in Trinidad and Tobago were repairing an underwater oil pipeline. Suddenly, without warning, a huge pressure differential (caused by corporate negligence) meant that the pipe sucked all five men deep into the hollow metal tube.
It was just 30 inches wide—and they were trapped inside it, slurping oxygen from a tiny air pocket as the clock ticked down on their lives.
As an enthusiastic SCUBA diver, I initially shuddered at the thought of a podcast about such a terrifying underwater disaster. But the podcast isn’t just about cramped spaces and running out of air; it’s a story of rich vs. poor power differentials, corporate corruption, cover-ups, company cultures that prioritize reputational damage control over saving lives, and how one deadly tragedy upended an island nation’s politics.
The Thief at the British Museum (Podcast)
Five years ago, an enthusiastic art collector was browsing ancient artefacts for sale and became perplexed: how could the item be for sale when he knew—for a fact—that it was in the permanent collection of the British Museum?
This was the beginning of one man’s solo crusade to get the museum to take him seriously as he alleged a widespread theft of thousands of priceless artefacts from the most impressive collection of human heritage on the planet. And as he persisted—and was eventually vindicated—a remarkable story of a strange, large-scale inside job began to take shape.
How could nobody notice, for years, that 1,500 museum items had gone missing—and were being sold for tiny amounts of money on eBay?
Dobby’s Grave: “RIP Dongby”
I went camping for a few nights on the southwest coast of Wales this week, hiking along soaring, striated cliffs overlooking shimmering waters. It was delightful. Zorro, my trusty canine companion, was suitably enthusiastic about the trip. (He has his own sleeping bag).
While walking along the beach, we stumbled across a large stack of decorated rocks, topped off with a makeshift shrine, completed with a long sock. It turned out this was the final resting place of Dobby, the beloved house elf from the Harry Potter films, depicted as being buried on this exact spot.
It was a strange place, full of heartfelt tributes—an outpouring of genuine grief and tenderness—for a fictional character that clearly was tied to so many people’s personal identities, their memories of childhood, perhaps a bittersweet reminder of the capacity of evil to kill good beings, whether they ever existed or not. There was something deeply, bizarrely human about all the people who had made the pilgrimage, painstakingly, lovingly decorating rocks just to give a final goodbye to a character that touched their lives.
And then, there was my favorite memorial, written with passion by someone who came all this way just to write this tear-jerking message:
“RIP Dongby.”
You had one job!
Thank you for reading The Garden of Forking Paths. If you enjoyed this edition, or learned something new, please consider upgrading to a paid subscription for just $4/month. Your support makes this possible—and a subscription also allows you to access the full archive of 210+ essays.
Life expectancy at birth in the ancient world was often between 20-30 years old, but that’s because so many people died in childhood. If you survived into adulthood, a fair number of people would live beyond their 50s and some would have truly long lives (even by today’s standards).
News I could really use, like a breath of fresh air and wonderment. Muchas Gracias.
I enjoyed every word of this piece and learned some new things. Thank you for pulling together these disparate wonders for us, Brian.