Itemoids

Health

Hypochondria Never Dies

The Atlantic

www.theatlantic.com › magazine › archive › 2024 › 06 › body-made-of-glass-book-review-hypochondria › 678218

At breakfast the other week, I noticed a bulging lump on my son’s neck. Within minutes of anxious Googling, I’d convinced myself that he had a serious undiagnosed medical condition—and the more I looked, the more apprehensive I got. Was it internal jugular phlebectasia, which might require surgery? Or a sign of lymphoma, which my father had been diagnosed with before he died? A few hours and a visit to the pediatrician later, I returned home with my tired child in tow, embarrassed but also relieved: The “problem” was just a benignly protuberant jugular vein.

My experience was hardly unique. We live in an era of mounting health worries. The ease of online medical self-diagnosis has given rise to what’s called cyberchondria: concern, fueled by consulting “Dr. Google,” that escalates into full-blown anxiety. Our medical system features ever more powerful technologies and proliferating routine preventive exams—scans that peer inside us, promising to help prolong our lives; blood tests that spot destructive inflammation; genetic screenings that assess our chances of developing disease. Intensive vigilance about our health has become the norm, simultaneously unsettling and reassuring. Many of us have experienced periods of worry before or after a mammogram or colonoscopy, or bouts of panic like mine about my son’s neck. For some, such interludes become consuming and destabilizing. Today, at least 4 percent of Americans are known to be affected by what is now labeled “health anxiety,” and some estimates suggest that the prevalence is more like 12 percent.

And yet hypochondria, you may be surprised to learn, officially no longer exists. In 2013, the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders, the so-called bible of psychiatric conditions, eliminated hypochondriasis. The change reflected an overdue need to reconceive a diagnosis that people found stigmatizing because it implied that hypochondriacs are neurotic malingerers whose symptoms aren’t “real.” The DSM introduced two distinct new diagnoses, illness anxiety disorder and somatic symptom disorder, both of which aim to be neutrally clinical descriptions of people with “extensive worries about health.” What differentiates them is the presence or absence of physical symptoms accompanying those fears.

But the efforts to delineate the spectrum of health anxiety, however, fall short of clarifying the murky nature of hypochondria. The ostensibly helpful terms are actually anything but that. Although we know more than ever before about the diseases and mental illnesses that afflict us, the body’s most obdurate mysteries remain. Doctors and patients must navigate them together. The only way to do so is by setting aside any impulse to moralize and by embracing uncertainty—the very thing that modern medicine is least equipped to do. The abyss between patients’ subjective experience of symptoms and medicine’s desire for objectivity is hard to bridge, as the scholar Catherine Belling notes in A Condition of Doubt. This is the space where hypochondria still lives.

The timing of the writer Caroline Crampton’s new book, A Body Made of Glass: A Cultural History of Hypochondria, couldn’t be better. What her belletristic account of hypochondria’s long and twisting lineage sometimes lacks in authoritative rigor, it makes up for in vivid evocations of being a patient. Her youthful experience with cancer and the anxiety she has suffered ever since propel her undertaking: a tour that includes a sampling of evolving medical science about the condition, as well as literary reflections (from, among others, John Donne, Molière, Marcel Proust, Virginia Woolf, and Philip Larkin) on the doubt and fear that are inseparable from life in a body that gets sick.

[Read: The psychology of irrational fear]

Hypochondria, as Crampton highlights, is not just a lay term for a tendency to worry about illness that isn’t there. It’s a diagnosis that has existed for hundreds of years. The attendant symptoms and meanings have shifted continually, always in step with changing conceptions of wellness and disease. In that sense, the history of hypochondria reflects one constant: Each era’s ideas track its limited understanding of health, and demonstrate a desire for clarity about the body and illness that again and again proves elusive. Knowing this doesn’t stop Crampton from dreaming of a “definitive test for everything, including health anxiety itself.”

Hippocrates, known as the father of medicine, used the term hypochondrium in the fifth century B.C.E. to identify a physical location—the area beneath the ribs, where the spleen was known to lie. Hippocratic medicine held that health depended on a balance among four humors—blood, black bile, yellow bile, and phlegm—that affected both body and mind. An excess of black bile, thought to collect in the organs of the hypochondrium, where many people experienced unpleasant digestive symptoms, could also cause responses such as moodiness and sadness. The term hypochondria thus came to be associated, as the humoral theory persisted into the Renaissance, not only with symptoms like an upset stomach but also with sluggishness, anxiety, and melancholy—a convergence of “two seemingly unrelated processes within the body: digestive function and emotional disorder,” as Crampton notes.

By the 17th century, the notion of hypochondria as a fundamentally physical condition that also had mental symptoms had been firmly established. In The Anatomy of Melancholy (1621), the English writer and scholar Robert Burton described it as a subset of melancholia, noting a “splenetic hypochondriacal wind” accompanied by “sharp belchings” and “rumbling in the guts,” along with feeling “fearful, sad, suspicious”—an illness that, as he put it, “crucifies the body and mind.” Physicians in the 18th century began to investigate hypochondria as a disorder of the recently discovered nervous system, accounting for symptoms not just in the gut but in other parts of the body as well. According to this view, the cause wasn’t imbalanced humors but fatigue and debility of the nerves themselves.

The story of Charles Darwin, which Crampton tells in her book, illustrates the transition between the period when hypochondria was still seen primarily as a physical disease and the period when it began to look like a primarily psychological condition. Darwin, who was born in 1809, suffered from intense headaches, nausea, and gastric distress, as well as fatigue and anxiety, all of which he chronicled in a journal he called “The Diary of Health.” Although various posthumous diagnoses of organic diseases have been proposed—including systemic lactose intolerance—Crampton observes that Darwin’s need to follow strict health regimens and work routines could be interpreted as a manifestation of undue worry. This blurred line between intense (and possibly useful) self-scrutiny and mental disorder became a challenge for doctors and patients to address.

A fundamental shift had taken place by the late 19th century, thanks to the emergence of views that went on to shape modern psychology, including the idea that, as Crampton puts it, “the mind … controlled the body’s experiences and sensations, not the other way around.” Distinguished by what the neurologist George Beard, in the 1880s, called “delusions,” hypochondria was reconceived as a mental illness: It was a psychological state of unwarranted concern with one’s health.

In the 20th century, the prototypical hypochondriac became the kind of neurotic whom Woody Allen plays in Hannah and Her Sisters: someone who obsessively thinks they are sick when they’re not. Freud’s view that unexplained physical symptoms can be the body’s expression of inner conflict—meaning that those symptoms could be entirely psychological in origin—played an influential role. The idea that stress or anguish could manifest as bodily distress, in a process that came to be called “somatization,” spread. So did 20th-century medicine’s new capacity to test for and rule out specific conditions. Consider Allen’s character in that film, fretting about a brain tumor, only to have his worries assuaged by a brain scan. This newly psychologized anxiety, juxtaposed with medical science’s objective findings, helped solidify the modern image of the hypochondriac as a comedic figure, easily caricatured as a neurotic who could, and should, just “snap out of it.”

Unlike some other forms of anxiety, health worries are a problem that neither better labels nor improved treatments can hope to completely banish. Hypochondria, the writer Brian Dillon pointedly notes in his The Hypochondriacs: Nine Tormented Lives, ultimately “makes dupes of us all, because life, or rather death, will have the last laugh.” In the meantime, we doubt, wait, anticipate, and try to identify: Is that stabbing headache a passing discomfort, or a sign of disease? Our bodies are subject to fluctuations, as the medical science of different eras has understood—and as today’s doctors underscore. The trick is to pay enough attention to those changes to catch problems without being devoured by the anxiety born of paying too much attention.

In retrospect, Crampton, as a high-school student in England, wasn’t anxious enough, overlooking for months a tennis-ball-size lump above her collarbone that turned out to be the result of Hodgkin’s lymphoma, a blood cancer. Her doctor told her she had a significant chance that treatment would leave her cancer-free. After chemo, radiation, one relapse, and a stem-cell transplant, she got better. But the experience left her hypervigilant about her body, anxious that she might miss a recurrence. As she reflects, “it took being cured of a life-threatening illness for me to become fixated on the idea that I might be sick.” Her conscientious self-monitoring gave way to panicked visits to urgent care and doctors’ offices, seeking relief from the thought that she was experiencing a telltale symptom—a behavior that she feels guilty about as a user of England’s overstretched National Health Service. “At some point,” she writes, “my responsible cancer survivor behavior had morphed into something else.”

[From the January/February 2014 issue: Scott Stossel on surviving anxiety]

What Crampton was suffering from—the “something else”—seems to be what the DSM now labels “illness anxiety disorder,” an “excessive” preoccupation with health that is not marked by intense physical symptoms. It applies both to people who are anxious without apparent cause or symptoms and to people like Crampton, who have survived a serious disease that might recur and are understandably, but debilitatingly, apprehensive.

It can be hard to distinguish this term, Crampton finds, from the DSM ’s other one, somatic symptom disorder, which describes a disproportionate preoccupation that is accompanied by persistent physical symptoms. It applies to people who catastrophize—the person with heartburn who grows convinced that she has heart disease—as well as those with a serious disease who fixate, to their detriment, on their condition. The definition makes a point of endorsing the validity of a patient’s symptoms, whatever the cause may be; in this, it embodies a 21st-century spirit of nonjudgmental acceptance. Yet because it is a diagnosis of a mental “disorder,” it inevitably involves assessments—of, among other things, what counts as “excessive” anxiety; evaluations like these can be anything but clear-cut. Medicine’s distant and not so distant past—when multiple sclerosis was often misdiagnosed as hysteria, and cases of long COVID were dismissed as instances of pandemic anxiety—offers a caution against confidently differentiating between psychological pathology and poorly understood illness.

In Crampton’s view, the DSM ’s revision has turned out to be “an extensive exercise in obfuscation.” Some physicians and researchers agree that the categories neither lump nor split groups of patients reliably or helpfully. A 2013 critique argued that somatic symptom disorder would pick up patients with “chronic pain conditions [and] patients worrying about the prognosis of a serious medical condition (e.g., diabetes, cancer),” not to mention people with undiagnosed diseases. A 2016 study failed to provide “empirical evidence for the validity of the new diagnoses,” concluding that the use of the labels won’t improve the clinical care of patients suffering from “high levels of health anxiety.”

“Hypochondria only has questions, never answers, and that makes us perpetually uneasy,” Crampton writes. Still, she finds that she almost mourns the old term. Its imperfections fit her messy experience of anxiety—and help her describe it to herself and doctors, giving “edges to a feeling of uncertainty” that she finds overwhelming. But her position, she acknowledges, is a privileged one: As a former adolescent cancer patient, she gets care when she seeks it, and doesn’t really have to worry about being stigmatized by doctors or friends.

Crampton’s concerns and her experience, that is, are legible to the medical system—to all of us. But that is not true for the millions of patients (many of them young women) suffering from fatigue or brain fog who struggle to get doctors to take their symptoms seriously, and turn out to have a condition such as myalgic encephalomyelitis/chronic fatigue syndrome or an autoimmune disease. They, too, are pulled into the story of hypochondria—yet the DSM ’s labels largely fail to solve the problem these patients encounter: In the long shadow of Freud, we are still given to assuming that what clinicians call “medically unexplained symptoms” are psychological in origin. Fifteen-minute appointments in which doctors often reflexively dismiss such symptoms as indicators of anxiety don’t help. How can doctors usefully listen without time—or medical training that emphasizes the bounds of their own knowledge?

This omission is the real problem with the DSM ’s revision: It pretends to have clarity we still don’t have, decisively categorizing patients rather than scrutinizing medicine’s limitations. The challenge remains: Even as evidence-based medicine laudably strives to nail down definitions and make ever-finer classifications, patients and practitioners alike need to recognize the existential uncertainty at the core of health anxiety. Only then will everyone who suffers from it be taken seriously. After all, in an era of pandemics and Dr. Google, what used to be called hypochondria is more understandable than ever.

Someday we might have the longed-for “definitive test” or a better set of labels, but right now we must acknowledge all that we still don’t know—a condition that literature, rather than medicine, diagnoses best. As John Donne memorably wrote, in the throes of an unknown illness, now suspected to have been typhus, “Variable, and therefore miserable condition of man! This minute I was well, and am ill, this minute.”

This article appears in the June 2024 print edition with the headline “Hypochondria Never Dies.”

A Fundamental Stage of Human Reproduction Is Shifting

The Atlantic

www.theatlantic.com › health › archive › 2024 › 05 › menopause-timing-evolution-technology-reproduction › 678279

For a long time, having children has been a young person’s game. Although ancient records are sparse, researchers estimate that, for most of human history, women most typically conceived their first child in their late teens or early 20s and stopped having kids shortly thereafter.

But in recent decades, people around the world, especially in wealthy, developed countries, have been starting their families later and later. Since the 1970s, American women have on average delayed the beginning of parenthood from age 21 to 27; Korean women have nudged the number past 32. As more women have kids in their 40s, the average age at which women give birth to any of their kids is now above 30, or fast approaching it, in most high-income nations.

Rama Singh, an evolutionary biologist at McMaster University, in Canada, thinks that if women keep having babies later in life, another fundamental reproductive stage could change: Women might start to enter menopause later too. That age currently sits around 50, a figure that some researchers believe has held since the genesis of our species. But to Singh’s mind, no ironclad biological law is stopping women’s reproductive years from stretching far past that threshold. If women decide to keep having kids at older ages, he told me, one day, hundreds of thousands of years from now, menopause could—theoretically—entirely disappear.

Singh’s viewpoint is not mainstream in his field. But shifts in human childbearing behavior aren’t the only reason that menopause may be on the move. Humans are, on the whole, living longer now, and are in several ways healthier than our ancient ancestors. And in the past few decades, especially, researchers have made technological leaps that enable them to tinker like never before with how people’s bodies function and age. All of these factors might well combine to alter menopause’s timeline. It’s a grand experiment in human reproduction, and scientists don’t yet know what the result might be.

So far, scientists have only scant evidence that the age of onset for menopause has begun to drift. Just a few studies, mostly tracking trends from recent decades, have noted a shift on the order of a year or two among women in certain Western countries, including the U.S. and Finland. Singh, though, thinks that could be just the start. Menopause can come on anywhere from a person’s 30s to their 60s, and the timing appears to be heavily influenced by genetics. That variation suggests some evolutionary wiggle room. If healthy kids keep being born to older and older parents, “I could see the age of menopause getting later,” Megan Arnot, an anthropologist at University College London, told me.

Singh’s idea assumes that menopause is not necessary for humans—or any animal, for that matter—to survive. And if a species’ primary directive is to perpetuate itself, a lifespan that substantially exceeds fertility does seem paradoxical. Researchers have found lengthy post-reproductive lifespans in only a handful of other creatures—among them, five species of toothed whales, plus a single population of wild chimpanzees. But women consistently spend a third to half of their life in menopause, the most documented in any mammal.

In humans, menopause occurs around the time when ovaries contain fewer than about 1,000 eggs, at which point ovulation halts and bodywide levels of hormones such as estrogen plummet. But there’s no biological imperative for female reproductive capacity to flame out after five decades of life. Each human woman is born with some 1 to 2 million eggs—comparable to what researchers have estimated in elephants, which remain fertile well into their 60s and 70s. Nor do animal eggs appear to have a built-in expiration date: Certain whales, for instance, have been documented bearing offspring past the age of 100.

[Read: Why killer whales (and humans) go through menopause]

This disconnect has led some researchers to conclude that menopause is an unfortunate evolutionary accident. Maybe, as some have argued, menopause is a by-product of long lifespans evolving so quickly that the ovaries didn’t catch up. But many women have survived well past menopause for the bulk of human history. Singh contends that menopause is a side effect of men preferring to mate with younger women, allowing fertility-compromising mutations to accumulate in aged females. (Had women been the ones to seek out only younger men, he told me, men would have evolved their own version of menopause.) Others disagree: Arnot told me that, if anything, many of today’s men may prefer younger women because fertility declines with age, rather than the other way around.

But the preponderance of evidence supports menopause being beneficial to the species it’s evolved in, including us, Francisco Úbeda de Torres, a mathematical biologist at Royal Holloway, University of London, told me. Certainly, menopause was important enough that it appears to have arisen multiple times—at least four separate times among whales alone, Samuel Ellis, a biologist at the University of Exeter, told me.

One of the most prominent and well-backed ideas about why revolves around grandmothering. Maybe menopause evolved to rid older women of the burden of fertility, freeing up their time and energy to allow them to help their offspring raise their own needy kids. In human populations around the world, grandmother input has clearly boosted the survival of younger generations; the same appears to be true among orcas and other toothed whales. Kristen Hawkes, an anthropologist at the University of Utah, argues that the influence of menopausal grandmothering was so immense that it helped us grow bigger brains and shaped the family structures that still govern modern societies; it is, she told me, sufficient to explain menopause in humans, and what has made us the people we are today.

[From the October 2019 issue: The secret power of menopause]

Some researchers suspect that menopause may have other perks. Kevin Langergraber, an ecologist at Arizona State University, points out that certain populations of chimpanzees can also live well past menopause, even though their species doesn’t really grandmother at all. In chimpanzees and some other animals, he told me, menopause might help reduce the competition for resources between mothers and their children as they simultaneously try to raise young offspring.

Regardless of the precise reasons, menopause may be deeply ingrained in our lineage—so much so that it could be difficult to adjust or undo. After all this time of living with an early end to ovulation, there is probably “no single master time-giver” switch that could be flipped to simply extend human female fertility, Michael Cant, an evolutionary biologist at the University of Exeter, told me.

Perhaps, though, menopause’s timeline could still change—not on scales of hundreds of thousands of years, but within generations. Malnutrition and smoking, for instance, are linked to an early sunsetting of menses, while contraceptive use may push the age of menopause onset back—potentially because of the ways in which these factors can affect hormones. Menopause also tends to occur earlier among women of lower socioeconomic status and with less education. Accordingly, interventions as simple as improving childhood nutrition might be enough to raise the average start of menopause in certain parts of the world, Lynnette Sievert, an anthropologist at the University of Massachusetts at Amherst, told me.

[Read: Why so many accidental pregnancies happen in your 40s]

Changes such as those would likely operate mostly on the margins—perhaps closing some of the gaps between poorer and richer nations, which can span about five years. Bigger shifts, experts told me, would probably require medical innovation that can slow, halt, or even reverse the premature aging of the ovaries, and maintain a person’s prior levels of estrogen and other reproductive hormones. Kara Goldman, an obstetrician-gynecologist and a reproductive scientist at Northwestern University, told me that one key to the ovarian fountain of youth might be finding drugs to preserve the structures that house immature eggs in a kind of dormant early state. Other researchers see promise in rejuvenating the tissues that maintain eggs in a healthy state. Still others are generating cells and hormones in the lab in an attempt to supplement what the aging female body naturally loses. Deena Emera, an evolutionary geneticist at the Buck Institute for Research on Aging, in California, thinks some of the best inspiration could come from species that stay fertile very late into life. Bowhead whales, for instance, can reproduce past the age of 100—and don’t seem to succumb to cancer. Maybe, Emera told me, they’re especially good at repairing DNA damage in reproductive and nonreproductive cells alike.

Some women may welcome an extended interval in which to consider having kids, but Goldman and Emera are most focused on minimizing menopause’s health costs. Studies have repeatedly linked the menopause-related drop in hormones to declines in bone health; some research has pointed to cardiovascular and cognitive issues as well. Entering menopause can entail years of symptoms such as hot flashes, urinary incontinence, vaginal dryness, insomnia, and low libido. Putting all of that off, perhaps indefinitely, could extend the period in which women live healthfully, buoyed by their reproductive hormones.

[Read: Women in menopause are getting short shrift]

Extending the ovaries’ shelf life won’t necessarily reverse or even mitigate menopause’s unwanted effects, Stephanie Faubion, the director of Mayo Clinic’s Center for Women’s Health, told me. Plus, it may come with additional risks related to later-in-life pregnancies. It could also raise a woman’s chances of breast or uterine cancer, blood clots, and stroke, Jerilynn Prior, an endocrinologist at the University of British Columbia, told me. And putting off menopause may also mean more years of menstruation and contraception, a prospect that will likely give many women pause, says Nanette Santoro, an obstetrician-gynecologist and a reproductive scientist at the University of Colorado School of Medicine.

But several researchers think some tweaking is worth a shot. Even if menopause once helped our species survive, Goldman said, “it’s hard to imagine” that’s still the case. Evolution may have saddled us with an odd misalignment in the lifespans of the ovaries and the other organs they live alongside. But it has also equipped us with the smarts to potentially break free of those limits.