Itemoids

University

Is COVID Immunity Hung Up on Old Variants?

The Atlantic

www.theatlantic.com › health › archive › 2023 › 01 › covid-vaccine-immunity-variant-protection › 672704

In the two-plus years that COVID vaccines have been available in America, the basic recipe has changed just once. The virus, meanwhile, has belched out five variants concerning enough to earn their own Greek-letter names, followed by a menagerie of weirdly monikered Omicron subvariants, each seeming to spread faster than the last. Vaccines, which take months to reformulate, just can’t keep up with a virus that seems to reinvent itself by the week.

But SARS-CoV-2’s evolutionary sprint might not be the only reason that immunity can get bogged down in the past. The body seems to fixate on the first version of the virus that it encountered, either through injection or infection—a preoccupation with the past that researchers call “original antigenic sin,” and that may leave us with defenses that are poorly tailored to circulating variants. In recent months, some experts have begun to worry that this “sin” might now be undermining updated vaccines. At an extreme, the thinking goes, people may not get much protection from a COVID shot that is a perfect match for the viral variant du jour.

Recent data hint at this possibility. Past brushes with the virus or the original vaccine seem to mold, or even muffle, people’s reactions to bivalent shots—“I have no doubt about that,” Jenna Guthmiller, an immunologist at the University of Colorado School of Medicine, told me. The immune system just doesn’t make Omicron-focused antibodies in the quantity or quality it probably would have had it seen the updated jabs first. But there’s also an upside to this stubbornness that we could not live without, says Katelyn Gostic, an immunologist and infectious-disease modeler who has studied the phenomenon with flu. Original antigenic sin is the reason repeat infections, on average, get milder over time, and the oomph that enables vaccines to work as well as they do. “It’s a fundamental part,” Gostic told me, “of being able to create immunological memory.”

This is not just basic biology. The body’s powerful first impressions of this coronavirus can and should influence how, when, and how often we revaccinate against it, and with what. Better understanding of the degree to which these impressions linger could also help scientists figure out why people are (or are not) fighting off the latest variants—and how their defenses will fare against the virus as it continues to change.

The worst thing about “original antigenic sin” is its name. The blame for that technically lies with Thomas Francis Jr., the immunologist who coined the phrase more than six decades ago after noticing that the initial flu infections people weathered in childhood could bias how they fared against subsequent strains. “Basically, the flu you get first in life is the one you respond to most avidly for the long term,” says Gabriel Victora, an immunologist at Rockefeller University. That can become somewhat of an issue when a very different-looking strain comes knocking.

In scenarios like these, original antigenic sin may sound like the molecular equivalent of a lovesick teen pining over an ex, or a student who never graduates out of immunological grade school. But from the immune system’s point of view, never forgetting your first is logically sound. New encounters with a pathogen catch the body off guard—and tend to be the most severe. A deep-rooted defensive reaction, then, is practical: It ups the chances that the next time the same invader shows up, it will be swiftly identified and dispatched. “Having good memory and being able to boost it very quickly is sometimes a very good thing,” Victora told me. It’s the body’s way of ensuring that it won’t get fooled twice.

[Read: Annual COVID shots mean we can stop counting]

These old grudges come with clear advantages even when microbes morph into new forms, as flu viruses and coronaviruses often do. Pathogens don’t remake themselves all at once, so immune cells that home in on familiar snippets of a virus can still in many cases snuff out enough invaders to prevent an infection’s worst effects. That’s why even flu shots that aren’t perfectly matched to the season’s most prominent strains are usually still quite good at keeping people out of hospitals and morgues. “There’s a lot of leniency in how much the virus can change before we really lose protection,” Guthmiller told me. The wiggle room should be even bigger, she said, with SARS-CoV-2, whose subvariants tend to be far more similar to one another than, say, different flu strains are.

With all the positives that immune memory can offer, many immunologists tend to roll their eyes at the negative and bizarrely moralizing implications of the phrase original antigenic sin. “I really, really hate that term,” says Deepta Bhattacharya, an immunologist at the University of Arizona. Instead, Bhattacharya and others prefer to use more neutral words such as imprinting, evocative of a duckling latching onto the first maternal figure it spots. “This is not some strange immunological phenomenon,” says Rafi Ahmed, an immunologist at Emory University. It’s more a textbook example of what an adaptable, high-functioning immune system does, and one that can have positive or negative effects, depending on context. Recent flu outbreaks have showcased a little bit of each: During the 2009 H1N1 pandemic, many elderly people, normally more susceptible to flu viruses, fared better than expected against the late-aughts strain, because they’d banked exposures to a similar-looking H1N1—a derivative of the culprit behind the 1918 pandemic—in their youth. But in some seasons that followed, H1N1 disproportionately sickened middle-aged adults whose early-life flu indoctrinations may have tilted them away from a protective response.

[Read: COVID science is moving backwards]

The backward-gazing immune systems of those adults may have done more than preferentially amplify defensive responses to a less relevant viral strain. They might have also actively suppressed the formation of a response to the new one. Part of that is sheer kinetics: Veteran immune cells, trained up on past variants and strains, tend to be quicker on the draw than fresh recruits, says Scott Hensley, an immunologist at the Perelman School of Medicine at the University of Pennsylvania. And the greater the number of experienced soldiers, the more likely they are to crowd out rookie fighters—depriving them of battlefield experience they might otherwise accrue. Should the newer viral strain eventually return for a repeat infection, those less experienced immune cells may not be adequately prepared—leaving people more vulnerable, perhaps, than they might otherwise have been.

Some researchers think that form of imprinting might now be playing out with the bivalent COVID vaccines. Several studies have found that the BA.5-focused shots are, at best, moderately more effective at producing an Omicron-targeted antibody response than the original-recipe jab—not the knockout results that some might have hoped for. Recent work in mice from Victora’s lab backs up that idea: B cells, the manufacturers of antibodies, do seem to have trouble moving past the impressions of SARS-CoV-2’s spike protein that they got from first exposure. But the findings don’t really trouble Victora, who gladly received his own bivalent COVID shot. (He’ll take the next update, too, whenever it’s ready.) A blunted response to a new vaccine, he told me, is not a nonexistent one—and the more foreign a second shot recipe is compared with the first, the more novice fighters should be expected to participate in the fight. “You’re still adding new responses,” he said, that will rev back up when they become relevant. The coronavirus is a fast evolver. But the immune system also adapts. Which means that people who receive the bivalent shot can still expect to be better protected against Omicron variants than those who don’t.

Historical flu data support this idea. Many of the middle-aged adults slammed by recent H1N1 infections may not have mounted perfect attacks on the unfamiliar virus, but as immune cells continued to tussle with the pathogen, the body “pretty quickly filled in the gaps,” Gostic told me. Although it’s tempting to view imprinting as a form of destiny, “that’s just not how the immune system works,” Guthmiller told me. Preferences can be overwritten; biases can be undone.

Original antigenic sin might not be a crisis, but its existence does suggest ways to optimize our vaccination strategies with past biases in mind. Sometimes, those preferences might need to be avoided; in other instances, they should be actively embraced.

For that to happen, though, immunologists would need to fill in some holes in their knowledge of imprinting: how often it occurs, the rules by which it operates, what can entrench or alleviate it. Even among flu viruses, where the pattern has been best-studied, plenty of murkiness remains. It’s not clear whether imprinting is stronger, for instance, when the first exposure comes via infection or vaccination. Scientists can’t yet say whether children, with their fiery yet impressionable immune systems, might be more or less prone to getting stuck on their very first flu strain. Researchers don’t even know for certain whether repetition of a first exposure—say, through multiple doses of the same vaccine, or reinfections with the same variant—will more deeply embed a particular imprint.

It does seem intuitive that multiple doses of a vaccine could exacerbate an early bias, Ahmed told me. But if that’s the case, then the same principle might also work the other way: Maybe multiple exposures to a new version of the virus could help break an old habit, and nudge the immune system to move on. Recent evidence has hinted that people previously infected with an early Omicron subvariant responded more enthusiastically to a bivalent BA.1-focused vaccine—available in the United Kingdom—than those who’d never encountered the lineage before. Hensley, at the University of Pennsylvania, is now trying to figure out if the same is true for Americans who got the BA.5-based bivalent shot after getting sick with one of the many Omicron subvariants.

Ahmed thinks that giving people two updated shots—a safer approach, he points out, than adding an infection to the mix—could untether the body from old imprints too. A few years ago, he and his colleagues showed that a second dose of a particular flu vaccine could help shift the ratio of people’s immune responses. A second dose of the fall’s bivalent vaccine might not be practical or palatable for most people, especially now that BA.5 is on its way out. But if next autumn’s recipe overlaps with BA.5 in ways that it doesn’t with the original variant—as it likely will to at least some degree, given the Omicron lineage’s continuing reign—a later, slightly different shot could still be a boon.

Keeping vaccine doses relatively spaced out—on an annual basis, say, à la flu shots—will likely help too, Bhattacharya said. His recent studies, not yet published, hint that the body might “forget” old variants, as it were, if it’s simply given more time: As antibodies raised against prior infections and injections fall away, vaccine ingredients could linger in the body rather than be destroyed by prior immunity on sight. That slightly extended stay might offer the junior members of the immune system—lesser in number, and slower on the uptake—more of an opportunity to cook up an Omicron-specific response.

In an ideal world, researchers might someday know enough about imprinting to account for its finickiness whenever they select and roll out new shots. Flu shots, for instance, could be personalized to account for which strains babies were first exposed to, based on birth year; combinations of COVID vaccine doses and infections could dictate the timing and composition of a next jab. But the world is not yet living that reality, Gostic told me. And after three years of an ever-changing coronavirus and a fluctuating approach to public health, it’s clear that there won’t be a single vaccine recipe that’s ideal for everyone at once.

Even Thomas Francis Jr. did not consider original antigenic sin to be a total negative, Hensley told me. According to Francis, the true issue with the “sin” was that humans were missing out on the chance to imprint on multiple strains at once in childhood, when the immune system is still a blank slate—something that modern researchers could soon accomplish with the development of universal vaccines. Our reliance on first impressions can be a drawback. But the same phenomenon can be an opportunity to acquaint the body with diversity early on—to give it a richer narrative, and memories of many threats to come.

Late night hosts react to classified documents found at Biden's former office

CNN

www.cnn.com › videos › media › 2023 › 01 › 11 › kimmel-fallon-colbert-biden-classified-documents-contd-lon-orig-tp.cnn

This story seems to be about:

Late night hosts poked fun at the reports that back in November President Biden's lawyers found government records, including classified documents, at a former office that he used when he was an honorary professor at the University of Pennsylvania.

Can We Talk About How Weird Baby Mammals Are?

The Atlantic

www.theatlantic.com › science › archive › 2023 › 01 › baby-mammals-comic-proportions-bat-feet › 672699

As adults, bats—the only mammals in the world capable of bona fide flight—are all about their wings. The trademark appendages can span up to 66 inches; they help bats snag insects, climb trees, attract mates, even fan their bodies in the summer heat. But as babies, bats are all about their giant clown feet.

Most mammals exit the womb with hind limbs that measure only about 20 to 60 percent of their maximum size. But compare a newborn little Japanese horseshoe bat’s foot with its mother’s, and they’re “almost identical,” says Daisuke Koyabu, an evolutionary embryologist at the University of Tsukuba, in Japan—even while the newborn’s wings remain fragile and small.

The comically flipped proportions of newborn and adult go beyond bats. Other creatures, too, weather some major anatomical transitions as they pass through puberty—a reminder that young animals sometimes live entirely distinct lives than their elders do. The changes are aesthetic, but they’re also functional. The body parts that matter most to animals later in life aren’t necessarily the ones that help them survive when they first slide out of the womb.

In a world of weird-looking babies, infant bats may reign supreme. Koyabu and his colleagues have found that the newborn animals’ hind limbs already clock in at 70 to 95 percent of their adult length. And the babies’ honking feet aren’t just for kicks. For many days after a still-developing pup is born, its mother must haul it around more or less full-time. But bat moms can’t carry their babies in the traditional sense, because “their forelimbs are wings” that are busy flapping and soaring, says Nicole Grunstra, an evolutionary anthropologist at the University of Vienna. So the infants use their ginormous feet to cling to their mother’s fur. It’s an impressive operation for both parties, considering that the winged newborns can weigh up to 45 percent as much as their parents do, the rough equivalent of a 140-pound human delivering a 63-pound infant … who also happens to have size 8 or 9 feet. (As if that weren’t bad enough, some bats are born feet first.)

[Read: Pregnancy is a war; birth is a cease-fire]

Other species’ early-life anatomy is similarly practical, if not quite so strange. Newborn whales, dolphins, and other mammals have über-developed tails, so they don’t drown in the open ocean; cows, wildebeests, and other hoofed creatures are born with highly developed legs so they can sprint about, sometimes within minutes of birth, to keep up with their herd. And many nonhuman primates have sturdy, dexterous forelimbs at birth so they can hitch a ride on their mother’s front or back. (The newborns of a few monkey species have arms and hands so strong that they can hoist themselves out of their mother’s vaginal tract, then clamber up her front for milk.)

Then there’s outsize anatomy that doesn’t make babies that much more independent—but can still help them stay safe until they can hold their own. Adult animals of many species, humans among them, go gaga over the big eyes, large foreheads, and pudgy, kissable cheeks of their young. The reaction begets more caregiving, which protects the infants while they remain in their fragile, freshly birthed state. People are so into cute that they’ve bred some of these exaggerated traits into certain companion animals, such as dogs—though puppies’ sometimes hilariously large paws aren’t all that disproportionate at birth, especially compared with the feet of their distant bat kin.

[Read: The politically subversive power of puppies]

Wonky dimensions can come with costs. After a gestation that lasts only a month or so, red kangaroos emerge from the birth canal as little more than a jelly-bean-size nub, pink and hairless and blind. At this stage, they are about as developed as an 8-to-12-week-old human fetus; the brain and skull are shrunken, the backside tapers into near nothingness, and the lower legs—so important in adulthood—are “stubby and nonfunctional,” says Kathleen Smith, an evolutionary biologist at Duke University. These still-fetal structures leave the joey in a pretty perilous state. “It’s really extreme—they could die suddenly,” says Ingmar Werneburg, an evolutionary morphologist at the University of Tübingen, in Germany, who’s been collaborating with Koyabu on bat work. But that’s the cost the little creature must pay to ensure that its short gestation prioritizes the development of its front half: a strong, sucking mouth, flanked by some seriously jacked forepaws—the bits of anatomy it requires to crawl out of its mother’s vagina and into her pouch, where it can fuse itself to a life-sustaining teat.

Part of the kangaroo’s problem is its super-short stretch in the womb. But even bats, which spend an average of three to four months in utero—surprisingly long for mammals of their size—pay a tax for their adult-size feet. They’re also born hairless, with ultra-fragile wings just a third or so of their adult size, says Taro Nojiri, a biologist at Juntendo University, in Japan, who has been studying bats with Koyabu. And they, too, have to change their development strategies when they fully enter the world.

Eventually, the Picasso-esque extremes of childhood give way to a different set of proportions—and, sometimes, a very different way of living. Once bats have graduated out of toddlerhood, their feet take a functional backseat to their echolocating mouths, sensitive ears, and powerful wings; joeys, too, execute a switch-up, as their legs and tail undergo a massive growth spurt during their months suckling in their mother’s pouch. It’s a near reversal of the animal’s entire architecture—proof positive, Koyabu told me, that “the morphology of the newborn is not just a miniature morphology of the adult.” Infant animals are not just preludes to their elders, but their own entities, with unique needs, vulnerabilities, and experiences of the world—even if that uniqueness can saddle them with bodily proportions that feel comedically out of whack.

[Read: This is your brain on puppies]

Perhaps bats’ first days would be simpler if they came out with their wings already raring to go; maybe kangaroo joeys would have an easier journey from vagina to pouch if they could just use their hind legs. But gestation is a race to funnel resources to the organs the newborn will need most, and the sooner it’s over, the better for Mom. “Pregnancy in mammals is horribly dangerous,” Smith told me. The longer it stretches on, the greater the risk to the mother, and the more time she spends with her body not entirely her own. The good news, Smith said, is that limbs and organs are quite good at playing catch-up. The important thing is to build up the bits of the infant body that will give the kid the best start possible—no matter how odd it might look.

Science Has a Crummy-Paper Problem

The Atlantic

www.theatlantic.com › newsletters › archive › 2023 › 01 › academia-research-scientific-papers-progress › 672694

This is Work in Progress, a newsletter by Derek Thompson about work, technology, and how to solve some of America’s biggest problems. Sign up here to get it every week.

We should be living in a golden age of creativity in science and technology. We know more about the universe and ourselves than we did in any other period in history, and with easy access to superior research tools, our pace of discovery should be accelerating. But, as I wrote in the first edition of this newsletter, America is running out of new ideas.

“Everywhere we look we find that ideas … are getting harder to find,” a group of researchers from Stanford University and MIT famously concluded in a 2020 paper. Another paper found that “scientific knowledge has been in clear secular decline since the early 1970s,” and yet another concluded that “new ideas no longer fuel economic growth the way they once did.”

In the past year, I’ve traced the decline of scientific breakthroughs and entrepreneurship, warned that some markets can strangle novelty, and investigated the domination of old movies and songs in the film and music industries. This year, a new study titled “Papers and Patents Are Becoming Less Disruptive Over Time” inches us closer to an explanation for why the pace of knowledge has declined. The upshot is that any given paper today is much less likely to become influential than a paper in the same field from several decades ago. “Our study is the first to show that progress is slowing down, not just in one or two places, but across many domains of science and technology,” Michael Park, a co-author and professor at the University of Minnesota, told me.

The researchers relied on a metric called the Consolidation-Disruption Index—or CD Index—which measures the influence of new research. For example, if I write a crummy literature review and no scientist ever mentions my work because it’s so basic, my CD Index will be extremely low. If I publish a paradigm-shifting study and future scientists exclusively cite my work over the research I rendered irrelevant, my CD Index will be very high.

This new paper found that the CD Index of just about every academic domain today is in full-on mayday! mayday! descent. Across broad landscapes of science and technology, the past is eating the present, progress is plunging, and truly disruptive work is hard to come by. Despite an enormous increase in scientists and papers since the middle of the 20th century, the number of highly disruptive studies each year hasn’t increased.

Why is this happening?

One possibility is that disruptive science is becoming less productive as each field becomes more advanced and the amount of knowledge new scientists have to acquire increases. This is sometimes called the “burden of knowledge” theory. Just as picking apples from a tree becomes harder after you harvest the low-hanging fruit, science becomes harder after researchers solve the easiest mysteries. This must be true, in some cases: Calculating gravity in the 1600s basically required a telescope, pen, and paper. Discovering the Higgs boson in the 21st century required constructing a $10 billion particle collider and spending billions more firing subatomic particles at one another at near–light speed. Pretending these things are the same is not useful.

A related theory is Johan S. G. Chu’s concept of “durable dominance”—a phenomenon where highly competitive fields create a small number of dominant winners. Chu and the University of Chicago scholar James Evans found that progress has slowed in many fields because scientists are so overwhelmed by the glut of information in their domain that they’re reading and riffing on the same limited canon of famous papers. It’s more or less the same principle as a weekend couch potato overwhelmed by streaming options who opts to just watch the top-ranked TV show on Netflix. In both science and streaming, a surplus of options might be entrenching a small number of massive hits.

When I spoke with the disruption paper’s co-authors last week, they seemed interested in explanations beyond the burden-of-knowledge theory. “If the low-hanging-fruit theory were sufficient, then I think we’d expect to see the oldest fields stagnate most dramatically,” said Russell Funk, a co-author and professor at the Carlson School of Management. “But the fact that the decline in disruption is happening across so many fields of science and technology points to something broader about scientific practice, and the corporatization of science, and the decline of scientific exploration in the last few decades.”

In other words, if science is getting less productive, it’s not just because we know too much about the world. It’s because we know too little about science itself. Or, more specifically, we know too little about how to conduct research in a way that gets the best, most groundbreaking results.

According to the rules of modern academia, a young academic should build status by publishing as many papers in prestigious journals as she can, harvest the citations for clout, and solicit funding institutions for more money to keep it all going. These rules may have been created with the best intentions—to fund the most promising projects and ensure the productivity of scientists. But they have created a market logic that has some concerning consequences.

First, these rules might discourage truly free exploration. As the number of Ph.D. students has grown, National Institutes of Health funding has struggled to keep up. Thus, the success rate for new project grants has mostly declined in the past 30 years. As grants have become more competitive, savvy lab directors have strategically aimed for research that seems plausible but not too radical—optimally new rather than totally new, as one researcher put it. This approach may create a surplus of papers that are designed to advance knowledge only a little. A 2020 paper suggested that the modern emphasis on citations to measure scientific productivity has shifted rewards and behavior toward incremental science and “away from exploratory projects that are more likely to fail, but which are the fuel for future breakthroughs.” As attention given to new ideas has decreased, science has stagnated.

Second, at the far extreme, these incentives might create a surplus of papers that just aren’t any good—that is, they exist purely to advance careers, not science.

“I definitely think there’s something to the idea that there are just a lot more bullshit papers out there,” Funk told me. Rather than blame individual scientists, he said the fault lies in a system that encourages volume over quality: “There are journals, which I’d consider predatory journals, that make researchers pay money to publish their papers there, with only symbolic peer review, and then the journals play games by making the authors cite articles from the same journal.”

Funk’s predatory-journal story reminded me of the dark side of Moneyball: When any industry focuses too much on one metric, it can render the metric meaningless and warp the broader purpose of the industry. Just as we are living in a platinum age of television—more quantity but perhaps not more quality—we seem to be in a platinum age of science, in which the best you can say about the industry is that there certainly seems to be more of everything, including crap.

A year ago, I pitched the idea of an abundance agenda, arguing that the U.S. suffers from a scarcity mindset in health care, housing, and beyond. The crisis in science offers an interesting test of this thesis in that researchers are struggling with a superabundance of knowledge and studies. It’s a useful reminder that abundance is not a sufficient end point; rather, it’s an input. Science may have a deficit of disruption precisely because the industry doesn’t know how to navigate its crisis of plenty—too much knowledge to synthesize, and too many papers bolstering their authors’ reputation without expanding the frontier of science.

The Meaning of Dry January

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 01 › -dry-january-challenge-2023-drinking-meaning-benefits › 672695

Edward Slingerland is a philosophy professor who wrote a book arguing that alcohol has helped humans create the world as we know it. But this January, he’ll be forgoing alcohol—at least for half of the month.

Slingerland, the author of Drunk: How We Sipped, Danced, and Stumbled Our Way to Civilization, is, for the first time, participating in Dry January, the annual tradition where drinkers go sober for the first month of the year. (Slingerland is doing just half the month.) In doing so, he’ll join a growing number of Americans (according to one poll, as much as one-fifth of the population) who participate in the annual campaign, which originated in the United Kingdom a decade ago.

[From the July/August 2021 issue: America has a drinking problem]

I reached out to Slingerland because I was curious to know what he made of the annual movement—and what it says about modern society. After all, as chronicled in Drunk, humans have spent thousands of years and countless brain cells trying to get wasted. Why are so many people now voluntarily abstaining, albeit temporarily? Does Dry January speak to something larger about our culture’s ever-evolving relationship with booze?

We discussed that and more over a beer. (Just kidding. This was over Zoom and by telephone.)

Our conversation has been condensed and edited for clarity.

Caroline Mimbs Nyce: What do you make of Dry January as a cultural phenomenon?

Edward Slingerland: I think it’s a response to a recognition of the danger of alcohol. Alcohol is a dangerous substance. But for most of our history, alcohol had built-in safety features.

First, there were limits to how strong alcohol was. Then we invented distillation and disabled that safety feature. This happened in the West relatively recently, like, 1600s to 1700s. So we now have alcohol in this incredibly dangerous form that we just aren’t equipped to deal with biologically.

And then the other safety feature is that all cultures that use alcohol have very elaborate—both formal and informal—rituals or cultural norms that help people drink safely. Typically, your access was mediated socially: It was in ritual context or at least some sort of feasting-meal context. Historically, it’s unprecedented to have private access to alcohol. Only relatively recently do we have this ability to drive our SUV to a drive-through liquor store, load it up with cases and cases of vodka, bring it home, and just have it in the house.

I call these two dangers the dangers of distillation and isolation. I think things like Dry January are ways for people to try to reassert some kind of control—to reestablish some safety features.

Nyce: There’s some evidence to suggest that Gen Z has a different relationship with alcohol. Do you think a change can happen that quickly—that within, say, 20 to 50 years, depending on how you measure, a generation could develop a very distinct relationship with the substance?  

Slingerland: Absolutely. I mean, look at the way that attitudes toward tobacco have changed. I think the Gen Z thing is partly that alcohol is not as cool, because it’s what your parents or your uncle drinks. And so cannabis is cool—or microdosing psilocybin. But I think these are actually a bit of a fad.

I refer to alcohol as the king of intoxicants because it’s far and away the dominant intoxicant that’s used across the world throughout history. And there’s a good reason for that. It’s got some real downsides: It’s physiologically really harmful, and quite addictive physically. But then you get all of these features that make it an ideal social drug: It’s very easy to dose; it has very predictable effects across individuals; it’s easy to make; it goes well with food. We’ve had cannabis, for instance, for a very long time—probably at least 6,000 years, maybe longer. There’s a reason that when you go to a restaurant, you’re given a wine, not a cannabis, list.

With Gen Z, there’s this idea that alcohol isn’t cool, but it’s going to be difficult for them to find a functional substitute for it.

Nyce: Do you expect alcohol to be dethroned any time soon as sort of the king of substances?

Slingerland: No way. There’s just inertia, and it has a cultural significance as well. It’s really hard to imagine that in France, for example, they’re going to start serving food with cannabis on the side and not local white wine that’s been paired with the local food for hundreds of years. You see wine traditions co-evolving with culinary traditions in various parts of the world. And that co-evolution is really hard to undo.

Nyce: In Drunk, you describe many of the positive benefits of alcohol. So I was curious what you make of Dry January, whether you just see it as a check on the negative—or if you had any concerns about it, given the way that alcohol has helped us build civilizations and helped with creativity.

Slingerland: I think it’s a quite healthy attempt to check rising consumption. January is the beginning of the year. People have just been through the holiday season, where they’ve been probably drinking quite heavily at parties and family gatherings. So it just makes sense.

During Dry January, if you’re not drinking alcohol, you’re going to lose some of the functional effects. You’re going to lose the creativity boost and social bonding. But it makes sense to endure some costs occasionally if you need to course correct.

For instance, problem drinking during the pandemic became really serious. Once you up your consumption, it’s very, very hard to dial back down. And probably the most effective way to do that is a kind of hard stop for a bit to just let your physiology reset.

Nyce: With the pandemic in particular, as you say, there’s been a problem of overconsumption, but at the same time, there’s also been a lot of loneliness. It almost feels like alcohol—in moderation—could help us with the latter. How do you think about the overconsumption problem versus the social benefits?

Slingerland: It’s tricky. The pandemic was basically a natural experiment that you would never get human-subject approval for: Let’s see what happens if no one’s allowed to leave their house, but they can order a case of tequila from their local taqueria. It was the extreme version of drinking in isolation, which was really unhealthy. People tried to keep using alcohol in a social way with things like Zoom cocktail hours, but that didn’t work very well.

There’s a new study out by researchers including University of Pittsburgh’s Michael Sayette, one of the leading alcohol researchers. In face-to-face social interactions, alcohol is very helpful. It relaxes people. It makes them less self-conscious. It makes them bond better with other people. They found that in online interactions, it actually has a reverse effect. It makes you more self-conscious. In in-person interactions with alcohol, you get a mood increase that lasts afterwards—a kind of afterglow. You get the opposite with online drinking.

When I’m interacting with you right now on Zoom, I can see myself, which wouldn’t be the case if we were in person. You just focus on yourself in a way that is not good for your mood and for the smoothness of the social interaction.

Nyce: If you were to create a user guide to alcohol, what would be in it?

Slingerland: Mimic healthy cultures. So there are some cultures that have healthier drinking practices than others. Anthropologists refer to Northern versus Southern European drinking cultures. Northern drinking cultures tend to be binge drinkers; they drink hard alcohol primarily, often in groups of just men by themselves, women by themselves. Alcohol is forbidden to kids. It’s kind of taboo. The purpose of drinking is to get drunk.

[Read: America’s favorite poison]

Anglophone college culture is kind of the worst version of this, because it’s kids without fully developed prefrontal cortices doing it, and they’re drinking distilled liquors. If you want to design the unhealthiest drinking culture possible, it would be college drinking culture.

Whereas if you look at Southern European cultures like Italy or Spain, they’re drinking primarily wine and beer. They’re always drinking in the context of a meal, so it’s always around a meal table. It’s in mixed company—kids and grandparents and parents. To drink to the point of being visibly drunk is embarrassing and actually kind of shameful.

Nyce: If you had to name or describe this era of America’s relationship with alcohol, how would you do so?

Slingerland: I don’t know if this is a catchy name, but “cautious” is how I would characterize it. You think of the ’50s Mad Men era—it was just full speed ahead, three-martini lunches. I think now people have become more aware of the dangers of alcohol and the downsides. And so we’re just more wary or cautious when it comes to alcohol than we used to be.

Nyce: And how has studying and writing about it changed your perception of your own drinking? Do you think about the research when you go to imbibe with family and friends?

Slingerland: All the time. Yeah. I think about it constantly.

Nyce: Does it ruin the experience for you?  

Slingerland: I appreciate it more in some ways, because I am not just enjoying it phenomenologically as a person, but at a meta level, I can step back and think, Oh, this is what’s happening functionally. But I’ve changed my behavior in certain ways in response to my research.

Nyce: What ways are those?

Slingerland: One thing is I’ve never really liked beer, but I’ve started drinking beer occasionally. I had a get-together—like, a kickoff event for this new postdoc on this big project that I run. In the past, I would have ordered a couple of bottles of wine for the table, because that’s what I like—I prefer wine. But instead, I got beer, because one takeaway from my research is that lower-alcohol-content beverages are better. It’s easier in a social situation to drink and continue drinking and not worry about your consumption.

Most of the social benefits of alcohol that I talk about in the book come from moderate levels of intoxication—so, like, 0.08 blood-alcohol content, or about where you should not be operating heavy machinery. If you’re drinking, like, a 4 percent lager or something, you can drink that pretty much all night and never get past .08. If you want to deliver ethanol to the human brain, beer is the safest way to do that. So I started actually making a place for beer in my life where I never did before.

Nyce: Have you ever done Dry January? Or ever considered it?

Slingerland: Never in the past. But my partner and I decided last week we’re going to do Half-Dry January. We live long distance from each other, and we’re apart for two weeks of January. We’re going to do a Dry January when we’re apart so that we can indulge when we’re together.

When you buy a book using a link on this page, we receive a commission. Thank you for supporting The Atlantic.

As University of Idaho students return to classes, they say the arrest of a murder suspect brings peace of mind

CNN

www.cnn.com › 2023 › 01 › 11 › us › idaho-student-killings-campus-safety-wednesday › index.html

This story seems to be about:

Classes resume Wednesday at the University of Idaho, just weeks after many students abandoned the campus amidst anxieties that the person responsible for the gruesome stabbing deaths of four students in November had yet to be found.