Itemoids

CBS

Is Moderate Drinking Okay?

The Atlantic

www.theatlantic.com › ideas › archive › 2025 › 01 › moderate-drinking-warning-labels-cancer › 681322

Here’s a simple question: Is moderate drinking okay?

Like millions of Americans, I look forward to a glass of wine—sure, occasionally two—while cooking or eating dinner. I strongly believe that an ice-cold pilsner on a hot summer day is, to paraphrase Benjamin Franklin, suggestive evidence that a divine spirit exists and gets a kick out of seeing us buzzed.

But, like most people, I understand that booze isn’t medicine. I don’t consider a bottle of California cabernet to be the equivalent of a liquid statin. Drinking to excess is dangerous for our bodies and those around us. Having more than three or four drinks a night is strongly related to a host of diseases, including liver cirrhosis, and alcohol addiction is a scourge for those genetically predisposed to dependency.

If the evidence against heavy drinking is clear, the research on my wine-with-dinner habit is a wasteland of confusion and contradiction. This month, the U.S. surgeon general published a new recommendation that all alcohol come with a warning label indicating it increases the risk of cancer. Around the same time, a meta-analysis published by the National Academies of Sciences, Engineering, and Medicine concluded that moderate alcohol drinking is associated with a longer life. Many scientists scoffed at both of these headlines, claiming that the underlying studies are so flawed that to derive strong conclusions from them would be like trying to make a fine wine out of a bunch of supermarket grapes.

I’ve spent the past few weeks poring over studies, meta-analyses, and commentaries. I’ve crashed my web browser with an oversupply of research-paper tabs. I’ve spoken with researchers and then consulted with other scientists who disagreed with those researchers. And I’ve reached two conclusions. First, my seemingly simple question about moderate drinking may not have a simple answer. Second, I’m not making any plans to give up my nightly glass of wine.

Alcohol ambivalence has been with us for almost as long as alcohol. The notion that booze is enjoyable in small doses and hellish in excess was captured well by Eubulus, a Greek comic poet of the fourth century B.C.E., who wrote that although two bowls of wine brought “love and pleasure,” five led to “shouting,” nine led to “bile,” and 10 produced outright “madness, in that it makes people throw things.”

In the late 20th century, however, conventional wisdom lurched strongly toward the idea that moderate drinking was healthy, especially when the beverage of choice was red wine. In 1991, Morley Safer, a correspondent for CBS, recorded a segment of 60 Minutes titled “The French Paradox,” in which he pointed out that the French filled their stomachs with meat, oil, butter, and other sources of fat, yet managed to live long lives with lower rates of cardiovascular disease than their Northern European peers. “The answer to the riddle, the explanation of the paradox, may lie in this inviting glass” of red wine, Safer told viewers. Following the report, demand for red wine in the U.S. surged.

[Read: America has a drinking problem]

The notion that a glass of red wine every night is akin to medicine wasn’t just embraced by a gullible news media. It was assumed as a matter of scientific fact by many researchers. “The evidence amassed is sufficient to bracket skeptics of alcohol’s protective effects with the doubters of manned lunar landings and members of the flat-Earth society,” the behavioral psychologist and health researcher Tim Stockwell wrote in 2000.

Today, however, Stockwell is himself a flat-earther, so to speak. In the past 25 years, he has spent, he told me, “thousands and thousands of hours” reevaluating studies on alcohol and health. And now he’s convinced, as many other scientists are, that the supposed health benefits of moderate drinking were based on bad research and confounded variables.

A technical term for the so-called French paradox is the “J curve.” When you plot the number of drinks people consume along an X axis and their risk of dying along the Y axis, most observational studies show a shallow dip at about one drink a day for women and two drinks a day for men, suggesting protection against all-cause mortality. Then the line rises—and rises and rises—confirming the idea that excessive drinking is plainly unhealthy. The resulting graph looks like a J, hence the name.

The J-curve thesis suffers from many problems, Stockwell told me. It relies on faulty comparisons between moderate drinkers and nondrinkers. Moderate drinkers tend to be richer, healthier, and more social, while nondrinkers are a motley group that includes people who have never had alcohol (who tend to be poorer), people who quit drinking alcohol because they’re sick, and even recovering alcoholics. In short, many moderate drinkers are healthy for reasons that have nothing to do with drinking, and many nondrinkers are less healthy for reasons that have nothing to do with alcohol abstention.

[Read: Not just sober-curious, but neo-temperate]

When Stockwell and his fellow researchers threw out the observational studies that were beyond salvation and adjusted the rest to account for some of the confounders I listed above, “the J curve disappeared,” he told me. By some interpretations, even a small amount of alcohol—as little as three drinks a week—seemed to increase the risk of cancer and death.

The demise of the J curve is profoundly affecting public-health guidance. In 2011, Canada’s public-health agencies said that men could safely enjoy up to three oversize drinks a night with two abstinent days a week—about 15 drinks a week. In 2023, the Canadian Centre on Substance Use and Addiction revised its guidelines to define low-risk drinking as no more than two drinks a week.

Here’s my concern: The end of the J curve has made way for a new emerging conventional wisdom—that moderate drinking is seriously risky—that is also built on flawed studies and potentially overconfident conclusions. The pendulum is swinging from flawed “red wine is basically heart medicine!” TV segments to questionable warnings about the risk of moderate drinking and cancer. After all, we’re still dealing with observational studies that struggle to account for the differences between diverse groups.

[Read: Is a glass of wine harmless? Wrong question.]

In a widely read breakdown of alcohol-health research, the scientist and author Vinay Prasad wrote that the observational research on which scientists are still basing their conclusions suffers from a litany of “old data, shitty data, confounded data, weak definitions, measurement error, multiplicity, time-zero problems, and illogical results.” As he memorably summarized the problem: “A meta-analysis is like a juicer, it only tastes as good as what you put in.” Even folks like Stockwell who are trying to turn the flawed data into useful reviews are like well-meaning chefs, toiling in the kitchen, doing their best to make coq au vin out of a lot of chicken droppings.

The U.S. surgeon general’s new report on alcohol recommended adding a more “prominent” warning label on all alcoholic beverages about cancer risks. The top-line findings were startling. Alcohol contributes to about 100,000 cancer cases and 20,000 cancer deaths each year, the surgeon general said. The guiding motivation sounded honorable. About three-fourths of adults drink once or more a week, and fewer than half of them are aware of the relationship between alcohol and cancer risk.

But many studies linking alcohol to cancer risk are bedeviled by the confounding problems facing many observational studies. For example, a study can find a relationship between moderate alcohol consumption and breast-cancer detection, but moderate consumption is correlated with income, as is access to mammograms.

One of the best-established mechanisms for alcohol being related to cancer is that alcohol breaks down into acetaldehyde in the body, which binds to and damages DNA, increasing the risk that a new cell grows out of control and becomes a cancerous tumor. This mechanism has been demonstrated in animal studies. But, as Prasad points out, we don’t approve drugs based on animal studies alone; many drugs work in mice and fail in clinical trials in humans. Just because we observe a biological mechanism in mice doesn’t mean you should live your life based on the assumption that the same cellular dance is happening inside your body.

[Read: The truth about breast cancer and drinking red wine—or any alcohol]

I’m willing to believe, even in the absence of slam-dunk evidence, that alcohol increases the risk of developing certain types of cancer for certain people. But as the surgeon general’s report itself points out, it’s important to distinguish between “absolute” and “relative” risk. Owning a swimming pool dramatically increases the relative risk that somebody in the house will drown, but the absolute risk of drowning in your backyard swimming pool is blessedly low. In a similar way, some analyses have concluded that even moderate drinking can increase a person’s odds of getting mouth cancer by about 40 percent. But given that the lifetime absolute risk of developing mouth cancer is less than 1 percent, this means one drink a day increases the typical individual’s chance of developing mouth cancer by about 0.3 percentage points. The surgeon general reports that moderate drinking (say, one drink a night) increases the relative risk of breast cancer by 10 percent, but that merely raises the absolute lifetime risk of getting breast cancer from about 11 percent to about 13 percent. Assuming that the math is sound, I think that’s a good thing to know. But if you pass this information along to a friend, I think you can forgive them for saying: Sorry, I like my chardonnay more than I like your two percentage points with a low confidence interval.

Where does this leave us? Not so far from our ancient-Greek friend Eubulus. Thousands of years and hundreds of studies after the Greek poet observed the dubious benefits of too much wine, we have much more data without much more certainty.

In her review of the literature, the economist Emily Oster concluded that “alcohol isn’t especially good for your health.” I think she’s probably right. But life isn’t—or, at least, shouldn’t be—about avoiding every activity with a whisker of risk. Cookies are not good for your health, either, as Oster points out, but only the grouchiest doctors will instruct their healthy patients to foreswear Oreos. Even salubrious activities—trying to bench your bodyweight, getting in a car to hang out with a friend—incur the real possibility of injury.

[Read: A daily drink is almost certainly not going to hurt you]

An appreciation for uncertainty is nice, but it’s not very memorable. I wanted a takeaway about alcohol and health that I could repeat to a friend if they ever ask me to summarize this article in a sentence. So I pressed Tim Stockwell to define his most cautious conclusions in a memorable way, even if I thought he might be overconfident in his caution.

“One drink a day for men or women will reduce your life expectancy on average by about three months,” he said. Moderate drinkers should have in their mind that “every drink reduces your expected longevity by about five minutes.” (The risk compounds for heavier drinkers, he added. “If you drink at a heavier level, two or three drinks a day, that goes up to like 10, 15, 20 minutes per drink—not per drinking day, but per drink.”)

Every drink takes five minutes off your life. Maybe the thought scares you. Personally, I find great comfort in it—even as I suspect it suffers from the same flaws that plague this entire field. Several months ago, I spoke with the Stanford University scientist Euan Ashley, who studies the cellular effects of exercise. He has concluded that every minute of exercise adds five extra minutes of life.

When you put these two statistics together, you get this wonderful bit of rough longevity arithmetic: For moderate drinkers, every drink reduces your life by the same five minutes that one minute of exercise can add back. There’s a motto for healthy moderation: Have a drink? Have a jog.

Even this kind of arithmetic can miss a bigger point. To reduce our existence to a mere game of minutes gained and lost is to squeeze the life out of life. Alcohol is not like a vitamin or pill that we swiftly consume in the solitude of our bathrooms, which can be straightforwardly evaluated in controlled laboratory testing. At best, moderate alcohol consumption is enmeshed in activities that we share with other people: cooking, dinners, parties, celebrations, rituals, get-togethers—life! It is pleasure, and it is people. It is a social mortar for our age of social isolation.

[Read: The anti-social century]

An underrated aspect of the surgeon general’s report is that it is following, rather than trailblazing, a national shift away from alcohol. As recently as 2005, Americans were more likely to say that alcohol was good for their health, instead of bad. Last year, they were more than five times as likely to say it was bad, instead of good. In the first seven months of 2024, alcohol sales volume declined for beer, wine, and spirits. The decline seemed especially pronounced among young people.

To the extent that alcohol carries a serious risk of excess and addiction, less booze in America seems purely positive. But for those without religious or personal objections, healthy drinking is social drinking, and the decline of alcohol seems related to the fact that Americans now spend less time in face-to-face socializing than any period in modern history. That some Americans are trading the blurry haze of intoxication for the crystal clarity of sobriety is a blessing for their minds and guts. But in some cases, they may be trading an ancient drug of socialization for the novel intoxicants of isolation.

MAGA’s Demon-Haunted World

The Atlantic

www.theatlantic.com › ideas › archive › 2025 › 01 › peter-thiel-maga-conspiracism › 681310

Just two years ago, Dominion Voting Systems’ defamation lawsuit against Fox News showed that many right-wing influencers didn’t believe a word of the stuff they were peddling to their audiences. In text messages that surfaced during litigation, top Fox anchors and executives poured scorn on the idea that the 2020 presidential election had been stolen, even as the network amplified that conspiracy theory to its audience. “Our viewers are good people and they believe it,” Tucker Carlson wrote in one message.

Today, though, some of the country’s most mainstream, most influential conservatives are stoking paranoid conspiracism—and seem to genuinely believe what they’re saying.

The venture capitalist Peter Thiel, for example, could not be more of an establishment figure: He was an early investor in Facebook, is now a mentor of Vice President–Elect J. D. Vance, and has strong links to the U.S. defense industry through his company Palantir. But in a recent opinion column in the ultra-establishment Financial Times, Thiel sounds like The X-Files’ Fox Mulder after a long night in the Bigfoot forums. “The future demands fresh and strange ideas,” he writes.

[Read: Peter Thiel is taking a break from democracy]

After Donald Trump’s second inauguration, Thiel implies, we might finally know the truth about the assassination of President John F. Kennedy and whether the coronavirus was a bioweapon. Thiel notes that the internet also has questions about the death of the well-connected sex offender Jeffrey Epstein. “Trump’s return to the White House augurs the apokálypsis”—that is, a revealing—“of the ancien regime’s secrets,” he adds. (Two pretentious expressions in one sentence? Monsieur, watch out for hubris.) Thiel wants large-scale declassifications and a truth-and-reconciliation commission, in the model of South Africa’s reckoning with apartheid. “The apokálypsis cannot resolve our fights over 1619,” Thiel writes, referring to the year the first enslaved Africans arrived in Virginia, “but it can resolve our fights over Covid-19; it will not adjudicate the sins of our first rulers, but the sins of those who govern us today.”

Thiel portrays Trump’s resurgence as a defeat for the “Distributed Idea Suppression Complex,” or DISC—his friend and employee Eric Weinstein’s term for legacy media outlets and nongovernmental organizations that supposedly prevent politically inconvenient truths from reaching the public. Thanks to the internet, information can no longer be suppressed.

Like most classic conspiracism, Thiel’s arguments contain grains of truth. As Naomi Klein’s Doppelganger wisely notes, “The line between unsupported conspiracy claims and reliable investigative research is neither as firm nor as stable as many of us would like to believe.” Thiel is right that some liberal commentators and mainstream outlets were too quick to dismiss the question of whether COVID-19 originated anywhere other than a Wuhan meat market. At one point, a New York Times reporter suggested on Twitter that racism underlay any suspicions that the virus had escaped from a research lab. None of this shows, however, that the coronavirus was a bioweapon, or that Anthony Fauci deserves to be prosecuted. (For that matter, the Times has run multiple articles that are open-minded about the origins of the virus.) In any case, who was president in 2020, when the COVID-origins debate was emerging? If any classified evidence existed that would have cleared up the controversy, Donald Trump had the power to disclose it.

[From the October 2023 issue: From feminist to right-wing conspiracist]

As for Epstein, his death, in 2019, was certainly very convenient for anyone who might have been embarrassed by what he knew. Nevertheless, the so-called mainstream media are not covering up the lingering questions; definitive answers simply aren’t available, and may never be. (CBS broadcast an episode of 60 Minutes with graphic photos of Epstein’s autopsy all the way back in January 2020.) Besides, if anyone is hiding the truth about Epstein, it’s probably not the left-wing blob. Who would benefit from killing a man who hung out with Trump, Prince Andrew, and various tech billionaires? Probably not blue-haired vegans from Portland, Oregon; racial-justice campaigners; or humanities academics. Meanwhile, the idea that anyone stifled doubts about the official explanation of the Kennedy assassination before social media is laughable. The subject has captivated Americans for more than six decades. Oliver Stone’s JFK, which highlighted florid conspiracy theories about the former president’s death, came out in 1991. The film starred Kevin Costner and won two Oscars.

Thiel’s quest for closure about the pandemic is noteworthy. Something happened during that period to drive influential, apparently rational people toward beliefs that were once associated with crackpots. Others suddenly lost trust in institutions and expertise. The podcaster Bryan Johnson—a successful tech entrepreneur who is now pursuing literal immortality—went from boasting about receiving the Moderna vaccine in 2021, because he had invested in one of the companies involved in its development, to complaining that “vaccines are a holy war” and that he regretted getting a COVID shot because not enough data supported its use. This is a man who pops enough pills that if you shook him, he’d rattle.

[Read: I went to a rave with a 46-year-old millionaire who claims to have the body of a teenager]

High-profile figures across the political right have revealed their penchant for woo-woo. Skepticism of conventional medicine has become a staple of heterodox podcasts that simultaneously promote unproven, unregulated dietary supplements. My colleague Anne Applebaum has described this trend toward mysticism, fringe religious beliefs, and pseudo-spirituality as the New Obscurantism.

Until recently, I had assumed that the anti-establishment sentiments promoted by Thiel and others were merely opportunistic, a way for elites to stoke a form of anti-elitism that somehow excluded themselves as targets of popular rage. Thiel has always made a point of entertaining provocative heterodox opinions, but he has also demonstrated himself to be eloquent, analytical, and capable of going whole paragraphs without saying something unhinged. But reading his Financial Times column, I thought: My God, he actually believes this stuff. The entire tone is reminiscent of a stranger sitting down next to you on public transit and whispering that the FBI is following him.

The correct response to uncertainty is humility, not conspiracy. But conspiracy is exactly what many of those who are influential in Trump’s orbit have succumbed to—everything must be a product of the DISC, or the deep state, or the World Economic Forum, or other sinister and hidden controlling hands. The cynical Tucker Carlson of the Dominion era has given way to a more crankish version since his firing from Fox. When Carlson first went independent, he seemed to be hosting kooks for clicks. On his live tour, for example, he looked faintly embarrassed as Roseanne Barr told him that Democrats “love the taste of human flesh and they drink human blood.” And maybe he didn’t really believe the former crack user who claimed to have had a gay affair with Barack Obama, or the historian who asserts that Winston Churchill—not Adolf Hitler—was the “chief villain” in the Second World War. But at a certain point, I started to take Carlson at his word. Recently, he claimed that he’d woken up with scars and claw marks after being attacked by a demon in his bedroom. A few days before this, he said that America needed a “vigorous spanking” from Daddy Trump, and a few days after, Carlson revealed that he thought demons had invented the atom bomb. He’s clearly working through some stuff.

What can we learn from this kind of credulity? First, that maintaining an appropriate level of skepticism is the intellectual discipline needed to navigate the rest of the 2020s. Yes, the legacy media will get things wrong. But that doesn’t mean you should believe every seductive narrative floating around online, particularly when it’s peddled by those who are trying to sell you something.

The second lesson is that, no matter how smart a person might be in their business dealings, humans are all prone to the same lizard-brain preference for narratives over facts. That makes choosing your information sources carefully even more important. If you spend all day listening to people who think that every inexplicable event has a malevolent hand behind it, you will start to believe that too. The fact that this paranoia has eaten up America’s most influential men is an apokálypsis of its own.

How Sherlock Holmes Broke Copyright Law

The Atlantic

www.theatlantic.com › books › archive › 2025 › 01 › how-sherlock-holmes-broke-copyright-law › 681223

This story seems to be about:

Edmund Wilson hated mysteries. In the 1940s, one of the most respected literary critics in America outlined his objections to the genre in a pair of caustic essays, “Why Do People Read Detective Stories?” and “Who Cares Who Killed Roger Ackroyd?” After receiving a deluge of irate responses, Wilson conceded that he had recently been reading himself to sleep with the Sherlock Holmes series, enthralled by its “fairy-tale poetry of hansom cabs, gloomy London lodgings, and lonely country estates.” He contended, however, that Holmes’s cases occupied a special category: “They are among the most amusing of fairy-tales and not among the least distinguished.”

If Sherlock Holmes really is the last of the classic fairy-tale heroes, he may also be the first to have been protected by modern intellectual-copyright law. Sir Arthur Conan Doyle introduced Holmes to his loyal companion, Dr. Watson, in the 1887 novel A Study in Scarlet, but the final stories didn’t fall out of copyright until January 1, 2023. Now that the characters are unambiguously free to use, numerous Holmes projects are scheduled to premiere or begin filming in the coming year alone, including Guy Ritchie’s Young Sherlock series on Amazon; Watson, a CBS procedural starring Morris Chestnut; and Sherlock & Daughter with David Thewlis on the CW. Brendan Foley, the creator of Sherlock & Daughter, told me that “the escape of Holmes and Watson into the public domain” might not be the only explanation for the coming surge, but “it certainly didn’t hurt.”

The latest spin-offs can safely ignore the confusing rights issues that plagued earlier adaptations. For the big-budget action movies that Ritchie directed with Robert Downey Jr., Warner Bros. took the extraordinary step of signing agreements with two competing entities that claimed to own Holmes. Robert Doherty, the creator of Elementary, which reimagines Holmes in present-day New York, told me in an email that the rights situation for his CBS series was “murky” but that a deal was struck out of an abundance of caution: “I think the view on the studio side was that the characters were indeed in the public domain. At the same time, all parties wanted to tread very carefully.”

Although the Holmes copyright debacle has finally expired, it offers a preview of even more contentious battles to come. Modern audiences have plenty of experience with the notion of a series character, developed over decades, who inspires both “fan works”—a concept that Holmes devotees essentially invented—and a seemingly endless string of reboots. For one glaring example, a little more than a decade from now, the public domain will welcome the earliest stories featuring another hero often called “the world’s greatest detective”: Batman. And his current owners will have every reason to study the playbook of the Doyle estate.

The confusion surrounding Holmes stands as a cautionary tale about the manipulation of copyright law—not by opportunists exploiting a valuable piece of intellectual property, but by the character’s official custodians. Last year marked the 50th anniversary of Nicholas Meyer’s novel The Seven-Per-Cent Solution, which imagined Sigmund Freud treating Holmes for his addiction to cocaine. Its release was delayed for months by negotiations with Doyle’s copyright holders, resulting in what Meyer now calls “no seven-per-cent solution, I promise you.”

After the novel became a best seller, Meyer wrote five sequels, including last year’s Sherlock Holmes and the Telegram From Hell, but he told me that he might never have attempted the first novel if he had foreseen the ensuing headaches: “I had done a back-of-the-envelope calculation and convinced myself that Holmes was in the public domain. Math is not my strong suit.” Yet even for the experts, untangling the facts of the case has always been a three-pipe problem—the kind of mystery that Holmes could solve only after three pipefuls of his favorite shag tobacco.

[Read: Sherlock Holmes, unlikely style icon]

Over the four decades during which Doyle wrote the original stories, international copyright law was rapidly evolving. After the author died in 1930, a colorful array of contenders fought over the rights, including his playboy sons, Denis and Adrian; Denis’s widow, the former Princess Nina Mdivani; and the producer Sheldon Reynolds and his wife, Andrea, who had a very public affair with the notorious socialite Claus von Bülow. Eventually, those rights coalesced under the Conan Doyle Estate Ltd., which is overseen by various family members. (There are no direct descendants.) But the confusion didn’t end there. The four novels and 46 short stories published before 1923 entered the public domain in 1998. Only the last 10 stories in the series were covered by the Copyright Term Extension Act—nicknamed the “Mickey Mouse Protection Act,” after its most famous beneficiary—that passed later that year, postponing future expirations of some copyrights by decades.

Yet the Doyle estate used those late stories as a wedge, arguing that it retained licensing rights for all works featuring Holmes and Watson during the remaining quarter of a century before the final tales—widely seen as the worst of the bunch—fell out of copyright. Their claim: The characters didn’t assume their definitive form until the series was complete. The estate based its argument on a distinction between “flat” and “round” fictional characters first proposed by E. M. Forster in his 1927 book, Aspects of the Novel, a concept frequently invoked in high-school literature classes but never previously tested in court.

In its legal filings, the estate drew a contrast between “flat” characters without depth—such as Superman and Amos and Andy—and “round” characters such as Holmes, who were capable of complexity and change. Doyle, it said, continued to develop Holmes to the very end, gradually transforming him from a reasoning machine into an empathetic figure who displays affection for women, dogs, and even his long-suffering partner. And it soon became clear that this argument would have enormous implications for copyright holders, who would be motivated to retain control over their characters by changing them incrementally for as long as possible.

In 2013, the estate was sued by the prominent Sherlockian Leslie S. Klinger, who refused to pay a licensing fee for an anthology of new Holmes stories by contemporary writers. Klinger said that all of the detective’s crucial components—including his “bohemian nature” and his “aptitude for disguise”—were established early in the series. (As other commentators have noted, some of Holmes’s most recognizable characteristics—the deerstalker cap, the distinctive curved calabash pipe, the phrase “Elementary, my dear Watson”—never appeared in Doyle’s stories at all.)

After the case was decided in Klinger’s favor, an appeal ended up before U.S. Circuit Judge Richard Posner, who upheld the ruling and ordered the estate to pay all legal costs, criticizing its strategy as “a form of extortion” against creators: “It’s time the estate, in its own self-interest, changed its business model.” Yet the lingering “fog of uncertainty,” as Klinger’s lawyer described it, allowed the estate to continue policing its claim on elements from the final run of stories, especially their alleged depiction of a more emotional Holmes.

In 2015, the estate filed suit against the makers of Mr. Holmes, an Ian McKellen film adapted from a novel by Mitch Cullin, who complained to a reporter, “It is cheaper for corporations to settle than go to court, and I believe the estate is not only keenly aware of that reality, but that they bank on it as an outcome.” Five years later, it went after the Netflix movie Enola Holmes, contending that the estate owned the stories that defined the version of Holmes “stamped in the public mind.” Both suits were likely privately settled, but with all rights now expired, the estate has turned to what its head of licensing, Tim Hubbard, described in an email as “authenticat[ing] projects and partnerships where our collaborators want to be connected to the source.” (The estate declined to address specific questions about its legal strategies or arguments.)

[Read: Generative AI is challenging a 234-year-old law]

It has also been relieved of the obligation to make an argument that Meyer, the author of The Seven-Per-Cent Solution, succinctly dismissed to me as “bullshit.” The estate created a false narrative about the character it was supposedly protecting, Meyer argued, ignoring the abundant earlier evidence of what Watson calls the “hidden fires” smoldering beneath the exterior of the otherwise rational Holmes, who displays humor, empathy, and emotion throughout the series.

One could plausibly claim, as Klinger suggested, that all of the important aspects of the character were there from the very beginning. In his seminal 1910 essay, “Studies in the Literature of Sherlock Holmes,” the theologian Ronald A. Knox identified 11 elements of the archetypal case. According to Knox, the only story that contained the full list was none other than A Study in Scarlet, which was published in the United States by J. B. Lippincott in 1890.

Betsy Rosenblatt, a law professor at Case Western Reserve University, told me that the novel’s U.S. copyright would have lasted a maximum of 56 years, meaning that the characters should have entered the public domain in America in 1946. If creators had been allowed to independently explore eight decades sooner one of the most popular fictional characters in history, our picture of Holmes might have been immeasurably enriched.

These issues aren’t merely historical or hypothetical. In 2034, the oldest Superman comics will enter the public domain, followed a year later by Batman. Jay Kogan, a senior vice president in charge of legal affairs at DC Comics, has advocated for protecting the company’s stake in its superheroes “by gradually changing the literary and visual characteristics of a character over time.” Whereas Holmes evolved organically—or so the estate has claimed—Bruce Wayne might don new costumes only so that a corporation can assert control over “the de facto standard” of the Dark Knight.

Creativity, however, doesn’t follow the logic of copyright law. Once a character becomes a cultural possession—with the “fairy-tale” quality that enchanted Edmund Wilson—even a rudimentary form will carry the aura of its other incarnations. As soon as the earliest version of Batman is freely available, creators will benefit from his full history, turning these associations to all kinds of surprising ends. This is exactly why copyrights expire. Holmes and Watson are eternal not because they are mysteriously “round,” but because they are flat enough to fit into new stories for every generation.

E. M. Forster, who defined these categories in the first place, saw that flatness can be enormously satisfying: “We all want books to endure, to be refuges, and their inhabitants to be always the same, and flat characters tend to justify themselves on this account.” Doyle himself knew that such reliability could be a source of comfort. In the 1917 story “His Last Bow,” which transposed the pair from the Victorian era into the Great War, Holmes offered a backhanded compliment to his faithful friend: “Good old Watson! You are the one fixed point in a changing age.”

Don’t Let Terror Shut America Down

The Atlantic

www.theatlantic.com › ideas › archive › 2025 › 01 › dont-let-terror-shut-us-down › 681193

Updated on January 1, 2025, at 2:43 p.m. ET

Despite the devastating terror attack that killed at least 10 people on Bourbon Street in New Orleans in the early morning hours of New Year’s Day, it seemed at first as though the Sugar Bowl college-football playoff game would continue tonight in the city’s Superdome, less than two miles from the carnage. This afternoon, officials announced they would postpone the game for at least 24 hours.

Getting on with activities as normal, to whatever extent is possible, is the correct approach. Responses to terror or violent attacks need to be based on the specifics of the incident, but the default should always be to remain open. A nation, any nation, must have the capacity to mourn and move forward simultaneously.

The question isn’t whether proceeding with scheduled events is disrespectful to those who have been directly affected by terror. In some ways, it obviously is; the Sugar Bowl is only a college football game. But the decision should be based less on emotion and more on the level of ongoing risk, and the available security, for those who are asked to continue with their lives.

First, can the situation legitimately be described as no longer posing a continuing danger? In 2015 in Paris, a wave of terror attacks over one long night resulted in 130 deaths. The entire country was placed under what amounted to a three-month lockdown, with most public events canceled. That made some sense, given the sophistication and planning behind those attacks, and the fact that a concert hall and sporting venue were targeted. “People have come from all over the country,” Louisiana Representative Troy Carter told CBS, “but nothing is more important than public safety and making sure that we're protecting the citizens and visitors alike.”

In a statement, the FBI identified the suspect as 42-year-old Shamsud-Din Jabbar, a U.S. citizen from Texas. He was killed at the scene by law-enforcement officers. An Islamic State flag had been located in the vehicle, the FBI said, and law enforcement is working to determine the suspect’s affiliations. Although what additional information might be available to the FBI remains unclear, the unified messaging suggests they are not overly concerned about continuing risk.

Second, if a city chooses to close down or delay events, does it have clear standards for what will allow it to reopen? This was the dilemma after the Boston Marathon bombings on a Monday in 2013, when the two terrorists initially evaded law enforcement. After the Tsarnaev brothers, who had carried out the attack, killed an MIT police officer while making their escape, the governor asked residents of nearby towns to remain indoors as the search proceeded. The governor’s request, accepted by the scared public rather than enforced, ceased to be sustainable as the search dragged on for an entire day. European cities such as Brussels have faced the same issue after major attacks. It is easy to close down but harder to have metrics for what is perfectly safe, because that is an impossible standard.

Third, can public-safety resources and planning be redeployed or reassessed in light of the terror attack without forcing the city to a standstill? A preplanned sports event, such as the Sugar Bowl, already has in place safety and security protocols that can be amended in just a few hours to allow for more resources from other jurisdictions and changes to vehicle access. Indeed, just a day after Boston’s lockdown, the Red Sox played at Fenway with a ramped-up public-safety presence. The Hall of Fame slugger David Ortiz memorably welcomed the anxious crowd by saying, “This is our fucking city.” He was reflecting a sense that terrorists elevate their cause if they can affect entire populations, and the best response can be an insistent normalcy.

There is no perfect answer to the challenge posed by an attack, but asking the public to stay put can be unnecessary. In Maine in 2023, after the tragic shooting of 18 victims by a lone gunman, the town of Lewiston and areas across southern Maine went into shelter-in-place mode for several days until he was found dead from suicide. Fear and isolation may have been unnecessarily amplified by the lockdown, originally issued for an indefinite period.

After the terror attacks of September 11, 2001, President George W. Bush tried to calm a grieving nation by telling citizens to still “go shopping for their families.” The quote has been mocked as both tone-deaf (the term consumer patriotism was coined) and insensitive, but the for is often forgotten in the retelling. No matter how terrible an attack, we still need to be there for one another—whether that means gathering or grieving or, when the time comes, just watching a football game.