Itemoids

Great

Humans Love Fireflies. Maybe Too Much.

The Atlantic

www.theatlantic.com › science › archive › 2023 › 07 › firefly-tourism-insect-species-threats › 674865

This story seems to be about:

This article was originally published in bioGraphic.

One dusky June evening, two days before the 2022 Pennsylvania Firefly Festival, the biologist Sarah Lower sat on a back porch, watching the sky for a specific gradation of twilight. A group of Lower’s students from Bucknell University hung around her, armed with butterfly nets and stopwatches for counting the time between firefly flashes—a way to differentiate between the multiple lightning-bug species that live here at the edge of Pennsylvania’s Allegheny National Forest. This postindustrial expanse of second-growth trees and hills pimpled with oil wells also happens to rank among the world’s best places to see fireflies.

Once the cloudy sky blushed red from its last glimpse of the setting sun, I set out with Lower and her students toward the forest edge. Moving from habitat to habitat as the evening deepened, Lower narrated which species we saw and their different behaviors. Her students, meanwhile, netted their way down a wish list of research samples.


First up was Photinus macdermotti, a firefly species that emits two quick flashes. Just a few feet away, near a pond ringed by cattails where a beaver lazed face up, the students caught Photinus marginellus, a quick single flasher. Males buzzed around one patch of goldenrod, blinking quick winks at the sitting females who deigned to flash back. Like other species of fireflies, males of P. marginellus typically flash in flight, while females wait below on blades of grass, shooting answering flashes at only the most compelling suitors.

At first, these early-evening species looked almost like pixels of static. But the darker it got, the more they came to resemble dust motes twinkling in invisible sunbeams.

Half an hour later, we moved on. Heading across Pennsylvania Route 666 and past a modest farmhouse, we reached a small path leading down to Tionesta Creek, which parallels the road. By now the air had chilled. Twilight drained away the last notes of color, a dullness almost immediately punctuated by a yet-undescribed firefly species from the genus Photuris, nicknamed “Chinese lanterns” by Lower and her team. Each flash set the fireflies aglow for long beats of unearthly green so bright they illuminated surrounding vegetation. A student snagged one in a net, marveling at its size—several times larger than the species they’d already collected. Irritated or alarmed, the captured firefly switched to a faster pulse, reminiscent of a car alarm.

“These are the ‘I’m angry’ lights,” Lower explained.

Clumsy in the dark but reluctant to spoil our night vision with flashlights, we meandered along the creek to where a bridge spanned the water, overlooking an island spiked with conifers. From the base of the island to the tree canopy, a galaxy of fireflies shone in drifts or brief flashes, complemented by a starry sky overhead. Their flashes merged with the stars into a doubly scintillating reflection in the water below. It was a dazzling scene, and one that hundreds of people would soon flock here to see as the Firefly Festival got under way.

Around the world, firefly tourism is surging in popularity. The interest gives scientists like Lower hope that funding and conservation will follow, because fireflies—like other dark-dependent invertebrates—are succumbing to our society’s penchant for sterile lawns and careless nighttime lighting. But the choice to open any of the world’s most spectacular firefly sites to the public focuses these same pressures to a sharp point. When the founders of the Pennsylvania Firefly Festival chose to share their backyard’s magic with the world a decade ago, did they further imperil the local firefly population? Or, by giving people like me the chance to stand on a bridge, balanced between galaxies, did they play a small role in protecting one of our most beloved summer spectacles?


On another June night, in 2012, a group of visitors arrived at Ken and Peggy Butler’s bed-and-breakfast, out past reliable cell service in Forest County, Pennsylvania. Peggy was a school therapist, Ken a money manager, and they had moved out into the northwest corner of the state for the quiet and the fly-fishing.

These visitors were not the Butlers’ typical bed-and-breakfast guests. The roving band of firefly scientists lugged microscopes and butterfly nets into the Butlers’ garage, then spent the next six weeks venturing out in tick-proof gear each evening, surveying fireflies where the Butler’s grassy backyard melted into the half-a-million-acre national forest. What they found was nothing short of astonishing—a wonderland of evolutionary biology amid the quiet, unimposing hills of rural Pennsylvania.

[Read: Will these be the last polar bears on Earth?]

One theory holds that bioluminescence emerged on Earth half a billion to 2 billion years ago in organisms to which oxygen was toxic. This theory holds that some life forms evolved a chemical process that could consume and detoxify any offending molecules while popping out a little bit of light as a harmless by-product.

Whatever its primordial purpose, bioluminescence has since emerged or reemerged at least 94 times across the tree of life, according to recent counts. The specifics of how different single-celled organisms and larger creatures accomplish their own glow-up tricks vary, but a general pattern holds across many examples. Bioluminescent organisms like fireflies have enzymes called luciferases (from the Latin lucifer, meaning “light-bringer”), which they apply inside specialized lantern organs, alongside a pinch of oxygen and a little bit of energy, to another class of compounds called luciferins. Et voilà: A photon of light comes out.

Most creatures who adapt this ancient chemistry to their own ends reside in the ocean: electric-blue crustaceans, fish that use dim lights to cloak themselves from predators, and deep-sea squid that scintillate like alien spacecraft. A few, like New Zealand’s glowworms, live in caves. Fireflies, conversely, are easy to see, flickering at the edge of backyards, captured in jars, shining in the childhood memories of millions as a stand-in for nostalgia or wonder. Perhaps because they’re the type of bioluminescent creature people are most likely to encounter, fireflies hold a special allure—often they’re a gateway to an underappreciated, imperiled cosmos of nocturnal biodiversity.

To date, scientists have described more than 2,000 species of fireflies. Some are active during the day, communicating via pheromones. But the most well known come out during the evening or night to inscribe bursts of light into the air like species-specific autographs. The researchers who first came to survey the species in the Butlers’ backyard included Lower, who was then a graduate student, and Lynn Faust, an independent naturalist and firefly expert. The team reported at least 15 species in all, the insects living practically on top of one another.

Two species in particular stood out. The researchers spotted clouds of one famous and rare firefly, Photinus carolinus, which flashes in synchronous bursts, causing larger groups of them to light up in near unison in a wave that moves across the forest. Then they discovered what appeared to be a new species, the one they nicknamed “Chinese lanterns,” flying like lazy sparks above a campfire for long beats of electric lime green. Both these and the synchronizers, wrote Faust in the survey report, “easily reached the ‘WOW!’ level.”

For the Butlers, the choice now was whether the scientists should be vague or precise about the location of the firefly wonderland. “If you decide you don’t want to pursue anything with this, we will keep it quiet,” Faust told the Butlers. “You can just go about your lives as normal as possible.”

The Butlers evaluated their options. Make the report as specific as you like, they said. How many people could possibly come?

Faust knew the answer to that question. She had begun her own path to the forefront of firefly science not as a credentialed academic but as a young mother in 1992, when she invited scientists to her family’s cabin in the Great Smoky Mountains in Tennessee to study a spectacle her family had long referred to as “the light show.” As those scientists soon published, Faust’s family’s private light show was a proven example of synchronous fireflies.

Before long, people wanted to see for themselves. Many people. The synchronizers in Great Smoky Mountains National Park became an annual event on par with Fourth of July fireworks, drawing more than 26,000 tourists a year. Visitors clomped through the forest, often crushing female fireflies underfoot or disorienting the insects with their flashlight beams. “I have crouched in the dark woods, illuminated by the rhythmic flashes, and wept over the unintended consequences,” Faust wrote in her 2017 book, Fireflies, Glow-worms, and Lightning Bugs, one of the few authoritative field guides to North American fireflies.

She also felt, however, that many of these clompers would otherwise never go out in the dark with eyes and hearts open to nature. Was sharing the Smoky Mountain fireflies with the world the right call? “It depends on which night you get me,” she told me recently.

Humans’ fascination with fireflies has long been smothering. In the early 20th century, hunters in the Japanese countryside stuffed fireflies into cages and shipped them to major cities such as Tokyo to glimmer out the rest of their lives as doomed mood lighting. Another wave of lighting-bug lust occurred in mid-century America, when a chemical company eager to harvest bioluminescent enzymes dispatched community groups and Boy Scouts as firefly collectors. And in China, 17 million fireflies were sold in 2016 alone, many over the eBay-like website Taobao, to customers who used them as living gifts, decorations, and Valentine’s Day–esque love tokens. (The chemical company stopped soliciting fireflies in the 1990s, and Taobao banned the sale of fireflies in 2017.)

Just going to see fireflies poses less obvious risk to them. But scientists have amassed some alarming reports. In Thailand, for example, where boats ferry tourists past mangrove-swamp forests pulsing with synchronous fireflies, scientists have documented shorelines eroding, gas leaking into the water, and camera flashes disturbing firefly courtship. At one popular Thai site, scientists have estimated that the population of one synchronizing-firefly species is down 80 percent since tourism began.

In a rural town in Mexico’s Tlaxcala state, where a new synchronizing-firefly species was formally recognized in 2012, tourism has since ballooned to some 120,000 visitors a year. And in North America, too, firefly tourism is on the rise. In Faust’s beloved Great Smokies, even after years of trying to throttle crowds—the National Park Service has instituted an online lottery to limit the number of visitors—some guests still head off into the forests and lie on the ground.

Tourism is far from the only threat to fireflies. As with many insects, data on lightning-bug populations are spotty, outside of a general, anecdotal sense that they’re blinking out. But insects overall are in crisis. Numerous studies suggest that within many insect groups, abundance is dwindling by 1 to 2 percent each year. An International Union for Conservation of Nature (IUCN) group found in 2020 that fireflies face three primary threats.

The first is habitat loss, which eradicates all but the hardiest lightning bugs from developed areas, leaving species like the big-dipper firefly—the pigeon of the firefly world. Second, like other insect populations, fireflies also seem to be suffering collateral damage from pesticides used in agriculture. And, finally, on top of that is light pollution: the glare of each streetlight, LED-outfitted billboard, front-porch lamp, and every other fixture left on in the night. A recent global study estimated that the collective glow of all this wasted light is making the night sky about 10 percent brighter each year, bathing ever more of the planet’s nighttime surface in light. Such artificial lights threaten to drown firefly bioluminescent courtship signals in much the same way loudspeakers blaring out static would disrupt birdsong. The entomologist Avalon Owens, who studied fireflies for her Ph.D. dissertation at Tufts University, has found that even ambient light pollution can cause some firefly species to blink less often, transforming what should be call-and-response dialogues into a series of missed connections.


Our effort to understand how quickly fireflies are disappearing is also hampered by our relative ignorance of them. North American fireflies spend much of their lives as larvae wriggling through soil, where they hunt down worms and snails, inject their prey with enzymes, and slurp up the resultant puddle of goo. Once they emerge as short-lived adults, some species are known only by a specific flash that a naturalist described seeing in a dark jungle decades ago. When the IUCN published its first firefly-conservation-status survey in 2021, focusing on 132 species in North America, it classified 18 as threatened. But it categorized 70 more only as data deficient, meaning we don’t know enough about them to say how imperiled they might be.

“Compared to what the monarch people can do, it’s so sad,” says Owens. Unlike butterfly hobbyists, who go out in clubs during the daytime and have collected decades of data on population abundances, firefly surveying has historically been a solitary activity. “Each couple of decades, you get, like, one eccentric person who spends every night in the middle of the woods,” she adds.

“Five years ago we basically knew nothing,” says Sara Lewis, a biologist at Tufts. For years, Lewis designed careful lab experiments to understand firefly reproductive structures and behaviors. Then “a switch went off in my head, and I was like, wait, what difference does it make to know [these specific details about] a group of animals that could be extinct in 50 or 100 years?” Today, Lewis co-leads the IUCN’s efforts to keep firefly populations alive.


As some firefly populations fade to black, though, general and scientific interest is swelling. More people want to see fireflies for themselves, driving firefly tourism, and more scientists want to better understand firefly biology both for its own sake and for future conservation work. Perhaps the Butlers didn’t have to make the same stark choice Lynn Faust made in the Great Smoky Mountains. Perhaps tourism and science could complement each other. Maybe people could love fireflies neither too little nor too much but just the right amount.



The Butlers’ path to sustainable firefly tourism was rocky. The summer after Lynn Faust’s report on the Allegheny National Forest fireflies was published, the Butlers hosted the first Pennsylvania Firefly Festival—a free, two-night event in the grassy field behind their house. They had food trucks, face painting, and music. Some 400 people came. The next year was similar. Then, in 2015, David Attenborough and his crew came to the property to film a documentary called Life That Glows, hiring Lower and Faust as on-site firefly wranglers. “Then we knew: This is serious,” Peggy says.

After Attenborough’s film, things got out of hand. A thousand people showed up in 2016. Cars filled the field, and as they pulled out, every pair of headlights beamed into the woods, grinding the synchronous display to a halt. “It was like, this is gonna break us,” Peggy says. “This is going to kill us because it’s going to kill the fireflies.”

[Read: The myth of the Galápagos cannot be sustained]

Since then, the Butlers have taken steps to rein in the enthusiasm. First they started charging admission, which they funneled to a nonprofit called the Pennsylvania Firefly Festival, which supports research and sponsors graduate students. With advice from Lewis, they installed bleachers and red-rope lighting to keep visitors from trampling female fireflies and their habitat. After the pandemic forced a pause, they went even smaller: They sold just 100 tickets in 2022, divided into two nights.

At the same time, the Butlers built up closer ties with the scientific community, converting their bed-and-breakfast into something more like a hostel for visiting researchers. Among the scientists who kept coming back was Lower, who is studying the many firefly species that restrict their activities to the day and communicate with pheromones. Lower and her collaborators recently isolated the first known firefly pheromone, and she was at the Butlers’ in 2022 to determine what scents fireflies are using to attract one another, and whether light- and smell-based flirting are mutually exclusive.

The Butlers have also hosted research on how artificial light stifles fireflies. In recent years, ecologists have demonstrated that many species are more sensitive to blue colors of light. When Owens came here to test the least harmful colors of artificial light for fireflies in 2019, though, she found that amber-colored lights—darlings of the dark-sky environmental movement because most species, humans included, seem less bothered by them—are especially disruptive to fireflies. Red lights are still a good choice, Owens says, but the best strategy remains the most obvious: Just use light sparingly overall.

The research happening at the Butlers’ is just one part of a worldwide firefly renaissance. Setting aside habitat loss, light pollution, and pesticides, the known ranges of many firefly species seem to be expanding, Faust says, because more people are out looking. Starting from the “discovery” of synchronizing P. carolinus fireflies in the Smoky Mountains in the 1990s based on Faust’s reports, naturalists and scientists have recognized other P. carolinus outposts up and down the Appalachian Mountains. (The Xerces Society maintains a map of places that accept visitors to view these and other species.)

The same scientists whom Faust had summoned to Tennessee later documented synchrony in another American species, Photuris frontalis, which soon drew its own research scientists and crowds, which in turn helped spark the passion of new enthusiasts. After surviving a life-threatening car accident, for example, the North Carolina State University entomologist Clyde Sorenson told me he pursued research on fireflies for the pure joy of it. In 2019, Sorenson documented firefly synchrony on North Carolina’s Grandfather Mountain, and he has since been tracking down an undescribed “ghost” firefly species that emits faint green signals.

With firefly tourism on the rise as well, a team convened by Lewis published a set of recommendations in 2021 for how to manage the upswing of interest. Although tourism is unlikely to lead to global extinctions, it can certainly extirpate local populations, she says. The final report recommends robust habitat protection and education programs, including etiquette guides. For guests, that means carrying no artificial light sources and staying on marked trails; for hosts, it means limiting total visitor numbers, fencing off paths, and minimizing lighting. These are all steps that the Butlers have taken as part of their journey from wide-eyed enthusiasts to conservation advocates.

A few weeks before the 2022 Pennsylvania Firefly Festival, Ken and Peggy Butler visited their first international scientific conference, in Portugal. From the time their plane touched back down in the U.S. to the start of the festival, their days were packed with answering emails, wrangling volunteers, and accommodating an in-home guest list that had ballooned to festival presenters, interns, the troop of Bucknell researchers, and the latest visiting journalist.

Finally, a few hours before the 2022 festival’s first night, Ken and Peggy slowed down long enough to chat with me on their porch about their own learning experience. Sarah Lower listened in, pausing at one point to snatch another day-active, presumably pheromone-emitting firefly buzzing around us and slot it into a vial.

I asked whether the Butlers regretted the answer they had given to Faust a decade ago, when the choice to publish their location propelled the rest of their summers—and a sizable part of their lives—into firefly-land. “I’m a firm no,” Ken said, and Peggy agreed.

Once the festival began, local musician Matt Miskie played a set of songs, including one written for the event: We’re out tonight,” the chorus goes, “beneath the Allegheny skies.” (He’s “the Jimmy Buffett of Western Pennsylvania,” Ken explained.) There was a merch table and exhibits: The astronomer Diane Turnshek, who had recently helped the city of Pittsburgh change its street lighting to limit light pollution, set up a booth promoting dark-sky environmentalism. Don Salvatore, a firefly naturalist and educator from New England, gave a Boston-accented presentation on firefly courtship. And then groups set out to see fireflies, guided by volunteers and the Bucknell students.

Even with the Butlers’ dedication to protecting fireflies and encouraging responsible tourism, nothing is perfect. That first evening got too cold, causing the synchronous fireflies to slow down and eventually stop flashing. One little girl, scared of the dark, had chosen to wear sneakers that burst out purple flashes with every step. An elderly woman sat in reverence and reminiscence at the edge of the woods, listening as Peggy explained firefly life history, but the car that fetched her back pierced the forest with its headlights.

And though the Butlers can control what happens on their own property, some of the most enticing firefly-viewing locations—like the magical bridge over Tionesta Creek—are public spaces subject to the choices of the entire community.

The nominal marquee show started at about 10 p.m. that evening, behind the house in the darker shadows of the woods. I stood in shivering silence, shoulder to shoulder with Miskie and a few other festival volunteers as a forest clearing’s worth of synchronous Photinus carolinus fireflies alternated between paparazzi bursts of quick white flashes and long, coordinated beats of collective quiet. A few straggling Chinese lanterns floated through their midst on their own tempo, unperturbed. Afterward, it was very, very hard to fall asleep.

Far right activists rally in Austria calling for end to ‘The Great Replacement’

Euronews

www.euronews.com › 2023 › 07 › 29 › far-right-activists-rally-in-austria-calling-for-end-to-the-great-replacement

"Natural Austrians" are becoming a minority in the country, according to far-right parties, who are marching through Vienna on Saturday to introduce the concept of "remigration" to the public.

Millions of Americans Have Stopped Attending Church. What Will Bring Them Back?

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 07 › christian-church-communitiy-participation-drop › 674843

Nearly everyone I grew up with in my childhood church in Lincoln, Nebraska, is no longer Christian. That’s not unusual. Forty million Americans have stopped attending church in the past 25 years. That’s something like 12 percent of the population, and it represents the largest concentrated change in church attendance in American history. As a Christian, I feel this shift acutely. My wife and I wonder whether the institutions and communities that have helped preserve us in our own faith will still exist for our four children, let alone whatever grandkids we might one day have.

This change is also bad news for America as a whole: Participation in a religious community generally correlates with better health outcomes and longer life, higher financial generosity, and more stable families—all of which are desperately needed in a nation with rising rates of loneliness, mental illness, and alcohol and drug dependency.

[Timothy Keller: American Christianity is due for a revival]

A new book, written by Jim Davis, a pastor at an evangelical church in Orlando, and Michael Graham, a writer with the Gospel Coalition, draws on surveys of more than 7,000 Americans by the political scientists Ryan Burge and Paul Djupe, attempting to explain why people have left churches—or “dechurched,” in the book’s lingo—and what, if anything, can be done to get some people to come back. The book raises an intriguing possibility: What if the problem isn’t that churches are asking too much of their members, but that they aren’t asking nearly enough?

The Great Dechurching finds that religious abuse and more general moral corruption in churches have driven people away. This is, of course, an indictment of the failures of many leaders who did not address abuse in their church. But Davis and Graham also find that a much larger share of those who have left church have done so for more banal reasons. The book suggests that the defining problem driving out most people who leave is … just how American life works in the 21st century. Contemporary America simply isn’t set up to promote mutuality, care, or common life. Rather, it is designed to maximize individual accomplishment as defined by professional and financial success. Such a system leaves precious little time or energy for forms of community that don’t contribute to one’s own professional life or, as one ages, the professional prospects of one’s children. Workism reigns in America, and because of it, community in America, religious community included, is a math problem that doesn’t add up.

Numerous victims of abuse in church environments can identify a moment when they lost the ability to believe, when they almost felt their faith draining out of them. The book shows, though, that for most Americans who were once a part of churches but have since left, the process of leaving was gradual, and in many cases they didn’t realize it was even happening until it already had. It’s less like jumping off a cliff and more like driving down a slope, eventually realizing that you can no longer see the place you started from.

Consider one of the composite characters that Graham and Davis use in the book to describe a typical evangelical dechurcher: a 30-something woman who grew up in a suburban megachurch, was heavily invested in a campus ministry while in college, then after graduating moved into a full-time job and began attending a young-adults group in a local church. In her 20s, she meets a guy who is less religiously engaged, they get married, and, at some point early in their marriage, after their first or second child is born, they stop going to church. Maybe the baby isn’t sleeping well and when Sunday morning comes around, it is simply easier to stay home and catch whatever sleep is available as the baby (finally) falls asleep.

In other cases, a person might be entering mid-career, working a high-stress job requiring a 60- or 70-hour workweek. Add to that 15 hours of commute time, and suddenly something like two-thirds of their waking hours in the week are already accounted for. And so when a friend invites them to a Sunday-morning brunch, they probably want to go to church, but they also want to see that friend, because they haven’t been able to see them for months. The friend wins out.

After a few weeks of either scenario, the thought of going to church on Sunday carries a certain mental burden with it—you might want to go, but you also dread the inevitable questions about where you have been. “I skipped church to go to brunch with a friend” or “I was just too tired to come” don’t sound like convincing excuses as you rehearse the conversation in your mind. Soon it actually sounds like it’d be harder to attend than to skip, even if some part of you still wants to go. The underlying challenge for many is that their lives are stretched like a rubber band about to snap—and church attendance ends up feeling like an item on a checklist that’s already too long.

What can churches do in such a context? In theory, the Christian Church could be an antidote to all that. What is more needed in our time than a community marked by sincere love, sharing what they have from each according to their ability and to each according to their need, eating together regularly, generously serving neighbors, and living lives of quiet virtue and prayer? A healthy church can be a safety net in the harsh American economy by offering its members material assistance in times of need: meals after a baby is born, money for rent after a layoff. Perhaps more important, it reminds people that their identity is not in their job or how much money they make; they are children of God, loved and protected and infinitely valuable.

But a vibrant, life-giving church requires more, not less, time and energy from its members. It asks people to prioritize one another over our career, to prioritize prayer and time reading scripture over accomplishment. This may seem like a tough sell in an era of dechurching. If people are already leaving—especially if they are leaving because they feel too busy and burned out to attend church regularly—why would they want to be part of a church that asks so much of them?

[Read: American religion is not dead yet]

Although understandable, that isn’t quite the right question. The problem in front of us is not that we have a healthy, sustainable society that doesn’t have room for church. The problem is that many Americans have adopted a way of life that has left us lonely, anxious, and uncertain of how to live in community with other people.

The tragedy of American churches is that they have been so caught up in this same world that we now find they have nothing to offer these suffering people that can’t be more easily found somewhere else. American churches have too often been content to function as a kind of vaguely spiritual NGO, an organization of detached individuals who meet together for religious services that inspire them, provide practical life advice, or offer positive emotional experiences. Too often it has not been a community that through its preaching and living bears witness to another way to live.

The theologian Stanley Hauerwas captured the problem well when he said that “pastoral care has become obsessed with the personal wounds of people in advanced industrial societies who have discovered that their lives lack meaning.” The difficulty is that many of the wounds and aches provoked by our current order aren’t of a sort that can be managed or life-hacked away. They are resolved only by changing one’s life, by becoming a radically different sort of person belonging to a radically different sort of community.

Last fall, I spent several days in New York City, during which time I visited a home owned by a group of pacifist Christians that lives from a common purse—meaning the members do not have privately held property but share their property and money. Their simple life and shared finances allow their schedules to be more flexible, making for a thicker immediate community and greater generosity to neighbors, as well as a richer life of prayer and private devotion to God, all supported by a deep commitment to their church.

This is, admittedly, an extreme example. But this community was thriving not because it found ways to scale down what it asked of its members but because it found a way to scale up what they provided to one another. Their way of living frees them from the treadmill of workism. Work, in this community, is judged not by the money it generates but by the people it serves. In a workist culture that believes dignity is grounded in accomplishment, simply reclaiming this alternative form of dignity becomes a radical act.

In the Gospels, Jesus tells his first disciples to leave their old way of life behind, going so far as abandoning their plow or fishing nets where they are and, if necessary, even leaving behind their parents. A church that doesn’t expect at least this much from one another isn’t really a church in the way Jesus spoke about it. If Graham and Davis are right, it also is likely a church that won’t survive the challenges facing us today.

The great dechurching could be the beginning of a new moment for churches, a moment marked less by aspiration to respectability and success, with less focus on individuals aligning themselves with American values and assumptions. We could be a witness to another way of life outside conventionally American measures of success. Churches could model better, truer sorts of communities, ones in which the hungry are fed, the weak are lifted up, and the proud are cast down. Such communities might not have the money, success, and influence that many American churches have so often pursued in recent years. But if such communities look less like those churches, they might also look more like the sorts of communities Jesus expected his followers to create.

The Great PowerPoint Panic of 2003

The Atlantic

www.theatlantic.com › technology › archive › 2023 › 07 › power-point-evil-tufte-history › 674797

The new media technology was going to make us stupid, to reduce all human interaction to a sales pitch. It was going to corrode our minds, degrade communication, and waste our time. Its sudden rise and rapid spread through business, government, and education augured nothing less than “the end of reason,” as one famous artist put it, for better or for worse. In the end, it would even get blamed for the live-broadcast deaths of seven Americans on national television. The year was 2003, and Americans were freaking out about the world-altering risks of … Microsoft PowerPoint.

Socrates once warned that the written word would atrophy our memory; the Renaissance polymath Conrad Gessner cautioned that the printing press would drown us in a “confusing and harmful abundance of books.” Generations since have worried that other new technologies—radio, TV, video games—would rot our children’s brains. In just the past 15 years alone, this magazine has sounded the alarm on Google, smartphones, and social media. Some of these critiques seem to have aged quite well; others, not so well. But tucked among them was a techno-scare of the highest order that has now been almost entirely forgotten: the belief that PowerPoint—that most enervating member of the Office software suite, that universal metonym for soporific meetings—might be evil.

Twenty years later, the Great PowerPoint Panic reads as both a farce and a tragedy. At the time, the age of social media was dawning: MySpace and LinkedIn were newly founded, and Facebook’s launch was just months away. But even as the polarization machine hummed to life, we were fixated on the existential threat of bullet points. Did we simply miss the mark? Or, ridiculous as it may seem today, were we onto something?

Sixteen minutes before touchdown on the morning of February 1, 2003, the space shuttle Columbia disintegrated into the cloudless East Texas sky. All seven astronauts aboard were killed. As the broken shuttle hurtled toward Earth in pieces, it looked to its live TV viewers like a swarm of shooting stars.

The immediate cause of the disaster, a report from a NASA Accident Investigation Board determined that August, was a piece of insulating foam that had broken loose and damaged the shuttle’s left wing soon after liftoff. But the report also singled out a less direct, more surprising culprit. Engineers had known about—and inappropriately discounted—the wing damage long before Columbia’s attempted reentry, but the flaws in their analysis were buried in a series of arcane and overstuffed computer-presentation slides that were shown to NASA officials. “It is easy to understand how a senior manager might read this PowerPoint slide and not realize that it addresses a life-threatening situation,” the report stated, later continuing: “The Board views the endemic use of PowerPoint briefing slides instead of technical papers as an illustration of the problematic methods of technical communication at NASA.”

PowerPoint was not then a new technology, but it was newly ubiquitous. In 1987, when the program was first released, it sold 40,000 copies. Ten years later, it sold 4 million. By the early 2000s, PowerPoint had captured 95 percent of the presentation-software market, and its growing influence on how Americans would talk and think was already giving rise to a critique. A 2001 feature in The New Yorker by Ian Parker argued that the software “helps you make a case, but it also makes its own case: about how to organize information, how much information to organize, how to look at the world.” Vint Cerf, one of the “fathers of the internet,” took to quipping that “power corrupts, and PowerPoint corrupts absolutely.”

By the start of 2003, the phrase death by PowerPoint had well and truly entered the popular lexicon. A Yale statistician named Edward Tufte was the first to take it literally: That spring, Tufte published a rip-roaring broadside titled The Cognitive Style of PowerPoint, including his analysis of the software’s role in the recent Columbia disaster. Its cover page, a political cartoon that Tufte designed himself, shows a photo of army battalions, standing in perfect columns, before a giant statue of Joseph Stalin in the center of Budapest. A speech bubble comes from one soldier’s mouth: “There’s no bullet list like Stalin’s bullet list!” Another calls out: “But why read aloud every slide?” Even Stalin speaks: “следующий слайд,” he says—“Next slide, please.”

The pamphlet’s core argument, channeling Marshall McLuhan, was that the media of communication influence the substance of communication, and PowerPoint as a medium had an obfuscatory, dumbing-down effect. It did not necessarily create vague, lazy presentations, but it certainly accommodated and sometimes even disguised them—with potentially fatal consequences. This is exactly what Tufte saw in the Columbia engineers’ slides. “The cognitive style of PP compromised the analysis,” he declared months before the NASA investigation report reached a very similar conclusion.

[Read: The Gettysburg Address as a Powerpoint]

Radical as Tufte’s position was, people took him seriously. He was already famous at the time as a public intellectual: His traveling one-day class on information design was more rock tour than lecture circuit. Hundreds of people packed into hotel ballrooms for each session. “They come to hear Edward R. Tufte,” one writer remarked at the time, “in the way the ancient Greeks must have gone to hear Socrates or would-be transcendentalists cut a path to 19th century Concord.” So when “the da Vinci of data” decided to weigh in on what would soon be called “the PowerPoint debate,” people listened.

Wired ran an excerpt from his pamphlet in September 2003, beneath the headline “PowerPoint Is Evil.” A few months later, The New York Times Magazine included Tufte’s assessment—summarized as “PowerPoint Makes You Dumb”—in its recap of the year’s most intriguing and important ideas. “Perhaps PowerPoint is uniquely suited to our modern age of obfuscation,” the entry read, noting that Colin Powell had just used the software to present evidence of Iraq’s weapons of mass destruction to the United Nations.

A few pages on was another notable entry in the magazine’s list of exciting new ideas: the social network. Even as PowerPoint was being linked with reality distortion and the rise of what Americans would soon be calling “truthiness,” the jury was still out on Friendster, LinkedIn, and other such networks. Maybe by supercharging social connection, they could alleviate our “profound national loneliness,” the write-up said. Maybe they would only “further fracture life into disparate spheres—the online and the offline.” Or maybe they wouldn’t be all that transformative—at least not compared with a technology as pervasive and influential as PowerPoint.

Tufte is now 81 years old and has long since retired. The “E.T. Tour,” which garnered, by his final count, 328,001 attendees, is over. These days, he mainly sculpts. But he is still himself: He still loathes PowerPoint. He still derives a kindergartner’s delight from calling it “PP.” And if you visit edwardtufte.com, you can still purchase his Stalin cartoon in poster form for $14.

In May, I emailed Tufte to ask how he thought his critique of PowerPoint had aged. True to form, he answered with a 16-page PDF, compiled specially for me, consisting of excerpts from his books and some blurbs about them too. He eventually agreed to speak by phone, but my first call to him went to voicemail. “In a land where time disappeared, E.T. is not available,” he incants in his outgoing message, with movie-trailer dramatics. “Your key to communication is voicemail. Or text message. Do it!” Beep.

When I finally reached E.T., I asked him whether, after 20 years of steady use, PowerPoint had really made us stupid. “I have no idea,” he said. “I’ve been on another planet. I’m an artist now.” In some sense, he went on, he’s the worst person to ask, because no one has dared show him a PowerPoint presentation since 2003. He also claimed that he hasn’t been “keeping score,” but he was aware—and appreciative—of the semi-recent revelation that his work helped inspire Jeff Bezos to ban the use of PowerPoint by senior Amazon executives.

Bezos was not the only one to see things Tufte’s way. Steve Jobs also banned PowerPoint from certain company meetings. At a 2010 military conference in North Carolina, former National Security Adviser H. R. McMaster, then an Army general, described PowerPoint as an internal threat; he had prohibited its use during the assault on the Iraqi city of Tal Afar in 2005. “PowerPoint makes us stupid,” General James Mattis said at the same conference. And in 2011, a former software engineer in Switzerland formed the Anti PowerPoint Party, a (sort of) real political party devoted to fighting slide-deck tyranny.

[Read: Is Google making us stupid?]

Tufte’s essay has faced its share of criticism too. Some accused him of having engineered a controversy in order to juice his course attendance. Others said he’d erred by mixing up the software with the habits of its users. “Any general opposition to PowerPoint is just dumb,” the Harvard psychologist Steven Pinker told The Wall Street Journal in 2009. “It’s like denouncing lectures—before there were awful PowerPoint presentations, there were awful scripted lectures, unscripted lectures, slide shows, chalk talks, and so on.” Gene Zelazny, the longtime director of business visual presentations at McKinsey, summed up Tufte’s argument as “blaming cars for the accidents that drivers cause.”

The problem with this comparison is that our transportation system does bear some responsibility for the 30,000 to 40,000 car-crash deaths that occur in the U.S. every year, because it puts drivers in the position to cause accidents. PowerPoint, Tufte told me, has an analogous effect by actively facilitating bad presentations. “It’s convenient for the presenter,” he said, “and it’s inconvenient and harmful to the audience and to the content.”

But if all of those bad presentations really led to broad societal ills, the proof is hard to find. Some scientists have tried to take a formal measure of the alleged PowerPoint Effect, asking whether the software really influences our ability to process information. Sebastian Kernbach, a professor of creativity and design at the University of St. Gallen, in Switzerland, has co-authored multiple reviews synthesizing this literature. On the whole, he told me, the research suggests that Tufte was partly right, partly wrong. PowerPoint doesn’t seem to make us stupid—there is no evidence of lower information retention or generalized cognitive decline, for example, among those who use it—but it does impose a set of assumptions about how information ought to be conveyed: loosely, in bullet points, and delivered by presenters to an audience of passive listeners. These assumptions have even reshaped the physical environment for the slide-deck age, Kernbach said: Seminar tables, once configured in a circle, have been bent, post-PowerPoint, into a U-shape to accommodate presenters.

The Atlantic

When I spoke with Kernbach, he was preparing for a talk on different methods of visual thinking to a group of employees at a large governmental organization. He said he planned to use a flip chart, draw on blank slides like a white board, and perhaps even have audience members do some drawing of their own. But he was also gearing up to use regular old PowerPoint slides. Doing so, he told me, would “signal preparation and professionalism” for his audience. The organization was NASA.

The fact that the American space agency still uses PowerPoint should not be surprising. Despite the backlash it inspired in the press, and the bile that it raised in billionaires, and the red alert it caused within the military, the corporate-presentation juggernaut rolls on. The program has more monthly users than ever before, according to Shawn Villaron, Microsoft’s vice president of product for PowerPoint—well into the hundreds of millions. If anything, its use cases have proliferated. During lockdown, people threw PowerPoint parties on Zoom. Kids now make PowerPoint presentations for their parents when they want to get a puppy or quit soccer or attend a Niall Horan meet and greet. If PowerPoint is evil, then evil rules the world.

On its face at least, the idea that PowerPoint makes us stupid looks like a textbook case of misguided technological doomsaying. When I asked Tufte to revisit his critique, he demurred, but later in our conversation I pressed him on the matter more directly: Was it possible that his own critique of a new technology had missed the target, just as so many others had in the past? Were the worries over PowerPoint any different from those about the printing press or word processors or—

He cut in before I could finish the thought. The question, he said with evident exasperation, was impossible to answer. “I don’t do big think, big bullshit,” he told me. “I'm down there in the trenches, right in the act of communication.” By which he meant, I think, that he doesn’t engage in any kind of remotely abstract historical thinking.

I tried narrowing the question. Today’s concerns about social media bear a certain resemblance to the PowerPoint critique, I said. Both boil down to a worry that new media technologies value form over substance, that they are designed to hold our attention rather than to convey truth, and that they make us stupid. Could it be—was there any chance at all—that Tufte had made the right critique, but of the wrong technology? He wasn’t having it. The comparison between PowerPoint and social media, he said, is “hand-waving and bullshit and opportunism.”

[Read: Yes, social media really is undermining democracy]

This dismissal notwithstanding, it’s tempting to entertain counterfactuals and wonder how things might have played out if Tufte and the rest of us had worried about social media back in 2003 instead of presentation software. Perhaps a timely pamphlet on The Cognitive Style of Friendster or a Wired headline asserting that “LinkedIn Is Evil” would have changed the course of history. If the social-media backlash of the past few years had been present from the start, maybe Facebook would never have grown into the behemoth it is now, and the country would never have become so hopelessly divided.

Or it could be that nothing whatsoever would have changed. No matter what their timing, and regardless of their aptness, concerns about new media rarely seem to make a difference. Objections get steamrolled. The new technology takes over. And years later, when we look back and think, How strange that we were so perturbed, the effects of that technology may well be invisible.

Did the written word decimate our memory? Did radio shrink our attention span? Did PowerPoint turn us into corporate bureaucrats? If these innovations really did change the way we think, then we’re measuring their effects with an altered mind. Either the critiques were wrong, or they were so right that we can no longer tell the difference.

The Song That Made Tony Bennett a Star

The Atlantic

www.theatlantic.com › culture › archive › 2023 › 07 › tony-bennett-because-of-you › 674801

In recent days, Tony Bennett—who died Friday at the age of 96—reportedly sang one last song while sitting at his piano. It was “Because of You,” his first hit, released in 1951, and the single that propelled him to more than seven decades of fame, fortune, and legend. But it was always more than a stepping stone. Where many artists downplay their early work, Bennett kept “Because of You” close to his heart. There is much to remember Bennett for, from his civil-rights activism to his stewardship of classic American pop songs. Without “Because of You,” none of it might have happened.

When Bennett first recorded the song, he was a 24-year-old kid from Queens whose slim discography had yielded little success. He had fought in World War II, participating in the liberation of Nazi concentration camps. As part of the postwar occupying force, he sang in Army bands. His career began in earnest at Columbia Records (then the home of Frank Sinatra), but almost stalled out before it began. He was on the verge of being dropped by the label when, in 1951, the orchestra leader Percy Faith randomly picked “Because of You” out of a pile of sheet music for Bennett to record.

“Because of You” has an interesting provenance. It was co-written by a Hammerstein—but not Oscar, the lyricist who famously collaborated with Richard Rodgers. Instead, it was written by Oscar’s far less notable uncle, Arthur, along with his creative partner Dudley Wilkinson. At first, the song went nowhere, Hammerstein brand notwithstanding. But Faith’s chance selection changed all that. His advice to Bennett: “Just relax. Use your natural voice and sing the song.” Better counsel was never given. Prior to that, Bennett, by his own admission, had been unsuccessfully trying out an overwrought style. “Then,” he said, “we decided I would just sing honestly and sincerely.”

[Photos: Remembering Tony Bennett]

Bennett brought an eventful young life—childhood hardship, the horrors of war—to bear on the heartbroken lull of “Because of You.” Faith’s orchestra curls around Bennett’s trembling voice and weightless cadence, informed by the singer’s lifelong adoration of jazz. There’s gravity to it, though. When he sings, “And I can smile because of you,” the subtext is simple but crushing: The object of his love is the only thing that can keep his spirit from collapsing. But the strength of that love is enough.

In his book The B Side: The Death of Tin Pan Alley and the Rebirth of the Great American Song, Ben Yagoda calls Bennett “the most justly celebrated singer of standards”—yet “Because of You” was virtually unknown, as was Bennett, when he recorded it. That obscurity didn’t last long. The single sold more than a million copies. It floated for weeks from sandwich-shop jukeboxes and sitting-room radios across the country. Because of You became the title of Bennett’s debut album; it set the stage for his rise, and for the resurgence of a mature pop style whose appeal transcended teenybopper fads and reached a more world-weary audience.

Like many of his contemporaries, including his friend Sinatra, Bennett grudgingly capitulated to the commercial pressures of rock music—but only briefly. As he remarked in his memoir The Good Life, “I thought the world was losing its mind” when rock started conquering the pop charts in the ’60s. His two albums from 1970, Tony Sings the Great Hits of Today! and Tony Bennett’s “Something, were bogged down by half-hearted interpretations of the Beatles and Stevie Wonder. From that low point, he decided to double down on his passion for jazz, even as the jazz world itself was pivoting away from dreamy crooners during the fusion-heavy ’70s. Likewise, he never gave up on “Because of You.” It remained a staple of his live sets and a fan favorite, a tender reminder of the delicate power and ageless warmth he possessed even in his youth.

Later decades grew kinder to Bennett; by the turn of the century, the world had fully embraced him once again. “Because of You” was, in part, responsible for the revival of his popularity; his rendition of this song with k. d. lang helped make his 2006 album, Duets: An American Classic, a platinum-selling triumph. It also paved the way for the final chapter of Bennett’s decorated career, in which he sang with younger artists such as Amy Winehouse and Lady Gaga, ensuring his resonance for generations to come.

Bennett might not have realized that “Because of You” would be his swan song, but it couldn’t have served as a better bookend. If by some twist of fate it had been his only hit, it would still echo with ache. As history would have it, though, the song was both the opening of and the epitaph to his career. “Because of You” made Bennett a star—and largely because of him, popular music retains a body of song whose romance will forever make us swoon.

Obsessed With the Life That Could Have Been

The Atlantic

www.theatlantic.com › books › archive › 2023 › 07 › august-blue-deborah-levy-novel › 674740

In the early days of the pandemic, it became harder for us to see one another. The human face, the ultimate marker of individuality, what the philosopher Emmanuel Levinas called “the first disclosure,” was suddenly sheathed in fabric. Strangers encountered on the street were even stranger—and the masks that covered their visage became a screen on which to project anxious thoughts.

In August Blue, the South African–born, North London–based novelist Deborah Levy’s latest, a concert pianist named Elsa Anderson glimpses a woman in a blue hospital mask at a flea market in Athens buying a kitschy bauble—a pair of toy mechanical horses—which she inexplicably also badly wants for herself. Unable to fully view the woman’s face, Elsa comes to believe she is actually seeing in the mysterious, attractive stranger some version of herself, or rather, a doppelgänger of sorts. “My startling thought at the moment was that she and I were the same person.”

Levy’s readers would be surprised if she didn’t set a novel in the aftermath of the Great Lockdown of 2020, when “everyone looked dazed and battered,” even as the worst of the pandemic had passed. She has always used the defining events of recent times—the collapse of the Berlin Wall, the financial crisis, Brexit—as the soundtrack for her stories. The sense of the displacement and unease that come with living amid historic disruption is what gives her books an edge of menace and suggest ambition belied by their relative brevity.

The quintessential Levy subject is a member of the intelligentsia, a historian studying male tyrants, a poet, a doctoral student in anthropology. These characters are 21st-century Herzogs, who can’t help but channel their neuroses through the prism of their intellectual fixations. In Hot Milk, the anthropologist’s relationship with her mother is a kinship structure endlessly turned over. In The Man Who Saw Everything, the historian notes that Stalin would flirt with women by throwing bread at them—a habit of hurling carbs that we learn his own tyrannical father shares. These academic overlays are one of the playful pleasures of her books.

Elsa fits the Levy archetype. She is a prodigy, plucked from foster parents at the age of 6 so that a great teacher, Arthur Goldstein, can raise her to become a virtuoso. He is Elsa’s gay, pompously pedantic Henry Higgins, who trains her to detach her mind from the commonplace so that she can master the classical repertoire.

But when the book begins, and Elsa is rummaging through the market in Athens, she has just humiliatingly stumbled from the path to greatness that Goldstein plotted. While performing Rachmaninoff’s Piano Concerto No. 2 at the Golden Hall in Vienna, she begins subconsciously playing her own dissonant notes, which she eventually realizes is an assertion of her own creative impulses, and then walks off the stage, as the maestro disdainfully mocks her.  

[Read: Deborah Levy’s disorienting, captivating fiction]

An identity crisis—which begins just before the concert, when Elsa dyes her hair blue, an event that she describes as a severing of relations with the birth mother who abandoned her—swells to become debilitating. Like so many of Levy’s other protagonists, she finds herself bopping across locales in Europe’s touristed southern reaches, on meandering sun-drenched odysseys in search of healing.

Levy’s novels have an undeniable—and undeniably winning—eccentricity. The introduction of a doppelgänger is a typically atypical move. Levy doesn’t exactly practice magical realism; her books are too tethered to the practicalities of life to ever be described that way. But her plots turn on weird moments and comical misunderstandings—if not magic, exactly, then serendipity seems to infuse her fictional worlds. Small details are inflated with symbolic significance; words and phrases repeat with murky purpose.

But the presence of Levy’s double is one of the most overtly intentional ideas in her fiction. It is central to her feminism, the political commitment that subtly permeates her novels and less subtly shapes her nonfiction. And it’s a device that she has used to make sense of her own life’s struggles. The doppelgänger doesn’t just stalk Levy’s protagonist; it stalks the entirety of her work.  

In the United States, much of the popular affection for Levy rests on a superb trilogy of memoirs—what she called a “living autobiography”—which includes a slim volume, Things I Don’t Want to Know. The book posed as a feminist response to George Orwell’s famous essay “Why I Write.” Levy adopted what Orwell called his “four great motives for writing” and used them for her chapter titles, even if the substance of her argument was elliptical in a way Orwell’s was not. Casting Orwell as her foil wasn’t a gesture of aggressive iconoclasm. Rather, she exploited the template to explain herself, showing how the impulses propelling the female author were far different from those that moved Orwell.

To attach one’s memoir to Orwell might seem a touch brash, given that the essayist’s biography is the romantic definition of the independent writing life, with its shunning of material glories in the stubborn pursuit of righteous causes. But there’s a parallel that doesn’t feel strained: Levy also suffered critical neglect for much of her career. In her early 50s, she couldn’t find a major publisher for her novel Swimming Home, so she released it with a small nonprofit press, supported by the British government and reader subscriptions.

Swimming Home was her first novel in 15 years and the sort of midlife success that rarely happens. It caught critics by surprise and gave her the first of three consecutive turns as a Booker Prize finalist. That Levy’s flourishing came belatedly is not terribly surprising, given the story contained in her autobiography—a series of personal crises and one long search-and-rescue mission for her authentic self.  

At the age of 5, a special branch of the South African police grabbed her father, an academic and activist, from the family bungalow in the middle of the night. He eventually stood trial alongside Nelson Mandela, his comrade in the African National Congress. During her father’s years in prison, Levy’s mother shipped her off to her godmother in Durban, where she attended a Catholic school and lived under the roof of a draconian patriarch.

When the apartheid government released her father—she was 9—the family sought sanctuary in the U.K. But exile exacerbated a growing sense of alienation, as she tried to assimilate into the dreary existence of 1970s England. Levy coped with the dislocation by reinventing herself as a teenage bohemian, even as she was sitting in working-class greasy spoons that didn’t have the faintest touch of Parisian sophistication. “I was a sad girl impersonating a sad girl,” she recalls.

Her sense of alienation tailed her into adulthood—when motherhood meant that she tended to her family at the expense of her own happiness and artistic fulfillment: “To not feel at home in her family home is the beginning of the bigger story of society and its female discontents.”

That Levy would ultimately fix on the idea of a doppelgänger is an understandable response to her personal history of tumult and the nagging sense of inauthenticity. To swerve from the expected course so often is to become inevitably fascinated by what Philip Roth once described as the “counterlife,” the alternative version of existence, where what ifs are fully rendered in the imagination. In her memoirs, she imagines encountering her own double—her young émigré self visiting her in middle age, after her divorce, when she is sitting in her North London apartment block watching the Great British Bake Off.

The idea of the doppelgänger cuts to the essence of her feminism: Mothers are haunted by the life they had before children and by the concession they have made to family. Liberation is recovery of that alternative self uninhibited by social strictures. It is “learning how to be a subject rather than a delusion.”

What’s thrilling about Levy’s novels is that they are alive with this relentless spirit of questing. Copying Orwell’s essay format is emblematic of her impish experimentalism. Her best novels take structural risks. The Man Who Saw Everything is divided into two parts, separated by 28 years, each repeating the same unlikely moment, when the book’s central subject steps into the crosswalk of Abbey Road and then gets knocked down by a car driven by a German man. It’s a nod to a famed image and clever conceit. The reprisal of the accident allows her to revisit events in the first half of the book. With the benefit of time, the narrator’s narcissistic misreading of his relationships is exposed.

[Read: Writing in the ruins]

Levy’s collected work is like a Freudian universe of symbols and phrases, which recur within her books—and across her books. She describes someone misquoting the famous line from The Communist Manifesto about a specter haunting Europe—and then the word specter begins to haunt the novel itself, provoking the reader every time it turns up, forcing deeper consideration of its meaning.

Certain questions she poses, using the same phrasing, appear verbatim in different books. In one of her volumes of memoir, she asks, “What do we do with the things that we don’t want to know?” The question inspired that book’s title—and it appears again in August Blue. A question that would obsess someone tormented by their counterlife.

August Blue has its share of invention, but not relative to Levy’s recent books. In the end, Elsa sits with Goldstein, her teacher and surrogate father, as he lies dying in a small house in Sardinia. He is a bit of a bully, but the only source of affection in her life, however contingent it might be on her artistic success.

She has certainly lived the life that Goldstein selected for her. He changed her name—from Ann to Elsa—and charted her career, in part, to validate his own genius as a teacher. Only in the dark shadow of his impending death does Elsa set aside her fears and resentments to learn the identity of her birth parents. This is, in the end, an archetypal plot we’ve seen over and over at the multiplex: an adopted child confronting her terrifying longing for self-knowledge.

What’s more, Levy’s feminist critique of the classical-music world is uncharacteristically lumpy. She overworks the theme of a woman forced to master the scores of male geniuses while suppressing her own creativity. Elsa spends her free time watching YouTube videos of the dancer Isadora Duncan, envying her artistic freedom, a preoccupation that is a bit too crudely deployed as the liberatory counterpoint to Elsa’s sense of being shackled to the repertoire.

Yet even in this less fully realized novel—her best are The Man Who Saw Everything and Hot Milk—Levy showcases her idiosyncratic mind. If the ultimate aim of feminism, as she preaches it, is to reclaim individuality, to banish the haunting specter of a more fulfilled, more authentic version of one’s self, her prose models this idea. Her language is beautifully her own: She describes the entertaining of suicidal thoughts as “standing on the forbidden pasture”; she calls Elsa’s dyed mane “very expressed hair.” Her imagery is pungently original. She shows us Elsa’s capacity for cruelty by having her unflinchingly stab a sea urchin with a fork while on a diving trip.

Levy’s subjects are credible intellectuals, because she is too. When she casually inserts a riff about Nietzsche’s failed musical experiments into dialogue, it is organic and interesting. Her reading of Freud is never far beneath the surface of her prose—and it’s almost a Freudian joke that she repeats the Freudian phrase “things we don’t want to know” so often. As an observer, she’s able to conjure the historic moment that has just passed, describing the ennui of the pandemic with disturbing precision, capturing the awkwardness of everyday human interactions in the aftermath of quarantine.

Because of her feminism—and her eccentricity—Levy tends to be squeezed into niches by critics in a way that fails to capture the ambition of her books. The Financial Times recently dubbed her “a cult novelist.” But this feels stingy. Instead we should call her what she is: one of the most lively, most gratifying novelists of ideas at work today.

Goodbye to the Prophets of Doom

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 07 › economics-inequality-piketty-milanovic › 674702

During the Great Recession, public discourse about the economy underwent something of a Great Disappointment.

For much of the country’s history, most Americans assumed that the future would bring them or their descendants greater affluence. Despite periodic economic crises, the overall story seemed to be one of progress for every stratum of the population. Those expectations were largely borne out: The standard of living enjoyed by working-class Americans for much of the mid-20th century, for example, was far superior to that enjoyed by affluent Americans a generation or two earlier.

But after the 2008 financial crisis, those assumptions were upended by a period of intense economic suffering coupled with a newfound interest among economists in the topic of inequality. Predictions of economic decline took over the conversation. America, a country long known for its inveterate optimism, came to dread the future—in which it now appeared that most people would have less and less.

[Adam Ozimek: The simple mistake that almost triggered a recession]

Three arguments provided the intellectual foundation for the Great Disappointment. The first, influentially advanced by the MIT economist David Autor, was that the wages of most Americans were stagnating for the first time in living memory. Although the income of average Americans had roughly doubled once every generation for most of the previous century, wage growth for much of the population began to flatline in the 1980s. By 2010, it looked as though poorer Americans faced a future in which they could no longer expect any real improvement in their standard of living.

The second argument had to do with globalization’s impact on the worldwide distribution of income. In a graph that came to be known as the “elephant curve,” the Serbian American economist Branko Milanović argued that the world’s poorest people were experiencing only minor income growth; that the middle percentiles were benefiting mightily from globalization; that those in the upper-middle segment—which included many industrial workers and people in the service industry in rich countries, including America—had seen their incomes stagnate; and that the very richest were making out like bandits. Globalization, it seemed, was a mixed blessing, and a distinctly concerning one for the bottom half of wage earners in industrialized economies such as the United States.

                                 Branko Milanović’s “elephant curve”

The final, and most sweeping, argument was about the nature and causes of inequality. Even as much of the population was just holding its own in prosperity, the wealth and income of the richest Americans were rising rapidly. In his 2013 surprise best seller, Capital in the Twenty-First Century, the French economist Thomas Piketty proposed that this trend was likely to continue. Arguing that the returns on capital had long outstripped those of labor, Piketty seemed to suggest that only a calamitous event such as a major war—or a radical political transformation, which did not appear to be on the horizon—could help tame the trend toward ever-greater inequality.

The Great Disappointment continues to shape the way many Americans think about the current and future state of the economy. But as the pandemic and the rise of inflation have altered the world economy, the intellectual basis for the thesis has begun to wobble. The reasons for economic pessimism have started to look less convincing than they once were. Is it time to revise the core tenets of the Great Disappointment?

One of the most prominent labor economists in the U.S., Autor has over the past decade provided much of the evidence regarding the stagnation of American workers’ incomes, especially for those without a college degree.

The U.S. economy, Autor wrote in a highly influential paper in 2010, is bifurcating. Even as demand for high-skilled workers rose, demand for “middle-wage, middle-skill white-collar and blue-collar jobs” was contracting. America’s economy, which had once provided plenty of middle-class jobs, was splitting into a highly affluent professional stratum and a large remainder that was becoming more immiserated. The overall outcome, according to Autor, was “falling real earnings for noncollege workers” and “a sharp rise in the inequality of wages.”

Autor’s past work on the falling wages of a major segment of the American workforce makes it all the more notable that he now sounds far more optimistic. Because companies were desperately searching for workers at the tail-end of the pandemic, Autor argues in a working paper published earlier this year, low-wage workers found themselves in a much better bargaining position. There has been a remarkable reversal in economic fortunes.

“Disproportionate wage growth at the bottom of the distribution reduced the college wage premium and reversed the rise in aggregate wage inequality since 1980 by approximately one quarter,” Autor writes. The big winners of recent economic trends are precisely those groups that had been left out in preceding decades: “The rise in wages was particularly strong among workers under 40 years of age and without a college degree.”

Even after accounting for inflation, Autor shows, the bottom quarter of American workers has seen a significant boost in income for the first time in years. The scholar who previously wrote about the “polarization” in the U.S. workforce now concludes that the American economy is experiencing an “unexpected compression.” In other words, the wealth gap is narrowing with surprising speed.

Autor is not the only leading economist who is calling into doubt the underpinnings of the Great Disappointment. According to Milanović, his “elephant curve” proved so influential in part because it confirmed fears many people had about the effects of globalization. His famous graph was, he now admits, an “empirical confirmation of what many thought.” He is no longer so sure about that piece of conventional wisdom.

A few years ago, Milanović set out to update the original elephant curve, which was based on data from 1988 to 2008. The result came as a shock—a positive one. Once Milanović included data for another decade, to 2018, the curve changed shape. Instead of the characteristic “rise, fall, rise again” that had given the curve its viral name, its steadily falling gradient now seemed to paint a straightforward and much more optimistic picture. Over the four decades he now surveyed, the incomes of the poorest people in the world rose very fast, those of people toward the middle of the distribution fairly fast, and those of the richest rather sluggishly. Global economic conditions were improving for nearly everyone, and, contrary to conventional wisdom, it was the most needy, not the most affluent, who were reaping the greatest rewards.

                                  Milanović’s revised curve

In a recent article for Foreign Affairs, Milanović goes even further. “We’re frequently told,” he writes, that “we live in an age of inequality.” But when you look at the most recent global data, that turns out to be false: In fact, “the world is growing more equal than it has been for over 100 years.”

To this day, Piketty remains the patron saint of the Great Disappointment. No thinker is invoked more often to justify the theory. But even Piketty’s pessimistic diagnosis, made a decade ago, has come to look much less dire.

In part, this is because Piketty’s work has come in for criticism from other economists. According to one influential line of argument, Piketty mistook why returns on capital were higher than returns to labor in many industrialized countries in the decades after World War II. Absent concerted pressure to prevent this, Piketty had argued, the nature of capitalism would always favor billionaires and giant corporations over ordinary workers. But according to Matthew Rognlie, an economist at Northwestern University, Piketty’s explanation for why inequality increased during that period was based on a misinterpretation of the data.

The outsize returns on capital during the latter half of the 20th century, Rognlie argues, were mainly due to the huge growth in house prices in metropolitan centers such as Paris and New York. If returns on capital were larger than returns to labor over this period, the reason was not a general economic trend but specific political factors, such as restrictive building codes. In addition, the main beneficiaries were not the billionaires and big corporations on which Piketty focused; rather, they were the kinds of upper-middle-class professionals who own the bulk of housing stock in major cities.

Economists continue to debate whether such criticisms hit the mark. But even as Piketty defended his work, he himself started to strike a more optimistic note about the long-term structure of the economy. In his 2022 book, A Brief History of Equality, he talks about the rise of inequality as an anomaly. “At least since the end of the eighteenth century there has been a historical movement towards equality,” he writes. “The world of the 2020s, no matter how unjust it may seem, is more egalitarian than that of 1950 or that of 1900, which were themselves in many respects more egalitarian than those of 1850 or 1780.”

Like Autor and Milanović, Piketty seems to have concluded that the thesis of the Great Disappointment was, in key respects, wrong.

It would be premature to put worries about stagnating incomes or rising inequality to rest. The three former prophets of doom all emphasize the role that social and political factors play in shaping economic outcomes. As a result, they see recent wage growth for poorer Americans as caused in part by the expansionary economic policies that both Donald Trump and Joe Biden pursued in response to the pandemic.

Similarly, the huge gains that some of the poorest people in the world have made in recent decades derive in part from their governments’ efforts to use industrial policy to shape the impact of globalization on their countries. Whether, as Piketty has argued, the returns on capital will in the long run outstrip the returns to labor depends on political decisions about taxation and redistribution, about the strength of trade unions and the rules governing labor markets.

[Oren Cass: The labor-shortage myth]

Recent good news about our economic prospects should not lead us to conclusions that could quickly turn out to be overexuberant. But we should also avoid perpetuating an instinctive pessimism that looks less and less warranted. Although pessimism may seem smart or shrewd, cynicism about our collective ability to build a better world only makes it harder to win support for the kind of economic policies we need to create that future.

Progressives sometimes seem to believe that they can mobilize people by making the future look scary. But when voters feel threatened, it is usually unscrupulous reactionaries who make unrealistic promises and scapegoat outsiders who benefit. Wage stagnation and rising inequality are still real dangers about which we must remain vigilant—but the fact that a better economic future has come to look a good deal more achievable should be cause for full-throated celebration.