Itemoids

American

Will We Remember Succession or Ted Lasso More?

The Atlantic

www.theatlantic.com › culture › archive › 2023 › 06 › succession-ted-lasso-series-finales › 674276

This article contains spoilers through the series finales of Succession and Ted Lasso.

Succession ended on Sunday with a series finale whose title, like the three season finales before it, was taken from a John Berryman poem, “Dream Song 29.” Before the episode aired, there was widespread speculation about whether the poem alluded to any particular revelation. Would Kendall, whose death felt like it had been foreshadowed so many times on the show—all those vacant gazes down at the city from up high and baptismal engulfments in water—die by suicide, as Berryman did? He wouldn’t, it turned out. Although Jeremy Strong apparently improvised, while in character, a version of the last scene in which Kendall tries to hurl himself into the Hudson and is thwarted by his bodyguard, in the aired version, Kendall survives, albeit as a broken version of himself. You can decide for yourself whether the poem alludes to Kendall’s guilt over the covered-up death of a waiter (in a lake, no less), or to the revelation that his father believed he may have killed his own sister, or to both.

Berryman, though, could also bring to mind the other influential TV show that ended this week, Ted Lasso. In the Apple show’s second season, Ted reveals to his therapist that his father died by suicide, a devastating twist for a series that debuted as an ebullient cross-cultural sports comedy. In “Dream Song 145,” a different poem in the same series, Berryman recalls how his own father, “very early in the morning, / rose with his gun and went outdoors by my window / and did what was needed.” That moment, Berryman writes in another poem, “wiped out my childhood.” He was raised in Oklahoma; Ted comes from Kansas. Ted confesses to his therapist that he’s furious, still, that his father “quit on his family”; Berryman, in rage, recounts spitting on his father’s grave. Berryman’s alter ego is named Henry, like Ted’s son, whose presence motivates the series’s occupying question: Can people change? Can we be better than our parents? Or are we fated to keep passing the poison to the next generation?

This is, you’ll note, Succession’s central concern, too, answered in definitive fashion in the finale, when a pregnant Shiv potentially elects to preserve cursed power—and pain—for her future child while selling out her brothers. The show’s worldview is that people simply can’t escape their coding, an existential framework that, to my mind at least, made the last two seasons deeply claustrophobic, with everyone doomed to repeat the same steps of the same game with the same players. Succession is a clever show—so clever. It’s profane and caustic and, in the right moments, so tense you could gnaw your thumbs off. It’s exquisitely written in a way that makes the lumpen final season of Ted Lasso feel like something conceived of and drafted by an obliging AI. The two shows are functionally antithetical in style. (Call it the “snark versus smarm” debate transposed for 2020s television.) Succession is a study of American capitalism largely written by Brits; Ted Lasso is a riff on British culture mostly written by Americans. Both shows somehow feature Harriet Walter playing a mother who fails her children. Both have, at some point, incorporated lines from Philip Larkin’s poem “This Be the Verse,” which Succession’s creator, Jesse Armstrong, also paraphrased in an article for The Guardian on the origins of the HBO series.

[Read: The Succession plot point that explained the whole series]

If we’re talking about cultural impact, we have to acknowledge that Ted Lasso, at this point, is about as hip as Nickelback, whereas Succession has made its way into our idioms, our jokes, even our structural understanding of money and power. Here is my confession, though: As much as I admired Succession’s finale aesthetically and intellectually, I hated the experience of watching it. There was something deeply unsettling about its nihilism, its acceptance of art’s futility in the face of commerce. (“Drama can change minds,” Willa loftily tells Connor in Season 1, trying to argue that her play could be as significant as his campaign for president. “Yeah, but not really,” Connor replies.) It’s striking that a show with so many spectacular individual components—the writing, the god-level acting, the superlative direction, the production values—adds up in the end to something so hollow: a series about people who are nothing at all, stuck in hell, fated only to contaminate the rest of us.

Early in the coronavirus pandemic, I interviewed a professor of media psychology about the different ways people seek comfort in television—why some of us were kicking back with anodyne, escapist fare such as The Great British Baking Show while others of us were perversely streaming Outbreak and Contagion. Depending on how your brain processes hardship, it turns out, you’re either a person who wants to be uplifted with feel-good entertainment or a person who wants to be reminded that no matter how bad things get, they could still get worse.

It’s this theory, I think, that explains why both Succession and Ted Lasso found eager audiences over the past few years: There were viewers who wanted to be smothered in shortbread and viewers who found that they had new tolerance for the viperous Roys and their obscene infighting. “We wrote the first season in the belief that nobody would watch the show,” Georgia Pritchett, one of Succession’s executive producers, wrote this week. “And nobody did, really. Or the second season. It took a global pandemic, and the world’s population sitting at home wondering what they could do, for people to really start paying attention.” Back in 2018, I remember imploring people to watch the first season, cajoling them to stick with it through the sixth episode, when Kendall is absurdly thwarted by traffic in his first bid to topple his father, so they could get to the seventh, when the corporate tussling is finally traded out for some breathtaking emotional violence.

As inaccessible as Succession could be at first, Ted Lasso was the opposite: Almost embarrassingly eager to please, it wielded sincerity like a cudgel, and employed so many pop-cultural tropes that it could work as a matter of muscle memory alone. Ted was akin to a midwestern Mary Poppins, blowing in as the wind changed to help people connect with one another and be the best version of themselves. In a different climate, it might have sunk without a trace. But in late 2020, to audiences ground down by a surfeit of real-world suffering at a time of scant optimism or inspirational leadership, it stuck.

[Read: The new comedy of American decline]

Both shows are structured around central poles: If Logan Roy is the rapacious cancer spreading to everything and everyone he touches, Ted Lasso is the inversion, an egoless Wichita prophet who couldn’t care less about winning, and whose only governing principle is his belief in our capacity to be better. Both shows are consumed with fathers and sons, and the psychological morass that accompanies having an absent or abusive parent. Both are set in a world where masculinity has calcified into systems that make everyone miserable, and where victory is fleeting: You win, or you lose, and you carry on playing. (In an interview with Vulture, Mark Mylod, one of Succession’s main directors, even compared the show’s static environment to soccer.) Ted Lasso delivered sermons—to a fault, in its third season—through feel-good breakthroughs and teachable moments; whatever Succession had to say, it kept to itself.

And yet: When Ted Lasso was good, it could be wonderful. I laugh-cried, in the penultimate episode, when Jamie confessed that he’d stopped conditioning his hair because “what’s the point”—a joke about peacocking professional soccer players that is also a fairly neat summation of depression. I was fascinated, in Season 2, by the idea that Ted’s relentless positivity, his exhausting superpower, might be more of a defense mechanism sparked by his father’s suicide than an innate character trait. (In the end, it was both.) So much of the third season was excruciating, marked by weak writing, sloppy structure, and dubious, unearned twists. But the finale, written by the show’s co-creators, returned somewhat to form in revisiting the show’s fundamental conceit. “Can people change?” asks Roy Kent in a spontaneous meeting of the show’s consciousness-raising group for befuddled men. The show concluded that they can, if they’re big enough and humble enough to know that they need to.

I’m still not sure whether Succession was an inherently cynical show or, to use Tom Scocca’s summation of snark, a theory of cynicism. One of the reasons I loved the scenes of emotional annihilation that tended to anchor the end of each of its seasons was that they promised a catharsis that never came. If you’ve been to the theater, you’ve seen the sort of play in which secrets are laid bare and characters stripped of their defenses so that they can begin, afterward, to heal. Succession offered up the wastage but never the forward momentum. In an interview with Vanity Fair, Jeremy Strong summarized how Armstrong understands humanity, “which is that fundamentally, people don’t really change. They don’t do the spectacular, dramatic thing. Instead, there’s kind of a doom loop that we’re all stuck in.” I’ll never not be formally impressed by Succession. But it’s equally hard not to feel queasy, after everything, about its conception of mankind, in which we’re all so condemned by fate that there’s no point even trying to imagine our way out.

American women are leading the post-pandemic US jobs recovery

Quartz

qz.com › american-women-are-leading-the-post-pandemic-us-jobs-re-1850500104

The percentage of American women in their prime working years employed or looking for work reached an all-time-high in May, while prime-age men are not yet participating in the labor force at their pre-pandemic levels, according to new data from the US Labor Department.

Read more...

A Tragically American Approach to the Child-Care Crisis

The Atlantic

www.theatlantic.com › family › archive › 2023 › 06 › child-care-united-states-employer-based › 674269

For a brief moment, it looked like America could get a real child-care system—one that wasn’t defined by lengthy waitlists, sky-high fees, and crossed-fingers quality. When the House of Representatives passed the Build Back Better Act in 2021, it included $400 billion in funding, part of which would have paid programs enough to boost providers’ wages, in turn increasing the supply of available slots. The act also would have capped all but the wealthiest families’ child-care bills at 7 percent of their income. This overhaul would have put child care squarely in the same category as Social Security, Medicare, and other guaranteed supports: It would have, in other words, become a right. Since Joe Manchin and 50 Republican senators killed the bill, however, many policy makers have started following a tired old playbook: If at first you fail to make something a universal right, try making it an employee benefit.

The instinct to make for any policy port in a storm is understandable, and the American child-care system is stuck in a years-long hurricane. At its core is a financial paradox. Child-care providers have very high fixed costs due to the need for low child-to-adult ratios, so they can’t pay their staff well without significantly increasing parent fees (many child-care workers make less than parking attendants). In other words, child care simultaneously is too expensive for parents and brings in too little revenue for programs to operate sustainably. In fact, the industry is still down more than 50,000 employees from pre-pandemic levels. Centers have shut down for want of staff, long waitlists have stretched to the point of absurdity, and the rising cost of care continues to exceed inflation.

The system desperately needs a large infusion of permanent public money so that programs can compensate educators well, parent fees can be slashed, and supply can rise to meet demand. As Annie Lowrey wrote last year, “The math does not work. It will never work. No other country makes it work without a major investment from government.”

[Read: The reason child care is so hard to afford]

In his public remarks and his proposed budget for the 2024 fiscal year, President Joe Biden is certainly insistent about the need for a permanent answer to child-care funding. Democratic Senators Elizabeth Warren and Patty Murray, along with their House counterparts, have each submitted a major child-care bill in recent months. Yet in the face of congressional gridlock, Democrats and Republicans alike are turning to employers as a salve.

At the federal level, the Biden administration is nudging companies to offer employees child-care assistance, embedding such encouragement in the semiconductor CHIPS Act and a recent executive order on care. Red states such as Oklahoma and Missouri have proposed—along with other actions, like tax credits for donors to child-care programs—sweetening the incentive pot for employer child-care benefits. States such as Michigan and Kentucky are piloting programs in which child-care costs can be split among the employer, the employee, and the government.

The problem is that these are quarter measures at best. Millions of gig workers who don’t receive benefits will be left out by default. And employer-sponsored benefits are unreliable because people may switch or lose their job—and because employers can simply change their mind. According to a recent Care.com survey of 500 companies, nearly one-third said they might cut child-care benefits if a recession takes hold. Even putting all of that aside, none of these programs can ever hope to help even the barest fraction of the millions of families who want and need care.

[Read: America’s child-care equilibrium has shattered]

For instance, two years after its inception, Michigan’s well-intentioned Tri-Share initiative reaches a grand total of 277 families. On-site child-care centers can quickly fill up and may not meet the needs or preferences of blue-collar workers who require care during nontraditional hours. Moreover, none of these initiatives significantly addresses providers’ wages, and opening new programs when you can’t even find staff for existing ones is a bridge to nowhere. A child-care system that relies on the employer-employee relationship is fundamentally flawed. There is a reason we do not offer public schooling as part of a benefits package.

That is not to say that employers should be ignored. Some parents benefit greatly from having child care located where they work. However, those programs do not have to be funded and run by the employer; in a publicly funded system, on-site centers can be one option among many. Similarly, employers can and should be asked to contribute to the child care their employees rely on, but through taxation instead of fringe benefits. Vermont is set to become the first state to substantially increase child-care funding with a small payroll tax, at least 75 percent of which will be paid by employers. The resulting funds will allow the state to make many more families eligible for child-care assistance and help providers raise their wages.

We’ve been at this crossroads before, with health care. During World War II, companies began offering health insurance as a perk. This was done to get around wage caps established in 1942 to prevent the economy from going haywire as companies competed for the suddenly shrunken labor force. Coming out of the war, President Harry Truman proposed a national health-insurance system akin to what would become the U.K.’s National Health Service. The plan failed under opposition not just from business interests but from several major labor unions that had become invested in the idea of employer-sponsored insurance—a decision whose effects the country still feels today.

Child care itself serves up a cautionary tale. In the late 1960s and early ’70s, a wide-ranging coalition of advocates and elected officials pushed for a universal, affordable, choice-based child-care system. Their efforts culminated in the Comprehensive Child Development Act of 1971, which would have created a nationally funded, locally run network of child-care sites. The legislation passed Congress on a bipartisan basis before President Richard Nixon vetoed it. Soon thereafter, the coalition splintered. The historian Anna Danziger Halperin has written that, “following this narrowing of political possibilities and shift of the policy landscape to the right, by the 1980s advocates … no longer pressured policymakers for universal approaches. Instead they focused on more limited provisions, like tax incentives for employers to provide child care.”

The logic behind leaning on employer-sponsored child care is easy: Something is better than nothing. Yet this is not always the case, in life or in public policy. In the middle of a hurricane, handing out umbrellas is a waste of time and energy. As America learned with health care, if we get used to a service being tied to employment, that idea can become entrenched and very hard to change. Today’s stopgap measures become tomorrow’s status quo. Marching down such a path will make it even harder to gain the momentum needed to build and fund a child-care system that works for everyone.

Part of the difficulty in gathering that momentum is the lack of a popular child-care proposal that captures the public imagination. Murray’s plan has the most support within the Democratic Party and formed the basis for the Build Back Better child-care provisions. Although transformational, the bill uses a complicated income-based sliding fee scale and a bureaucratic “activity test” whereby parents must prove they are engaged in work or school, or have a legitimate reason not to be. One would be hard-pressed to summarize either Murray’s or Warren’s plan in a sentence, much less a viral sound bite.

The time and energy spent promoting employee child-care benefits, then, would be better spent developing a simply communicated, comprehensive reform plan. To maximize its popularity, such a plan should help with the early years as well as after-school and summer care, and follow the lead of some Nordic countries with stipends for stay-at-home parents. The simplest, strongest plan to capture the public's attention could be to mimic the public-school system, and propose universal and free child care. Ideally, any plan would be tied into a suite of pro-family policies that includes paid family leave and a monthly allowance for helping with general child-rearing costs. There is significant political upside to getting this right: The child-care pain point is deep and broad, and fixing child care is an astoundingly popular policy area that could be put front and center in a campaign.

The miserable state of American child care is not a given. In the past 30 years, Germany, Canada, Ireland and other peer nations with market-based child-care systems have undergone tremendous reforms. Canada aims to halve child-care fees nationwide, and some families have already seen their bills reduced by thousands of dollars. Within the U.S., in addition to Vermont’s recent victory, New Mexico is proposing to make child care free for most families while boosting educator wages. The common thread? Large amounts of permanent public money.

In the end, the country must decide what child care is: a right that every family deserves and that is worth investing in, or a luxury to be purchased by those with means and bestowed upon a lucky few at their employers’ whim.

A Labor Shortage Is a Great Problem to Have

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 06 › labor-shortage-low-unemployment-inflation › 674263

Today’s jobs report from the Bureau of Labor Statistics shows the unemployment rate continuing to hold close to its lowest level in 70 years, despite a slight uptick last month. This might seem like good news, but it has two groups of Americans deeply troubled. One is the business community, which counts on a surplus of available workers to keep wages down. The other, unfortunately, is mainstream economists—and the policy makers who listen to them.

Federal Reserve Chair Jerome Powell has linked low unemployment to high inflation, publicly discussing the need to restore “balance” to the labor market—meaning increase unemployment and suppress wage growth—to tame consumer prices. A director at the American Enterprise Institute, a corporate-friendly think tank, recently called for “a pretty big increase in the unemployment rate.” Republicans in several states have introduced legislation to loosen child-labor restrictions as a way to expand the labor supply.

The Biden administration, meanwhile, seems to agree that low unemployment poses a problem, and to see immigration as an answer. In December, Axios reported that Biden’s “top economic aides are concerned that the lack of immigrant workers is leading to labor shortages.” Last month, Secretary of Homeland Security Alejandro Mayorkas called for immigration reform on the grounds that “there are businesses around this country that are desperate for workers” and “desperate workers in foreign countries that are looking for jobs in the United States.” Apparently our own workers aren’t desperate enough.

[Derek Thompson: What does the shocking unemployment report really mean?]

To the average person, opposition to low unemployment and rising wages is deeply counterintuitive. But it has long been central to economic policy. As Glenn Hubbard, a Columbia University economist who chaired George W. Bush’s Council of Economic Advisers, has written, “Since the dawn of their discipline, economists have understood the goal of the economic system to be optimizing consumption—producing goods and services as cheaply as possible and getting them in the hands of individuals who want them to improve living conditions.” In this way of thinking, labor is just another commodity, like wood or oil, and Americans are best off when it is plentiful and cheap.

American public policy has largely managed to keep things that way. Over the past 50 years, as both parties supported the entry of millions of unskilled immigrants and the offshoring of entire industries, America’s per capita gross domestic product more than doubled after adjusting for inflation. Productivity of labor rose by a similar amount, and corporate profits per capita nearly tripled. Yet over the same time period, the average inflation-adjusted hourly earnings of the typical worker rose by less than 1 percent.

In the coronavirus pandemic’s aftermath, for the first time in a long time, many employers are discovering that they can’t fill jobs at the low wages they’re accustomed to offering. “We hear from businesses every day that the worker shortage is their top challenge,” Neil Bradley, chief policy officer at U.S. Chamber of Commerce, said last May. This is the precise circumstance under which wages might finally rise. Instead, the business community is looking to government to get them out of a jam, and leaders on both sides of the aisle seem only too eager to help.

This is a grave mistake—politically, economically, and morally. If employers are struggling to find workers, they should offer better pay and conditions. If that comes at the expense of some profits, or requires some prices to rise, well, that’s how markets are supposed to work. In most other contexts, capitalism’s proponents celebrate how the market creates incentives for businesses to solve problems. In that respect, a labor shortage is a great problem to have. Only by challenging employers to improve job quality and boost productivity will we find out what the market’s awesome power can achieve for American workers and their families.

The notion of a “labor shortage” in a market economy presents something of a puzzle. The basic principle of supply and demand suggests that, if employers can’t find enough workers, they’ll simply have to offer higher wages or better working conditions. Perhaps in the face of a sudden shock—say, in the middle of a pandemic—a temporary shortage might arise. The labor supply could shrink faster than businesses could adjust. But that’s not the situation today. Labor-force participation has returned to 2019 levels; real wages have been falling after a brief bump early in the pandemic. When employers say there isn’t enough labor, what they really mean is that they can’t find enough people willing to work under the terms that they want to offer—and that they're doing a poor job increasing productivity with the workers they have.

The irony is that the most fervent free-market economists and business leaders are often the first to complain about labor shortages and overheated labor markets. So they need some explanation for why supply and demand suddenly don’t apply. Thus the trope of “jobs Americans won’t do.” The idea is that wages are determined by some objective measure of productivity. You get paid what you’re worth to your employer, no more, no less. On this theory, certain jobs—like busing tables in a fast-food restaurant or picking crops in a hot field—just don’t command wages high enough for most Americans to want to do them.

In truth, there’s no such thing as objectively higher- or lower-value jobs. Those determinations are set by market conditions, which are in turn shaped by public policy. There is, therefore, a circularity to the dynamic: Wages are influenced by judgments about what a given job should be worth and thus whether a purported shortage should be remedied by policy makers. Although it’s tempting to say that the market has decided that software development pays $61 an hour while picking lettuce pays $16, that observation falters on the fact that farm owners can’t actually find workers at that low wage. (If you offered computer programmers $16 an hour and made them work in the hot sun, you would have trouble finding enough of them too.) That’s why the federal government created an H-2A agricultural-guest-worker program that has swelled from fewer than 50,000 annual visas in 2005 to more than 250,000 in 2021.

No one thinks twice when professionals in office buildings see their wages rise, or when employers have to woo them with free meals and comfy chairs. Only when lower-wage workers see gains, even briefly, do we suddenly have an economic crisis on our hands.

[Annie Lowrey: Why everyone is so mad about the economy]

The sober economists have an explanation for this too: inflation. Sure, everyone wants to see low-wage workers do better, in the abstract. But if we start paying people too much, employers will have to raise prices to cover rising wages, and we’ll get inflation. In the argot of the Federal Reserve, wage growth must be kept “consistent” with its target 2 percent inflation rate.

The first problem with this reaction is that, as an empirical matter, tight labor markets are not necessarily associated with high inflation. In the late 1970s, as inflation was surging, the unemployment rate was high too: 5 to 6 percent. Throughout both the late 1990s and the late 2010s, an unemployment rate below 4 percent coincided with low inflation. Over the past year, as inflation fell from its high of 9 percent to less than 5 percent—much closer to the Federal Reserve’s target of 2 percent—the unemployment rate fell along with it.

One reason for the disconnect is that market forces create a constant incentive for employers to do more with less. Faced with pressure to raise wages, the rational response is to seek productivity increases wherever possible—or even, gasp, to accept lower profits for shareholders. A remarkable illustration comes from the 1960s, when the United States decided to end the Mexican bracero program that provided farms with half a million low-wage guest workers each year. The result was not the proverbial $50 pint of strawberries, but rapid mechanization. In other words, instead of relying on many poorly paid jobs filled by guest workers, the industry created new, better jobs Americans would do—in equipment development, production, and operation. The lesson: If employers know that they’ll always have to pursue profit with a constrained labor supply, they will invest and innovate in ways that benefit workers. Bringing manufacturing back to American shores, for example, would not mean replicating Asian sweatshops, but rather creating capital-intensive, high-productivity factories with good jobs here at home.

We also should scrutinize the term inflation, which doesn’t mean the same thing at all times to all people. For workers at the low end of the income scale, wage increases are desirable even if they do partly translate to higher prices, because those workers will see their earnings grow faster than the prices they pay for consumer goods. (This is because labor is only one of many factors that determine prices). Wage growth may fuel some inflation, but those receiving the raises see real net gains. Higher-income workers might lose out in this scenario, as they pay more for the same goods without getting a raise. After decades of widening inequality, market forces would finally work to the benefit of those who have been left behind.

And yet, the political and economic establishment sees this outcome as a cause for alarm, not excitement. Enthusiasm for “free markets” turns out to depend on which interests those markets are serving. As both The New York Times and The Wall Street Journal have recently reported, corporations seem to be taking advantage of the inflation story to raise prices beyond what their rising costs require. Yet conservative think tanks and op-ed columnists seem uninterested in calling on the government to tackle that issue. Equally damning for the left of center, meanwhile, is the embrace of immigration as a “solution” to inflation—which finally acknowledges the reality, long denied by liberals, that unskilled immigrants suppress the wages of low-wage workers already here. “When labor is in short supply relative to demand, employers offer higher wages,” explained the pro-immigration advocacy group FWD.us last month. “Immigration policy that responds quickly to market shifts can stabilize prices for consumers and offer relief to employers.”

Low prices for consumers and relief for employers, but no mention of existing workers: a fine summary of America’s economic agenda for the past half-century. One hopes that the spectacle of our leaders scrambling to keep poor workers from getting ahead will finally expose its absurdity. Good jobs that allow workers to support their families and communities can’t be just a hoped-for by-product of a market economy; they must be its purpose. Gains in consumption and material living standards are good, but cheaper prices through lower wages is a losing proposition for working-class families and the nation as a whole.

The modern American economy has not failed with respect to the material standard of living. It has failed in the creation of insecure jobs that do not meet workers’ needs, a shift in the distribution of income that has left working families struggling, and a decay in social solidarity as the winners declare themselves the most valuable and the losers expendable. To reverse those trends, workers must have the power that comes from being needed.

Support for this article was provided by the William and Flora Hewlett Foundation.

How Europe Won the Gas War With Russia

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 06 › russia-ukraine-natural-gas-europe › 674268

The most significant defeat in Russia’s war on Ukraine was suffered not on a battlefield but in the marketplace.

The Russian aggressors had expected to use natural gas as a weapon to bend Western Europe to their will. The weapon failed. Why? And will the failure continue?

Unlike oil, which is easily transported by ocean tanker, gas moves most efficiently and economically through fixed pipelines. Pipelines are time-consuming and expensive to build. Once the pipeline is laid, over land or underwater, the buyer at one end is bound to the seller on the other end. Gas can move by tanker, too, but first it must be compressed into liquid form. Compressing gas is expensive and technologically demanding. In the 2010s, European consumers preferred to rely on cheaper and supposedly reliable pipeline gas from Russia. Then, in 2021, the year before the Russian attack on Ukraine, Europeans abruptly discovered the limits of Russian-energy reliability.

The Russian pipeline network can carry only so much gas at a time. In winter, Europe consumes more than the network can convey, so Europe prepares for shortages by building big inventories of gas in the summertime, when it uses less.

Russian actions in the summer of 2021 thwarted European inventory building. A shortage loomed—and prices spiked. I wrote for The Atlantic on January 5, 2022:

In a normal year, Europe would enter the winter with something like 100 billion cubic meters of gas on hand. This December began with reserves 13 percent lower than usual. Thin inventories have triggered fearful speculation. Gas is selling on European commodity markets for 10 times the price it goes for in the United States.

These high prices have offered windfall opportunities for people with gas to sell. Yet Russia has refused those opportunities. Through August, when European utilities import surplus gas to accumulate for winter use, deliveries via the main Russian pipeline to Germany flowed at only one-quarter their normal rate. Meanwhile, Russia has been boycotting altogether the large and sophisticated pipeline that crosses Ukraine en route to more southerly parts of Europe.

I added a warning: “By design or default, the shortfalls have put a powerful weapon in [Russian President Vladimir] Putin’s hands.”

A month later, the world learned what Putin’s gas weapon was meant to do. Russian armored columns lunged toward the Ukrainian capital, Kyiv, on February 24. Putin’s gas cutoffs appear to have been intended to deter Western Europe from coming to Ukraine’s aid.

The day before the invasion, I tried to communicate the mood of fear that then gripped gas markets and European capitals:

In 2017, 2018, and 2019, Russia’s dominance over its gas customers in Western Europe was weaker, and its financial resources to endure market disruption were fewer. In 2022, Russia’s power over its gas customers is at a zenith—and its financial resources are enormous … One gas-industry insider, speaking on the condition of anonymity in order to talk candidly, predicted that if gas prices stay high, European economies will shrink—and Russia’s could grow—to the point where Putin’s economy will overtake at least Italy’s and perhaps France’s to stand second in Europe only to Germany’s.

That fear was mercifully not realized. Instead, European economies proved much more resilient—and Russia’s gas weapon much less formidable—than feared. The lights did not go out.

The story of this success is one of much ingenuity, solidarity, sacrifice, and some luck. If Putin’s war continues into its second winter and into Europe’s third winter of gas shortages, Western countries will need even more ingenuity, solidarity, sacrifice, and luck.

Over 12 months, European countries achieved a remarkable energy pivot. First, they reduced their demand for gas. European natural-gas consumption in 2022 was estimated to be 12 percent lower than the average for the years 2019–21. More consumption cuts are forecast for 2023.

Weather helped. Europe’s winter of 2022–23 was, for the most part, a mild one. Energy substitution made a difference too. Germany produced 12 percent more coal-generated electricity in 2022 than in 2021. The slow recovery from the coronavirus pandemic in China helped as well. Chinese purchases of liquid natural gas on world markets actually dropped by nearly 20 percent in 2022 from their 2021 level.

[David Frum: Putin’s big chill in Europe]

Second, European countries looked out for consumers, and for one another. European Union governments spent close to 800 billion euros ($860 billion) to subsidize fuel bills in 2022. The United Kingdom distributed an emergency grant of £400 ($500) a household to help with fuel costs. Germany normally reexports almost half of the gas it imports, and despite shortfalls at home through the crisis, it continued to reexport a similar proportion to EU partners.

Third, as European countries cut their consumption, they also switched their sources of supply. The star of this part of the story is Norway, which replaced Russia as Europe’s single largest gas supplier. Norway rejiggered its offshore fields to produce less oil and more gas, I learned from energy experts during a recent visit I made to Oslo.

Norwegians also made sacrifices for their neighbors. Norway has an abundance of cheap hydroelectricity, and exports much of that power. During the 2022 energy crisis, those export commitments pushed up Norwegian households’ power bills and helped push down the approval ratings of Norway’s governing Labor Party by more than a quarter from its level at the beginning of that year. Nevertheless, the government steadfastly honored its electricity-export commitments (although it has now moved to place some restrictions on future exports).

The redirection from Asia of shipments of liquid natural gas from the United States, the Persian Gulf, and West Africa also contributed to European energy security. In December 2022, Germany opened a new gas-receiving terminal in Wilhelmshaven, near Bremen, which was completed at record speed, in fewer than 200 days. Two more terminals will begin operating in 2023.

The net result is that Russian gas exports fell by 25 percent in 2022. And since the painful record prices set in the months before the February 2022 invasion, the cost of gas in Europe has steeply declined.

Russian leaders had assumed that their pipelines to Europe would make the continent dependent on Russia. They did not apparently consider that the same pipelines also made Russia dependent on Europe. By contrast, only a single pipeline connects Russia to the whole of China, and it is less valuable to Putin—according to a study conducted by the Carnegie Endowment for International Peace, the gas it carries commands prices much lower than the gas Russia pipes to Europe.

To reach world markets, Russia will have to undertake the costly business of compressing its gas into liquid form. A decompression plant like the one swiftly constructed in Wilhelmshaven costs about $500 million. Germany’s three newly built terminals to receive liquid natural gas will cost more than $3 billion. But the outbound terminals that compress the gas cost even more: $10.5 billion is the latest estimate for the next big project on the U.S. Gulf Coast. Russia depended on foreign investment and technology to compete in the liquid-natural-gas market. Under Western sanctions, the flow of both investment and technology to Russia have been cut.

[Eliot A. Cohen: It’s not enough for Ukraine to win. Russia has to lose.]

Russia lacks the economic and technological oomph to keep pace with the big competitors in the liquid-gas market, such as the U.S. and Qatar. In April, CNBC reported on a study by gas-industry consultants that projected growth of 50 percent for the liquid-natural-gas market by 2030. The Russian share of that market will, according to the same study, shrink to 5 percent (from about 7 percent), even as the American share rises to 25 percent (from about 20 percent).

If the war in Ukraine continues through the next winter, Europe will have to overcome renewed difficulties. For example, Germany’s nuclear-power plants, which eased the shock last year, went offline forever in April. And this time, the winter might be colder. But gas production by non-Russian producers keeps rising, outpacing demand in the rest of the world. The Chinese economy continues its slow recovery from COVID; India lags as a gas buyer.

Risks are everywhere—but so are possibilities. When this war comes to an end, the lesson will be clear: We have to hasten the planet to a post-fossil-fuel future—not only to preserve our environment but to uphold world peace from aggressors who use oil and gas as weapons. Yet perhaps the most enduring lesson is political. Through the energy shock, Europe discovered a new resource: the power of wisely led cooperation to meet and overcome a common danger.

What It Takes to Win a War

The Atlantic

www.theatlantic.com › books › archive › 2023 › 06 › ernie-pyle-world-war-ii-soldiers › 674271

Most war correspondents don’t become household names, but as the Second World War raged, every American knew Ernie Pyle. His great subject was not the politics of the war, or its strategy, but rather the men who were fighting it. At the height of his column’s popularity, more than 400 daily newspapers and 300 weeklies syndicated Pyle’s dispatches from the front. His grinning face graced the cover of Time magazine. An early collection of his columns, Here Is Your War, became a best seller. It was followed by Brave Men, rereleased this week by Penguin Classics with an introduction by David Chrisinger, the author of the recent Pyle biography The Soldier’s Truth.

Pyle was one of many journalists who flocked to cover the Second World War. But he was not in search of scoops or special access to power brokers; in fact, he avoided the generals and admirals he called “the brass hats.” What Pyle looked for, and then conveyed, was a sense of what the war was really like. His columns connected those on the home front to the experiences of loved ones on the battlefield in Africa, Europe, and the Pacific. For readers in uniform, Pyle’s columns sanctified their daily sacrifices in the grinding, dirty, bloody business of war. Twelve million Americans would read about what it took for sailors to offload supplies under fire on a beachhead in Anzio, or how gunners could shoot enough artillery rounds to burn through a howitzer’s barrel. Pyle wrote about what he often referred to as “brave men.” And his idea of courage wasn’t a grand gesture but rather the accumulation of mundane, achievable, unglamorous tasks: digging a foxhole, sleeping in the mud, surviving on cold rations for weeks, piloting an aircraft through flak day after day after day.

We’ve become skeptical of heroic narratives. Critics who dismiss Pyle as a real-time hagiographer of the Greatest Generation miss the point. Pyle was a cartographer, meticulously mapping the character of the Americans who chose to fight. If a person’s character becomes their destiny, the destiny of the American war effort depended on the collective character of Americans in uniform. Pyle barely touched on tactics or battle plans in his columns, but he wrote word after word about the plight of the average frontline soldier because he understood that the war would be won, or lost, in their realm of steel, dirt, and blood.

In the following passage, Pyle describes a company of American infantrymen advancing into a French town against German resistance:

They seemed terribly pathetic to me. They weren’t warriors. They were American boys who by mere chance of fate had wound up with guns in their hands, sneaking up a death-laden street in a strange and shattered city in a faraway country in a driving rain. They were afraid, but it was beyond their power to quit. They had no choice. They were good boys. I talked with them all afternoon as we sneaked slowly forward along the mysterious and rubbled street, and I know they were good boys. And even though they weren’t warriors born to the kill, they won their battles. That’s the point.

I imagine that when those words hit the U.S. in 1944, shortly after D-Day, readers found reassurance in the idea that those “good boys” had what it took to win the war, despite being afraid, and despite not really being warriors. However, today Pyle’s words hold a different meaning. They read more like a question, one now being asked about America’s character in an ever more dangerous world.

[Read: Notes from a cematary]

The past two years have delivered a dizzying array of national-security challenges, including the U.S.’s decision to abandon Afghanistan to the Taliban, Russia’s war in Ukraine, and the possibility of a Chinese invasion of Taiwan. A rising authoritarian axis threatens the West-led liberal world order birthed after the Second World War. Much like when Pyle wrote 80 years ago, the character of a society—whether it contains “brave men” and “good boys” willing to defend democratic values—will prove determinative to the outcomes of these challenges.

The collapse of Afghanistan’s military and government came as a surprise to many Americans. That result cannot be fully explained by lack of dollars, time, or resources expended. Only someone who understood the human side of war—as Pyle certainly did—could have predicted that collapse, when the majority of Afghan soldiers surrendered to the Taliban. Conversely, in Ukraine, where most experts predicted a speedy Russian victory, the Ukrainians overperformed, defying expectations. The character of the Ukrainian people, one which most didn’t fully recognize, has been the driving factor.

Pyle often wrote in anecdotes, but his writing’s impact was anything but anecdotal. His style of combat realism, which eschews the macro and strategic for the micro and human, can be seen in today’s combat reporting from Ukraine. A new documentary film, Slava Ukraini, made by one of France’s most famous public intellectuals, Bernard-Henri Lévy, takes a Pyle-esque approach to last fall’s Ukrainian counteroffensive against the Russians. The film focuses on everyday Ukrainians and the courage they display for the sake of their cause. “And I’m amazed,” Lévy says, walking through a trench in eastern Ukraine, “that while weapons were not always their craft, these men are transformed into the bravest soldiers.”

Ernie Pyle at the front in 1944.(Bettmann/CORBIS/Getty)

War correspondents such as Thomas Gibbons-Neff at The New York Times and James Marson at The Wall Street Journal take a similar approach, with reporting that’s grounded in those specifics, which must inform any real understanding of strategy. The result is a style that’s indebted to Pyle and his concern with the soldiers’ morale and commitment to the cause, and reveals more than any high-level analyses could.

Pyle wasn’t the first to search for strategic truths about war in the granular reality of individual experiences. Ernest Hemingway, who didn’t cover the First World War as a correspondent but later reflected on it as a novelist, wrote in A Farewell to Arms:

There were many words that you could not stand to hear and finally only the names of the places had dignity. Certain numbers were the same way and certain dates and these with the names of the places were all you could say and have them mean anything. Abstract words such as glory, honor, courage, or hallow were obscene beside the concrete names of villages, the numbers of roads, the names of rivers, the numbers of regiments and the dates.

Pyle took this advice to heart when introducing characters in his columns. He would not only tell you a bit about a soldier, their rank, their job, and what they looked like; he would also make sure to give the reader their home address. “Here are the names of just a few of my company mates in that little escapade that afternoon,” he writes, after describing heavy combat in France. “Sergeant Joseph Palajsa, of 187 I Street, Pittsburgh. Pfc. Arthur Greene, of 618 Oxford Street, Auburn Massachusetts …” He goes on to list more than a half dozen others. Pyle knew that “only the names of the places had dignity.” And sometimes those places were home.

As a combat reporter, Pyle surpassed all others working during the Second World War, outwriting his contemporaries, Hemingway included. This achievement was one of both style and commitment. Was there any reporter who saw more of the war than Pyle? He first shipped overseas in 1940, to cover the Battle of Britain. He returned to the war in 1942, to north Africa, and he went on to Italy, to France, and finally to the Pacific. On April 17, 1945, while on a patrol near Okinawa, a sniper shot Pyle in the head, killing him instantly. His subject, war, finally consumed him.

[Read: The two Stalingrads]

Reading the final chapters of Brave Men, it seems as though Pyle’s subject was consuming him even before he left for Okinawa. “For some of us the war has already gone on too long,” he writes. “Our feelings have been wrung and drained.” Brave Men ends shortly after the liberation of Paris. The invasion of western Europe—which we often forget was an enormous gamble—had paid off. Berlin stood within striking distance. The war in Europe would soon be over. Pyle, however, remains far from sanguine.

“We have won this war because our men are brave, and because of many other things.” He goes on to list the contribution of our allies, the roles played by luck, by geography, and even by the passage of time. He cautions against hubris in victory and warns about the challenges of homecoming for veterans. “And all of us together will have to learn how to reassemble our broken world into a pattern so firm and so fair that another great war cannot soon be possible … Submersion in war does not necessarily qualify a man to be the master of the peace. All we can do is fumble and try once more—try out of the memory of our anguish—and be as tolerant with each other as we can.”

America’s Dysfunction Has Two Main Causes

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 06 › us-societal-trends-institutional-trust-economy › 674260

How has America slid into its current age of discord? Why has our trust in institutions collapsed, and why have our democratic norms unraveled?

All human societies experience recurrent waves of political crisis, such as the one we face today. My research team built a database of hundreds of societies across 10,000 years to try to find out what causes them. We examined dozens of variables, including population numbers, measures of well-being, forms of governance, and the frequency with which rulers are overthrown. We found that the precise mix of events that leads to crisis varies, but two drivers of instability loom large. The first is popular immiseration—when the economic fortunes of broad swaths of a population decline. The second, and more significant, is elite overproduction—when a society produces too many superrich and ultra-educated people, and not enough elite positions to satisfy their ambitions.

These forces have played a key role in our current crisis. In the past 50 years, despite overall economic growth, the quality of life for most Americans has declined. The wealthy have become wealthier, while the incomes and wages of the median American family have stagnated. As a result, our social pyramid has become top-heavy. At the same time, the U.S. began overproducing graduates with advanced degrees. More and more people aspiring to positions of power began fighting over a relatively fixed number of spots. The competition among them has corroded the social norms and institutions that govern society.

The U.S. has gone through this twice before. The first time ended in civil war. But the second led to a period of unusually broad-based prosperity. Both offer lessons about today’s dysfunction and, more important, how to fix it.

To understand the root causes of the current crisis, let’s start by looking at how the number of über-wealthy Americans has grown. Back in 1983, 66,000 American households were worth at least $10 million. That may sound like a lot, but by 2019, controlling for inflation, the number had increased tenfold. A similar, if smaller, upsurge happened lower on the food chain. The number of households worth $5 million or more increased sevenfold, and the number of mere millionaires went up fourfold.

This article has been adapted from Turchin’s forthcoming book.

On its surface, having more wealthy people doesn’t sound like such a bad thing. But at whose expense did elites’ wealth swell in recent years?

Starting in the 1970s, although the overall economy continued to grow, the share of that growth going to average workers began to shrink, and real wages leveled off. (It’s no coincidence that Americans’ average height—a useful proxy for well-being, economic and otherwise—stopped increasing around then too, even as average heights in much of Europe continued climbing.) By 2010, the relative wage (wage divided by GDP per capita) of an unskilled worker had nearly halved compared with mid-century. For the 64 percent of Americans who didn’t have a four-year college degree, real wages shrank in the 40 years before 2016.

[From the December 2020 issue: The next decade could be even worse]

As wages diminished, the costs of owning a home and going to college soared. To afford an average house, a worker earning the median wage in 2016 had to log 40 percent more hours than she would have in 1976. And parents without a college degree had to work four times longer to pay for their children’s college.

Even college-educated Americans aren’t doing well across the board. They made out well in the 1950s, when fewer than 15 percent of 18-to-24-year-olds went to college, but not today, when more than 60 percent of high-school grads immediately enroll. To get ahead of the competition, more college graduates have sought out advanced degrees. From 1955 to 1975, the number of students enrolled in law school tripled, and from 1960 to 1970, the number of doctorate degrees granted at U.S. universities more than tripled. This was manageable in the post–World War II period, when the number of professions requiring advanced degrees shot up. But when the demand eventually subsided, the supply didn’t. By the 2000s, degree holders greatly outnumbered the positions available to them. The imbalance is most acute in the social sciences and humanities, but the U.S. hugely overproduces degrees even in STEM fields.

This is part of a broader trend. Compared with 50 years ago, far more Americans today have either the financial means or the academic credentials to pursue positions of power, especially in politics. But the number of those positions hasn’t increased, which has led to fierce competition.

Competition is healthy for society, in moderation. But the competition we are witnessing among America’s elites has been anything but moderate. It has created very few winners and masses of resentful losers. It has brought out the dark side of meritocracy, encouraging rule-breaking instead of hard work.

All of this has left us with a large and growing class of frustrated elite aspirants, and a large and growing class of workers who can’t make better lives for themselves.

The decades that have led to our present-day dysfunction share important similarities with the decades leading to the Civil War. Then as now, a growing economy served to make the rich richer and the poor poorer. The number of millionaires per capita quadrupled from 1800 to 1850, while the relative wage declined by nearly 50 percent from the 1820s to the 1860s, just as it has in recent decades. Biological data from the time suggest that the average American’s quality of life declined significantly. From 1830 to the end of the century, the average height of Americans fell by nearly two inches, and average life expectancy at age 10 decreased by eight years during approximately the same period.

This popular immiseration stirred up social strife, which could be seen in urban riots. From 1820 to 1825, when times were good, only one riot occurred in which at least one person was killed. But in the five years before the Civil War, 1855 to 1860, American cities experienced no fewer than 38 such riots. We see a similar pattern today. In the run-up to the Civil War, this frustration manifested politically, in part as anti-immigrant populism, epitomized by the Know-Nothing Party. Today this strain of populism has been resurrected by Donald Trump.

[From the January/February 2022 issue: Beware prophecies of civil war]

Strife grew among elites too. The newly minted millionaires of the 19th century, who made their money in manufacturing rather than through plantations or overseas trade, chafed under the rule of the southern aristocracy, as their economic interests diverged. To protect their budding industries, the new elites favored high tariffs and state support for infrastructure projects. The established elites—who grew and exported cotton, and imported manufactured goods from overseas—strongly opposed these measures. The southern slaveholders’ grip on the federal government, the new elites argued, prevented necessary reforms in the banking and transportation systems, which threatened their economic well-being.

As the elite class expanded, the supply of desirable government posts flattened. Although the number of U.S. representatives grew fourfold from 1789 to 1835, it had shrunk by mid-century, just as more and more elite aspirants received legal training—then, as now, the chief route to political office. Competition for political power intensified, as it has today.

Those were cruder times, and intra-elite conflict took very violent forms. In Congress, incidences and threats of violence peaked in the 1850s. The brutal caning that Representative Preston Brooks of South Carolina gave to Senator Charles Sumner of Massachusetts on the Senate floor in 1856 is the best-known such episode, but it was not the only one. In 1842, after Representative Thomas Arnold of Tennessee “reprimanded a pro-slavery member of his own party, two Southern Democrats stalked toward him, at least one of whom was armed with a bowie knife,” the historian Joanne Freeman recounts. In 1850, Senator Henry Foote of Mississippi pulled a pistol on Senator Thomas Hart Benton of Missouri. In another bitter debate, a pistol fell out of a New York representative’s pocket, nearly precipitating a shoot-out on the floor of Congress.

This intra-elite violence presaged popular violence, and the deadliest conflict in American history.

The victory of the North in the Civil War decimated the wealth and power of the southern ruling class, temporarily reversing the problem of elite overproduction. But workers’ wages continued to lag behind overall economic growth, and the “wealth pump” that redistributed their income to the elites never stopped. By the late 19th century, elite overproduction was back, new millionaires had replaced the defeated slave-owning class, and America had entered the Gilded Age. Economic inequality exploded, eventually peaking in the early 20th century. By 1912, the nation’s top wealth holder, John D. Rockefeller, had $1 billion, the equivalent of 2.6 million annual wages—100 times higher than the top wealth holder had in 1790.

Then came the New York Stock Exchange collapse of 1929 and the Great Depression, which had a similar effect as the Civil War: Thousands of economic elites were plunged into the commoner class. In 1925, there were 1,600 millionaires, but by 1950, fewer than 900 remained. The size of America’s top fortune remained stuck at $1 billion for decades, inflation notwithstanding. By 1982, the richest American had $2 billion, which was equivalent to “only” 93,000 annual wages.

[From the December 2019 issue: How America ends]

But here is where the two eras differed. Unlike the post–Civil War period, real wages steadily grew in the mid-20th century. And high taxes on the richest Americans helped reverse the wealth pump. The tax rate on top incomes, which peaked during World War II at 94 percent, stayed above 90 percent all the way until the mid-1960s. Height increased by a whopping 3 inches in roughly the first half of the 20th century. Life expectancy at age 10 increased by nearly a decade. By the 1960s, America had achieved a broad-based prosperity that was virtually unprecedented in human history.

The New Deal elites learned an important lesson from the disaster of the Civil War. The reversal of elite overproduction in both eras was similar in magnitude, but only after the Great Depression was it accomplished through entirely nonviolent means. The ruling class itself was an important part of this—or, at least, a prosocial faction of the ruling class, which persuaded enough of their peers to acquiesce to the era’s progressive reforms.

As the historian Kim Phillips-Fein wrote in Invisible Hands, executives and stockholders mounted an enormous resistance to the New Deal policies regulating labor–corporate relations. But by mid-century, a sufficient number of them had consented to the new economic order for it to become entrenched. They bargained regularly with labor unions. They accepted the idea that the state would have a role to play in guiding economic life and helping the nation cope with downturns. In 1943, the president of the U.S. Chamber of Commerce—which today pushes for the most extreme forms of neoliberal market fundamentalism—said, “Only the willfully blind can fail to see that the old-style capitalism of a primitive, free-shooting period is gone forever.” President Dwight Eisenhower, considered a fiscal conservative for his time, wrote to his brother:

Should any political party attempt to abolish social security, unemployment insurance, and eliminate labor laws and farm programs, you would not hear of that party again in our political history. There is a tiny splinter group, of course, that believes you can do these things … Their number is negligible and they are stupid.

Barry Goldwater ran against Lyndon Johnson in 1964 on a platform of low taxes and anti-­union rhetoric. By today’s standards, Goldwater was a middle-of-the-road conservative. But he was regarded as radical at the time, too radical even for many business leaders, who abandoned his campaign and helped bring about his landslide defeat.

The foundations of this broad-based postwar prosperity—and for the ruling elite’s eventual acquiescence to it—were established during the Progressive era and buttressed by the New Deal. In particular, new legislation guaranteed unions’ right to collective bargaining, introduced a minimum wage, and established Social Security. American elites entered into a “fragile, unwritten compact” with the working classes, as the United Auto Workers president Douglas Fraser later described it. This implicit contract included the promise that the fruits of economic growth would be distributed more equitably among both workers and owners. In return, the fundamentals of the political-economic system would not be challenged. Avoiding revolution was one of the most important reasons for this compact (although not the only one). As Fraser wrote in his famous resignation letter from the Labor Management Group in 1978, when the compact was about to be abandoned, “The acceptance of the labor movement, such as it has been, came because business feared the alternatives.”

We are still suffering the consequences of abandoning that compact. The long history of human society compiled in our database suggests that America’s current economy is so lucrative for the ruling elites that achieving fundamental reform might require a violent revolution. But we have reason for hope. It is not unprecedented for a ruling class—with adequate pressure from below—to allow for the nonviolent reversal of elite overproduction. But such an outcome requires elites to sacrifice their near-term self-interest for our long-term collective interests. At the moment, they don’t seem prepared to do that.

This article has been adapted from Peter Turchin’s forthcoming book, End Times: Elites, Counter-Elites, and the Path of Political Disintegration.