Itemoids

Japan

Boeing and the Dark Age of American Manufacturing

The Atlantic

www.theatlantic.com › ideas › archive › 2024 › 04 › boeing-corporate-america-manufacturing › 678137

The sight of Bill Boeing was a familiar one on the factory floor. His office was in the building next to the converted boatyard where workers lathed the wood, sewed the fabric wings, and fixed the control wires of the Boeing Model C airplane. there is no authority except facts. facts are obtained by accurate observation read a plaque affixed outside the door. And what could need closer observation than the process of his aircraft being built? One day in 1916, Boeing spotted an imperfectly cut wing rib, dropped it to the floor, and slowly stomped it to bits. “I, for one, will close up shop rather than send out work of this kind,” he declared.

When David Calhoun, the soon-to-be-lame-duck CEO of the company Boeing founded, made a rare appearance on the shop floor in Seattle one day this past January, circumstances were decidedly different. Firmly a member of the CEO class, schooled at the knee of General Electric’s Jack Welch, Calhoun had not strolled over from next door but flown some 2,300 miles from Boeing’s headquarters in Arlington, Virginia. And he was not there to observe slipshod work before it found its way into the air—it already had. A few weeks earlier, the door of a Boeing 737 had fallen out mid-flight. In the days following his visit, Calhoun’s office admitted that it still didn’t know quite what had gone wrong, because it didn’t know how the plane had been put together in the first place. The door’s restraining bolts had either been screwed in wrong, or not at all. Boeing couldn’t say, because, as it told astonished regulators, the company had “no records of the work being performed.”

The two scenes tell us the peculiar story of a plane maker that, over 25 years, slowly but very deliberately extracted itself from the business of making planes. For nearly 40 years the company built the 737 fuselage itself in the same plant that turned out its B-29 and B-52 bombers. In 2005 it sold this facility to a private-investment firm, keeping the axle grease at arm’s length and notionally shifting risk, capital costs, and labor woes off its books onto its “supplier.” Offloading, Boeing called it. Meanwhile the tail, landing gear, flight controls, and other essentials were outsourced to factories around the world owned by others, and shipped to Boeing for final assembly, turning the company that created the Jet Age into something akin to a glorified gluer-together of precast model-airplane kits. Boeing’s latest screwups vividly dramatize a point often missed in laments of America’s manufacturing decline: that when global economic forces carried off some U.S. manufacturers for good, even the ones that stuck around lost interest in actually making stuff.

The past 30 years may well be remembered as a dark age of U.S. manufacturing. Boeing’s decline illustrates everything that went wrong to bring us here. Fortunately, it also offers a lesson in how to get back out.

In Bill Boeing’s day, the word manufactory had cachet. You could bank at the Manufacturers Trust. Philadelphia socialites golfed at the Manufacturers’ Club. Plans for the newly consecrated Harvard Business School called for a working factory on campus. The business heroes of the day—Ford, Edison, Firestone—had risen from the shop floor.

There, they had pioneered an entirely new way of making things. The American system of production—featuring interchangeable parts, specialized machine tools, moving assembly lines—was a huge leap beyond European methods of craft production. And it produced lopsided margins of victory for the likes of Ford, GM, and Boeing. To coordinate these complex new systems, two new occupations arose: the industrial engineer, who spoke the language of the shop floor, and the professional financial manager, who spoke the language of accounting.

[Charlie Warzel: Flying is weird right now]

At first the engineers held sway. In a 1930 article for Aviation News, a Boeing engineer explained how the company’s inspectors “continually supervise the fabrication of the many thousands of parts entering into the assemblage of a single plane.” Philip Johnson, an engineer, succeeded Bill Boeing as CEO; he then passed the company to yet another engineer, Clairmont Egtvedt, who not only managed production of the B-17 bomber from the executive suite, but personally helped design it.

After the Second World War, America enjoyed three decades of dominance by sticking with methods it had used to win it. At the same time, a successor was developing, largely unnoticed, amid the scarcities of defeated Japan. The upstart auto executive Eiji Toyoda had visited Ford’s works and found that, however much he admired the systems, they couldn’t be replicated in Japan. He couldn’t afford, for instance, the hundreds of machine tools specialized to punch out exactly one part at the touch of a button. Although his employees would have to make do with a few general-purpose stamping presses, he gave these skilled workers immense freedom to find the most efficient way to run them. The end result turned out to be radical: Costs fell and errors dropped in a renewable cycle of improvement, or kaizen.

What emerged was a different conception of the corporation. If the managerial bureaucrats in the other departments were to earn their keep, they needed a thorough understanding of the shop floor, or gemba (roughly “place of making value”). The so-called Gemba Walk required their routine presence at each step until they could comprehend the assembly of the whole. Otherwise they risked becoming muda—waste.

When the wave of Japanese competition finally crashed on corporate America, those best equipped to understand it—the engineers—were no longer in charge. American boardrooms had been handed over to the finance people. And they were hypnotized by the new doctrine of shareholder value, which provided a rationale for their ascendance but little incentive for pursuing long-term improvements or sustainable approaches to cost control. Their pay packages rewarded short-term spikes in stock price. There were lots of ways to produce those.

Which brings us to the hinge point of 1990, when a trio of MIT researchers published The Machine That Changed the World, which both named the Japanese system—“lean production”and urged corporate America to learn from it. Just then, the Japanese economy crashed, easing the pressure on U.S. firms. In the years that followed, American manufacturers instead doubled down on outsourcing, offshoring, and financial engineering. This round of wounds was self-inflicted. Already infused with a stench of decay, manufacturing was written off as yesterday’s activity.

At GE, which produced three of Boeing’s last four CEOs, manufacturing came to be seen as “grunt work,” as the former GE executive David Cote recently told Fortune’s Shawn Tully. Motorola—founded as Galvin Manufacturing and famed for its religious focus on quality—lost its lead in mobile-phone making after it leaned into software and services. Intel’s bunny-suited fab workers were the face of high-tech manufacturing prowess until the company ceded hardware leadership to Asian rivals. “Having once pioneered the development of this extraordinary technology,” the current Intel CEO, Pat Gelsinger, wrote recently, “we now find ourselves at the mercy of the most fragile global supply chain in the world.”

Phil Condit, the talented engineer who had overseen design of the hugely successful 777, was atop Boeing when I visited the company in late 2000. He was no stranger to the shop floor. Traversing Boeing’s Everett plant in a golf cart, he pointed out the horizontal tail fin stretching above us. Hard to believe it was larger than the 737’s wing, he marveled. Waiting back in his office—still located on the bank of the Duwamish River but greatly swollen by the recent merger with McDonnell Douglas—was a different sort of glee. “Wow! Double wow!” his mother had emailed him, referring to Boeing’s closing stock price that day. And, it would soon emerge, he wanted to get some distance from what he described to the Puget Sound Business Journal as “how-do-you-design-an-airplane stuff.” The next year, he moved Boeing’s headquarters to Chicago, pulling the top brass away from the shop floor just as the company was embarking on a radically new approach to airplane assembly.

Its newest plane, the 787 Dreamliner, would not be an in-house production. Instead Boeing would farm out the designing and building to a network of “partner” companies—each effectively its own mini-Boeing with its own supply chain to manage. “It used to be you’d have some Boeing people develop the blueprints, then march over and say, ‘Hey, would you build this for me?’” Richard Safran, an analyst at Seaport Research Partners and a former aerospace engineer, told me. “Now, instead, you’re asking them to design it, to integrate it, to do the R&D.”

The allures of this “capital light” approach were many: Troublesome unions, costly machine shops, and development budgets would all become someone else’s problem. Key financial metrics would instantly improve as costs shifted to other firms’ balance sheets. With its emphasis on less, the approach bore a superficial resemblance to lean production. But where lean production pushed know-how back onto the shop floor, this pushed the shop floor and its know-how out the door altogether.

Beyond that were the problems that a Boeing engineer, L. J. Hart-Smith, had foreseen in a prescient white paper that he presented at a 2001 Boeing technical symposium. With outsourcing came the possibility that parts wouldn’t fit together correctly on arrival. “In order to minimize these potential problems,” Hart-Smith warned, “it is necessary for the prime contractor to provide on-site quality, supplier-management, and sometimes technical support. If this is not done, the performance of the prime manufacturer can never exceed the capabilities of the least proficient of the suppliers.”

Boeing didn’t listen. Wall Street dismissed Hart-Smith’s paper as a “rant,” and Boeing put each supplier in charge of its own quality control. When those controls failed, Boeing had to bear the cost of fixing flawed components. Most troubling was the dangerous feedback loop Hart-Smith foresaw. Accounting-wise, those fixes, which in reality are the costs of outsourcing, would instead appear as overhead—creating the impression that in-house work was expensive and furthering the rationale for offloading even more of the manufacturing process.

In the short term, this all worked wonders on Boeing’s balance sheet: Its stock rose more than 600 percent from 2010 to 2019. Then the true folly of this approach made its inevitable appearance when two strikingly similar crashes caused by faulty software on Boeing planes killed a total of 346 people.

[James Surowiecki: What’s gone wrong at Boeing]

Today, if you stand along the Seattle waterfront long enough, sooner or later you’ll catch sight of a train headed south carrying the distinctive shape of a Boeing 737. Though it’s colored a metallic green and missing its tail—clearly not the finished product—it’s the kind of thing you point to and say, Look kids, a Boeing plane’s on that train! Not so. The logomark on the side spells it out: Spirit AeroSystems of Wichita, Kansas, has built this fuselage, which isn’t coming from Boeing. It’s going to Boeing.

A plane is a complex system in which the malfunction of one piece can produce catastrophic failure of the whole. Assembly must be tightly choreographed. But now—especially with Boeing continually trying to wring costs from its suppliers—there were many more chances for errors to creep in. And when FAA investigators finally toured the premises of Spirit AeroSystems—maker of the blown-out door as well as the fuselage it was supposed to fit in—they did not find a tight operation. They found one door seal being lubricated with Dawn liquid dish soap and cleaned with a wet cheesecloth, and another checked with a hotel-room key card.

A dark age doesn’t descend all at once. The process of emerging from one also takes time. It must begin with a recognition that something has been lost. Boeing’s fall just might have provided that rush of clarity. You could be from the 12th century and still know that soap and cheesecloth aren’t for making flying machines. Boeing’s chief financial officer recently admitted that the company got “a little too far ahead of itself on the topic of outsourcing.” It is in talks to reacquire Spirit AeroSystems and is already making the composite wings of its next-gen plane, the 777X, in-house at a new, billion-dollar complex outside Seattle. “Aerospace Executives Finally Rediscover the Shop Floor,” Aviation Week declared on the cover of a recent issue.

As for the rest of corporate America, one of the strongest signals may be coming from the company Boeing has striven so hard to emulate: GE. Under operations-minded boss Larry Culp, the company is finally—only 40 or so years late—pushing itself through a crash course in lean manufacturing. It is belatedly yielding to the reality that workers on the gemba are far better at figuring out more efficient ways of making things than remote bureaucrats with spreadsheet abstractions.

In the crucial field of semiconductors, meanwhile, Intel has recognized that Moore’s Law (the doubling of computing power roughly every 18 months) flows not from above but from manufacturing advances it once dominated. It has undertaken a “death march,” in the words of CEO Pat Gelsinger, to regain its lost edge on the foundry floor. The CHIPS Act has put a powerful political wind at his back. Green and other incentives are powering a broader, truly seismic surge in spending on new U.S. factories, now going up at three times their normal rate. No other country is experiencing such a buildout.

Add all the capacity you want. It won’t reverse the country’s long decline as a manufacturing superpower if corporate America keeps gurgling its sad, tired story about the impossibility of making things on these shores anymore. It’s a story that helped pour a whole lot of wealth into the executive pockets peddling it. But half a century of self-inflicted damage is enough. The doors have fallen off, and it’s plain for all to see: The story was barely bolted together.

The Case for Miniatures

The Atlantic

www.theatlantic.com › ideas › archive › 2024 › 04 › miniature-art-museums-thorne-rooms-bonsais › 678133

This story seems to be about:

Empires and nation-states are remembered for their monuments, but they also leave behind plenty of miniatures. Inside the Egyptian pyramids, within the chamber where the pharaoh’s mummy rests, stand collections of little statues—wooden figurines of mummified servants, clay hippos painted turquoise—to remind the ruler how the world once looked. Academics have complained that miniatures suffer from scholarly neglect. After carrying out the first comprehensive survey of more than 500 miniatures found in excavations along the Nile in 2011, the Italian archeologist Grazia Di Pietro felt compelled to remark in an essay that these were more than “simple toys.”

A miniature is a replica of something bigger, a distortion of scale that makes it wonderful in a way the merely small is not. Miniatures are not the same as models, which are didactic (an anatomical model of the heart to educate students, for example) or utilitarian (a model showing the plan for a skyscraper yet to be built). Miniatures imitate life but have no clear practical purpose. They can be harder to make than their full-size counterparts. But they are portable, like the tiny mannequins the French government commissioned from fashion houses when World War II ended and Parisians couldn’t afford human-size haute couture. The mannequins toured Europe, splendidly dressed ambassadors carrying the message that the French had skill, if not much fabric.

Miniatures seek detail rather than abstraction. They are competitive. Some strive to be ever smaller, like the diminutive books that surged in popularity during the Industrial Revolution, after the printing press had rendered mass production easier. The essayist Susan Stewart writes about this in her book On Longing: Narratives of the Miniature, the Gigantic, the Souvenir, the Collection. Maybe, she suggests, the guilds of printers and binders missed the challenges of craftsmanship. (Centuries before the printing press, Arab and Persian calligraphers figured out how to make Qurans smaller than their thumbs.) Other miniatures strive to be ever more perfect—consider the locket portraits once sold in England, each with its own magnifying glass.

[Watch: Forget tiny houses–try miniature sculptures]

“I think a lot about record books, like the Guinness Book of World Records,” Joan Kee, an art historian at the University of Michigan, told me. “There’s always the smallest and the biggest: two extremes of human achievements.” Monuments and miniatures both inspire awe, but the awe each inspires is of a different kind. The pyramids stand as testaments to the glory of great powers, pooled resources, and concerted human effort. They’re formidable. The Egyptian figurines conjure images of a single artisan’s obsession, squinting eyes, and precise fingers. They’re precious.  

Here’s an irony of time and size: Monuments, in their grandness, seem destined to last forever—but the unobtrusive miniature is often what survives the passage of centuries and the onslaughts of natural disasters. Today, museums are full of miniatures, though many institutions don’t seem to know what to do with them. Jack Davy, a British curator, coined the term miniature dissonance to criticize the practice of exhibiting them all together with little context, like souvenirs on a table. Museum collections are a kind of miniature themselves—a whole world made to fit inside a building. One way to tell the history of museums is that they evolved from the rooms in which noble families once displayed trinkets from their trips of conquest—dried butterflies, incense lamps, taxidermic birds, Chinese porcelains. The rooms were called cabinets of curiosities, or wunderkammer in German—“wonder rooms.” The word cabinet then came to mean the piece of furniture that might contain such wonders; the word became its own miniature.

In the 1930s, Narcissa Thorne was a Chicago housewife and socialite, married to a scion of the Montgomery Ward department-store fortune. She mocked her ladylike education: “Knowing how to put on my hat straight was supposed to be enough.” Since childhood, she had relished traveling and collecting small objects, and liked to say that her miniatures were not a hobby but a mania. In 1933, hundreds of thousands of people lined up at the Chicago Century of Progress Exposition to see not some futuristic technology but an exhibit of 30 miniature rooms, imagined, commissioned, and furnished by Thorne. There was a Tudor hall, a Victorian drawing room, a Versailles-esque boudoir with a gilded bathtub. Some of the rooms had windows, through which the scenography of an outside landscape was visible and the light of a miniature sun seeped in. The audience found the realism uncanny, Kay Wells, an art historian at the University of Wisconsin at Milwaukee, told me. Some were so shocked by the view of all that intimate domesticity, they felt like voyeurs. Quite a few compared the rooms to peep shows.

Designed by Narcissa Thorne. E-14: English Drawing Room of the Victorian Period, 1840-70, 1937. Gift of Mrs. James Ward Thorne.” (The Art Institute of Chicago.)

Miniatures are often said to be all about control: creating tiny utopias by shrinking what is big and intimidating. “You can control your dollhouse,” Leslie Edelman, the owner of the only dedicated dollhouse store left in New York City, told me, as he showed me a miniature fruit basket so exquisite that the bananas inside of it could actually be peeled. “I mean, the outside world these days is insane!” In Truman Capote’s In Cold Blood, the character who collects miniatures is a frail mother who falls into depression after the birth of her child. Poor woman, I thought when I read the book, making this little world for herself because she can’t handle the real one.

[Read: Dollhouses weren’t invented for play]

Narcissa Thorne, too, wanted to assert control—over the stubborn passage of time and what she saw as the ugliness of modern fads. Art Deco mixed influences from too many places in a pastiche she didn’t fancy. Instead, she liked the “period rooms” that were being added to museums in Detroit, New York, and Chicago to display the prettier interior design of bygone eras. She donated and volunteered at major institutions, but none big enough to accommodate a collection as comprehensive as she would have wished. By making her own compact period rooms, she could display the chronology of European domesticity at a manageable scale.

But miniatures can do more than provide an illusion of control. And perhaps, despite her intentions, Thorne’s rooms did something of the opposite. Great miniatures create the fantasy that they are part of a world that will never fully reveal itself to the viewer. This is the same fantasy, as Stewart observed in On Longing, that animates The Nutcracker, Pinocchio, and other fairy tales in which toys come alive. A reporter at the Chicago Tribune wrote that looking into the rooms made you feel like a Lilliputian in Gulliver’s Travels.

The Thorne rooms exert a power that preserved historic villas and museum period rooms cannot replicate. If a space can be inhabited, then the people inhabiting it can’t escape the presence of EXIT signs, plexiglass barriers, and one another. You always know you’re trapped in the present. You can’t walk into a miniature room, yet it feels somehow much more immersive. Thorne chose not to populate her rooms with tiny people. Ellenor Alcorn, the curator of Applied Arts of Europe at the Art Institute of Chicago, which holds the biggest collection of Thorne rooms, calls that a “really wise” decision. “The absence of figures means that we, as the visitor, become the human element in the room, and bring them to life,” she told me.

The Thorne rooms at the Art Institute remain something of rarity: miniatures taken very, very seriously by a major American museum. The Metropolitan Museum of Art, in New York City, is home to the world’s biggest collection of American portrait miniatures—including a locket memento of George Washington and “Beauty Revealed,” a miniaturist’s self-portrait, which shows only her breasts—but only about 3 percent, are on display at the moment. (A spokesperson for the museum told me these paintings are rotated every few months because they’re sensitive to the light.)

American miniature enthusiasts are used to thinking of their fascination as a quirk. Elle Shushan, who collects and sells 19th-century miniature portraits like those at the Met, told me her circle is “niche but passionate.” Carolyn LeGeyt, a Connecticut retiree who made dollhouses for all the girls in her family—10 nieces and two granddaughters—when they turned 9, told me that her favorite week of the year is when she goes to a summer school run by the International Guild of Miniature Artisans at Maine Maritime Academy. For that one week, she doesn’t need to explain her “love for small things.” (That’s also where she learned to paint and then sand down her dollhouses’ door knobs so that they look worn by use.)

It doesn’t help their reputation for quirkiness that, as a group, American miniaturists are drawn to old-fashioned things. Most American dollhouses are Victorian. The miniature railroad at the Brandywine Museum, near Philadelphia, emerged out of nostalgia for disappearing old trains. This needn’t be the case. In Germany, Miniatur Wunderland replicates Hamburg’s warehouse district. Niklas Weissleder, a young man who works for the museum, told me that curators are getting anxious because many of the city’s cars are now electric, and the tiny cars have not yet been updated to reflect this change.

Not all American miniatures are quaint idylls. Frances Glessner Lee, a contemporary of Narcissa Thorne, created detailed room boxes too, but hers were murder scenes, with blood stains and decomposing bodies. Glessner Lee liked to read Sherlock Holmes stories, donated money to fund the school of legal medicine at Harvard, and hoped the budding detectives there would use the rooms as puzzles to crack in 90-minute sessions. For her contributions, Glessner Lee earned the title of “godmother of forensic science” and became America’s first female police captain.

A few years ago, the Smithsonian Institution’s Renwick Gallery, in Washington, D.C., displayed Glessner Lee’s rooms in “Murder Is Her Hobby,” a three-month exhibition. Nora Atkinson, the show’s curator, told me that it had been a tough sell for her bosses: They were “skeptical that anybody would be interested in sort of dollhouses, as they put it.” She felt there was a sense that the miniature rooms were just a feminine hobby, and not particularly “innovative.” In fact, the exhibition was so popular that the museum extended its hours. (A spokesperson for the Smithsonian told me that the exhibition was part of a series “showcasing women artists” and “challenging the marginalization of creative disciplines traditionally considered feminine.”)

Is there a country in the world where miniatures are more than a strange little pastime? I’m talking about a place that could serve as a site of pilgrimage for miniature-lovers, or a first destination in the event that a team of scholars finally sets out to write the Unified Theory of Miniatures as an Important Category of Artistic Expression.

There are probably quite a few candidates, but I’d submit Japan, where a long tradition honors the fascination with all objects mijika (“close to the body”) or te ni ireru (“that fit in the hand”).

Ayako Yoshimura, now a librarian at University of Chicago, told me that she doesn’t understand why collecting miniatures is seen as a bit weird in America; it was quite normal in Japan when she was growing up. When she moved to the United States for college, she brought along the miniatures from her childhood and has since kept a drawer for them in every place she has lived. She has all the makings of a miniature Japanese garden, with a fence and an ornamental water basin, but she rarely shows them to anyone.

Scholars I interviewed about the popularity of miniatures in Japan suggested that it might have to do with Japan itself being so small and dense, or with the nation’s tradition of decorative crafts. Yoshimura thinks her fellow Japanese have a “philosophy of concealment”; they are people who like owning little treasures to enjoy in private.

[From the January/February 2017 issue: Big in Japan–tiny food]

In the 1980s, the Korean professor and politician O-Young Lee wrote The Compact Culture, a best-selling book arguing that Japan’s love for small things, such as haiku and netsuke—tiny ivory sculptures concealed inside a kimono’s folds—led to its innovations in small-but-powerful industrial products such as the mighty microchip, and is by extension key to the nation’s economic success. Sushi, one of Japan’s most famous exports, is arguably a miniature—all the ingredients of a big plate, in a single bite.

Japan is also the master of what I believe to be the canonical miniature: bonsai trees, which are microcosms of nature outside nature. Originally from China, the practice of making miniature landscapes was supposed to teach students how to manipulate the elements. Individual pieces were called silent poems. When the art form spread to Japan, it conserved the meaning of an environment subdued. “A tree that is left growing in its natural state is a crude thing,” reads Utsubo monogatari, a 10th-century story. “It is only when it is kept close to human beings who fashion it with loving care that its shape and style acquire the ability to move one.”

(Srdjan Zivulovic / Reuters / Redux)

There’s something cruel about a desire for control that necessitates trapping a tree with wires, for decades, to stunt its growth and sculpt its shape. Keiichi Fujikawa, a second-generation bonsai artist from Osaka, told me he strives to hide or remove the wires before the trees are exhibited, but that without them the bonsai is not “aesthetically viable.” The wires are the price of beauty. Crucial to the Buddhist belief system, Yukio Lippit, a professor of Japanese art at Harvard, told me, is the idea of “nestedness,” of universes contained infinitely within universes. Miniature trees can remind their beholders of a cosmology in which every small thing holds an entire world.

When I first saw them, in the Brooklyn Botanic Garden, I understood that bonsais are not small trees but enormous ones—all of the complexity is there, simply at a reduced scale. I struggled to define why this effect is so beautiful, but I met an academic who came close: “Bonsais show the respect the artist has for you as a viewer,” Robert Huey, a professor of Japanese literature at University of Hawaii, told me. The bonsai artist knows that a miniature that simplifies would be an impostor, and bonsais are the opposite of impostors: not just miniature trees but real ones. They take decades to grow; they have leaves that blossom and fall with the seasons, and trunks that get sick and age. Like the trees that grow in the forest, they are fully alive.

The Politics of Pessimism

The Atlantic

www.theatlantic.com › ideas › archive › 2024 › 04 › the-age-of-grievance-frank-bruni › 678127

It had been clear for years that China was rising and rising—building rail lines and airports and skyscrapers at a rate that put the United States to shame, purchasing the favor of poorer countries, filling the world with its wares—when, in April 2014, I happened upon a bit of news. CNBC, citing a “new study from the world’s leading statistical agencies,” reported that China’s rapidly growing economy would rank first in the world, surpassing the United States’, by as soon as the end of the year. Our century-plus reign as the world’s wealthiest nation was over, or about to be. What a run we’d had!

But the study, which used debatable methodology, turned out to be wrong. It interested me less than something else I learned when I began poking around the internet to put it in some sort of context. I discovered that most Americans thought that China already had become our economic superior. And they’d thought that—erroneously—for several years.

In 2011, Gallup polled Americans on the question of whether the United States, China, the European Union, Japan, Russia, or India was the leading economic power in the world. More than 50 percent answered China, while fewer than 35 percent said the United States. Those numbers held when Gallup did the same polling the next year and the next and in 2014, when the portion of Americans choosing China rose to 52 percent and the portion choosing America dipped to 31 percent. That’s a whopping differential, especially considering its wrongness.

China’s economy still lags behind ours, although Americans have been reluctant to recognize that. In 2020, when China was pilloried as the cradle of the coronavirus pandemic, 50 percent of Americans indeed saw our economy as the mightier of the two. But that rediscovered swagger was short-lived. In 2021, 50 percent gave the crown back to China. Last year, Americans saw the economies as essentially tied.  

[From the May 1888 issue: What is pessimism?]

A fundamental misperception of global affairs by Americans isn’t surprising. Too many, if not most, of us are disinclined to look or think beyond our shores. But this particular misperception startled and fascinated me: We’d traditionally been such a confident, even cocky, nation, enamored of our military might (and often too quick to use it), showy with our foreign aid, schooled in stories—true ones—about how desperately foreigners wanted to make new lives here and what extraordinary risks they took to do so. We saw ourselves as peerless, and we spoke a distinctively American vocabulary of infinite possibility, boundless optimism, and better tomorrows.

American dream. American exceptionalism. Land of opportunity. Endless frontier. Manifest destiny. Those were the pretty phrases that I grew up with. We were inventors, expanders, explorers. Putting the first man on the moon wasn’t just a matter of bragging rights—though it was indeed that, and we bragged plenty about it. It was also an act of self-definition, an affirmation of American identity. We stretched the parameters of the navigable universe the way we stretched the parameters of everything else.

That perspective, obviously, was a romanticized one, achieved through a selective reading of the past. It discounted the experiences of many Black Americans. It minimized the degree to which they and other minorities were shut out from all of this inventing and exploring. It mingled self-congratulatory fiction with fact. And it probably imprinted itself more strongly on me than on some of my peers because of my particular family history. My father’s parents were uneducated immigrants who found in the United States exactly what they’d left Southern Italy for: more material comfort, greater economic stability, and a more expansive future for their children, including my father, who got a scholarship to an Ivy League school, went on to earn an M.B.A., and became a senior partner in one of the country’s biggest accounting firms. He put a heated in-ground pool in the backyard. He put me and my three siblings in private schools. He put our mother in a mink. And he pinched himself all the while.

It was nonetheless true that the idea of the United States as an unrivaled engine of social mobility and generator of wealth held sway with many Americans, who expected their children to do better than they’d done and their children’s children to do even better. That was the mythology, anyway. Sure, we hit lows, but we climbed out of them. We suffered doubts, but we snapped back. The tumult of the late 1960s, Richard Nixon’s degradation of the presidency, and the gas lines, international humiliation, and stagflation of Jimmy Carter’s presidency gave way, in 1980, to the election of Ronald Reagan, who declared that it was “morning again in America” and found an abundance of voters eager to welcome that dawn, to reconnect with an optimism that seemed more credibly and fundamentally American than deviations from it.

I don’t detect that optimism around me anymore. In its place is a crisis of confidence, a pervasive sense among most Americans that our best days are behind us, and that our problems are multiplying faster than we can find solutions for them. It’s a violent rupture of our national psyche. It’s a whole new American pessimism.

Well, maybe not entirely new. In Democracy in America, published in 1835, Alexis de Tocqueville noted a perpetually unsatisfied yearning in Americans, who, he wrote, “are forever brooding over advantages they do not possess.” He found Americans unusually attuned to their misfortunes, and that made (and still makes) sense: With big promises come big disappointments. Boundless dreams are bound to be unattainable.

Even in periods of American history that we associate with prosperity and tranquility, like the 1950s, there were rumblings and disenchantment: Rebel Without a Cause, The Man in the Gray Flannel Suit. And the late 1960s and early ’70s were an oxymoronic braid of surgent hope for necessary change and certainty that the whole American enterprise was corrupt. There were headstrong and heady demands for dignity, for equality, for justice. There were also cities on fire and assassinations. But the overarching story—the general trend line—of the United States in the second half of the 20th century was progress.

[Read: The patron saint of political violence]

Then, in 2001, the Twin Towers fell. In 2008, the global economy nearly collapsed. By 2012, I noticed that our “shining city on a hill,” to use one of Reagan’s favorite terms for the United States, was enveloped in a fog that wouldn’t lift. In June of that year, Jeb Bush visited Manhattan; had breakfast with several dozen journalists, including me; and mused about the country’s diminished position and fortunes. Perhaps because his political life was then on pause—he’d finished his two terms as Florida governor and his 2016 presidential campaign was still years away—he allowed himself a bluntness that he might not have otherwise. “We’re in very difficult times right now, very different times than we’ve been,” he said, and while that was already more downbeat than mainstream politicians’ usual prognostications, his following words were even darker: “We’re in decline.”

In the years that followed, I paid greater and greater heed to evidence that supported his appraisal, which mirrored my own. I was struck by how tempered and tentative President Barack Obama seemed by the second year of his second term, when he often mulled the smallness, not the largeness, of his place in history, telling David Remnick, the editor of The New Yorker, that each president is just “part of a long-running story. We just try to get our paragraph right.” “Mr. President,” my New York Times colleague Maureen Dowd wrote in response, “I am just trying to get my paragraph right. You need to think bigger.”

Of course, when Obama had thought bigger, he’d bucked up against an American political system that was polarized and paralyzed—that had turned “hope and change” into tweak and tinker. Obama’s longtime adviser David Axelrod told the Times’ Michael Shear: “I think to pretend that ‘It’s morning in America’ is a misreading of the times.”

That was in 2014, when I registered and explored the revelation that so many Americans thought China was wealthier than we were. Around the same time, I also noticed a long memo by the prominent Democratic political strategist Doug Sosnik in Politico. He observed that for 10 years running, the percentage of Americans who believed that the United States was on the wrong track had exceeded the percentage who thought it was on the right track. “At the core of Americans’ anger and alienation is the belief that the American dream is no longer attainable,” Sosnik wrote. “For the first time in our country’s history, there is more social mobility in Europe than in the United States.”

That “first time” turned out to be no fleeting aberration. Since then, the negative markers have multiplied, and the negative mood has intensified. The fog over our shining city won’t lift. Almost every year from 2000 to the present, the suicide rate has increased. A kind of nihilism has spread, a “rot at the very soul of our nation,” as Mike Allen wrote last year in his Axios newsletter summarizing a Wall Street Journal/NORC poll that charted both the collapse of faith in American institutions and the abandonment of tradition and traditional values. Only 38 percent of respondents said that patriotism was very important, in contrast with 70 percent of respondents from a similar Journal/NBC survey a quarter century earlier, in 1998.

To recognize those dynamics is to understand America’s current politics, in which so many politicians—presidential candidates included—whip up support less by talking about the brightness of the country’s future than by warning of the apocalypse if the other side wins. They’re not clarions of American glory. They’re bulwarks against American ruin.

This essay was adapted from the forthcoming The Age of Grievance.