Itemoids

United States

The Limits of Utopia

The Atlantic

www.theatlantic.com › newsletters › archive › 2024 › 05 › the-limits-of-utopia › 678339

This is an edition of Time-Travel Thursdays, a journey through The Atlantic’s archives to contextualize the present and surface delightful treasures. Sign up here.

Some 50 years ago, the architect and writer Peter Blake put himself on trial in the pages of The Atlantic. In a dramatic monologue equal parts polemic and confession, he pled guilty to having once upheld what he had come to see as the false precepts of architectural modernism: the insistence that a building’s design should express its function; the utopian faith in urban planning, giant public-housing towers, and prefabricated houses; even the presumption that cities—in new costumes of glass, steel, and concrete—would be the sites of an improved future civilization. A modernist by training, Blake believed that the movement had failed to produce either a more beautiful or a more equitable world in the postwar decades—and this failure necessitated a reconsideration of modernism’s basic tenets. Did form really follow function, or was that just a shibboleth? “The premises upon which we have almost literally built our world are crumbling,” he wrote, “and our superstructure is crumbling with them.”

The disillusionment had set in gradually. Blake, originally Blach, was born in 1920 to a Jewish family in Berlin. Following the rise of National Socialism, he, his mother, and his father all separately made their way to the United States; the Nazis eventually murdered many of their family members and neighbors. Before deploying in the war, Blake apprenticed as an architect in Philadelphia and began freelancing for Architectural Forum. In New York, the magazine’s headquarters, he became acquainted with the avant-garde: not just architects but painters, writers, furniture designers, and more.

Already there was grumbling about modernism. In 1948, responding to a takedown of the movement by The New Yorker’s architecture critic, Lewis Mumford, the young Blake sat on a Museum of Modern Art panel posing the question “What is happening to modern architecture?” A number of luminaries (all men) presented their case, but the report published in the museum’s bulletin concluded that the problem “remained unsolved.”

The issue became even more pressing in the next two decades as cities embraced programs of “urban renewal.” City officials, attracted by a veneer of novelty and efficiency, turned to modernist structures as a way to rehabilitate deteriorating low-income tracts of land—neighborhoods to which Black tenants were steadily relegated as the postwar federal government focused on subsidizing home ownership for white citizens. Public-housing projects, built on slum land that planners cleared using federal money, became avatars of modern design. (See the “tower in a park” units that became one of the prime targets of Blake’s 1974 polemic.)

After the war, criticism of modernism festered. Mumford found the modernists cold and impersonal; their buildings were too much like machines, neglecting “the feelings, the sentiments, and the interests of the person who was to occupy” them, he wrote. In 1961, Jane Jacobs, Blake’s former colleague at Architectural Forum, accused misguided planners of alienating cities from their “everyday diversity of uses and users” in The Death and Life of Great American Cities. Her field-upheaving book became the bible for skeptics of urban uniformity. “Does anyone suppose,” she wrote, “that, in real life, answers to any of the great questions that worry us today are going to come out of homogeneous settlements?”

In his 1974 essay for The Atlantic, Blake echoed Jacobs’s preference for density—and especially her disdain for the wide-open plazas that typically accompanied modern corporate skyscrapers. “The one sure way to kill cities,” he wrote, “is to turn their ground floors into great, spacious expanses of nothing.” But he also went further than Jacobs. In the essay’s final section, he wondered whether cities themselves were necessary to the future of humanity. In wealthy countries, he pointed out, developing technologies were rendering “many face-to-face communications unnecessary.” This wasn’t the world he was sure he desired, but in atoning for his generation’s sins, he pushed himself to the rhetorical limit:

Pretty soon the majority of Americans, and of people in other, industrialized nations, will be living in vast suburban tracts … our old downtown areas will become tourist attractions, probably operated by Walt Disney Enterprises, and kept much cleaner and safer and prettier by the Disney people than our present bureaucracies maintain them now.

His hypothetical became only more feverish:

They will become quaint historic sites, like Siena and Carcassonne and the mad castles of Ludwig of Bavaria, visited by suburbanites on package tours conducted by tape-recorded tourist guides. Rockefeller Center and other beauty spots will be viewed as quaint shrines erected by earlier and more primitive civilizations; and the only housing in these vacation spots will be Hilton Hotels or Howard Johnson’s Motor Inns, plus a few ghettos containing workers needed to clean the sidewalks and change the light bulbs.

Blake’s assault on modernism coincided with New York City’s economy teetering on the edge of collapse. The city had indebted itself precariously for years to balance the budget, but its then-mayor, Abe Beame, was running the city’s credit further into the ground with a spree of short-term borrowing. In November 1974, soon after The Atlantic published Blake’s essay, Beame announced the largest round of city-employee layoffs since the Great Depression.

Remarkably, Beame found time to personally respond to Blake. In a letter published in The Atlantic’s November 1974 issue, he expressed exasperation with several of Blake’s arguments. But Beame saved his greatest ire for Blake’s broader pessimism about cities. Electronic technology would never fully replace face-to-face communication, Beame knew from the regular walks he took around his neighborhood. “You can’t get that kind of human contact and enrichment out of a tube!”

Blake’s essay reflected the panicked condition of New York; it also marked the frenzied peak of a decades-long critique of modernism. In the years that followed, the movement’s shortcomings were deployed to justify the demolition of welfare programs, city planning, and (in the most literal sense) public housing. In his attempt to resuscitate New York’s economy, Beame’s successor, Ed Koch, poured money into private development, subsidizing the construction of luxury apartment buildings and corporate high-rises, some of which became New York’s classically “postmodern” structures.

More recently, some politicians in New York State have been debating legislation they hope will spark a construction boom akin to that of the modernist postwar decades; one recent bill proposes the creation of a “social housing” authority that would prioritize affordable units. In New York City, the linked crises of housing and homelessness are as pressing as ever, and many of the questions Blake and Jacobs wrestled with remain: Is more housing supply the way out? If so, who will build it? If private developers, can Americans trust them with our tax dollars?

Lingering as well is the question contained in the arc of Blake’s career: What does one find after turning away from the vision of an ideal city? In a memoir near the end of his life, Blake wrote fondly, if apprehensively, of the political idealism of the 1930s and 1940s, reserving his criticism for the excesses of corporate capitalism (to which some modern architects, he believed, had fallen prey) and authoritarianism (which he had come to see, in postwar-liberal fashion, as a symptom of idealism itself). By the end of his career, Blake was more than prepared to forfeit the dream of a perfectly built world in favor of reality’s chaotic and diverse one. He often invoked this paraphrase of Mumford: “Life is really more interesting than utopia.”

Who Really Has Brain Worms?

The Atlantic

www.theatlantic.com › health › archive › 2024 › 05 › robert-kennedy-brain-worms-neurocysticercosis › 678331

Earlier today, The New York Times broke some startling news about a presidential candidate. According to a 2012 deposition, Robert F. Kennedy Jr. once suffered from, in his own words, “a worm that got into my brain and ate a portion of it and then died.” The vague yet alarming description could apply to any number of parasitic ailments, among them angiostrongyliasis, baylisascariasis, toxocariasis, strongyloidiasis, and trichinosis. But some experts immediately suspected a condition called neurocysticercosis (NCC), in which the larvae of the pork tapeworm Taenia solium post up in the brain.

The condition might sound terrifying—and, to some observers, darkly hilarious. Literal brain worms! But it does not actually involve any brain-munching, or even a typical worm. The brain-invading culprit is instead a tapeworm (strictly, a kind of helminth) that typically makes its home in pigs. As far as parasitic infections go, this is “the most common one in the brain,” Laila Woc-Colburn, an infectious-disease physician at Emory University, told me. And globally, it’s one of the most common causes of epilepsy in adults.

NCC typically begins after people have been exposed to feces that contain the eggs of a pork tapeworm, say while on a pig farm or handling uncooked, contaminated food. After the eggs are swallowed, they hatch into larvae in the gut. Because people aren’t the appropriate host for the young tapeworms, they end up on a fruitless journey, meandering through the body in a desperate attempt to find pig muscle. A common final destination for the larvae is the brain, where they enclose themselves into cysts in the hopes of maturing; eventually, unable to complete their life cycle, they die, leaving behind little more than a calcified nub.

[Read: Flatworms are metal]

This is, to put it scientifically, some pretty gnarly stuff. But many cases are “completely asymptomatic,” Boghuma Kabisen Titanji, also an infectious-disease physician at Emory University, told me. In other people, though—especially those with a lot of larval cysts—the presence of the foreign invaders can spark a wave of inflammation, which in turn triggers swelling and tissue destruction. Individuals with cysts in their brain may develop headaches or seizures, though those problems can take years or even decades to manifest, Titanji said.

Experts estimate that millions of people may be afflicted with NCC worldwide, most of them concentrated in Latin America, sub-Saharan Africa, East Asia, and India. In the U.S., though, NCC is rather rare, with just a few thousand diagnoses made each year, many of them related to travel or immigration. “This is a disease of poverty,” Woc-Colburn told me. Which would make the multimillionaire Kennedy—if he had the infection at all—“an atypical patient.”

There is, at least, some comforting news. NCC is pretty easily preventable with solid hand-washing habits. And in the U.S., where CT scans are fairly accessible, “it can be diagnosed very easily,” Woc-Colburn said, particularly once doctors have a good sense of a patient’s exposure history. Doctors generally know to look for it in patients who come in with headaches and seizures. (Kennedy first sought help after experiencing memory loss and mental fogginess, though he recently told the Times that those symptoms have since resolved and that he hadn’t received treatment for the parasite.) The infection is also treatable with standard antiparasitics. And caught early, it isn’t expected to leave lingering damage. In more serious cases, though, years of severe, unmanaged seizures can lead to certain cognitive defects.

[Read: America’s never-ending battle against flesh-eating worms]

None of this is to say that Kennedy definitely had NCC. All the public knows is that, in 2010, he said that he was battling neurological symptoms, and that an unusual blemish appeared on a brain scan. (The memory loss and mental fogginess may very well have been attributable to mercury poisoning from Kennedy’s diet at the time, which was high in tuna and perch, according to the same 2012 deposition.) Even if a parasite was definitely to blame, “at least six or seven” others could have ended up in his brain, Titanji told me. Like the pork-tapeworm larvae, several of them would have ended up there accidentally, only to die a quick death without gulping down any brain tissue.

The most comforting news about NCC is that—again—it is uncommon in the United States. Still, now that this news has broken, Woc-Colburn worries that her clinic is going to fill up with people who think they’re afflicted. Given the odds, many of them will be wrong. If anyone’s really worried about their gray matter becoming lunch, they shouldn’t fear worms, but Naegleria fowleri, a rare amoeba that camps out in warm bodies of water. That one, I regret to report, really does eat your brain.