Itemoids

Latino

The Race-Blind College-Admissions Era Is Off to a Weird Start

The Atlantic

www.theatlantic.com › ideas › archive › 2025 › 02 › affirmative-action-yale-admissions › 681541

This article was featured in the One Story to Read Today newsletter. Sign up for it here.

When colleges began announcing the makeup of their incoming freshman classes last year—the first admissions cycle since the Supreme Court outlawed affirmative action—there seemed to have been some kind of mistake. The Court’s ruling in Students for Fair Admissions v. Harvard had been almost universally expected to produce big changes. Elite universities warned of a return to diversity levels not seen since the early 1960s, when their college classes had only a handful of Black students.

And yet, when the numbers came in, several of the most selective colleges in the country reported the opposite results. Yale, Dartmouth, Northwestern, the University of Virginia, Wesleyan, Williams, and Bowdoin all ended up enrolling more Black or Latino students, or both. Princeton and Duke appear to have kept their demographics basically stable.

These surprising results raise two competing possibilities. One is that top universities can preserve racial diversity without taking race directly into account in admissions. The other, favored by the coalition that successfully challenged affirmative action in court, is that at least some of the schools are simply ignoring the Supreme Court’s ruling—that they are, in other words, cheating. Finding out the truth will likely require litigation that could drag on for years. Although affirmative action was outlawed in 2023, the war over the use of race in college admissions is far from over.

History strongly suggested that the end of affirmative action would be disastrous for diversity in elite higher education. (Most American colleges accept most applicants and therefore didn’t use affirmative action in the first place.) In the states that had already banned the practice for public universities, the share of Black and Latino students enrolled at the most selective flagship campuses immediately plummeted. At UC Berkeley, for example, underrepresented minorities made up 22 percent of the freshman class in 1997. In 1998, after California passed its affirmative-action ban, that number fell to 11 percent. Many of these schools eventually saw a partial rebound, but not enough to restore their previous demographic balance.

Something similar happened at many selective schools in the aftermath of the Supreme Court’s 2023 ruling. At Harvard and MIT, for example, Black enrollment fell by more than 28 and 60 percent, respectively, compared with the average of the two years prior to the Court’s decision. But quite a few institutions defied expectations. At Yale, Black and Latino enrollment increased, while Asian American enrollment fell by 16 percent compared with recent years. Northwestern similarly saw its Black and Latino populations increase by more than 10 percent, while Asian and white enrollment declined. (In Students for Fair Admissions, the Court had found that Harvard’s race-conscious admissions policies discriminated against Asian applicants.)

[Rose Horowitch: The perverse consequences of tuition-free medical school]

Figuring out how this happened is not easy. Universities have always been cagey about how they choose to admit students; the secrecy ostensibly prevents students from trying to game the process. (It also prevents embarrassment: When details have come out, usually through litigation, they have typically not been flattering.) Now, with elite-college admissions under more scrutiny than usual, they’re even more wary of saying too much. When I asked universities for further details about their response to the ruling, Dartmouth, Bowdoin, and Williams declined to comment, Yale and Northwestern pointed me toward their vague public statements, and a Princeton spokesperson said that “now race plays no role in admissions decisions.” Duke did not reply to requests for comment.

The information gap has led outside observers to piece together theories with the data they do have. One possibility is that universities such as Yale and Princeton are taking advantage of some wiggle room in the Supreme Court’s ruling. “Nothing in this opinion should be construed as prohibiting universities from considering an applicant’s discussion of how race affected his or her life, be it through discrimination, inspiration, or otherwise,” Chief Justice John Roberts wrote in his majority opinion. This seemed to provide an indirect way to preserve race-consciousness in admissions. “It’s still legal to pursue diversity,” Sonja Starr, a law professor at the University of Chicago, told me. Her research shows that 43 of the 65 top-ranked universities have essay prompts that ask applicants about their identity or adversity; eight made the addition after the Court’s decision.

Another theory is that universities have figured out how to indirectly preserve racial diversity by focusing on socioeconomic status rather than race itself. In 2024, Yale’s admissions department began factoring in data from the Opportunity Atlas, a project run by researchers at Harvard and the U.S. Census Bureau that measures the upward mobility of children who grew up in a given neighborhood. It also increased recruitment and outreach in low-income areas. Similarly, Princeton announced that it would aim to increase its share of students who are eligible for financial aid. “In the changed legal environment, the University’s greatest opportunity to attract diverse talent pertains to socioeconomic diversity,” a committee designed to review race-neutral admissions policies at the college wrote.

Some evidence supports the “socioeconomics, not race” theory. Dartmouth announced that it had increased its share of low-income students eligible for federal Pell grants by five percentage points. Yale has said that last year’s incoming freshman class would have the greatest share of first-generation and low-income students in the university’s history. Richard Kahlenberg, a longtime proponent of class-based affirmative action who testified on behalf of the plaintiffs challenging Harvard’s admissions policies, told me that, by increasing economic diversity as a proxy for race, elite colleges have brought in the low-income students of color whom purely race-based affirmative action had long allowed them to overlook. (In recent years, almost three-quarters of the Black and Hispanic students at Harvard came from the wealthiest 20 percent of those populations nationally.) “While universities had been claiming that racial preferences were the only way they could create racial diversity, in fact, if we assume good faith on the part of the universities, they have found ways to achieve racial diversity without racial preferences,” Kahlenberg said.

[Richard Kahlenberg: The affirmative action that colleges really need]

If we assume good faith—that’s a big caveat. Not everyone is prepared to give universities the benefit of the doubt. Edward Blum, the president of Students for Fair Admissions, the plaintiff in the case that ended affirmative action, has already accused Yale, Princeton, and Duke of cheating. And Richard Sander, a law professor at UCLA and a critic of affirmative action, said that if a university’s Black enrollment numbers are still above 10 percent, “then I don’t think there’s any question that they’re engaged in illegal use of preferences.”

The skeptics’ best evidence is the fact that the universities accused of breaking the rules haven’t fully explained how they got their results. Yale, for example, has touted its use of the Opportunity Atlas, but hasn’t shared how it factors information from the tool into admissions decisions. Before the Court’s ruling, a Black student was four times more likely to get into Harvard than a white student with comparable scores, and a Latino applicant about twice as likely.

To keep numbers stable, race-neutral alternatives would have to provide a comparable boost. According to simulations presented to the Supreme Court, universities would have to eliminate legacy and donor preferences and slightly lower their average SAT scores to keep demographics constant without considering race. (In oral arguments, one lawyer compared the change in test scores to moving “from Harvard to Dartmouth.”) With minor exceptions, selective universities have given no indication that they’ve made either of those changes.

Even the data that exist are not totally straightforward to interpret. Some universities have reported an uptick in the percentage of students who chose not to report their race in their application. If that group skews white and Asian, as research suggests it does, then the reported share of Black and Latino students could be artificially inflated. And then there’s the question of how many students choose to accept a university’s offer of admission, which schools have little control over. Wesleyan, for example, accepted fewer Black applicants than it had in prior years, Michael Roth, the university’s president, told me. But a larger share chose to matriculate—possibly, Roth said, because even-more-selective schools had rejected them. The University of Virginia similarly had an unusually high yield among Black students, according to Greg Roberts, its dean of admissions. He couldn’t tell whether this was thanks to the school’s outreach efforts or just a coincidence. “I think what we’re doing is important, but to the extent it will consistently impact what the class looks like, I have no idea,” he told me. (Both Roth and Roberts, the only university administrators who agreed to be interviewed for this article, assured me that their institutions had obeyed the Court’s ruling.)

None of those alternative explanations is likely to sway the people who are convinced the schools cheated. With Donald Trump back in office, colleges that don’t see a meaningful uptick in Asian enrollees will likely face civil-rights investigations, says Josh Dunn, a law professor at the University of Tennessee at Knoxville. “If everything ends up looking exactly like it did prior to SFFA,” he told me, then the courts will “probably think that the schools were not trying to comply in good faith.”

Blum, the head of Students for Fair Admissions, has already threatened to sue Yale, Princeton, and Duke if they don’t release numbers proving to his satisfaction that they have complied with the law. (Blum declined to be interviewed for this article.) A new lawsuit could force universities to turn over their admissions data, which should reveal what’s really going on. It could also invite the Court to weigh in on new questions, including the legality of race-neutral alternatives to affirmative action that are adopted with racial diversity in mind. A resolution to any of these issues would take years to arrive.

In many ways, the endless fight over affirmative action is a proxy for the battle over what uber-selective universities are for. Institutions such as Harvard and Yale have long been torn between conflicting aims: on the one hand, creating the next generation of leaders out of the most accomplished applicants; on the other, serving as engines of social mobility for promising students with few opportunities. It will take much more than the legal demise of affirmative action to put that debate to rest.

Legal Weed Didn’t Deliver on Its Promises

The Atlantic

www.theatlantic.com › ideas › archive › 2025 › 01 › marijuana-legalization-drawbacks › 681519

In 2012, Colorado and Washington State legalized the commercial production and sale of cannabis for nonmedical use, and since then 22 other U.S. states have followed. The shift was viewed in many quarters as benign and overdue—involving an organic, even medicinal, intoxicant with no serious drawbacks. Advocates promised safe and accurately labeled products, reduced addiction to opioids, smaller prison populations, surging tax revenue, and a socially responsible industry that prioritized people over profits. But all of those promises have turned out to be overstated or simply wrong.  

Legalization has raised cannabis consumption dramatically, and also altered patterns of use. In the 1990s and early 2000s, most consumers smoked the drug and did so only occasionally or semi-regularly—say, on weekends with friends. Some people used more regularly, of course: In 2000, 2.5 million Americans reported daily or near-daily cannabis use. But by 2022, that had grown sevenfold to 17.7 million. Remarkably, that’s more than the 14.7 million who reported using alcohol that often. Today, more than 40 percent of Americans who use cannabis take it daily or near-daily, and these users consume perhaps 80 percent of all the cannabis sold in the U.S.

The drug’s potency has also risen sharply. Until the year 2000, the average potency of seized cannabis never exceeded 5 percent THC, the principal intoxicant in the plant. Today, smokeable buds, or flower, sold in licensed stores usually exceed 20 percent THC. Vapes, dabs, and shatter—all of which are forms of drug delivery that commercialization spread—are more potent still.

[Malcolm Ferguson: Marijuana is too strong now]

More frequent use of more potent products has led to a staggering rise in the typical consumer’s average weekly dose of THC. Back in the 1980s and ’90s, when potency averaged about 4 percent, someone consuming one 0.4-gram joint each weekend night—and none on weekdays—was averaging roughly 32 milligrams of THC a week. Average daily users today are consuming about 1.6 grams of high-potency flower a day, or its equivalent in other forms. That works out to more than 2,000 milligrams of THC a week—or about 70 times as much.

The numbers are shocking, and yet this is what happens when frequency, potency, and quantity all rise in tandem. For some consumers, high potency itself encourages more frequent use by delivering a stronger effect.

Medical science can’t yet clarify the effects of long-term use of 300-plus milligrams of THC a day, because this consumption pattern is new. Most controlled studies work with short-term exposure to smaller doses, often in the 20-to-50 milligram range, and observational studies that followed users for years were examining a drug—low-strength, infrequently used cannabis—that barely exists anymore.

But high-frequency use of high-potency marijuana raises a range of concerns. For one thing, there is little question that cannabis intoxication can impair cognitive functions including concentration and memory formation. That was not a big worry when most people used only on weekends. Daily use, however, means using on work and school days. The drug also impairs perception and motor control; the availability of strong, legal marijuana has been followed by increases in automobile crashes and emergency-room visits.

And over the long term, although some people can handle a wake-and-bake lifestyle, just like some alcoholic people are functional, there are likely millions of users for whom couch lock impedes career advancement, academic success, or meeting responsibilities to family.

On surveys, 63 percent of high-frequency users report enough cognitive, emotional, employment, and social problems as a result of using the drug to be coded as meeting the criteria for a cannabis-use disorder (a condition defined by being unable to fully control drug-use behavior despite its negative consequences). For technical reasons, we think that figure overstates the problem, but there is no doubt that the problem exists: 17 percent of high-frequency users report wanting cannabis so badly that there are times they can’t think of anything else. Chronic use may lead to other health problems as well. Most notably, evidence is mounting that frequent use of high-strength products raises the risk of serious mental illnesses such as schizophrenia.  

To be clear, these risks and harms do not remotely add up to a “cannabis crisis” in the same way that we speak of an opioid crisis or a meth crisis—calamities marked by widespread premature mortality and shattered families. Many people who enjoy cannabis have no trouble managing their use. They can now buy it cheaply and without stigma, in a variety of forms. And everyone can be relieved that adolescents’ cannabis use has stayed roughly where it was since legalization began. By one standard measure, use rose just 3 percent among 12-to-17-year-olds from 2012 to 2022.

But use has soared for adults (up 155 percent), especially for those 35 and older (up 300 percent), and the increase cannot be characterized as entirely benign. Many assumptions made about what would follow legalization seem naive in retrospect.

Those assumptions extended beyond the nature of the relationship between pot users and the drug, and how it might change. Soaring cannabis use would still have been a win, from public-health and crime-control perspectives, if it had resulted in less use of even more dangerous drugs. But it hasn’t. Predictions that cannabis legalization would reduce consumption of alcohol, a drug much more strongly associated with physical aggression, were not realizedreductions observed in some groups or contexts were offset by increases in others.  

Based on weak scientific evidence, many advocates likewise promised that legal cannabis would lead people to use fewer opioids. (Weedmaps—an online review site for pot—put up billboards all over the country promising reductions in opioid overdose, for instance.) Yet those early findings were reversed as more data became available, and recent reviews suggest that legalization is more likely to increase than reduce opioid-death rates. This should not be too surprising: Although the old “gateway drug” arguments of the 1970s and ’80s overstated the risk of merely trying marijuana, the commercialization of cannabis has clearly expanded high-frequency use, and dependence on any drug can increase the likelihood of using and developing dependence on other drugs.

[Read: Marijuana’s health effects are about to get a whole lot clearer]

Some promised criminal-justice benefits have also proved illusory, in part because advocates exaggerated the extent to which marijuana use entangled people in the criminal-justice system. “Discriminatory enforcement of marijuana laws is one reason that black and Latino Americans make up two-thirds of the U.S. prison population,” the progressive Center for American Progress noted in 2018, in a report advocating national legalization. But even before legalization, very few people were in prison for pot possession alone. There were a lot of pot-smoking burglars and robbers behind bars, but only about 2 percent of inmates were in prison solely for marijuana offenses, and most of those were traffickers or their employees.

That there had been too many marijuana-possession arrests is undoubtedly true. And legalization has cut them sharply, leaving mostly only arrests of underage users and of residual illegal suppliers. But even here, the case for outright legalization of supply was oversold: States that merely decriminalized marijuana possession saw declines almost as large. In California, for example, converting marijuana possession from a misdemeanor to a civil infraction reduced possession arrests by 86 percent in just 12 months. Subsequent legalization had only a modest incremental effect.

Allowing commercial supply—as opposed to merely decriminalizing possession—has produced other unintended consequences, though these consequences could easily have been anticipated because businesses typically follow the laws of economics.    

Large producers run by MBAs have adopted industrial agricultural practices that are brutally efficient, dramatically outcompeting the artisanal production that many advocates foresaw. Before legalization, much high-quality cannabis was grown in small indoor facilities; one 2006 Dutch study of 77 illegal grows reported an average size smaller than 200 square feet. Now an average-size commercial grow might operate on 10,000 to 20,000 square feet, and an industry magazine lists one producer (Copperstate Farms) as operating almost 2,000,000 square feet of greenhouse grow space; mixed-mode growers are even larger.

Commercial production has driven down prices, and so the cannabis tax windfall touted by many supporters of legalization has also been underwhelming. In California, cannabis excise and sales taxes peaked in 2021; by the first quarter of 2023, they were reported as accounting for only 0.2 percent of total state tax collections. Not all taxes due even get collected; in 2023, for instance, 15 percent of the state’s cannabis firms defaulted on taxes they owed.  

Falling prices have thinned profit margins, adding to the commercial imperative to expand the market and attract new customers. Hence the proliferation of edibles and other products that are more accessible to nonsmokers. The industry is targeting women—who historically used cannabis less than men did—as a growth demographic, just as the cigarette and alcohol industries had before. From 2012 to 2022, high-frequency use grew strongly for men (up 137 percent), but exploded among women (up 300 percent).

Many commercial cannabis providers have proved difficult to regulate. Initially, regulatory enforcement efforts tended to be modest, and that was an error. Misleading labels are commonplace in the cannabis industry today, and some producers use unapproved pesticides or exploitative labor arrangements.  

The 2018 Farm Bill created further opportunities for bad behavior. The bill was supposed to legalize nonintoxicating uses of the cannabis plant, such as growing fiber for clothes or seed for food and oil. Unfortunately, loopholes let unscrupulous actors sell intoxicating products completely outside of most states’ regulatory systems. The Farm Bill permits the production and sale of “hemp”—defined as any cannabis product containing less than 0.3 percent of delta-9 THC, the primary THC variant in cannabis. But edibles, being relatively heavy, can contain a lot of delta-9 THC and still, by weight, remain under the 0.3 percent threshold. What’s more, the marijuana plant contains nonintoxicating cannabinoids that can be chemically transformed into intoxicating cousins such as delta-8 THC. The resulting array of products, which can appeal to youth, may have no labeling requirements (depending on what state they’re being sold in) and no protection against unfamiliar and potentially dangerous synthetic by-products. They may not have been tested for pesticides either.

Unsurprisingly, hemp producers who do not follow product-safety rules have in many cases been outcompeting those state-licensed cannabis companies that try to follow the regulations, contributing to high cannabis-business failure rates and less reliable products for consumers.

These ills and others—the sprouting of cannabis shops on seemingly every block in some city neighborhoods, the smell of pot that greets many riders of public transportation—have not gone unnoticed by the American people. The election in November underscored the degree of disappointment with the results of marijuana legalization. Though Nebraska did become the 39th state to approve the drug for medical purposes, North and South Dakotans voted down ballot initiatives to legalize recreational use. Floridians did the same—despite $150 million in campaign spending by the industry and an endorsement from Donald Trump.

This pause in what had seemed an inexorable movement toward wider—and eventually national—legalization is healthy. Leaping all the way from prohibition to the enthusiastic embrace of a for-profit, freewheeling, corporate cannabis industry has clearly created downsides and excesses that legalization advocates did not initially imagine (or, in some cases, admit). States still considering legalization—and those that may be reconsidering how legalization has worked out for them so far—would be wise to instead explore the ample middle ground, or what the late drug-policy expert Mark Kleiman called a “grudging toleration” of legal use and supply. Even a society that otherwise embraces free-market capitalism should be open to middle paths for addiction-inducing intoxicants, which are not ordinary commodities.  

[Read: I don’t want to smell you get high]

What might grudging toleration look like in practice? In addition to eliminating the Farm Bill loopholes that have contributed to a Wild West environment in many places, we would offer four specific suggestions.   

1. Restrain the power of large-scale producers.

The cannabis supply chain spans growers, manufacturers who process and package the plant material, and retailers. Regulation is needed for farmers (concerning which pesticides are allowed, for instance) and retailers (testing compliance with laws blocking sale to minors, for example)—but the bigger challenges involve the manufacturers who produce the concentrated products, control the brands, and dominate marketing and advertising. Two remedies can help.

First, there is no reason to allow for-profit corporations to participate in product manufacturing. For-profit businesses are fabulously efficient at developing new products and driving up consumption. That’s fine when the product is cornflakes or canola, but not when it involves addictive drugs. Cannabis is an addiction-inducing intoxicant for which rapidly expanding consumption has significant costs.  

Instead, legalization could restrict cannabis-product manufacturing to nonprofits or public-benefit corporations. Reliance on nonprofits is a norm in some other industries providing goods or services that in one way or another involve issues beyond pure commerce. Most hospitals and universities, for instance, are either nonprofit or government-owned. In the cannabis industry, these organizations could be chartered to undercut illegal supply by producing to meet existing demand, without promoting greater consumption.

Second, especially in places where for-profit manufacturers are still permitted, major manufacturers should be barred from owning, operating, or controlling either farms or retail outlets. Similar restrictions were part of many states’ plans when alcohol prohibition was repealed, and they might limit big corporations’ power—including their lobbying power. Likewise, they should be barred from merging with tobacco and alcohol companies.

2. Curtail high-potency products.

For many, the purpose of legalization was to replace the illegal market with legal, regulated supply. But legalization has also changed the market, bringing in a slew of more potent products. Drug-reform advocates sometimes invoke the so-called iron law of prohibition, which claims that prohibition begets more potent forms (because, being more compact, they are more easily hidden). But with cannabis, the opposite happened: It was legalization that spread higher-potency forms of the drug.  

Whether cheap, higher-potency products necessarily exacerbate health harms is much debated. But history is replete with examples of inexpensive, high-potency forms of drugs creating new problems, from the British Gin Craze of the first half of the 18th century in London (during which consumption increased eightfold to about one gallon per person per year) to the current fentanyl epidemic, which has killed more Americans than heroin ever did.

Some fear that banning higher-potency products will create or greatly expand illegal markets, but modern societies often ban certain forms of a product without creating big illegal markets—as long as other forms remain legal. For instance, throughout much of the 20th century, many countries banned the sale of absinthe, but there was no big illegal market for absinthe because other liquors were available. Likewise, today’s bans on caffeinated alcoholic drinks and flavored cigarettes are mostly honored.

Quebec already essentially bans dabs, butane hash oil, and other high-potency products, and it has considerably less cannabis use than other provinces of Canada.    

The U.S. should likewise ban such products, and maybe also the synthesis of artificial cannabinoids. And for products that stay within the potency limit, a further safeguard could be taxing more potent products at a higher rate, just as is done with alcoholic beverages.

3. Leave room for small-scale producers and other small businesses.

The primary challenge to public health does not come from the many small artisanal producers of marijuana, or from retailers. The greater problem is Big Marijuana. Applying the same rules to all parties burdens hobbyists and boutique producers while letting corporations run amok.

Most states have enacted cottage-food-production laws that exempt small-scale producers of craft-food products (baked goods, pickles, honey, etc.) from the strict scrutiny that is appropriate for agribusiness and food conglomerates. Cannabis policy could make similar distinctions.  

Small growers might be exempted from certain regulations when selling only flower—and also prohibited from selling refined or dangerous products, just as cottage-food producers are usually banned from selling meats or goods that need to be refrigerated.

4. Get public safety and public health off the sidelines.

In order to limit the damage done by legal addictive products, society needs effective public-health regulation. And in order to thrive, licensed legal industries need government enforcement against illegal suppliers. Both of these necessities have been lacking.

Neither regulators nor police have attacked illegal production, promotion, and sale with sufficient vigor, perhaps because any enforcement involving marijuana has become entangled—at least in the minds of many progressives—with concerns about the carceral state, or anti-police sentiment more generally.

[Jane C. Hu: Almost no one is happy with legal weed]

But when the legal risk of, say, operating an unlicensed weed shop drops to near-zero, the illegal industry grows and the legal industry suffers, undermining the legalization regime and tax revenue at the same time. Enforcement agencies, those who oversee them, and the activist community need to shake off the misperception that enforcing the rules of a legal industry is a revival of the War on Drugs.

Similarly, public-health departments must start informing the public more vigorously about the health risks of cannabis, just as they do those of tobacco, alcohol, and gambling. Thus far, they have generally failed to do this, perhaps because of misplaced fears of reenacting hysterias of prior eras. Today, a big, legal industry is selling a product with established health risks, and public health needs to embrace its traditional role as an advocate for health over profit.  

These and other reforms would better balance the trade-offs between profit and public interest. Naturally, the industry will fight them, but this should only increase urgency. Naive and self-serving advocates shaped (and, via the initiative process, sometimes wrote) many state-level legalization bills, with results that should trouble us. Legalization should be redesigned where it already exists, and efforts to expand it to other states or nationally should learn from the mistakes of the recent past.

What the Fires Revealed About Los Angeles Culture

The Atlantic

www.theatlantic.com › culture › archive › 2025 › 01 › los-angeles-wildfires-infrastructure › 681428

When wildfires broke out across Los Angeles earlier this month, many publications began to frame the incalculable tragedy through the lens of celebrity news. As flames engulfed the Palisades, a wealthy neighborhood perched along the Pacific Coast Highway, a steady influx of reports announced the growing list of stars who’d lost their homes: Paris Hilton. Billy Crystal. Rosie O’Donnell. These dispatches from celebrity evacuees have broadcast the scale and intractability of the damage, underscoring something most Southern Californians already know to be true: No one, not even the rich and famous, is safe from the danger of wildfires. “This loss is immeasurable,” the TV host Ricki Lake said in an Instagram post about her home burning. “I grieve along with all of those suffering during this apocalyptic event.”

In the most basic sense, the wildfires can be understood as equalizing. An ember doesn’t choose its path based on property value or paparazzi presence, and when one part of Los Angeles burns, foreboding smoke hangs over the whole metro area. Secluded neighborhoods like the Pacific Palisades, where multimillion-dollar houses overlook the ocean, typically have far fewer evacuation routes than urban areas do. But as fires continue to ravage the area, the blazes also reflect—and exacerbate—the disparities embedded in the most mundane tenets of L.A. life. In Southern California, sights as common as a crowded freeway help explain why wildfires have become a universal threat—and why some Angelenos are less equipped than others to recover from the devastation those fires cause.

Like other extreme-weather events, wildfires are now more common and more difficult to protect against, because of climate change. The state has made some inroads in addressing greenhouse-gas emissions, which drive extreme temperatures and drought, but one of the greatest accelerants is practically synonymous with California itself. Car culture not only undermines efforts to reduce the toxic pollution that fuels climate change—it also relies on infrastructure that creates and deepens drastic inequalities among the communities that live with the consequences of climate change. Modern Los Angeles depends on cars partly because of its sprawling geography, Anastasia Loukaitou-Sideris, an urban-planning professor and the interim dean of UCLA’s Luskin School of Public Affairs, explained to me. Yet these smog-producing cars became so central to Southern California life because of “transportation policy that has quite favored the automobile and given a tremendous amount of investment to build the freeways,” Loukaitou-Sideris said.

[Read: The GoFundMe fires]

In moments of tragedy or upheaval, not all Angelenos can take their freedom of mobility for granted, in part because of how Southern California infrastructure has developed over the past century. The multilane highways that now crisscross the area were first laid out in the late 1930s, not long after the idea of L.A. as “the city built for the automobile” emerged as a political campaign. (In the ’20s, an extensive transit network stretching from Venice well into the Inland Empire was the world’s largest electric-railway system; by the early ’60s, it had been completely dismantled to make room for freeways and buses.) Through the tail end of the 20th century, lawmakers prioritized suburban growth, enabled by car-friendly streets and expressways. Meanwhile, transit systems in urban areas—the ones that connect people in dense locations—received comparatively little funds. In the past decade, more funding has gone toward buses and rail systems, but ridership has decreased—in part because rising housing costs in transit-friendly neighborhoods have pushed out the low-income residents most likely to rely on it.

Beyond favoring only people with cars, these freeway networks created further social stratification. Developers often chose to place major highways in low-income areas because wealthy, and often white, homeowners lobbied against their own neighborhoods being disrupted. In their research, Loukaitou-Sideris and her colleagues traced the historical impacts of several L.A. County and Bay Area freeways built during the 1960s and ’70s. For many Californians, these roads represented freedom of movement. But researchers found that their construction had—and still has—incredibly damaging effects on the (often poor and/or Black) neighborhoods they run through. Californians in communities of color are typically not the most frequent drivers, but they live with the highest concentration of vehicle emissions—and traffic-related pollution compounds the health risks of inhaling wildfire smoke.

Because so many displaced residents need shelter, some landlords and real-estate agents are now attempting to list apartments with sky-high rents, despite state laws against price gouging after disasters. The rise of this illegal exploitation points to a sobering reality: For many Californians, the onset of a destructive wildfire is an economic catastrophe, too. That’s part of why Rachel Morello-Frosch, an environmental-health scientist and a professor at UC Berkeley, insists that evacuation maps alone don’t tell a complete story. She referred to what she and her colleagues have called “the climate gap”: how extreme-weather events disproportionately affect communities of color and those that are poor, underinsured, and underinvested. One of the most brutal fires hit Altadena, an unincorporated town north of Pasadena where people of color sought refuge from racist housing policies, and where the percentage of Black homeowners eclipses other parts of the metro area. Restoring Altadena, and preserving its Black and Latino residents’ connections to the place where they’ve built a distinct cultural history, will undoubtedly be a complicated task.  

Federal support for California’s efforts to prevent future wildfires is uncertain under the new administration—President Donald Trump has already signed several executive orders that undo climate regulations. During his first term, Trump reportedly refused to give disaster aid to California on partisan grounds—and changed his mind only when informed that a heavily Republican area had been affected by wildfires. Prior to Trump being sworn in for a second term on Monday, the president’s threats to place conditions on federal aid to California were said to be gaining traction, even as the fires continued to obliterate swaths of the state. In his inaugural speech, Trump lamented that the fires are “raging through the houses and communities, even affecting some of the wealthiest and most powerful individuals in our country.” Earlier this month, in posts on Truth Social, he cast blame on Governor Gavin Newsom for allegedly failing to deliver basic services to residents. (Newsom’s office disputed Trump’s characterization of the governor’s actions.)

But climate change poses an existential threat to all Californians, regardless of political affiliation, class, or celebrity. As I watch my home state anxiously from afar, checking my text messages constantly for updates from my loved ones, I’ve been heartened by the mutual-aid networks and community-led efforts that have sprung up. Amid so much destruction, the rare moments of hope come from seeing how many Angelenos recognize the stakes of building a different future together. Disaster response doesn’t have to look the way it did in New Orleans during Hurricane Katrina, when vulnerable groups were the slowest to recoup their losses (and, in some cases, never did). As Morello-Frosch put it to me, in order for Angelenos to “return, recover, and rebuild in a way that maybe helps fortify them against the next fire,” the government would need to be invested in the health and safety of all people—and proactively account for the inequities that vulnerable communities face before the next blazes hit.

How the Village People Explain Trump

The Atlantic

www.theatlantic.com › culture › archive › 2025 › 01 › trumps-village-people-inauguration › 681387

The first great image of the second Donald Trump administration emerged last night at a Washington, D.C., basketball arena, where the soon-to-be-inaugurated president danced with the Village People. After Trump finished one of his classic stem-winding speeches, he was joined by five hunks of disco infamy: the bare-armed construction worker, the denim-crotched cowboy, the chaps-wearing biker, the befringed Native American chief, and the vinyl-booted cop. With his suit and pendulous red tie, Trump looked like he was in the band, like just another shade in a rainbow of satirical American masculinity.

The president’s affinity for the Village People’s music used to seem trollish, but now it’s just logical. The band formed in the 1970s when two French producers, one of them gay, put out a casting call that read “Macho Types Wanted: Must Dance and Have a Moustache.” Today those founders are dead, but the band’s frontman, Victor Willis, is alive to deny, at every chance, that “YMCA” is a queer anthem. Over the past few years, he’s also moved from condemning the Trump campaign’s use of the song to embracing it, in part because, as he recently explained on Facebook, “The financial benefits have been great.” The Trumpified Village People now project what seemed to be the greater theme of this past inauguration weekend: a strange new dream of American unity, washed of anything but cosmetic difference, joined in spectacle and opportunism.

At his previous inauguration, Trump had trouble booking performers to celebrate the results of a brutally divisive, closely contested election. Headliners included the faded rock band 3 Doors Down, a drummer famous for a cameo in The Matrix Reloaded, and the late, game-for-whatever Toby Keith (who told me in 2017, “The president of the frickin’ United States asks you to do something and you can go, you should go instead of being a jack-off”). The festivities felt confused and limp.

This inauguration, by contrast, followed an election in which virtually every demographic had moved to the right. Trump now has a big tent, so he’s going to put on a circus. The rosters for the inaugural galas weren’t quite A-list in terms of musicians who matter right now, but they did feature recognizable names across a range of genres and constituencies—the rapper Nelly; the reggaeton star Anuel AA; various right-leaning, country-aligned stalwarts such as Jason Aldean and Kid Rock. The greatest reversal was for Snoop Dogg, who once made fun of rappers who palled around with the president but now seemed happy to DJ for tuxedoed bros celebrating the first crypto president.

The Capitol Rotunda, where the inauguration ceremony was moved because of freezing weather, made the big tent feel intimate. As the faces of America’s past looked down from busts, the ceiling painted with E Pluribus Unum, various oddities of the present—such as Melania’s sleek, eye-hiding Hamburglar hat—instantly looked historical. The chamber was so small that much of the audience watched from an overflow room; the Democrats (including four previous presidents and their spouses, sans Michelle Obama) were scrunched up close to the Republicans, as if at a courthouse wedding. Behind Trump stood the most important new members of his coalition: the tech moguls Elon Musk, Tim Cook, Jeff Bezos, and Mark Zuckerberg.

[Read: The Gilded Age of Trump begins now]

“The entire nation is rapidly unifying,” Trump said in his speech, before listing the many demographics—Black, Latino, old, young, and so forth—who’d helped deliver his victory. The speech had its dark passages, but it was no redux of 2017’s “American carnage” rant. Rather, Trump strung together positive, forward-looking statements about the country’s oncoming golden age—an endless summer on the “Gulf of America,” without crime or conflict, and our flag waving on Mars. He was followed by a bar joke’s worth of benedictions—from a rabbi, a Catholic priest, and a Black evangelical pastor. The latter, Lorenzo Sewell, spoke with rumbling flamboyance, calling for freedom to ring “from the prodigious hilltops of New Hampshire” to “the curvaceous hilltops of California.”

As pageantry, the ceremony was effective. The opera singer Christopher Macchio bellowed “Oh, America” over military drums, with a hint of ’80s-metal righteousness. The repetitious nature of the president’s speech, stating and restating visions of prosperity and peace, served to distract from the various groups that may soon suffer: millions of immigrants he vowed to round up; trans and gender-nonconforming people navigating the government’s strict new definitions of gender; the “radical and corrupt establishment” whose leaders were sitting inches away, politely squinting at a man who’d vowed retribution against his rivals.

The spell created by pomp and circumstance broke a bit for one performance during the ceremony. Carrie Underwood, the 41-year-old American Idol star and country hitmaker, walked out to sing “America the Beautiful.” Something went wrong with her backing music, and she smiled in silence for nearly two minutes. Was this an omen? Would Trump’s promised golden age immediately turn out to be glitchy and underwhelming? But then Underwood told the Rotunda to just sing the words along with her. Everyone obliged—including Joe Biden and, by the end of the song, Kamala Harris. Democracy, it’s well understood, has been undergoing a trial. But, begrudgingly or not, the country’s still together.

The Coalition Collapse That Doomed Biden’s Presidency

The Atlantic

www.theatlantic.com › politics › archive › 2025 › 01 › coalition-collapse-biden-carter › 681254

Presidents whom most voters view as failures, justifiably or not, have frequently shaped American politics long after they leave office—notably, by paving the way for presidencies considered much more successful and consequential. As President Joe Biden nears his final days in office, his uneasy term presents Democrats with some uncomfortable parallels to their experience with Jimmy Carter, whose state funeral takes place this week in Washington, D.C.

The former Georgia governor’s victory in 1976 initially offered the promise of revitalizing the formidable electoral coalition that had delivered the White House to Democrats in seven of the nine presidential elections from 1932 (won by Franklin D. Roosevelt) to 1964 (won by Lyndon B. Johnson), and had enabled the party to enact progressive social policies for two generations. But the collapse of his support over his four years in office, culminating in his landslide defeat by Ronald Reagan in 1980, showed that Carter’s electoral victory was instead that coalition’s dying breath. Carter’s troubled term in the White House proved the indispensable precondition to Reagan’s landmark presidency, which reshaped the competition between the two major parties and enabled the epoch-defining ascendancy of the new right.

The specter of such a turnabout now haunts Biden and his legacy. Despite his many accomplishments in the White House, the November election’s outcome demonstrated that his failures—particularly on the public priorities of inflation and the border—eclipsed his successes for most voters. As post-election surveys made clear, disapproval of the Biden administration’s record was a liability that Vice President Kamala Harris could not escape.

Biden’s unpopularity helped Donald Trump make major inroads among traditionally Democratic voting blocs, just as the widespread discontent over Carter’s performance helped Reagan peel away millions of formerly Democratic voters in 1980. If Trump can cement in office the gains he made on Election Day—particularly among Latino, Asian American, and Black voters—historians may come to view Biden as the Carter to Trump’s Reagan.

In his landmark 1993 book, The Politics Presidents Make, the Yale political scientist Stephen Skowronek persuasively argued that presidents succeed or fail according to not only their innate talents but also the timing of their election in the long-term cycle of political competition and electoral realignment between the major parties.

Most of the presidents who are remembered as the most successful and influential, Skowronek showed, came into office after decisive elections in which voters sweepingly rejected the party that had governed the country for years. The leaders Skowronek places in this category include Thomas Jefferson after his election in 1800, Andrew Jackson in 1828, Abraham Lincoln in 1860, Roosevelt in 1932, and Reagan in 1980.

These dominating figures, whom Skowronek identifies as men who “stood apart from the previously established parties,” typically rose to prominence with a promise “to retrieve from a far distant, even mythic, past fundamental values that they claimed had been lost.” Trump fits this template with his promises to “make America great again,” and he also displays the twin traits that Skowronek describes as characteristic of these predecessors that Trump hopes to emulate: repudiating the existing terms of political competition and becoming a reconstructive leader of a new coalition.

The great repudiators, in Skowronek’s telling, were all preceded by ill-fated leaders who’d gained the presidency representing a once-dominant coalition that was palpably diminished by the time of their election. Skowronek placed in this club John Adams, John Quincy Adams, Franklin Pierce, James Buchanan, Herbert Hoover, and Carter. Each of their presidencies represented a last gasp for the party that had won most of the general elections in the years prior. None of these “late regime” presidents, as Skowronek called them, could generate enough success in office to reverse their party’s declining support; instead, they accelerated it.

The most recent such late-regime president, Carter, was elected in 1976 after Richard Nixon’s victories in 1968 and 1972 had already exposed cracks in the Democrats’ New Deal coalition of southerners, Black voters, and the white working class. Like many of his predecessors in the dubious fraternity of late-regime presidents, Carter recognized that his party needed to recalibrate its message and agenda to repair its eroding support. But the attempt to set a new, generally more centrist direction for the party foundered.

Thanks to rampant inflation, energy shortages, and the Iranian hostage crisis, Carter was whipsawed between a rebellion from the left (culminating in Senator Edward Kennedy’s primary challenge) and an uprising on the right led by Reagan. As Carter limped through his 1980 reelection campaign, Skowronek wrote, he had become “a caricature of the old regime’s political bankruptcy, the perfect foil for a repudiation of liberalism itself as the true source of all the nation’s problems.”

Carter’s failures enabled Reagan to entrench the electoral realignment that Nixon had started. In Reagan’s emphatic 1980 win, millions of southern white conservatives, including many evangelical Christians, as well as northern working-class white voters renounced the Democratic affiliation of their parents and flocked to Reagan’s Republican Party. Most of those voters never looked back.

The issue now is whether Biden will one day be seen as another late-regime president whose perceived failures hastened his party’s eclipse among key voting blocs. Pointing to his record of accomplishments, Biden advocates would consider the question absurd: Look, they say, at the big legislative wins, enormous job growth, soaring stock market, historic steps to combat climate change, skilled diplomacy that united allies against Russia’s invasion of Ukraine, and boom in manufacturing investment, particularly in clean-energy technologies.

In electoral terms, however, Biden’s legacy is more clouded. His 2020 victory appeared to revive the coalition of college-educated whites, growing minority populations, young people, and just enough working-class white voters that had allowed Bill Clinton and Barack Obama to win the White House in four of the six elections from 1992 through 2012. (In a fifth race over that span, Al Gore won the popular vote even though he lost the Electoral College.) But the public discontent with Biden frayed almost every strand of that coalition.

Biden made rebuilding his party’s support among working-class voters a priority and, in fact, delivered huge gains in manufacturing and construction jobs that were tied to the big three bills he passed (on clean energy, infrastructure, and semiconductors). But public anger at the rising cost of living contributed to Biden’s job-approval rating falling below 50 percent in the late summer of 2021 (around the time of the chaotic Afghanistan withdrawal), and it never climbed back to that crucial threshold. On Election Day, public disappointment with Biden’s overall record helped Trump maintain a crushing lead over Harris among white voters without a college degree, as well as make unprecedented inroads among nonwhite voters without a college degree, especially Latinos.