Itemoids

University

America’s ‘Marriage Material’ Shortage

The Atlantic

www.theatlantic.com › ideas › archive › 2025 › 02 › america-marriage-decline › 681518

Perhaps you’ve heard: Young people aren’t dating anymore. News media and social media are awash in commentary about the decline in youth romance. It’s visible in the corporate data, with dating-app engagement taking a hit. And it’s visible in the survey data, where the share of 12th graders who say they’ve dated has fallen from about 85 percent in the 1980s to less than 50 percent in the early 2020s, with the decline particularly steep in the past few years.

Naturally, young people’s habits are catnip to news commentators. But although I consider the story of declining youth romance important, I don’t find it particularly mysterious. In my essay on the anti-social century, I reported that young people have retreated from all manner of physical-world relationships, whether because of smartphones, over-parenting, or a combination of factors. Compared with previous generations of teens, they have fewer friends, spend significantly less time with the friends they do have, attend fewer parties, and spend much more time alone. Romantic relationships theoretically imply a certain physicality; so it’s easy to imagine that the collapse of physical-world socializing for young people would involve the decline of romance.

[From the February 2025 issue: The anti-social century]

Adults have a way of projecting their anxieties and realities onto their children. In the case of romance, the fixation on young people masks a deeper—and, to me, far more mysterious—phenomenon: What is happening to adult relationships?

American adults are significantly less likely to be married or to live with a partner than they used to be. The national marriage rate is hovering near its all-time low, while the share of women under 65 who aren’t living with a partner has grown steadily since the 1980s. The past decade seems to be the only period since at least the 1970s when women under 35 were more likely to live with their parents than with a spouse.

People’s lives are diverse, and so are their wants and desires and circumstances. It’s hard, and perhaps impossible, to identify a tiny number of factors that explain hundreds of millions of people’s decisions to couple up, split apart, or remain single. But according to Lyman Stone, a researcher at the Institute for Family Studies, the most important reason marriage and coupling are declining in the U.S. is actually quite straightforward: Many young men are falling behind economically.

A marriage or romantic partnership can be many things: friendship, love, sex, someone to gossip with, someone to remind you to take out the trash. But, practically speaking, Stone told me, marriage is also insurance. Women have historically relied on men to act as insurance policies—against the threat of violence, the risk of poverty. To some, this might sound like an old-fashioned, even reactionary, description of marriage, but its logic still applies. “Men’s odds of being in a relationship today are still highly correlated with their income,” Stone said. “Women do not typically invest in long-term relationships with men who have nothing to contribute economically.” In the past few decades, young and especially less educated men’s income has stagnated, even as women have charged into the workforce and seen their college-graduation rates soar. For single non-college-educated men, average inflation-adjusted earnings at age 45 have fallen by nearly 25 percent in the past half century, while for the country as a whole, average real earnings have more than doubled. As a result, “a lot of young men today just don’t look like what women have come to think of as ‘marriage material,’” he said.

In January, the Financial Times’ John Burn-Murdoch published an analysis of the “relationship recession” that lent strong support to Stone’s theory. Contrary to the idea that declining fertility in the U.S. is mostly about happily childless DINKs (dual-income, no-kid couples), “the drop in relationship formation is steepest among the poorest,” he observed. I asked Burn-Murdoch to share his analysis of Current Population Survey data so that I could take a closer look. What I found is that, in the past 40 years, coupling has declined more than twice as fast among Americans without a college degree, compared with college graduates. This represents a dramatic historic inversion. In 1980, Americans ages 25 to 34 without a bachelor’s degree were more likely than college graduates to get married; today, it’s flipped, and the education gap in coupling is widening every year. Marriage produces wealth by pooling two people’s income, but, conversely, wealth also produces marriage.

Contraception technology might also play a role. Before cheap birth control became widespread in the 1970s, sexual activity was generally yoked to commitment: It was a cultural norm for a man to marry a girl if he’d gotten her pregnant, and single parenthood was uncommon. But as the (married!) economists George Akerlof and Janet Yellen observed in a famous 1996 paper, contraception helped disentangle sex and marriage. Couples could sleep together without any implicit promise to stay together. Ultimately, Akerlof and Yellen posit, the availability of contraception, which gave women the tools to control the number and the timing of their kids, decimated the tradition of shotgun marriages, and therefore contributed to an increase in children born to low-income single parents.

The theory that the relationship recession is driven by young men falling behind seems to hold up in the U.S. But what about around the world? Rates of coupling are declining throughout Europe, as well. In England and Wales, the marriage rate for people under 30 has declined by more than 50 percent since 1990.

And it’s not just Europe. The gender researcher Alice Evans has shown that coupling is down just about everywhere. In Iran, annual marriages plummeted by 40 percent in 10 years. Some Islamic authorities blame Western values and social media for the shift. They might have a point. When women are exposed to more Western media, Evans argues, their life expectations expand. Fitted with TikTok and Instagram and other windows into Western culture, young women around the world can seek the independence of a career over the codependency (or, worse, the outright loss of freedom) that might come with marriage in their own country. Social media, a woman veterinarian in Tehran told the Financial Times, also glamorizes the single life “by showing how unmarried people lead carefree and successful lives … People keep comparing their partners to mostly fake idols on social platforms.”

[Read: The happiness trinity]

According to Evans, several trends are driving this global decline in coupling. Smartphones and social media may have narrowed many young people’s lives, pinning them to their couches and bedrooms. But they’ve also opened women’s minds to the possibility of professional and personal development. When men fail to support their dreams, relationships fail to flourish, and the sexes drift apart.

If I had to sum up this big messy story in a sentence, it would be this: Coupling is declining around the world, as women’s expectations rise and lower-income men’s fortunes fall; this combination is subverting the traditional role of straight marriage, in which men are seen as necessary for the economic insurance of their family.

So why does all this matter? Two of the more urgent sociological narratives of this moment are declining fertility and rising unhappiness. The relationship recession makes contact with both. First, marriage and fertility are tightly interconnected. Unsurprisingly, one of the strongest predictors of declining fertility around the world is declining coupling rates, as Burn-Murdoch has written. Second, marriage is strongly associated with happiness. According to General Social Survey data, Americans’ self-described life satisfaction has been decreasing for decades. In a 2023 analysis of the GSS data, the University of Chicago economist Sam Peltzman concluded that marriage was more correlated with this measure of happiness than any other variable he considered, including income. (As Stone would rush to point out here, marriage itself is correlated with income.)

The social crisis of our time is not just that Americans are more socially isolated than ever, but also that social isolation is rising alongside romantic isolation, as the economic and cultural trajectories of men and women move in opposite directions. And, perhaps most troubling, the Americans with the least financial wealth also seem to have the least “social wealth,” so to speak. It is the poor, who might especially need the support of friends and partners, who have the fewest close friends and the fewest long-term partners. Money might not buy happiness, but it can buy the things that buy happiness.

The Race-Blind College-Admissions Era Is Off to a Weird Start

The Atlantic

www.theatlantic.com › ideas › archive › 2025 › 02 › affirmative-action-yale-admissions › 681541

This article was featured in the One Story to Read Today newsletter. Sign up for it here.

When colleges began announcing the makeup of their incoming freshman classes last year—the first admissions cycle since the Supreme Court outlawed affirmative action—there seemed to have been some kind of mistake. The Court’s ruling in Students for Fair Admissions v. Harvard had been almost universally expected to produce big changes. Elite universities warned of a return to diversity levels not seen since the early 1960s, when their college classes had only a handful of Black students.

And yet, when the numbers came in, several of the most selective colleges in the country reported the opposite results. Yale, Dartmouth, Northwestern, the University of Virginia, Wesleyan, Williams, and Bowdoin all ended up enrolling more Black or Latino students, or both. Princeton and Duke appear to have kept their demographics basically stable.

These surprising results raise two competing possibilities. One is that top universities can preserve racial diversity without taking race directly into account in admissions. The other, favored by the coalition that successfully challenged affirmative action in court, is that at least some of the schools are simply ignoring the Supreme Court’s ruling—that they are, in other words, cheating. Finding out the truth will likely require litigation that could drag on for years. Although affirmative action was outlawed in 2023, the war over the use of race in college admissions is far from over.

History strongly suggested that the end of affirmative action would be disastrous for diversity in elite higher education. (Most American colleges accept most applicants and therefore didn’t use affirmative action in the first place.) In the states that had already banned the practice for public universities, the share of Black and Latino students enrolled at the most selective flagship campuses immediately plummeted. At UC Berkeley, for example, underrepresented minorities made up 22 percent of the freshman class in 1997. In 1998, after California passed its affirmative-action ban, that number fell to 11 percent. Many of these schools eventually saw a partial rebound, but not enough to restore their previous demographic balance.

Something similar happened at many selective schools in the aftermath of the Supreme Court’s 2023 ruling. At Harvard and MIT, for example, Black enrollment fell by more than 28 and 60 percent, respectively, compared with the average of the two years prior to the Court’s decision. But quite a few institutions defied expectations. At Yale, Black and Latino enrollment increased, while Asian American enrollment fell by 16 percent compared with recent years. Northwestern similarly saw its Black and Latino populations increase by more than 10 percent, while Asian and white enrollment declined. (In Students for Fair Admissions, the Court had found that Harvard’s race-conscious admissions policies discriminated against Asian applicants.)

[Rose Horowitch: The perverse consequences of tuition-free medical school]

Figuring out how this happened is not easy. Universities have always been cagey about how they choose to admit students; the secrecy ostensibly prevents students from trying to game the process. (It also prevents embarrassment: When details have come out, usually through litigation, they have typically not been flattering.) Now, with elite-college admissions under more scrutiny than usual, they’re even more wary of saying too much. When I asked universities for further details about their response to the ruling, Dartmouth, Bowdoin, and Williams declined to comment, Yale and Northwestern pointed me toward their vague public statements, and a Princeton spokesperson said that “now race plays no role in admissions decisions.” Duke did not reply to requests for comment.

The information gap has led outside observers to piece together theories with the data they do have. One possibility is that universities such as Yale and Princeton are taking advantage of some wiggle room in the Supreme Court’s ruling. “Nothing in this opinion should be construed as prohibiting universities from considering an applicant’s discussion of how race affected his or her life, be it through discrimination, inspiration, or otherwise,” Chief Justice John Roberts wrote in his majority opinion. This seemed to provide an indirect way to preserve race-consciousness in admissions. “It’s still legal to pursue diversity,” Sonja Starr, a law professor at the University of Chicago, told me. Her research shows that 43 of the 65 top-ranked universities have essay prompts that ask applicants about their identity or adversity; eight made the addition after the Court’s decision.

Another theory is that universities have figured out how to indirectly preserve racial diversity by focusing on socioeconomic status rather than race itself. In 2024, Yale’s admissions department began factoring in data from the Opportunity Atlas, a project run by researchers at Harvard and the U.S. Census Bureau that measures the upward mobility of children who grew up in a given neighborhood. It also increased recruitment and outreach in low-income areas. Similarly, Princeton announced that it would aim to increase its share of students who are eligible for financial aid. “In the changed legal environment, the University’s greatest opportunity to attract diverse talent pertains to socioeconomic diversity,” a committee designed to review race-neutral admissions policies at the college wrote.

Some evidence supports the “socioeconomics, not race” theory. Dartmouth announced that it had increased its share of low-income students eligible for federal Pell grants by five percentage points. Yale has said that last year’s incoming freshman class would have the greatest share of first-generation and low-income students in the university’s history. Richard Kahlenberg, a longtime proponent of class-based affirmative action who testified on behalf of the plaintiffs challenging Harvard’s admissions policies, told me that, by increasing economic diversity as a proxy for race, elite colleges have brought in the low-income students of color whom purely race-based affirmative action had long allowed them to overlook. (In recent years, almost three-quarters of the Black and Hispanic students at Harvard came from the wealthiest 20 percent of those populations nationally.) “While universities had been claiming that racial preferences were the only way they could create racial diversity, in fact, if we assume good faith on the part of the universities, they have found ways to achieve racial diversity without racial preferences,” Kahlenberg said.

[Richard Kahlenberg: The affirmative action that colleges really need]

If we assume good faith—that’s a big caveat. Not everyone is prepared to give universities the benefit of the doubt. Edward Blum, the president of Students for Fair Admissions, the plaintiff in the case that ended affirmative action, has already accused Yale, Princeton, and Duke of cheating. And Richard Sander, a law professor at UCLA and a critic of affirmative action, said that if a university’s Black enrollment numbers are still above 10 percent, “then I don’t think there’s any question that they’re engaged in illegal use of preferences.”

The skeptics’ best evidence is the fact that the universities accused of breaking the rules haven’t fully explained how they got their results. Yale, for example, has touted its use of the Opportunity Atlas, but hasn’t shared how it factors information from the tool into admissions decisions. Before the Court’s ruling, a Black student was four times more likely to get into Harvard than a white student with comparable scores, and a Latino applicant about twice as likely.

To keep numbers stable, race-neutral alternatives would have to provide a comparable boost. According to simulations presented to the Supreme Court, universities would have to eliminate legacy and donor preferences and slightly lower their average SAT scores to keep demographics constant without considering race. (In oral arguments, one lawyer compared the change in test scores to moving “from Harvard to Dartmouth.”) With minor exceptions, selective universities have given no indication that they’ve made either of those changes.

Even the data that exist are not totally straightforward to interpret. Some universities have reported an uptick in the percentage of students who chose not to report their race in their application. If that group skews white and Asian, as research suggests it does, then the reported share of Black and Latino students could be artificially inflated. And then there’s the question of how many students choose to accept a university’s offer of admission, which schools have little control over. Wesleyan, for example, accepted fewer Black applicants than it had in prior years, Michael Roth, the university’s president, told me. But a larger share chose to matriculate—possibly, Roth said, because even-more-selective schools had rejected them. The University of Virginia similarly had an unusually high yield among Black students, according to Greg Roberts, its dean of admissions. He couldn’t tell whether this was thanks to the school’s outreach efforts or just a coincidence. “I think what we’re doing is important, but to the extent it will consistently impact what the class looks like, I have no idea,” he told me. (Both Roth and Roberts, the only university administrators who agreed to be interviewed for this article, assured me that their institutions had obeyed the Court’s ruling.)

None of those alternative explanations is likely to sway the people who are convinced the schools cheated. With Donald Trump back in office, colleges that don’t see a meaningful uptick in Asian enrollees will likely face civil-rights investigations, says Josh Dunn, a law professor at the University of Tennessee at Knoxville. “If everything ends up looking exactly like it did prior to SFFA,” he told me, then the courts will “probably think that the schools were not trying to comply in good faith.”

Blum, the head of Students for Fair Admissions, has already threatened to sue Yale, Princeton, and Duke if they don’t release numbers proving to his satisfaction that they have complied with the law. (Blum declined to be interviewed for this article.) A new lawsuit could force universities to turn over their admissions data, which should reveal what’s really going on. It could also invite the Court to weigh in on new questions, including the legality of race-neutral alternatives to affirmative action that are adopted with racial diversity in mind. A resolution to any of these issues would take years to arrive.

In many ways, the endless fight over affirmative action is a proxy for the battle over what uber-selective universities are for. Institutions such as Harvard and Yale have long been torn between conflicting aims: on the one hand, creating the next generation of leaders out of the most accomplished applicants; on the other, serving as engines of social mobility for promising students with few opportunities. It will take much more than the legal demise of affirmative action to put that debate to rest.

ADHD’s Sobering Life-Expectancy Numbers

The Atlantic

www.theatlantic.com › health › archive › 2025 › 02 › adhd-shortened-life-expectancy › 681554

When I was unexpectedly diagnosed with ADHD last year, it turned my entire identity upside down. At 37, I’d tamed my restlessness and fiery temper, my obsessive reorganization of my mental to-do list, and my tendency to write and rewrite the same sentence for hours. Being this way was exhausting, but that was just who I was, or so I thought. My diagnosis reframed these quirks as symptoms of illness—importantly, ones that could be managed. Treatment corralled my racing thoughts in a way that I’d never before experienced.

But knowing that I have ADHD, short for “attention-deficit hyperactivity disorder,” has also opened my eyes to a new issue: Apparently, I am at risk of an early death. According to a study published last week that analyzed the deaths of more than 30,000 British adults, ADHD is linked with a lifespan that’s nearly seven years shorter for men, and about nine years shorter for women. Nine years! The findings suggest that the life expectancy of people with ADHD is nearly on par with that of smokers, and about five years shorter than that of heavy drinkers. When I sent the study to my husband, who also has ADHD, he texted back: “Damn.”

The findings are foreboding for many Americans. As of 2022, about 7 million American children ages 3 to 17  had at one point received an ADHD diagnosis as of 2022—1 million more than that same age group in 2016. And although ADHD may bring to mind kids bouncing off the walls, the number of adults with the condition has surged in recent years. ADHD’s rising prevalence has been met with some dismissiveness. As I wrote in 2023, questions have been raised about the validity of the recent spate of adult diagnoses, some of which were offered through dubious telehealth services that haphazardly doled out prescriptions. And ADHD is widely seen as mild, even mundane: Struggling to focus is hardly the same as say, schizophrenia, which has been linked to declines in life expectancy. But ADHD is “not as innocent as some people think it is,” Margaret Sibley, a psychiatry professor at the University of Washington School of Medicine who is not associated with the new study, told me.

No one dies from ADHD itself. Rather, symptoms such as concentration issues, emotional instability, memory issues, and impulsivity can touch nearly every aspect of life. Researchers have long known that people with ADHD are more likely to engage in risky behaviors, including substance abuse, unsafe sex, gambling, criminal acts, and dangerous driving. They are at a higher risk of depression, anxiety, and suicide. Difficulties keeping up with healthy lifestyle habits, such as eating well and exercising, lead to higher rates of obesity. All of these risks can chip away at a person’s life: Around the world, having ADHD is associated with lower socioeconomic status.

Some of the most life-threatening impacts of ADHD may be the least conspicuous, experts told me. Missing doctor appointments, forgetting to take medications, and struggling to navigate the health-care system can make existing illnesses worse. What leads children to be scolded for poor behavior can snowball into difficulties keeping a job, maintaining healthy relationships, and even staying out of prison. Forgetting to pay rent can lead to eviction; the sudden urge to race down a freeway could end in a crash.

The new study points to these sorts of risks to explain how ADHD can cut someone’s life short. Besides the shocking findings, what makes the research so notable is that it is the first to directly quantify years lost to ADHD. By matching diagnoses with death records, the authors calculated the mortality rate of people with ADHD, which they used to estimate life expectancy. A previous study quantified the effect of a childhood diagnosis on lifespan by extrapolating the effect of known risks and came to similar findings. The new research shows that “we have data related to the mortality of individuals—true data,” Sibley said.

The calculations aren’t definitive. The top-line life-expectancy numbers are part of a range that incorporates a margin of error: 4.5 to 9.11 years lost for males, and 6.55 to 10.91 years lost for females. “The estimate is not super precise,” Joshua Stott, an author of the paper and a clinical-psychology professor at University College London, told me. Nevertheless, even the most optimistic scenario discussed in the paper—a reduction of 4.5 years for men—is “still a big difference” in lifespan, Stott said.

Another caveat, he added, is that the study population may have been skewed toward people with additional health issues, possibly inflating the mortality risk. ADHD underdiagnosis is common in the United Kingdom, so it’s possible that those who had a diagnosis had sought it and were overall more in touch with health services. Perhaps the biggest limitation of the study, however, is that it doesn’t show whether treatment helps. Good data on who in the study was treated just weren’t available, Stott said.

Among the researchers I spoke with, there was no question that treatment would help. In fact, the diagnosis alone is even more important—an idea that transformed my fears into hope. Once people are aware of their condition, they can learn about the risks and adjust their lifestyle accordingly, David Goodman, an ADHD expert and a psychiatry professor at the Johns Hopkins University School of Medicine, told me. If ADHD can be diagnosed and managed, “a lot of this would disappear,” Stott said. Certainly, research suggests that drugs like Adderall can reduce ADHD’s effect on life expectancy. That should provide solace for many Americans: In 2023, two-thirds of American adults with ADHD were on medication or in behavioral therapy, or both.

The earlier a person knows about their disorder and the risks associated with it, the better. In a way, ADHD is like diabetes, Goodman said. When it’s treated early, living a relatively healthy life is doable. The longer it isn’t treated, the more the comorbidities pile up: heart disease, vision problems, nerve damage, kidney disease, and so on. With ADHD and diabetes, treatment can involve both drugs and lifestyle changes.

At first, my diagnosis brought relief. Then anger and remorse—that I had spent nearly four decades feeling drained and frustrated with myself when I could have managed my disorder all along. When I shared this with Goodman, he replied: “You and everyone else who gets diagnosed in their adult years.” Underdiagnoses aren’t limited to the U.K.; globally, they are common, particularly among girls and women. Underdiagnosis partly accounts for the growing number of adult cases. Given Stott’s findings, the uptick in adult diagnoses is a positive thing: It means those people have a chance to claim the years they might otherwise have lost. With diagnosis, “the goal is to diminish the regret that you have in the future, given the information and decisions you make in the present,” Goodman said.

But that requires a new perspective on ADHD. Although it has long been classified as a mental-health disorder, it is often seen as a stage that can be outgrown; eventually, the hyperactive child learns to sit still. The notion that ADHD is a serious lifelong disorder remains underappreciated; it’s relatively new, even in the research community. The condition has a dubious reputation among the general public: Just yesterday, Senator Tommy Tuberville lamented the bygone days when, to manage their child’s ADHD, “parents didn’t use a drug, they used a belt.” It is sometimes seen as a path to the recreational use of stimulants. Some scientists still contest the validity of adult ADHD itself, Sibley said: One recently framed ADHD as a false epidemic sparked by an overmedicalized society and self-diagnosis. Indeed, during the coronavirus pandemic, TikTok creators self-diagnosing ADHD led their followers to do the same; whether their assessments were right is anyone’s guess.

At times, I still question my own diagnosis, wondering whether my attention span is just victim to a maelstrom of forces: Texts, social-media alerts, email notifications, and the endless onslaught of news can make anyone feel chronically discombobulated. But I know now that ADHD is more than just a problem of attention; the relief I experience with treatment—from impulsivity and recklessness, angry outbursts, and frantic thoughts—is undeniable. For people with ADHD, the hope is that diagnosis can help disentangle a serious condition from the frenzied realities of modern life. Both are exhausting, but one, at least, can be controlled.