Itemoids

Amid

A Very Silly Movie About Some Very Good Dogs

The Atlantic

www.theatlantic.com › culture › archive › 2023 › 08 › strays-movie-review › 675052

Early on in the raunchy talking-animals comedy Strays, a montage plays of four dogs humping inanimate lawn ornaments, guzzling beer leaking from trash bags, and bonding over a plan to bite off a man’s genitals. It’s an inartfully staged sequence, packed with sophomoric jokes and enough f-bombs to rival a Quentin Tarantino film. On the other hand: Will you look at those sweet, scruffy faces! Those little paws! Sure, their CGI-ed mouths appear a bit strange and the canines do not seem to be making direct eye contact with one another, but they each deserve belly rubs and every single treat ever. How can anyone dislike a scene in which the goodest dogs are having the best time? Indeed, halfway through my screening, I glanced at my notes and realized that I’d drawn a series of smiley faces.

That’s all to say that Strays knows what it’s doing with its choice to follow a furry foursome, which saves the film from being an exercise in pure nonsense—at least for the dog-lovers in the audience. Hollywood makes plenty of absurd movies built on underbaked premises: This year, Cocaine Bear, Mafia Mamma, and 65 come to mind. Amid such a mediocre pack, you could do worse than a 93-minute film that, for all its obscene humor and gratuitous violence, contains a softhearted center about—what else?—the unconditional love of pets.

Then again, you could also do much better—and much funnier. Strays, despite being billed as an “R-rated comedy with bite,” is rather tame. (Sorry.) The story follows a Border Terrier named Reggie (voiced by Will Ferrell), who, after being abandoned by his pot-smoking loser owner, Doug (Will Forte), meets Bug (Jamie Foxx as a tough Boston terrier), Maggie (Isla Fisher as a smart Australian shepherd), and Hunter (Randall Park as a shy Great Dane). The group shows Reggie how to live without human supervision, and teaches him to accept that Doug was never kind to him—a revelation that kicks off a journey to make the former owner pay for his abuse.

Along the way, the four get involved in predictable misbehavior—drug-induced hijinks, gross-out gags—while indulging in endless dog-based jokes. The best ones involve highly specific jabs at dog-movie tropes, including a cameo that sends up A Dog’s Purpose and a scene involving a Homeward Bound–like, sentimental “narrator dog.” The worst involve asinine puns: At one point, the group debates what “regular style” means when it comes to dog sex.

[Read: What do dogs know about us?]

The director, Josh Greenbaum, isn’t trying to deliver the winsome charms of his last effort, Barb and Star Go to Vista Del Mar, a comedy about best friendship and coastal vacations that’s already becoming a cult classic. Instead, it strives for a not unpleasant brain-numbing effect on par with, say, falling down an online rabbit hole of cute animal videos. Strays achieves that result, to an extent. By the final act of the film, I had stopped taking notes altogether, defeated by the relentlessness of the movie’s profanity and poop-based imagery. I—and the audience I was with—laughed at a scene involving Maggie attempting to tell knock-knock jokes, only for the other dogs to respond with a chorus of woofs. I chuckled when Hunter said the word howling because he could not actually howl, and when Bug yelled “Fuck you, leaf!” at a leaf.

At the time, I could not really explain why this was so funny. In an attempt to pull myself together, I started thinking about what it meant that I was enjoying Strays; is this what “original” means now, for films to be made out of scenes that seem destined to become memes? Are the movie gods balancing the scales of narrative richness after the highs of Barbie and Oppenheimer? Has the relentless crush of being too online made me the perfect target to appreciate the juvenile humor of cute characters cursing? Should every movie just star dogs? Would it work with cats? (Not if they’re played by humans.)

I know: I’ve overthought Strays. The movie is, in the end, deeply unserious and completely mindless, but still strangely sweet. It is late-summer schlock, featuring an ensemble of four-legged animals who have done nothing wrong ever in their lives. It’s a reminder, if nothing else, that an adorable protagonist embarking on a hero’s journey goes a long way. It doesn’t matter if Strays is good. Because those dogs? They’re very good dogs.

How America Got Mean

The Atlantic

www.theatlantic.com › magazine › archive › 2023 › 09 › us-culture-moral-education-formation › 674765

This story seems to be about:

Illustrations by Ricardo Tomás

Over the past eight years or so, I’ve been obsessed with two questions. The first is: Why have Americans become so sad? The rising rates of depression have been well publicized, as have the rising deaths of despair from drugs, alcohol, and suicide. But other statistics are similarly troubling. The percentage of people who say they don’t have close friends has increased fourfold since 1990. The share of Americans ages 25 to 54 who weren’t married or living with a romantic partner went up to 38 percent in 2019, from 29 percent in 1990. A record-high 25 percent of 40-year-old Americans have never married. More than half of all Americans say that no one knows them well. The percentage of high-school students who report “persistent feelings of sadness or hopelessness” shot up from 26 percent in 2009 to 44 percent in 2021.

My second, related question is: Why have Americans become so mean? I was recently talking with a restaurant owner who said that he has to eject a customer from his restaurant for rude or cruel behavior once a week—something that never used to happen. A head nurse at a hospital told me that many on her staff are leaving the profession because patients have become so abusive. At the far extreme of meanness, hate crimes rose in 2020 to their highest level in 12 years. Murder rates have been surging, at least until recently. Same with gun sales. Social trust is plummeting. In 2000, two-thirds of American households gave to charity; in 2018, fewer than half did. The words that define our age reek of menace: conspiracy, polarization, mass shootings, trauma, safe spaces.

We’re enmeshed in some sort of emotional, relational, and spiritual crisis, and it undergirds our political dysfunction and the general crisis of our democracy. What is going on?

Over the past few years, different social observers have offered different stories to explain the rise of hatred, anxiety, and despair.

The technology story: Social media is driving us all crazy.

The sociology story: We’ve stopped participating in community organizations and are more isolated.

The demography story: America, long a white-dominated nation, is becoming a much more diverse country, a change that has millions of white Americans in a panic.

The economy story: High levels of economic inequality and insecurity have left people afraid, alienated, and pessimistic.

I agree, to an extent, with all of these stories, but I don’t think any of them is the deepest one. Sure, social media has bad effects, but it is everywhere around the globe—and the mental-health crisis is not. Also, the rise of despair and hatred has engulfed a lot of people who are not on social media. Economic inequality is real, but it doesn’t fully explain this level of social and emotional breakdown. The sociologists are right that we’re more isolated, but why? What values lead us to choose lifestyles that make us lonely and miserable?

The most important story about why Americans have become sad and alienated and rude, I believe, is also the simplest: We inhabit a society in which people are no longer trained in how to treat others with kindness and consideration. Our society has become one in which people feel licensed to give their selfishness free rein. The story I’m going to tell is about morals. In a healthy society, a web of institutions—families, schools, religious groups, community organizations, and workplaces—helps form people into kind and responsible citizens, the sort of people who show up for one another. We live in a society that’s terrible at moral formation.

[Read: American shoppers are a nightmare]

Moral formation, as I will use that stuffy-sounding term here, comprises three things. First, helping people learn to restrain their selfishness. How do we keep our evolutionarily conferred egotism under control? Second, teaching basic social and ethical skills. How do you welcome a neighbor into your community? How do you disagree with someone constructively? And third, helping people find a purpose in life. Morally formative institutions hold up a set of ideals. They provide practical pathways toward a meaningful existence: Here’s how you can dedicate your life to serving the poor, or protecting the nation, or loving your neighbor.

For a large part of its history, America was awash in morally formative institutions. Its Founding Fathers had a low view of human nature, and designed the Constitution to mitigate it (even while validating that low view of human nature by producing a document rife with racism and sexism). “Men I find to be a Sort of Beings very badly constructed,” Benjamin Franklin wrote, “as they are generally more easily provok’d than reconcil’d, more dispos’d to do Mischief to each other than to make Reparation, and much more easily deceiv’d than undeceiv’d.”

If such flawed, self-centered creatures were going to govern themselves and be decent neighbors to one another, they were going to need some training. For roughly 150 years after the founding, Americans were obsessed with moral education. In 1788, Noah Webster wrote, “The virtues of men are of more consequence to society than their abilities ; and for this reason, the heart should be cultivated with more assiduity than the head.” The progressive philosopher John Dewey wrote in 1909 that schools teach morality “every moment of the day, five days a week.” Hollis Frissell, the president of the Hampton Institute, an early school for African Americans, declared, “Character is the main object of education.” As late as 1951, a commission organized by the National Education Association, one of the main teachers’ unions, stated that “an unremitting concern for moral and spiritual values continues to be a top priority for education.”

The moral-education programs that stippled the cultural landscape during this long stretch of history came from all points on the political and religious spectrums. School textbooks such as McGuffey’s Eclectic Readers not only taught students how to read and write; they taught etiquette, and featured stories designed to illustrate right and wrong behavior. In the 1920s, W. E. B. Du Bois’s magazine for Black children, The Brownies’ Book, had a regular column called “The Judge,” which provided guidance to young readers on morals and manners. There were thriving school organizations with morally earnest names that sound quaint today—the Courtesy Club, the Thrift Club, the Knighthood of Youth.

Beyond the classroom lay a host of other groups: the YMCA; the Sunday-school movement; the Boy Scouts and Girl Scouts; the settlement-house movement, which brought rich and poor together to serve the marginalized; Aldo Leopold’s land ethic, which extended our moral concerns to include proper care for the natural world; professional organizations, which enforced ethical codes; unions and workplace associations, which, in addition to enhancing worker protections and paychecks, held up certain standards of working-class respectability. And of course, by the late 19th century, many Americans were members of churches or other religious communities. Mere religious faith doesn’t always make people morally good, but living in a community, orienting your heart toward some transcendent love, basing your value system on concern for the underserved—those things tend to.

[Arthur C. Brooks: Make yourself happy—be kind]

An educational approach with German roots that was adopted by Scandinavian societies in the mid-to-late 19th century had a wide influence on America. It was called Bildung, roughly meaning “spiritual formation.” As conceived by Wilhelm von Humboldt, the Bildung approach gave professors complete freedom to put moral development at the center of a university’s mission. In schools across Scandinavia, students studied literature and folk cultures to identify their own emotions, wounds, and weaknesses, in order to become the complex human beings that modern society required. Schools in the Bildung tradition also aimed to clarify the individual’s responsibilities to the wider world—family, friends, nation, humanity. Start with the soul and move outward.

The Bildung movement helped inspire the Great Books programs that popped up at places like Columbia and the University of Chicago. They were based on the conviction that reading the major works of world literature and thinking about them deeply would provide the keys to living a richer life. Meanwhile, discipline in the small proprieties of daily existence—dressing formally, even just to go shopping or to a ball game—was considered evidence of uprightness: proof that you were a person who could be counted on when the large challenges came.

Much of American moral education drew on an ethos expressed by the headmaster of the Stowe School, in England, who wrote in 1930 that the purpose of his institution was to turn out young men who were “acceptable at a dance and invaluable in a shipwreck.” America’s National Institute for Moral Instruction was founded in 1911 and published a “Children’s Morality Code,” with 10 rules for right living. At the turn of the 20th century, Mount Holyoke College, an all-women’s institution, was an example of an intentionally thick moral community. When a young Frances Perkins was a student there, her Latin teacher detected a certain laziness in her. She forced Perkins to spend hours conjugating Latin verbs, to cultivate self-discipline. Perkins grew to appreciate this: “For the first time I became conscious of character.” The school also called upon women to follow morally ambitious paths. “Do what nobody else wants to do; go where nobody else wants to go,” the school’s founder implored. Holyoke launched women into lives of service in Africa, South Asia, and the Middle East. Perkins, who would become the first woman to serve in a presidential Cabinet (Franklin D. Roosevelt’s), was galvanized there.

[Read: Students’ broken moral compasses]

These various approaches to moral formation shared two premises. The first was that training the heart and body is more important than training the reasoning brain. Some moral skills can be taught the way academic subjects are imparted, through books and lectures. But we learn most virtues the way we learn crafts, through the repetition of many small habits and practices, all within a coherent moral culture—a community of common values, whose members aspire to earn one another’s respect.

Ricardo Tomás

The other guiding premise was that concepts like justice and right and wrong are not matters of personal taste: An objective moral order exists, and human beings are creatures who habitually sin against that order. This recognition was central, for example, to the way the civil-rights movement in the 1950s and early 1960s thought about character formation. “Instead of assured progress in wisdom and decency man faces the ever present possibility of swift relapse not merely to animalism but into such calculated cruelty as no other animal can practice,” Martin Luther King Jr. believed. Elsewhere, he wrote, “The force of sinfulness is so stubborn a characteristic of human nature that it can only be restrained when the social unit is armed with both moral and physical might.”

At their best, the civil-rights marchers in this prophetic tradition understood that they could become corrupted even while serving a noble cause. They could become self-righteous because their cause was just, hardened by hatred of their opponents, prideful as they asserted power. King’s strategy of nonviolence was an effort simultaneously to expose the sins of their oppressors and to restrain the sinful tendencies inherent in themselves. “What gave such widely compelling force to King’s leadership and oratory,” the historian George Marsden argues, “was his bedrock conviction that moral law was built into the universe.”

A couple of obvious things need to be said about this ethos of moral formation that dominated American life for so long. It prevailed alongside all sorts of hierarchies that we now rightly find abhorrent: whites superior to Blacks, men to women, Christians to Jews, straight people to gay people. And the emphasis on morality didn’t produce perfect people. Moral formation doesn’t succeed in making people angels—it tries to make them better than they otherwise might be.

Furthermore, we would never want to go back to the training methods that prevailed for so long, rooted in so many thou shall nots and so much shaming, and riddled with so much racism and sexism. Yet a wise accounting should acknowledge that emphasizing moral formation meant focusing on an important question—what is life for?—and teaching people how to bear up under inevitable difficulties. A culture invested in shaping character helped make people resilient by giving them ideals to cling to when times got hard. In some ways, the old approach to moral formation was, at least theoretically, egalitarian: If your status in the community was based on character and reputation, then a farmer could earn dignity as readily as a banker. This ethos came down hard on self-centeredness and narcissistic display. It offered practical guidance on how to be a good neighbor, a good friend.

And then it mostly went away.

The crucial pivot happened just after World War II, as people wrestled with the horrors of the 20th century. One group, personified by the theologian Reinhold Niebuhr, argued that recent events had exposed the prevalence of human depravity and the dangers, in particular, of tribalism, nationalism, and collective pride. This group wanted to double down on moral formation, with a greater emphasis on humility.

Another group, personified by Carl Rogers, a founder of humanistic psychology, focused on the problem of authority. The trouble with the 20th century, the members of this group argued, was that the existence of rigid power hierarchies led to oppression in many spheres of life. We need to liberate individuals from these authority structures, many contended. People are naturally good and can be trusted to do their own self-actualization.

A cluster of phenomenally successful books appeared in the decade after World War II, making the case that, as Rabbi Joshua Loth Liebman wrote in Peace of Mind (1946), “thou shalt not be afraid of thy hidden impulses.” People can trust the goodness inside. His book topped the New York Times best-seller list for 58 weeks. Dr. Spock’s first child-rearing manual was published the same year. That was followed by books like The Power of Positive Thinking (1952). According to this ethos, morality is not something that we develop in communities. It’s nurtured by connecting with our authentic self and finding our true inner voice. If people are naturally good, we don’t need moral formation; we just need to let people get in touch with themselves. Organization after organization got out of the moral-formation business and into the self-awareness business. By the mid‑1970s, for example, the Girl Scouts’ founding ethos of service to others had shifted: “How can you get more in touch with you? What are you thinking? What are you feeling?” one Girl Scout handbook asked.

Schools began to abandon moral formation in the 1940s and ’50s, as the education historian B. Edward McClellan chronicles in Moral Education in America: “By the 1960s deliberate moral education was in full-scale retreat” as educators “paid more attention to the SAT scores of their students, and middle-class parents scrambled to find schools that would give their children the best chances to qualify for elite colleges and universities.” The postwar period saw similar change at the college level, Anthony Kronman, a former dean of Yale Law School, has noted. The “research ideal” supplanted the earlier humanistic ideal of cultivating the whole student. As academics grew more specialized, Kronman has argued, the big questions—What is the meaning of life? How do you live a good life?—lost all purchase. Such questions became unprofessional for an academic to even ask.

[Read: The benefits of character education]

In sphere after sphere, people decided that moral reasoning was not really relevant. Psychology’s purview grew, especially in family and educational matters, its vocabulary framing “virtually all public discussion” of the moral life of children, James Davison Hunter, a prominent American scholar on character education, noted in 2000. “For decades now, contributions from philosophers and theologians have been muted or nonexistent.” Psychology is a wonderful profession, but its goal is mental health, not moral growth.

From the start, some worried about this privatizing of morality. “If what is good, what is right, what is true is only what the individual ‘chooses’ to ‘invent,’ ” Walter Lippmann wrote in his 1955 collection, Essays in the Public Philosophy, “then we are outside the traditions of civility.” His book was hooted down by establishment figures such as the historian Arthur M. Schlesinger Jr.; the de-moralization of American culture was under way.

Over the course of the 20th century, words relating to morality appeared less and less frequently in the nation’s books: According to a 2012 paper, usage of a cluster of words related to being virtuous also declined significantly. Among them were bravery (which dropped by 65 percent), gratitude (58 percent), and humbleness (55 percent). For decades, researchers have asked incoming college students about their goals in life. In 1967, about 85 percent said they were strongly motivated to develop “a meaningful philosophy of life”; by 2000, only 42 percent said that. Being financially well off became the leading life goal; by 2015, 82 percent of students said wealth was their aim.

In a culture devoid of moral education, generations grow up in a morally inarticulate, self-referential world. The Notre Dame sociologist Christian Smith and a team of researchers asked young adults across the country in 2008 about their moral lives. One of their findings was that the interviewees had not given the subject of morality much thought. “I’ve never had to make a decision about what’s right and what’s wrong,” one young adult told the researchers. “My teachers avoid controversies like that like the plague,” many teenagers said.

The moral instincts that Smith observed in his sample fell into the pattern that the philosopher Alasdair MacIntyre called “emotivism”: Whatever feels good to me is moral. “I would probably do what would make me happy” in any given situation, one of the interviewees declared. “Because it’s me in the long run.” As another put it, “If you’re okay with it morally, as long as you’re not getting caught, then it’s not really against your morals, is it?” Smith and his colleagues emphasized that the interviewees were not bad people but, because they were living “in morally very thin or spotty worlds,” they had never been given a moral vocabulary or learned moral skills.

Most of us who noticed the process of de-moralization as it was occurring thought a bland moral relativism and empty consumerism would be the result: You do you and I’ll do me. That’s not what happened.

“Moral communities are fragile things, hard to build and easy to destroy,” the psychologist Jonathan Haidt writes in The Righteous Mind. When you are raised in a culture without ethical structure, you become internally fragile. You have no moral compass to give you direction, no permanent ideals to which you can swear ultimate allegiance. “He who has a why to live for can bear with almost any how,” the psychiatrist (and Holocaust survivor) Viktor Frankl wrote, interpreting a famous Nietzsche saying. Those without a why fall apart when the storms hit. They begin to suffer from that feeling of moral emptiness that Émile Durkheim called “anomie.”

Expecting people to build a satisfying moral and spiritual life on their own by looking within themselves is asking too much. A culture that leaves people morally naked and alone leaves them without the skills to be decent to one another. Social trust falls partly because more people are untrustworthy. That creates crowds of what psychologists call “vulnerable narcissists.” We all know grandiose narcissists—people who revere themselves as the center of the universe. Vulnerable narcissists are the more common figures in our day—people who are also addicted to thinking about themselves, but who often feel anxious, insecure, avoidant. Intensely sensitive to rejection, they scan for hints of disrespect. Their self-esteem is wildly in flux. Their uncertainty about their inner worth triggers cycles of distrust, shame, and hostility.

“The breakdown of an enduring moral framework will always produce disconnection, alienation, and an estrangement from those around you,” Luke Bretherton, a theologian at Duke Divinity School, told me. The result is the kind of sadness I see in the people around me. Young adults I know are spiraling, leaving school, moving from one mental-health facility to another. After a talk I gave in Oklahoma, a woman asked me, “What do you do when you no longer want to be alive?” The very next night I had dinner with a woman who told me that her brother had died by suicide three months before. I mentioned these events to a group of friends on a Zoom call, and nearly half of them said they’d had a brush with suicide in their family. Statistics paint the broader picture: Suicide rates have increased by more than 30 percent since 2000, according to the CDC.

Sadness, loneliness, and self-harm turn into bitterness. Social pain is ultimately a response to a sense of rejection—of being invisible, unheard, disrespected, victimized. When people feel that their identity is unrecognized, the experience registers as an injustice—because it is. People who have been treated unjustly often lash out and seek ways to humiliate those who they believe have humiliated them.

Lonely eras are not just sad eras; they are violent ones. In 19th-century America, when a lot of lonely young men were crossing the western frontier, one of the things they tended to do was shoot one another. As the saying goes, pain that is not transformed gets transmitted. People grow more callous, defensive, distrustful, and hostile. The pandemic made it worse, but antisocial behavior is still high even though the lockdowns are over. And now we are caught in a cycle, ill treatment leading to humiliation and humiliation leading to more meanness. Social life becomes more barbaric, online and off.

If you put people in a moral vacuum, they will seek to fill it with the closest thing at hand. Over the past several years, people have sought to fill the moral vacuum with politics and tribalism. American society has become hyper-politicized.

[David Brooks: America is having a moral convulsion]

According to research by Ryan Streeter, the director of domestic-policy studies at the American Enterprise Institute, lonely young people are seven times more likely to say they are active in politics than young people who aren’t lonely. For people who feel disrespected, unseen, and alone, politics is a seductive form of social therapy. It offers them a comprehensible moral landscape: The line between good and evil runs not down the middle of every human heart, but between groups. Life is a struggle between us, the forces of good, and them, the forces of evil.

The Manichaean tribalism of politics appears to give people a sense of belonging. For many years, America seemed to be awash in a culture of hyper-individualism. But these days, people are quick to identify themselves by their group: Republican, Democrat, evangelical, person of color, LGBTQ, southerner, patriot, progressive, conservative. People who feel isolated and under threat flee to totalizing identities.

Politics appears to give people a sense of righteousness: A person’s moral stature is based not on their conduct, but on their location on the political spectrum. You don’t have to be good; you just have to be liberal—or you just have to be conservative. The stronger a group’s claim to victim status, the more virtuous it is assumed to be, and the more secure its members can feel about their own innocence.

Politics also provides an easy way to feel a sense of purpose. You don’t have to feed the hungry or sit with the widow to be moral; you just have to experience the right emotion. You delude yourself that you are participating in civic life by feeling properly enraged at the other side. That righteous fury rising in your gut lets you know that you are engaged in caring about this country. The culture war is a struggle that gives life meaning.

Politics overwhelms everything. Churches, universities, sports, pop culture, health care are swept up in a succession of battles that are really just one big war—red versus blue. Evangelicalism used to be a faith; today it’s primarily a political identity. College humanities departments used to study literature and history to plumb the human heart and mind; now they sometimes seem exclusively preoccupied with politics, and with the oppressive systems built around race, class, and gender. Late-night comedy shows have become political pep rallies. Hundreds of thousands of Americans died unnecessarily during the pandemic because people saw a virus through the lens of a political struggle.

This is not politics as it is normally understood. In psychically healthy societies, people fight over the politics of distribution: How high should taxes be? How much money should go to social programs for the poor and the elderly? We’ve shifted focus from the politics of redistribution to the politics of recognition. Political movements are fueled by resentment, by feelings that society does not respect or recognize me. Political and media personalities gin up dramas in which our side is emotionally validated and the other side is emotionally shamed. The person practicing the politics of recognition is not trying to get resources for himself or his constituency; he is trying to admire himself. He’s trying to use politics to fill the hole in his soul. It doesn’t work.

The politics of recognition doesn’t give you community and connection, certainly not in a system like our current one, mired in structural dysfunction. People join partisan tribes in search of belonging—but they end up in a lonely mob of isolated belligerents who merely obey the same orthodoxy.

If you are asking politics to be the reigning source of meaning in your life, you are asking more of politics than it can bear. Seeking to escape sadness, loneliness, and anomie through politics serves only to drop you into a world marked by fear and rage, by a sadistic striving for domination. Sure, you’ve left the moral vacuum—but you’ve landed in the pulverizing destructiveness of moral war. The politics of recognition has not produced a happy society. When asked by the General Social Survey to rate their happiness level, 20 percent of Americans in 2022 rated it at the lowest level—only 8 percent did the same in 1990.

[Read: What the longest study on human happiness found is the key to a good life]

America’s Founding Fathers studied the history of democracies going back to ancient Greece. They drew the lesson that democracies can be quite fragile. When private virtue fails, the constitutional order crumbles. After decades without much in the way of moral formation, America became a place where more than 74 million people looked at Donald Trump’s morality and saw presidential timber.

Even in dark times, sparks of renewal appear. In 2018, a documentary about Mister Rogers called Won’t You Be My Neighbor? was released. The film showed Fred Rogers in all his simple goodness—his small acts of generosity; his displays of vulnerability; his respect, even reverence, for each child he encountered. People cried openly while watching it in theaters. In an age of conflict and threat, the sight of radical goodness was so moving.

In the summer of 2020, the series Ted Lasso premiered. When Lasso describes his goals as a soccer coach, he could mention the championships he hopes to win or some other conventional metric of success, but he says, “For me, success is not about the wins and losses. It’s about helping these young fellas be the best versions of themselves on and off the field.”

That is a two-sentence description of moral formation. Ted Lasso is about an earnest, cheerful, and transparently kind man who enters a world that has grown cynical, amoral, and manipulative, and, episode after episode, even through his own troubles, he offers the people around him opportunities to grow more gracious, to confront their vulnerabilities and fears, and to treat one another more gently and wisely. Amid lockdowns and political rancor, it became a cultural touchstone, and the most watched show on Apple TV+.

Even as our public life has grown morally bare, people, as part of their elemental nature, yearn to feel respected and worthy of respect, need to feel that their life has some moral purpose and meaning. People still want to build a society in which it is easier to be good. So the questions before us are pretty simple: How can we build morally formative institutions that are right for the 21st century? What do we need to do to build a culture that helps people become the best versions of themselves?

A few necessities come immediately to mind.

A modern vision of how to build character. The old-fashioned models of character-building were hopelessly gendered. Men were supposed to display iron willpower that would help them achieve self-mastery over their unruly passions. Women were to sequester themselves in a world of ladylike gentility in order to not be corrupted by bad influences and base desires. Those formulas are obsolete today.

The best modern approach to building character is described in Iris Murdoch’s book The Sovereignty of Good. Murdoch writes that “nothing in life is of any value except the attempt to be virtuous.” For her, moral life is not defined merely by great deeds of courage or sacrifice in epic moments. Instead, moral life is something that goes on continually—treating people considerately in the complex situations of daily existence. For her, the essential moral act is casting a “just and loving” attention on other people.

Normally, she argues, we go about our days with self-centered, self-serving eyes. We see and judge people in ways that satisfy our own ego. We diminish and stereotype and ignore, reducing other people to bit players in our own all-consuming personal drama. But we become morally better, she continues, as we learn to see others deeply, as we learn to envelop others in the kind of patient, caring regard that makes them feel seen, heard, and understood. This is the kind of attention that implicitly asks, “What are you going through?” and cares about the answer.

I become a better person as I become more curious about those around me, as I become more skilled in seeing from their point of view. As I learn to perceive you with a patient and loving regard, I will tend to treat you well. We can, Murdoch concluded, “grow by looking.”

Mandatory social-skills courses. Murdoch’s character-building formula roots us in the simple act of paying attention: Do I attend to you well? It also emphasizes that character is formed and displayed as we treat others considerately. This requires not just a good heart, but good social skills: how to listen well. How to disagree with respect. How to ask for and offer forgiveness. How to patiently cultivate a friendship. How to sit with someone who is grieving or depressed. How to be a good conversationalist.

These are some of the most important skills a person can have. And yet somehow, we don’t teach them. Our schools spend years prepping students with professional skills—but offer little guidance on how to be an upstanding person in everyday life. If we’re going to build a decent society, elementary schools and high schools should require students to take courses that teach these specific social skills, and thus prepare them for life with one another. We could have courses in how to be a good listener or how to build a friendship. The late feminist philosopher Nel Noddings developed a whole pedagogy around how to effectively care for others.

A new core curriculum. More and more colleges and universities are offering courses in what you might call “How to Live.” Yale has one called “Life Worth Living.” Notre Dame has one called “God and the Good Life.” A first-year honors program in this vein at Valparaiso University, in Indiana, involves not just conducting formal debates on ideas gleaned from the Great Books, but putting on a musical production based on their themes. Many of these courses don’t give students a ready-made formula, but they introduce students to some of the venerated moral traditions—Buddhism, Judeo-Christianity, and Enlightenment rationalism, among others. They introduce students to those thinkers who have thought hard on moral problems, from Aristotle to Desmond Tutu to Martha Nussbaum. They hold up diverse exemplars to serve as models of how to live well. They put the big questions of life firmly on the table: What is the ruling passion of your soul? Whom are you responsible to? What are my moral obligations? What will it take for my life to be meaningful? What does it mean to be a good human in today’s world? What are the central issues we need to engage with concerning new technology and human life?

These questions clash with the ethos of the modern university, which is built around specialization and passing on professional or technical knowledge. But they are the most important courses a college can offer. They shouldn’t be on the margins of academic life. They should be part of the required core curriculum.

Intergenerational service. We spend most of our lives living by the logic of the meritocracy: Life is an individual climb upward toward success. It’s about pursuing self-interest.

There should be at least two periods of life when people have a chance to take a sabbatical from the meritocracy and live by an alternative logic—the logic of service: You have to give to receive. You have to lose yourself in a common cause to find yourself. The deepest human relationships are gift relationships, based on mutual care. (An obvious model for at least some aspects of this is the culture of the U.S. military, which similarly emphasizes honor, service, selflessness, and character in support of a purpose greater than oneself, throwing together Americans of different ages and backgrounds who forge strong social bonds.)

Those sabbaticals could happen at the end of the school years and at the end of the working years. National service programs could bring younger and older people together to work to address community needs.

These programs would allow people to experience other-centered ways of being and develop practical moral habits: how to cooperate with people unlike you. How to show up day after day when progress is slow. How to do work that is generous and hard.

Moral organizations. Most organizations serve two sets of goals—moral goals and instrumental goals. Hospitals heal the sick and also seek to make money. Newspapers and magazines inform the public and also try to generate clicks. Law firms defend clients and also try to maximize billable hours. Nonprofits aim to serve the public good and also raise money.

In our society, the commercial or utilitarian goals tend to eclipse the moral goals. Doctors are pressured by hospital administrators to rush through patients so they can charge more fees. Journalists are incentivized to write stories that confirm reader prejudices in order to climb the most-read lists. Whole companies slip into an optimization mindset, in which everything is done to increase output and efficiency.

Moral renewal won’t come until we have leaders who are explicit, loud, and credible about both sets of goals. Here’s how we’re growing financially, but also Here’s how we’re learning to treat one another with consideration and respect; here’s how we’re going to forgo some financial returns in order to better serve our higher mission.

Early in my career, as a TV pundit at PBS NewsHour, I worked with its host, Jim Lehrer. Every day, with a series of small gestures, he signaled what kind of behavior was valued there and what kind of behavior was unacceptable. In this subtle way, he established a set of norms and practices that still lives on. He and others built a thick and coherent moral ecology, and its way of being was internalized by most of the people who have worked there.

Politics as a moral enterprise. An ancient brand of amoralism now haunts the world. Authoritarian-style leaders like Donald Trump, Vladimir Putin, and Xi Jinping embody a kind of amoral realism. They evince a mindset that assumes that the world is a vicious, dog-eat-dog sort of place. Life is a competition to grab what you can. Force is what matters. Morality is a luxury we cannot afford, or merely a sham that elites use to mask their own lust for power. It’s fine to elect people who lie, who are corrupt, as long as they are ruthless bastards for our side. The ends justify the means.

Those of us who oppose these authoritarians stand, by contrast, for a philosophy of moral realism. Yes, of course people are selfish and life can be harsh. But over the centuries, civilizations have established rules and codes to nurture cooperation, to build trust and sweeten our condition. These include personal moral codes so we know how to treat one another well, ethical codes to help prevent corruption on the job and in public life, and the rules of the liberal world order so that nations can live in peace, secure within their borders.

Moral realists are fighting to defend and modernize these rules and standards—these sinews of civilization. Moral realism is built on certain core principles. Character is destiny. We can either elect people who try to embody the highest standards of honesty, kindness, and integrity, or elect people who shred those standards. Statecraft is soulcraft. The laws we pass shape the kinds of people we become. We can structure our tax code to encourage people to be enterprising and to save more, or we can structure the code to encourage people to be conniving and profligate. Democracy is the system that best enhances human dignity. Democratic regimes entrust power to the people, and try to form people so they will be responsible with that trust. Authoritarian regimes seek to create a world in which the strong do what they can and the weak suffer what they must.

Look, I understand why people don’t want to get all moralistic in public. Many of those who do are self-righteous prigs, or rank hypocrites. And all of this is only a start. But healthy moral ecologies don’t just happen. They have to be seeded and tended by people who think and talk in moral terms, who try to model and inculcate moral behavior, who understand that we have to build moral communities because on our own, we are all selfish and flawed. Moral formation is best when it’s humble. It means giving people the skills and habits that will help them be considerate to others in the complex situations of life. It means helping people behave in ways that make other people feel included, seen, and respected. That’s very different from how we treat people now—in ways that make them feel sad and lonely, and that make them grow unkind.

This article appears in the September 2023 print edition with the headline “How America Got Mean.” When you buy a book using a link on this page, we receive a commission. Thank you for supporting The Atlantic.

College Football’s Power Brokers Are Destroying It

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 08 › college-football-greed-conference-alignment › 674930

The kickoff to the college-football season is a few weeks away, but fans are already seeing 2023’s biggest showdown—one that pits the long-term interests of schools and conferences against their own insatiable greed.

When a major football power switches from one conference to another—disrupting existing rivalries in favor of new opponents less familiar to fans—it’s always controversial. But numerous recent conference changes have disrupted the landscape to an unusual degree. Amid widespread complaints that college players’ newfound ability to profit from endorsement deals is harming a supposedly amateur sport, what’s really chewing college football to pieces are conference realignments fueled by schools’ and conferences’ avarice.

At the moment, the biggest sign of trouble is that the Pac-12 is being gutted amid a massive scramble across the NCAA Division I Football Bowl Subdivision for broadcast revenue. The venerable West Coast league has been unable to attract a major-network television deal, and as it struggles, marquee teams are abandoning the Pac-12 for bigger fortunes elsewhere.

[Jemele Hill: College football is cannibalizing itself]

The Big Ten is reportedly exploring the possibility of adding Oregon and Washington to its conference, a year after the conference gobbled up University of Southern California and UCLA. Colorado doesn’t necessarily have the same national prominence as the two legendary California universities, but its announcement last month that it will return to the Big 12 after more than a decade in the Pac-12 is yet another blow for the latter.

The reason so many schools are on the move is that each member of a conference gets a share of its guaranteed television revenues. So the bigger the deal, the bigger each school’s allotment. Currently, the Big Ten and the Southeastern Conference (SEC) have the most lucrative television deals in college football. Disney, which owns ESPN, successfully landed all of the SEC’s media rights in 2020 with a 10-year, $3 billion deal that begins in 2024. The agreement will pay the SEC about $300 million a year—a huge bump from the $55 million a year that CBS was paying the conference. Especially now that Texas and Oklahoma are set to join the SEC in 2024, the conference appears to be set up for long-term success. So does the Big Ten, which last year secured a seven-year, $7 billion media-rights agreement with Fox, CBS, and NBC.

On some level, you have to sympathize with college-football fans as the conference-realignment version of Game of Thrones plays out. Traditions, history, and entrenched rivalries are what make college football so appealing. As these schools and conferences jockey for financial position, traditions and history become an afterthought.

The Big Ten and the SEC naturally have emerged as the most attractive destinations in college football, and schools aren’t shy about their willingness to abandon conference solidarity and tradition for a bigger paycheck elsewhere. The Pac-12 isn’t the only conference facing a harsh reality. As Sports Illustrated has reported, at least half of the Atlantic Coast Conference (ACC) schools are considering leaving.

Florida State University’s president, Richard McCullough, said this week that his school faces “a very difficult situation,” even “an existential crisis,” as schools outside the ACC score tens of millions of dollars more a year to build facilities, retain coaches, and maximize their recruited athletes’ ability to profit from their fame.

The imperative to take account of players’ needs is something new for colleges and conferences. For many years, college athletes could be compensated only with a scholarship, and their otherwise-unpaid labor became the basis of a hugely lucrative business. But when courts and state legislatures decided that college athletes should be allowed to make money off of their name, image, and likeness, the change added a new variable for colleges. In deciding where to enroll, athletes now consider which schools might offer them the greatest chance of landing endorsement deals and monetizing their social-media fame. (Signing with a Big Ten or SEC member school is a good way for athletes to get their face on TV.) These considerations rankle college-football traditionalists, who supposedly want to uphold the old ideal of student athletes.

“I am against anything that devalues education,” the Clemson University football coach Dabo Swinney told ESPN last year. “That’s what I’m against. I am for anything that incentivizes education. People will come after me because I’ve always said that I’m against the professionalism of college athletics, and I am. Kids don’t know what they don’t know.”

[Devin Gordon: America ruined college football. Now college football is ruining America.]

That’s brazen coming from Swinney, who is in the midst of a 10-year, $115 million contract extension that he signed in September 2022. And so much for sticking with the old ways: Clemson, a member of the ACC for many decades, is reportedly among the schools seriously considering leaving for more money.

That athletes can now make money from their likeness is largely irrelevant to the fundamental issue: The top conferences’ broadcast deals have simply become so lucrative that colleges can’t resist seeking their share.

“The old question of, ‘How long would it take TV money to destroy college football?’ Maybe we’re here,” the Washington State University coach Jake Dickert told reporters Thursday. “To think, even remotely, five years ago, [that] the Pac-12 would be in this position, it’s unthinkable to think that we’re here today. And to think that local rivalries are at risk … to me, is unbelievable.”

For so long, college-football power brokers spent a lot of time conjuring every excuse as to why a fair and equitable system for players just wasn’t feasible. Now colleges’ hypocrisy is being fully exposed. Athletes were simply seeking equity and fair market value, and they’re finally able to get it. Colleges have been beholden to money the whole time.

What It Was Like to Live in My Car

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 08 › california-vehicular-homelessness-car-dwelling-los-angeles › 674901

The month I moved to Los Angeles felt apocalyptic, even by the standards of a city forever being destroyed in film. It was the end of the summer of 2020; stores were closed, streets empty, and wildfires had enveloped the region in smoke, turning the sky orange. Yet after I parked the U-Haul, things got even bleaker.

Walking to my new apartment, I passed a car where a 20-something had passed out with the engine running. Folks, I noticed, were sleeping in nearly every car on the street—a mix, I would later learn, of UCLA students and construction workers.

I had never encountered vehicular homelessness before moving out West. Indeed, it hadn’t even registered to me as a possibility, as a thing one might do to avoid sleeping on the street. In New York City, most homeless people don’t own cars, and in any case, the city has a legal obligation to provide shelter. This is not true in California.

Nearly 20,000 Angelenos live in RVs, vans, or cars, a 55 percent increase over when the count first started, in 2016. As the housing shortage deepens, thousands more will likely be forced into this lifestyle. Many of these people do not have the mental-health or substance-abuse issues eagerly trotted out to dismiss the homelessness crisis. A significant minority have jobs—they’re people who stock shelves or install drywall but simply can’t afford a home.

Like most Angelenos, I was repulsed by the homelessness crisis, vehicular or otherwise. Early in the summer of 2021, I temporarily joined the 20,000. Amid COVID-19 lockdowns, I was paying half of my income for a bedroom in a shared student apartment furnished like a doctor's office waiting room. My lease was set to expire, and I had to travel for work, anyway. Moving into my Prius seemed like the best bad option.

Angelenos love their cars, the stereotype goes. Our city’s distinctive natural wonder is, after all, the tar pits: Los Angeles wants to be paved over. And many see a certain American romance in a stretch of living, free and unencumbered, on the road.

Search YouTube for living out of a Prius and the first thing you’ll find is a former Bachelor contestant and NFL cheerleader who has pulled in millions of views for her travels in a mint-green 2006 Prius. Hundreds of social-media accounts offer similar adventures. Their styles vary, but the pitch is consistent: Save money; see the country; live your best life.

Why the Prius in particular? Unlike vans or RVs, the Toyota hybrid offers escape at rock-bottom prices. A 10-year-old beat-up Prius can run as low as $7,500. The car enjoys minimal maintenance and high gas mileage, and thanks to the hybrid battery, you can leave it running overnight for heat or AC.

Online communities such as the r/priusdwellers Subreddit celebrate novel builds—lifted Priuses, Priuses with solar panels, Priuses with more storage than an IKEA showroom. But my build was basic: Drop the rear seats, stack a 28-quart container on a 54-quart container on the floor, and put a pillow on top to create a flat, six-foot-long clearing. Lay down a yoga mat, a mattress topper, and a sleeping pad, and you have a bed more comfortable than any hotel mattress. You can add rods for hanging curtains and clothes, a sunscreen and rain guards for privacy.

On my first day living out of my Prius, I whizzed up the Pacific Coast Highway before hopping over to the 101, which runs through the sleepy Salinas Valley of Steinbeck fame. As the sun started to set, I realized that I hadn’t planned out where I was going to camp for the night and was forced to make my first rookie mistake: sleeping at a highway rest area.

The parking lot was packed with people living out of vehicles—truckers in semis, middle-class retirees in RVs, Millennials in tricked-out vans, and quite a few people in cars poorly suited to vehicle living, with stacks of luggage filling passenger seats and shirts pinched into closed windows to serve as curtains.

As I lay in the back of my Prius, reading by headlamp, I looked over to see a family of four sleeping in an old Honda Accord. A man slept in a reclined driver’s seat. A child stretched across the back seat. In the front passenger seat, a woman cradled a sleeping toddler. I hoped it was only for the night—some mix-up or scheduling mistake—but I suspected otherwise.

At stops like this, I often talked with fellow travelers, quickly finding a surprising degree of camaraderie among vehicle dwellers. Of course, many just want to be left alone, but others share food, jump one another’s stalled-out vehicles, and—most important of all—swap notes on where it’s safe to park.

The next day, I drove through San Francisco up to southern Oregon. Using Free Campsites, a peer-to-peer platform for finding and reviewing camping locations, I picked a patch of Bureau of Land Management property just off I-5. For people living out of vehicles on the cheap, BLM land is the gold standard of campgrounds—parking is free for up to 14 days, and the sites are quiet, safe, and at least vaguely scenic.

After spending a few days with relatives in the Willamette Valley, I broke east toward Boise along Route 20, driving through a dust storm in the eastern Oregon Badlands. I stopped off in the foothills of the Boise National Forest, then beelined to a BLM campsite north of Yellowstone, where I spent a few days working off a mobile hotspot, free of distraction.

My experiment in vehicle dwelling was supposed to wrap up around this time. I had to get back to Los Angeles to help teach classes at UCLA. But the vacancy rate for apartments in the city was low, my Ph.D. stipend was paltry, and I was facing some unexpected debt. I realized I wouldn’t be moving out of the Prius anytime soon.

Sleeping in a car in the city is much grimmer than in remote areas. Many cities ban vehicle living entirely, though often a de facto ban is enforced through parking policies, such as permit requirements or limited hours.

Los Angeles deploys a zone system, dividing the city into a patchwork of areas where vehicle living isn’t and is tolerated. Places where it’s not tolerated tend to be nice and well lit—residential neighborhoods and parking lots. Streets where it is tolerated tend to be dark and isolated, the kinds of places where you risk being the victim of a break-in. Sleep on the wrong street at the wrong time, and you could be ticketed, towed, or woken by police officers knocking on the window in the middle of the night.  

When I didn’t need to be close to campus, I often slept in the Angeles National Forest, just northeast of La Cañada Flintridge. Forest rangers there turn a mercifully blind eye to the dozens of families who sleep each night in dirt pullouts along Angeles Crest Highway. When I did need to be close to school, I slept among other UCLA students and construction workers a few blocks from campus—the exact scene that had so repulsed me when I first moved to Los Angeles.

There are three categories of vehicle living in Los Angeles. And thanks to citywide counts, we know exactly where each group clusters. Slightly more than half of the people living out of vehicles are in RVs. Large and conspicuous, RVs are typically tolerated only in industrial areas, where they line many streets. Roughly one in six live in vans. Thanks to the popularity of “van life” culture, they tend to concentrate in hip, beachside neighborhoods like Venice.

And then there are cars. By the official count, they house nearly a quarter of people who live out of vehicles, but this is almost certainly an undercount, because cars and their residents blend in. Relative to other people struggling with homelessness, they are more likely to be white, women, parents, and only temporarily homeless.

Of course, vehicle living can pose sanitation and public-health concerns. But criminalizing it, as so many cities effectively do, does nothing to address the obvious underlying cause of vehicular homelessness—a lack of housing. It just makes people’s already hard lives harder.

The good news is that some cities are reforming these policies. Starting with Santa Barbara in 2004, many cities have implemented “safe parking” programs, setting aside parking lots where people who live out of cars can park overnight free of harassment. The facilities are often hosted by faith groups, and the best ones provide security, bathrooms and showers, and access to case workers who can connect residents with social services.

But by one estimate, Los Angeles provides fewer than 500 such parking spots. Even if the city converted all 11,400 public parking spaces into safe parking, it still wouldn’t be enough.

Here at UCLA, where one in 20 students will at some point struggle with homelessness, administrators have rejected student-led requests for on-campus safe parking—a campaign organized in part by one of my former students who spent a few months living out of his car on the same Westwood street where I would occasionally sleep. Perhaps it would be embarrassing for the university to admit that many students live out of vehicles. But is the alternative any less embarrassing?

If the student-homelessness crisis has a silver lining, it’s that it seems to have created a generation of activists committed to reform. You can throw a rock at pro-housing YIMBY (“Yes in My Backyard”) gatherings and hit someone who has been forced to live out of a car. That includes Muhammad Alameldin, a researcher at the Terner Center for Housing Innovation. He was a student at Berkley when a snafu with roommates and a brutal Bay Area housing shortage pushed him into his Prius for three months.

Like Alameldin, I moved back into an apartment after three months of living in my Prius, a period made manageable by the occasional stay in a cheap hotel or with friends and family.

Ask anyone living out of a car how they fell into this life, and they will likely say: “I wanted to live free”; “I wanted to see the country”; “I wanted to go on an adventure.” But let the conversation carry on for more than a few minutes, and you will inevitably bump into a sadder origin story: a layoff, a divorce, a death, a foreclosure, an eviction.

The urge to roam is human. But roaming is a lot more romantic when it isn’t done out of desperation.