Itemoids

Black

My Father, My Faith, and Donald Trump

The Atlantic

www.theatlantic.com › magazine › archive › 2024 › 01 › evangelical-christian-nationalism-trump › 676150

This story seems to be about:

It was July 29, 2019—the worst day of my life, though I didn’t know that quite yet.

The traffic in downtown Washington, D.C., was inching along. The mid-Atlantic humidity was sweating through the windows of my chauffeured car. I was running late and fighting to stay awake. For two weeks, I’d been sprinting between television and radio studios up and down the East Coast, promoting my new book on the collapse of the post–George W. Bush Republican Party and the ascent of Donald Trump. Now I had one final interview for the day. My publicist had offered to cancel—it wasn’t that important, she said—but I didn’t want to. It was important. After the car pulled over on M Street Northwest, I hustled into the stone-pillared building of the Christian Broadcasting Network.

All in a blur, the producers took my cellphone, mic’d me up, and shoved me onto the set with the news anchor John Jessup. Camera rolling, Jessup skipped past the small talk. He was keen to know, given his audience, what I had learned about the president’s alliance with America’s white evangelicals. Despite being a lecherous, impenitent scoundrel—the 2016 campaign was marked by his mocking of a disabled man, his xenophobic slander of immigrants, his casual calls to violence against political opponents—Trump had won a historic 81 percent of white evangelical voters. Yet that statistic was just a surface-level indicator of the foundational shifts taking place inside the Church. Polling showed that born-again Christian conservatives, once the president’s softest backers, were now his most unflinching advocates. Jessup had the same question as millions of other Americans: Why?

As a believer in Jesus Christ—and as the son of an evangelical minister, raised in a conservative church in a conservative community—I had long struggled with how to answer this question. The truth is, I knew lots of Christians who, to varying degrees, supported the president, and there was no way to summarily describe their diverse attitudes, motivations, and behaviors. They were best understood as points plotted across a spectrum. At one end were the Christians who maintained their dignity while voting for Trump—people who were clear-eyed in understanding that backing a candidate, pragmatically and prudentially, need not lead to unconditionally promoting, empowering, and apologizing for that candidate. At the opposite end were the Christians who had jettisoned their credibility—people who embraced the charge of being reactionary hypocrites, still fuming about Bill Clinton’s character as they jumped at the chance to go slumming with a playboy turned president.

[From the April 2018 issue: Michael Gerson on Trump and the evangelical temptation]

Most of the Christians I knew fell somewhere in the middle. They had to some extent been seduced by the cult of Trumpism, yet to composite all of these people into a caricature was misleading. Something more profound was taking place. Something was happening in the country—something was happening in the Church—that we had never seen before. I had attempted, ever so delicately, to make these points in my book. Now, on the TV set, I was doing a similar dance.

Jessup seemed to sense my reticence. Pivoting from the book, he asked me about a recent flare-up in the evangelical world. In response to the Trump administration’s policy of forcibly separating migrant families at the U.S.-Mexico border, Russell Moore, a prominent leader with the Southern Baptist Convention, had tweeted, “Those created in the image of God should be treated with dignity and compassion, especially those seeking refuge from violence back home.” At this, Jerry Falwell Jr.—the son and namesake of the Moral Majority founder, and then-president of Liberty University, one of the world’s largest Christian colleges—took great offense. “Who are you @drmoore?” he replied. “Have you ever made a payroll? Have you ever built an organization of any type from scratch? What gives you authority to speak on any issue?”

This being Twitter and all, I decided to chime in. “There are Russell Moore Christians and Jerry Falwell Jr. Christians,” I wrote, summarizing the back-and-forth. “Choose wisely, brothers and sisters.”

Now Jessup was reading my tweet on-air. “Do you really see evangelicals divided into two camps?” the anchor asked.

I stumbled. Conceding that it might be an “oversimplification,” I warned still of a “fundamental disconnect” between Christians who view issues through the eyes of Jesus and Christians who process everything through a partisan political filter.

[From the June 2022 issue: Tim Alberta on how politics poisoned the evangelical church]

As the interview ended, I knew I’d botched an opportunity to state plainly my qualms about the American evangelical Church. Truth be told, I did see evangelicals divided into two camps—one side faithful to an eternal covenant, the other side bowing to earthly idols of nation and influence and fame—but I was too scared to say so. My own Christian walk had been so badly flawed. And besides, I’m no theologian; Jessup was asking for my journalistic analysis, not my biblical exegesis.

Walking off the set, I wondered if my dad might catch that clip. Surely somebody at our home church would see it and pass it along. I grabbed my phone, then stopped to chat with Jessup and a few of his colleagues. As we said our farewells, I looked down at the phone, which had been silenced. There were multiple missed calls from my wife and oldest brother. Dad had collapsed from a heart attack. There was nothing the surgeons could do. He was gone.

The last time I saw him was nine days earlier. The CEO of Politico, my employer at the time, had thrown a book party for me at his Washington manor, and Mom and Dad weren’t going to miss that. They jumped in their Chevy and drove out from my childhood home in southeast Michigan. When he sauntered into the event, my old man looked out of place—a rumpled midwestern minister, baggy shirt stuffed into his stained khakis—but before long he was holding court with diplomats and Fortune 500 lobbyists, making them howl with irreverent one-liners. It was like a Rodney Dangerfield flick come to life. At one point, catching sight of my agape stare, he gave an exaggerated wink, then delivered a punch line for his captive audience.

It was the high point of my career. The book was getting lots of buzz; already I was being urged to write a sequel. Dad was proud—very proud, he assured me—but he was also uneasy. For months, as the book launch drew closer, he had been urging me to reconsider the focus of my reporting career. Politics, he kept saying, was a “sordid, nasty business,” a waste of my time and God-given talents. Now, in the middle of the book party, he was taking me by the shoulder, asking a congressman to excuse us for just a moment. Dad put his arm around me and leaned in.

“You see all these people?” he asked.

“Yeah.” I nodded, grinning at the validation.

“Most of them won’t care about you in a week,” he said.

The record scratched. My moment of rapture was interrupted. I cocked my head and smirked at him. Neither of us said anything. I was bothered. The longer we stood there in silence, the more bothered I became. Not because he was wrong. But because he was right.

“Remember,” Dad said, smiling. “On this Earth, all glory is fleeting.”

Now, as I raced to Reagan National Airport and boarded the first available flight to Detroit, his words echoed. There was nothing contrived about Dad’s final admonition to me. That is what he believed; that is who he was.

Once a successful New York financier, Richard J. Alberta had become a born-again Christian in 1977. Despite having a nice house, beautiful wife, and healthy firstborn son, he felt a rumbling emptiness. He couldn’t sleep. He developed debilitating anxiety. Religion hardly seemed like the solution; Dad came from a broken and unbelieving home. He had decided, halfway through his undergraduate studies at Rutgers University, that he was an atheist. And yet, one weekend while visiting family in the Hudson Valley, my dad agreed to attend church with his niece, Lynn. He became a new person that day. His angst was quieted. His doubts were overwhelmed. Taking Communion for the first time at Goodwill Church in Montgomery, New York, he prayed to acknowledge Jesus as the son of God and accept him as his personal savior.

Dad became unrecognizable to those who knew him. He rose early, hours before work, to read the Bible, filling a yellow legal pad with verses and annotations. He sat silently for hours in prayer. My mom thought he’d lost his mind. A young journalist who worked under Howard Cosell at ABC Radio in New York, Mom was suspicious of all this Jesus talk. But her maiden name—Pastor—was proof of God’s sense of humor. Soon she accepted Christ too.

When Dad felt he was being called to abandon his finance career and enter the ministry, he met with Pastor Stewart Pohlman at Goodwill. As they prayed in Pastor Stew’s office, Dad said he felt the spirit of the Lord swirling around him, filling up the room. He was not given to phony supernaturalism—in fact, Dad might have been the most intellectually sober, reason-based Christian I’ve ever known—but that day, he felt certain, the Lord anointed him. Soon he and Mom were selling just about every material item they owned, leaving their high-salaried jobs in New York, and moving to Massachusetts so he could study at Gordon-Conwell Theological Seminary.

For the next two decades, they worked in small churches here and there, living off food stamps and the generosity of fellow believers. By the time I arrived, in 1986, Dad was Pastor Stew’s associate at Goodwill. We lived in the church parsonage; my nursery was the library, where towers of leather-wrapped books had been collected by the church’s pastors dating back to the mid-18th century. A few years later we moved to Michigan, and Dad eventually put down roots at a start-up, Cornerstone Church, in the Detroit suburb of Brighton. It was part of a minor denomination called the Evangelical Presbyterian Church (EPC), and it was there, for the next 26 years, that he served as senior pastor.

Cornerstone was our home. Because Mom also worked on staff, leading the women’s ministry, I was quite literally raised in the church: playing hide-and-seek in storage areas, doing homework in the office wing, bringing high-school dates to Bible study, working as a janitor during a year of community college. I hung around the church so much that I decided to leave my mark: At 9 years old, I used a pocket knife to etch my initials into the brickwork of the narthex.

The last time I’d been there, 18 months earlier, I’d spoken to a packed sanctuary at Dad’s retirement ceremony, armed with good-natured needling and PG-13 anecdotes. Now I would need to give a very different speech.

Standing in the back of the sanctuary, my three older brothers and I formed a receiving line. Cornerstone had been a small church when we’d arrived as kids. Not anymore. Brighton, once a sleepy town situated at the intersection of two expressways, had become a prized location for commuters to Detroit and Ann Arbor. Meanwhile, Dad, with his baseball allegories and Greek-linguistics lessons, had gained a reputation for his eloquence in the pulpit. By the time I moved away, in 2008, Cornerstone had grown from a couple hundred members to a couple thousand.

Now the crowd swarmed around us, filling the sanctuary and spilling out into the lobby and adjacent hallways, where tables displayed flowers and golf clubs and photos of Dad. I was numb. My brothers too. None of us had slept much that week. So the first time someone made a glancing reference to Rush Limbaugh, it did not compute. But then another person brought him up. And then another. That’s when I connected the dots. Apparently, the king of conservative talk radio had been name-checking me on his program recently—“a guy named Tim Alberta”—and describing the unflattering revelations in my book about Trump. Nothing in that moment could have mattered to me less. I smiled, shrugged, and thanked people for coming to the visitation.

They kept on coming. More than I could count. People from the church—people I’d known my entire life—were greeting me, not primarily with condolences or encouragement or mourning, but with commentary about Limbaugh and Trump. Some of it was playful, guys remarking about how I was the same mischief-maker they’d known since kindergarten. But some of it wasn’t playful. Some of it was angry; some of it was cold and confrontational. One man questioned whether I was truly a Christian. Another asked if I was still on “the right side.” All while Dad was in a box a hundred feet away.

It got to the point where I had to take a walk. Here, in our house of worship, people were taunting me about politics as I tried to mourn my father. I was in the company of certain friends that day who would not claim to know Jesus, yet they shrouded me in peace and comfort. Some of these card-carrying evangelical Christians? Not so much. They didn’t see a hurting son; they saw a vulnerable adversary.

That night, while fine-tuning the eulogy I would give at Dad’s funeral the following afternoon, I still felt the sting. My wife perceived as much. The unflappable one in the family, she encouraged me to be careful with my words and cautioned against mentioning the day’s unpleasantness. I took half of her advice.

In front of an overflow crowd on August 2, 2019, I paid tribute to the man who’d taught me everything—how to throw a baseball, how to be a gentleman, how to trust and love the Lord. Reciting my favorite verse, from Paul’s second letter to the early Church in Corinth, Greece, I told of Dad’s instruction to keep our eyes fixed on what we could not see. Reading from his favorite poem, about a man named Richard Cory, I told of Dad’s warning that we could amass great wealth and still be poor.

Then I recounted all the people who’d approached me the day before, wanting to discuss the Trump wars on AM talk radio. I proposed that their time in the car would be better spent listening to Dad’s old sermons. I spoke of the need for discipleship and spiritual formation. I suggested, with some sarcasm, that if they needed help finding biblical listening for their daily commute, the pastors here on staff could help. “Why are you listening to Rush Limbaugh ?” I asked my father’s congregation. “Garbage in, garbage out.”

There was nervous laughter in the sanctuary. Some people were visibly agitated. Others looked away, pretending not to hear. My dad’s successor, a young pastor named Chris Winans, wore a shell-shocked expression. No matter. I had said my piece. It was finished. Or so I thought.

A few hours later, after we had buried Dad, my brothers and I slumped down onto the couches in our parents’ living room. We opened some beers and turned on a baseball game. Behind us, in the kitchen, a small platoon of church ladies worked to prepare a meal for the family. Here, I thought, is the love of Christ. Watching them hustle about, comforting Mom and catering to her sons, I found myself regretting the Limbaugh remark. Most of the folks at our church were humble, kindhearted Christians like these women. Maybe I’d blown things out of proportion.

Just then, one of them walked over and handed me an envelope. It had been left at the church, she said. My name was scrawled across it. I opened the envelope. Inside was a full-page-long, handwritten screed. It was from a longtime Cornerstone elder, someone my dad had called a friend, a man who’d mentored me in the youth group and had known me for most of my life.

He had composed this note, on the occasion of my father’s death, to express just how disappointed he was in me. I was part of an evil plot, the man wrote, to undermine God’s ordained leader of the United States. My criticisms of President Trump were tantamount to treason—against both God and country—and I should be ashamed of myself.

However, there was still hope. Jesus forgives, and so could this man. If I used my journalism skills to investigate the “deep state,” he wrote, uncovering the shadowy cabal that was supposedly sabotaging Trump’s presidency, then I would be restored. He said he was praying for me.

I felt sick. Silently, I passed the letter to my wife. She scanned it without expression. Then she flung the piece of paper into the air and, with a shriek that made the church ladies jump out of their cardigans, cried out: “What the hell is wrong with these people?”

There has never been consensus on what, exactly, it means to be an evangelical. Competing and overlapping definitions have been offered for generations, some more widely embraced than others. Billy Graham, a man synonymous with the term, once remarked that he himself would like to inquire as to its true meaning. By the 1980s, thanks to the efforts of televangelists and political activists, what was once a religious signifier began transforming into a partisan movement. Evangelical soon became synonymous with conservative Christian, and eventually with white conservative Republican.

[Read: Defining evangelical]

My dad, a serious theologian who held advanced degrees from top seminaries, bristled at reductive analyses of his religious tribe. He would frequently state from the pulpit what he believed an evangelical to be: someone who interprets the Bible as the inspired word of God and who takes seriously the charge to proclaim it to the world.

From a young age, I realized that not all Christians were like my dad. Other adults who went to our church—my teachers, coaches, friends’ parents—didn’t speak about God the way that he did. Theirs was a more casual Christianity, less a lifestyle than a hobby, something that could be picked up and put down and slotted into schedules. Their pastor realized as much. Pushing his people ever harder to engage with questions of canonical authority and trinitarian precepts and Calvinist doctrine, Dad tried his best to run a serious church.

The author and his father in 2019 (Courtesy of Tim Alberta)

But for all his successes, Dad had one great weakness. Pastor Alberta’s kryptonite as a Christian—and I think he knew it, though he never admitted it to me—was his intense love of country.

Once a talented young athlete, Dad came down with tuberculosis at 16 years old. He was hospitalized for four months; at one point, doctors thought he might die. He eventually recovered, and with the Vietnam War escalating, he joined the Marine Corps. But at the Officer Candidates School in Quantico, Virginia, he fell behind in the physical work. His lungs were not healthy. After receiving an honorable discharge, Dad went home saddled with a certain shame. In the ensuing years, he learned that dozens of the second lieutenants he’d trained alongside at Quantico—as well as a bunch of guys he’d grown up with—were killed in action. It burdened him for the rest of his life.

This experience, and his disgust with the hippies and the drug culture and the war protesters, turned Dad into a law-and-order conservative. Marinating in the language of social conservatism during his time in seminary—this was the heyday of the Moral Majority—he emerged a full-spectrum Republican. His biggest political concern was abortion; in 1947, my grandmother, trapped in an emotionally abusive marriage, had almost ended her pregnancy with him. (She had a sudden change of heart at the clinic and walked out, a decision my dad would always attribute to holy intercession.) But he also waded into the culture wars: gay marriage, education curriculum, morality in public life.

Dad always told us that personal integrity was a prerequisite for political leadership. He was so relieved when Bill Clinton’s second term ended that he and Mom hosted a small viewing party in our living room for George W. Bush’s 2001 inauguration, to celebrate the return of morality to the White House. Over time, however, his emphasis shifted. One Sunday in early 2010, when I was home visiting, he showed the congregation an ominous video in which Christian leaders warned about the menace of Obamacare. I told him afterward that it felt inappropriate for a worship service; he disagreed. We would butt heads more regularly in the years that followed. It was always loving, always respectful. Yet clearly our philosophical paths were diverging—a reality that became unavoidable during the presidency of Donald Trump.

Dad would have preferred any of the other Republicans who ran in 2016. He knew that Trump was a narcissist and a liar; he knew that he was not a moral man. Ultimately Dad felt he had no choice but to support the Republican ticket, given his concern for the unborn and the Supreme Court majority that hung in the balance. I understood that decision. What I couldn’t understand was how, over the next couple of years, he became an apologist for Trump’s antics, dismissing criticisms of the president’s conduct as little more than an attempt to marginalize his supporters. Dad really did believe this; he believed that the constant attacks on Trump’s character were ipso facto an attack on the character of people like himself, which I think, on some subconscious level, created a permission structure for him to ignore the president’s depravity. All I could do was tell Dad the truth. “Look, you’re the one who taught me to know right from wrong,” I would say. “Don’t be mad at me for acting on it.”

To his credit, Dad was not some lazy, knee-jerk partisan. He was vocal about certain issues—gun violence, poverty, immigration, the trappings of wealth—that did not play to his constituency at Cornerstone.

Dad wasn’t a Christian nationalist; he wanted nothing to do with theocracy. He just believed that God had blessed the United States uniquely—and felt that anyone who fought to preserve those blessings was doing the Lord’s work. This made for an unfortunate scene in 2007, when a young congregant at Cornerstone, a Marine named Mark Kidd, died during a fourth tour of duty in Iraq. Public opinion had swung sharply against the war, and Democrats were demanding that the Bush administration bring the troops home. My dad was devastated by Kidd’s death. They had corresponded while Kidd was overseas and met for prayer in between his deployments. Dad’s grief as a pastor gave way to his grievance as a Republican supporter of the war: He made it known to local Democratic politicians that they weren’t welcome at the funeral.

“I am ashamed, personally, of leaders who say they support the troops but not the commander in chief,” Dad thundered from his pulpit, earning a raucous standing ovation. “Do they not see that discourages the warriors and encourages the terrorists?”

This touched off a firestorm in our community. Most of the church members were all for Dad’s remarks, but even in a conservative town like Brighton, plenty of people felt uneasy about turning a fallen Marine’s church memorial into a partisan political rally. Patriotism in the pulpit is one thing; lots of sanctuaries fly an American flag on the rostrum. This was something else. This was taking the weight and the gravity and the eternal certainty of God and lending it to an ephemeral and questionable cause. This was rebuking people for failing to unconditionally follow the president of the United States when the only authority we’re meant to unconditionally follow—particularly in a setting of stained-glass windows—is Christ himself.

I know Dad regretted it. But he couldn’t help himself. His own personal story—and his broader view of the United States as a godly nation, a source of hope in a despondent world—was impossible to divorce from his pastoral ministry. Every time a member of the military came to church dressed in uniform, Dad would recognize them by name, ask them to stand up, and lead the church in a rapturous round of applause. This was one of the first things his successor changed at Cornerstone.

Eighteen months after Dad’s funeral, in February 2021, I sat down across from that successor, Chris Winans, in a booth at the Brighton Bar & Grill. It’s a comfortable little haunt on Main Street, backing up to a wooden playground and a millpond. But Winans didn’t look comfortable. He looked nervous, even a bit paranoid, glancing around him as we began to speak. Soon, I would understand why.

Dad had spent years looking for an heir apparent. Several associate pastors had come and gone. Cornerstone was his life’s work—he had led the church throughout virtually its entire history—so there would be no settling in his search for a successor. The uncertainty wore him down. Dad worried that he might never find the right guy. And then one day, while attending a denominational meeting, he met Winans, a young associate pastor from Goodwill—the very church where he’d been saved, and where he’d worked his first job out of seminary. Dad hired him away from Goodwill to lead a young-adults ministry at Cornerstone, and from the moment Winans arrived, I could tell that he was the one.

Barely 30 years old, Winans looked to be exactly what Cornerstone needed in its next generation of leadership. He was a brilliant student of the scriptures. He spoke with precision and clarity from the pulpit. He had a humble, easygoing way about him, operating without the outsize ego that often accompanies first-rate preaching. Everything about this pastor—the boyish sweep of brown hair, his delightful young family—seemed to be straight out of central casting.

There was just one problem: Chris Winans was not a conservative Republican. He didn’t like guns. He cared more about funding anti-poverty programs than cutting taxes. He had no appetite for President Trump’s unrepentant antics. Of course, none of this would seem heretical to Christians in other parts of the world; given his staunch anti-abortion position, Winans would in most places be considered the picture of spiritual and intellectual consistency. But in the American evangelical tradition, and at a church like Cornerstone, the whiff of liberalism made him suspect.

Dad knew the guy was different. Winans liked to play piano instead of sports, and had no taste for hunting or fishing. Frankly, Dad thought that was a bonus. Winans wasn’t supposed to simply placate Cornerstone’s aging base of wealthy white congregants. The new pastor’s charge was to evangelize, to cast a vision and expand the mission field, to challenge those inside the church and carry the gospel to those outside it. Dad didn’t think there was undue risk. He felt confident that his hand-chosen successor’s gifts in the pulpit, and his manifest love of Jesus, would smooth over any bumps in the transition.

He was wrong. Almost immediately after Winans moved into the role of senior pastor, at the beginning of 2018, the knives came out. Any errant remark he made about politics or culture, any slight against Trump or the Republican Party—real or perceived—invited a torrent of criticism. Longtime members would demand a meeting with Dad, who had stuck around in a support role, and unload on Winans. Dad would ask if there was any substantive criticism of the theology; almost invariably, the answer was no. A month into the job, when Winans remarked in a sermon that Christians ought to be protective of God’s creation—arguing for congregants to take seriously the threats to the planet—people came to Dad by the dozens, outraged, demanding that Winans be reined in. Dad told them all to get lost. If anyone had a beef with the senior pastor, he said, they needed to take it up with the senior pastor. (Dad did so himself, buying Winans lunch at Chili’s and suggesting that he tone down the tree hugging.)

Winans had a tough first year on the job, but he survived it. The people at Cornerstone were in an adjustment period. He needed to respect that—and he needed to adjust, too. As long as Dad had his back, Winans knew he would be okay.And then Dad died.

Now, Winans told me, he was barely hanging on at Cornerstone. The church had become unruly; his job had become unbearable. Not long after Dad died—making Winans the unquestioned leader of the church—the coronavirus pandemic arrived. And then George Floyd was murdered. All of this as Donald Trump campaigned for reelection. Trump had run in 2016 on a promise that “Christianity will have power” if he won the White House; now he was warning that his opponent in the 2020 election, former Vice President Joe Biden, was going to “hurt God” and target Christians for their religious beliefs. Embracing dark rhetoric and violent conspiracy theories, the president enlisted prominent evangelicals to help frame a cosmic spiritual clash between the God-fearing Republicans who supported Trump and the secular leftists who were plotting their conquest of America’s Judeo-Christian ethos.

People at Cornerstone began confronting their pastor, demanding that he speak out against government mandates and Black Lives Matter and Joe Biden. When Winans declined, people left. The mood soured noticeably after Trump’s defeat in November 2020. A crusade to overturn the election result, led by a group of outspoken Christians—including Trump’s lawyer Jenna Ellis, who later pleaded guilty to a felony charge of aiding and abetting false statements and writings, and the author Eric Metaxas, who suggested to fellow believers that martyrdom might be required to keep Trump in office—roiled the Cornerstone congregation. When a popular church staffer who had been known to proselytize for QAnon was fired after repeated run-ins with Winans, the pastor told me, the departures came in droves. Some of those abandoning Cornerstone were not core congregants. But plenty of them were. They were people who served in leadership roles, people Winans counted as confidants and friends.

By the time Trump supporters invaded the U.S. Capitol on January 6, 2021, Winans believed he’d lost control of his church. “It’s an exodus,” he told me a few weeks later, sitting inside Brighton Bar & Grill.

The pastor had felt despair—and a certain liability—watching the attack unfold on television. Christian imagery was ubiquitous: rioters forming prayer circles, singing hymns, carrying Bibles and crosses. The perversion of America’s prevailing religion would forever be associated with this tragedy; as one of the legislative ringleaders, Senator Josh Hawley, explained in a speech the following year, long after the blood had been scrubbed from the Capitol steps, “We are a revolutionary nation precisely because we are the heirs of the revolution of the Bible.”

That sort of thinking, Winans said, represents an even greater threat than the events of January 6.

“A lot of people believe there was a religious conception of this country. A biblical conception of this country,” Winans told me. “And that’s the source of a lot of our problems.”

For much of American history, white Christians have enjoyed tremendous wealth and influence and security. Given that reality—and given the miraculous nature of America’s defeat of Great Britain, its rise to superpower status, and its legacy of spreading freedom and democracy (and, yes, Christianity) across the globe—it’s easy to see why so many evangelicals believe that our country is divinely blessed. The problem is, blessings often become indistinguishable from entitlements. Once we become convinced that God has blessed something, that something can become an object of jealousy, obsession—even worship.

“At its root, we’re talking about idolatry. America has become an idol to some of these people. If you believe that God is in covenant with America, then you believe—and I’ve heard lots of people say this explicitly—that we’re a new Israel,” Winans said, referring to the Old Testament narrative of God’s chosen nation. “You believe the sorts of promises made to Israel are applicable to this country; you view America as a covenant that needs to be protected. You have to fight for America as if salvation itself hangs in the balance. At that point, you understand yourself as an American first and most fundamentally. And that is a terrible misunderstanding of who we’re called to be.”

Plenty of nations are mentioned in the Bible; the United States is not one of them. Most American evangelicals are sophisticated enough to reject the idea of this country as something consecrated in the eyes of God. But many of those same people have chosen to idealize a Christian America that puts them at odds with Christianity. They have allowed their national identity to shape their faith identity instead of the other way around.

Winans chose to be hypervigilant on this front, hence the change of policy regarding Cornerstone’s salute to military personnel. The new pastor would meet soldiers after the service, shaking their hand and individually thanking them for their service. But he refused to stage an ovation in the sanctuary. This wasn’t because he was some bohemian anti-war activist; in fact, his wife had served in the Army. Winans simply felt it was inappropriate.

“I don’t want to dishonor anyone. I think nations have the right to self-defense. I respect the sacrifices these people make in the military,” Winans told me. “But they would come in wearing their dress blues and get this wild standing ovation. And you contrast that to whenever we would host missionaries: They would stand up for recognition, and we give them a golf clap … And you have to wonder: Why? What’s going on inside our hearts?”

This kind of cultural heresy was getting Winans into trouble. More congregants were defecting each week. Many were relocating to one particular congregation down the road, a revival-minded church that was pandering to the whims of the moment, led by a pastor who was preaching a blood-and-soil Christian nationalism that sought to merge two kingdoms into one.As we talked, Winans asked me to keep something between us: He was thinking about leaving Cornerstone.

The “psychological onslaught,” he said, had become too much. Recently, the pastor had developed a form of anxiety disorder and was retreating into a dark room between services to collect himself. Winans had met with several trusted elders and asked them to stick close to him on Sunday mornings so they could catch him if he were to faint and fall over.

I thought about Dad and how heartbroken he would have been. Then I started to wonder if Dad didn’t have some level of culpability in all of this. Clearly, long before COVID-19 or George Floyd or Donald Trump, something had gone wrong at Cornerstone. I had always shrugged off the crude, hysterical, sky-is-falling Facebook posts I would see from people at the church. I found it amusing, if not particularly alarming, that some longtime Cornerstone members were obsessed with trolling me on Twitter. Now I couldn’t help but think these were warnings—bright-red blinking lights—that should have been taken seriously. My dad never had a social-media account. Did he have any idea just how lost some of his sheep really were?

I had never told Winans about the confrontations at my dad’s viewing, or the letter I received after taking Rush Limbaugh’s name in vain at the funeral. Now I was leaning across the table, unloading every detail. He narrowed his eyes and folded his hands and gave a pained exhale, mouthing that he was sorry. He could not even manage the words.

We both kept quiet for a little while. And then I asked him something I’d thought about every day for the previous 18 months—a sanitized version of my wife’s outburst in the living room.

“What’s wrong with American evangelicals?”

Winans thought for a moment.

“America,” he replied. “Too many of them worship America.”

This article was adapted from Tim Alberta’s new book, The Kingdom, the Power, and the Glory: American Evangelicals in an Age of Extremism. It appears in the January/February 2024 print edition with the headline “The Church of America.”

The CRISPR Era Is Here

The Atlantic

www.theatlantic.com › health › archive › 2023 › 11 › crispr-sickle-cell-disease-cure › 676151

When Victoria Gray was still a baby, she started howling so inconsolably during a bath that she was rushed to the emergency room. The diagnosis was sickle-cell disease, a genetic condition that causes bouts of excruciating pain—“worse than a broken leg, worse than childbirth,” one doctor told me. Like lightning crackling in her body is how Gray, now 38, has described the pain. For most of her life, she lived in fear that it could strike at any moment, forcing her to drop everything to rush, once again, to the hospital.

After a particularly long and debilitating hospitalization in college, Gray was so weak that she had to relearn how to stand, how to use a spoon. She dropped out of school. She gave up on her dream of becoming a nurse.

Four years ago, she joined a groundbreaking clinical trial that would change her life. She became the first sickle-cell patient to be treated with the gene-editing technology CRISPR—and one of the first humans to be treated with CRISPR, period. CRISPR at that point had been hugely hyped, but had largely been used only to tinker with cells in a lab. When Gray got her experimental infusion, scientists did not know whether it would cure her disease or go terribly awry inside her. The therapy worked—better than anyone dared to hope. With her gene-edited cells, Gray now lives virtually symptom-free. Twenty-nine of 30 eligible patients in the trial went from multiple pain crises every year to zero in 12 months following treatment.

The results are so astounding that this therapy, from Vertex Pharmaceuticals and CRISPR Therapeutics, became the first CRISPR medicine ever approved, with U.K. regulators giving the green light earlier this month; the FDA appears prepared to follow suit in the next two weeks. No one yet knows the long-term effects of the therapy, but today Gray is healthy enough to work full-time and take care of her four children. “Now I’ll be there to help my daughters pick out their wedding dresses. And we’ll be able to take family vacations,” she told NPR a year after her treatment. “And they’ll have their mom every step of the way.”

The approval is a landmark for CRISPR gene editing, which was just an idea in an academic paper a little more than a decade ago—albeit one already expected to cure incurable diseases and change the world. But how, specifically? Not long after publishing her seminal research, Jennifer Doudna, who won the Nobel Prize in Chemistry with Emmanuelle Charpentier for their pioneering CRISPR work, met with a doctor on a trip to Boston. CRISPR could cure sickle-cell disease, he told her. On his computer, he scrolled through DNA sequences of cells from a sickle-cell patient that his lab had already edited with CRISPR. “That, for me, personally, was one of those watershed moments,” Doudna told me. “Okay, this is going to happen.” And now, it has happened. Gray and patients like her are living proof of gene-editing power. Sickle-cell disease is the first disease—and unlikely the last—to be transformed by CRISPR.

All of sickle-cell disease’s debilitating and ultimately deadly effects originate from a single genetic typo. A small misspelling in Gray’s DNA—an A that erroneously became a T—caused the oxygen-binding hemoglobin protein in her blood to clump together. This in turn made her red blood cells rigid, sticky, and characteristically sickle shaped, prone to obstructing blood vessels. Where oxygen cannot reach, tissue begins to die. Imagine “if you put a tourniquet on and walked away, or if you were having a heart attack all the time,” says Lewis Hsu, a pediatric hematologist at the University of Illinois at Chicago. These obstructions are immensely painful, and repeated bouts cause cumulative damage to the body, which is why people with sickle cell die some 20 years younger on average.

Not everyone with the sickle-cell mutation gets quite so sick. As far back as the 1940s, a doctor noticed that the blood of newborns with sickle-cell disease did not, surprisingly, sickle very much. Babies in the womb actually make a fetal version of the hemoglobin protein, whose higher affinity for oxygen pulls the molecule out of their mother’s blood. At birth, a gene that encodes fetal hemoglobin begins to turn off. But adults do sometimes still make varying amounts of fetal hemoglobin, and the more they make, scientists observed, the milder their sickle-cell disease, as though fetal hemoglobin had stepped in to replace the faulty adult version. Geneticists eventually figured out the exact series of switches our cells use to turn fetal hemoglobin on and off. But there, they remained stuck: They had no way to flip the switch themselves.

Then came CRISPR. The basic technology is a pair of genetic scissors that makes fairly precise cuts to DNA. CRISPR is not currently capable of fixing the A-to-T typo responsible for sickle cell, but it can be programmed to disable the switch suppressing fetal hemoglobin, turning it back on. Snip snip snip in billions of blood cells, and the result is blood that behaves like typical blood.

Sickle cell was a “very obvious” target for CRISPR from the start, says Haydar Frangoul, a hematologist at the Sarah Cannon Research Institute in Nashville, who treated Gray in the trial. Scientists already knew the genetic edits necessary to reverse the disease. Sickle cell also has the advantage of affecting blood cells, which can be selectively removed from the body and gene-edited in the controlled environment of a lab. Patients, meanwhile, receive chemotherapy to kill the blood-producing cells in their bone marrow before the CRISPR-edited ones are infused back into their body, where they slowly take root and replicate over many months.

It is a long, grueling process, akin to a bone-marrow transplant with one’s own edited cells. A bone-marrow transplant from a donor is the one way doctors can currently cure sickle-cell disease, but it comes with the challenge of finding a matched donor and the risks of an immune complication called graft-versus-host disease. Using CRISPR to edit a patient’s own cells eliminates both obstacles. (A second gene-based therapy, using a more traditional engineered-virus technique to insert a modified adult hemoglobin gene into DNA semi-randomly, is also expected to receive FDA approval  for sickle-cell disease soon. It seems to be equally effective at preventing pain crises so far, but development of the CRISPR therapy took much less time.)

In another way, though, sickle-cell disease is an unexpected front-runner in the race to commercialize CRISPR. Despite being one of the most common genetic diseases in the world, it has long been overlooked because of whom it affects: Globally, the overwhelming majority of sickle-cell patients live in sub-Saharan Africa. In the U.S., about 90 percent are of African descent, a group that faces discrimination in health care. When Gray, who is Black, needed powerful painkillers, she would be dismissed as an addict seeking drugs rather than a patient in crisis—a common story among sickle-cell patients.

For decades, treatment for the disease lagged too. Sickle-cell disease has been known to Western medicine since 1910, but the first drug did not become available until 1998, points out Vence Bonham, a researcher at the National Human Genome Research Institute who studies health disparities. In 2017, Bonham began convening focus groups to ask sickle-cell patients about CRISPR. Many were hopeful, but some had misgivings because of the history of experimentation on Black people in the U.S. Gray, for her part, has said she never would have agreed to the experimental protocol had she been offered it at one of the hospitals that had treated her poorly. Several researchers told me they hoped the sickle-cell therapy would make a different kind of history: A community that has been marginalized in medicine is the first in line to benefit from CRISPR.

Doctors aren’t willing to call it an outright “cure” yet. The long-term durability and safety of gene editing are still unknown, and although the therapy virtually eliminated pain crises, Hsu says that organ damage can accumulate even without acute pain. Does gene editing prevent all that organ damage too? Vertex, the company that makes the therapy, plans to monitor patients for 15 years.

Still, the short-term impact on patients’ lives is profound. “We wouldn’t have dreamed about this even five, 10 years ago,” says Martin Steinberg, a hematologist at Boston University who also sits on the steering committee for Vertex. He thought it might ameliorate the pain crises, but to eliminate them almost entirely? It looks pretty damn close to a cure.

In the future, however, Steinberg suspects that this currently cutting-edge therapy will seem like only a “crude attempt.” The long, painful process necessary to kill unedited blood cells makes it inaccessible for patients who cannot take months out of their life to move near the limited number of transplant centers in the U.S.—and inaccessible to patients living with sickle-cell disease in developing countries. The field is already looking at techniques that can edit cells right inside the body, a milestone recently achieved in the liver during a CRISPR trial to lower cholesterol. Scientists are also developing versions of CRISPR that are more sophisticated than a pair of genetic scissors—for example, ones that can paste sequences of DNA or edit a single letter at a time. Doctors could one day correct the underlying mutation that causes sickle-cell disease directly.

Such breakthroughs would open CRISPR up to treating diseases that are out of reach today, either because we can’t get CRISPR into the necessary cells or because the edit is too complex. “I get emails now daily from families all over the world asking, ‘My son or my loved one has this disease. Can CRISPR fix it?’” says Frangoul, who has become known as the first doctor to infuse a sickle-cell patient in a CRISPR trial. The answer, usually, is not yet. But clinical trials are already under way to test CRISPR in treating cancer, diabetes, HIV, urinary tract infections, hereditary angioedema, and more. We have opened the book on CRISPR gene editing, Frangoul told me, but this is not the final chapter. We may still be writing the very first.

‘Post-Victimhood’ Storytelling

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 11 › chase-hall-paintings-coffee-cotton › 676083

In his Surrealist Manifesto of 1924, André Breton wrote, “The marvelous is always beautiful, anything marvelous is beautiful, in fact only the marvelous is beautiful.” That line came to mind when I stood before Mother Nature, a giant canvas depicting a killer whale lifting a naked man into the air, eye level with a flock of gulls. The image was a highlight of “The Bathers,” Chase Hall’s standout debut at the David Kordansky Gallery in Chelsea this fall.

The show, of mostly immense paintings priced from $60,000 to $120,000, was billed as an investigation into “nature, leisure, public space, and Black adventurism.” The playful and enigmatic scenes involved men swimming, surfing, and sometimes levitating, in solitude or among a living bounty of fish and birds. They were at once beautiful and formally striking meditations on the richness and versatility of a single color: brown.

At the time of the exhibition, Hall was on the cusp of turning 30. He floated through the gallery in a white tank and loose gray slacks that broke over a pair of leather house shoes, looking like a West Coast rapper from the G-funk era who’d recently studied abroad in Florence. His aura was jubilant. The crowd, full of name-brand artists, collectors, and old friends, seemed both taken by the artwork and genuinely happy for the artist—two responses that do not always mesh.

As recently as three years ago, Hall was working odd jobs, scrounging for free materials, dumpster diving for the stretcher bars from NYU students’ cast-off canvases, discovering the signs and symbols of his own vernacular. Now he teaches at the university as an adjunct professor, and his work has landed in the permanent collections of numerous major institutions, including the Brooklyn Museum and the Whitney.

Brother’s Keeper, 2023

During the after-party for his opening, at a dive in Lower Manhattan, Hall slipped away from the crowd to an empty stool beneath a flat-screen. The third set of the second men’s U.S. Open quarterfinal was happening, Ben Shelton losing to Francis Tiafoe. “Shelton’s gonna get it,” Hall said, smiling confidently, seconds before the floppy-haired teenager ripped a merciless forehand down the line, saving the set and quickly snatching the match in the next. I realized my mixed-kid radar had failed me. It was only through Hall’s attention that I recalled Shelton’s biography and saw the tangled ancestry in the young player’s triumphant face—an ancestry like Hall’s, like mine.

[Conor Friedersdorf: Unraveling race]

“Boy, you look like the weather!” is how a woman once described Hall’s own tawny complexion as the sun bounced off it. The son of a Black father who was raised by his single white mother, Hall had a peripatetic childhood, even spending his seventh-grade school year in Dubai. When I first met him last spring—at a blue-chip artist’s opening where it seemed like every other attendee I encountered, whether a collector, party hopper, or critic, wanted to talk about Hall—he spoke of his younger self and the experience of being a “Black” kid with “white-looking” features as a kind of racial “acne” marring his appearance. Today, he notes an evolution, not only in his artistic practice but in his sense of himself in the world. “Over the last 10 years of really trying to navigate life and family and career and mixedness,” he told me, he’s been asking himself: “Am I the conduit for my own experience, or am I just going to try and hope no one says I have a white mom? It was like, ‘Why don’t I just stand up for my own shit?’”

As James McBride once wrote in The Color of Water, “being mixed is like that tingling feeling you have in your nose just before you sneeze—you’re waiting for it to happen but it never does.” You’re waiting to become one thing or another, but you never do. Hall has chosen to embrace this irresolvable aspect of his identity, not to oversimplify it. He’s grappling with what he calls “these dichotomies of genetic shame and genetic valor”—playing with them through the juxtaposition of color and blankness.

Hall uses acrylic paint but primarily relies on an extensive scale of brown tones realized through the medium of brewed coffee. It is an art-making process he’s been tweaking since he was a teenager in Southern California working after school at Starbucks, “smearing receipts” with doodles to cope with the boredom.

Over time, it has become a method of calibrating a sophisticated spectrum of beige, brown, and nearly ebony tones. Darker browns are finer ground; lighter ones are coarser. Through trial and error, and many, many gallons of coffee, he’s developed 26 distinct hues out of a single bean and cultivated extensive relationships with the various baristas of his East Village neighborhood. He can easily purchase 200 to 300 shots of espresso in an outing, which his contacts have learned to pull to his precise specifications, and which he takes home and applies to untreated cotton canvas through a technique he describes metaphorically as “melanin being soaked into the cotton.”

Coffee beans and cotton bolls are not just representational opposites of lightness and darkness; they are also emblematic of the legacies of Africa and Europe colliding in the New World through slavery. To this day, these materials represent often invisible ecosystems of poverty and coerced labor—smallholder farmers in Ethiopia, sweatshop workers in China. (The Brazilian artist Vik Muniz has also made art out of the materials of the slave economy—in his case, coffee beans and sugar.)

Hall’s vision is achieved not only by the melanated fields of expression he superimposes over whiteness through the process of, as he puts it, “corralling and containing a water-based form,” but to a significant degree by the preservation of raw open spaces he pointedly leaves unpainted. Many of these voids of “conceptual white paint” are also interspersed within his subject’s bodies—white noses, kneecaps, even genitalia—making explicit the base-level hybridity we are conditioned to deny or gloss over.

It is a technique he has pursued to such lengths that, as he explained in a talk at Kordansky, he now owns a small part of a craft-coffee company. The coffee the audience was gratefully sipping that morning in the gallery was derived from the same source that was used to make the surrounding artworks. He spent three years developing a process to reclaim even the grounds left over from the paintings, which he then turned into a series of gorgeous prints that went on display two blocks away at Pace Prints the same week as “The Bathers.” Nothing is wasted.

Despite his medium’s symbolism, and despite the arduous physical labor that goes into making it art—the pouring and repouring of a crop-based liquid onto crop-based surfaces—the images that result are not overtly political. This sets him apart from many other minority artists he is sometimes compared to (and from whom he draws inspiration)—people such as Henry Taylor—who are thriving in a time of revived interest in figurative painting. His male subjects are decidedly not responding to tragedy; they are, he said, “liberated figures outside of stereotypical Black spaces.” In a painting called Whitewash (Pelicanus Occidentalis), a ripped nude man stands astride a longboard. His face is drawn tight not with worry or heavyheartedness but with the deep concentration of diligent focus: His only struggle is to remain vertical atop the water.

Whitewash (Pelicanus Occidentalis), 2023

Hall himself concedes that, in terms of technical drawing, some NYU students he teaches are more capable than he is. But Rashid Johnson, another Kordansky artist, told me he sees in Hall a young painter with “a real willingness to evolve and develop, and to build a language.” Art, he says, is more than “strokes on a canvas.”

[Read: Political art isn’t always better art]

The effect, both of Hall’s individual paintings and, in a more profound way, of his cumulative work, is a refreshing challenge. Hall forces us to meet the people he depicts on their own terms, without the usual lens—or crutch—of our inherited, fetishizing, or condescending projections.

One of his central goals, as he put it to me later, is “redefining our relationship with the landscape, outside of basketball, enslavement.” He’s concerned with matters of agency, world building, and individuality—what he calls “post-victimhood” storytelling. “I really believe in life,” he told me. “I go out and try to make the best of it.”

Why America Abandoned the Greatest Economy in History

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 11 › new-deal-us-economy-american-dream › 676051

If there is one statistic that best captures the transformation of the American economy over the past half century, it may be this: Of Americans born in 1940, 92 percent went on to earn more than their parents; among those born in 1980, just 50 percent did. Over the course of a few decades, the chances of achieving the American dream went from a near-guarantee to a coin flip.

What happened?

One answer is that American voters abandoned the system that worked for their grandparents. From the 1940s through the ’70s, sometimes called the New Deal era, U.S. law and policy were engineered to ensure strong unions, high taxes on the rich, huge public investments, and an expanding social safety net. Inequality shrank as the economy boomed. But by the end of that period, the economy was faltering, and voters turned against the postwar consensus. Ronald Reagan took office promising to restore growth by paring back government, slashing taxes on the rich and corporations, and gutting business regulations and antitrust enforcement. The idea, famously, was that a rising tide would lift all boats. Instead, inequality soared while living standards stagnated and life expectancy fell behind that of peer countries. No other advanced economy pivoted quite as sharply to free-market economics as the United States, and none experienced as sharp a reversal in income, mobility, and public-health trends as America did. Today, a child born in Norway or the United Kingdom has a far better chance of outearning their parents than one born in the U.S.

This story has been extensively documented. But a nagging puzzle remains. Why did America abandon the New Deal so decisively? And why did so many voters and politicians embrace the free-market consensus that replaced it?

Since 2016, policy makers, scholars, and journalists have been scrambling to answer those questions as they seek to make sense of the rise of Donald Trump—who declared, in 2015, “The American dream is dead”—and the seething discontent in American life. Three main theories have emerged, each with its own account of how we got here and what it might take to change course. One theory holds that the story is fundamentally about the white backlash to civil-rights legislation. Another pins more blame on the Democratic Party’s cultural elitism. And the third focuses on the role of global crises beyond any political party’s control. Each theory is incomplete on its own. Taken together, they go a long way toward making sense of the political and economic uncertainty we’re living through.

“The American landscape was once graced with resplendent public swimming pools, some big enough to hold thousands of swimmers at a time,” writes Heather McGee, the former president of the think tank Demos, in her 2021 book, The Sum of Us. In many places, however, the pools were also whites-only. Then came desegregation. Rather than open up the pools to their Black neighbors, white communities decided to simply close them for everyone. For McGhee, that is a microcosm of the changes to America’s political economy over the past half century: White Americans were willing to make their own lives materially worse rather than share public goods with Black Americans.

From the 1930s until the late ’60s, Democrats dominated national politics. They used their power to pass sweeping progressive legislation that transformed the American economy. But their coalition, which included southern Dixiecrats as well as northern liberals, fractured after President Lyndon B. Johnson signed the Civil Rights Act of 1964 and the Voting Rights Act of 1965. Richard Nixon’s “southern strategy” exploited that rift and changed the electoral map. Since then, no Democratic presidential candidate has won a majority of the white vote.

Crucially, the civil-rights revolution also changed white Americans’ economic attitudes. In 1956, 65 percent of white people said they believed the government ought to guarantee a job to anyone who wanted one and to provide a minimum standard of living. By 1964, that number had sunk to 35 percent. Ronald Reagan eventually channeled that backlash into a free-market message by casting high taxes and generous social programs as funneling money from hardworking (white) Americans to undeserving (Black) “welfare queens.” In this telling, which has become popular on the left, Democrats are the tragic heroes. The mid-century economy was built on racial suppression and torn apart by racial progress. Economic inequality was the price liberals paid to do what was right on race.

The New York Times writer David Leonhardt is less inclined to let liberals off the hook. His new book, Ours Was the Shining Future, contends that the fracturing of the New Deal coalition was about more than race. Through the ’50s, the left was rooted in a broad working-class movement focused on material interests. But at the turn of the ’60s, a New Left emerged that was dominated by well-off college students. These activists were less concerned with economic demands than issues like nuclear disarmament, women’s rights, and the war in Vietnam. Their methods were not those of institutional politics but civil disobedience and protest. The rise of the New Left, Leonhardt argues, accelerated the exodus of white working-class voters from the Democratic coalition.

[David Leonhardt: The hard truth about immigration]

Robert F. Kennedy emerges as an unlikely hero in this telling. Although Kennedy was a committed supporter of civil rights, he recognized that Democrats were alienating their working-class base. As a primary candidate in 1968, he emphasized the need to restore “law and order” and took shots at the New Left, opposing draft exemptions for college students. As a result of these and other centrist stances, Kennedy was criticized by the liberal press—even as he won key primary victories on the strength of his support from both white and Black working-class voters.

But Kennedy was assassinated in June that year, and the political path he represented died with him. That November, Nixon, a Republican, narrowly won the White House. In the process, he reached the same conclusion that Kennedy had: The Democrats had lost touch with the working class, leaving millions of voters up for grabs. In the 1972 election, Nixon portrayed his opponent, George McGovern, as the candidate of the “three A’s”—acid, abortion, and amnesty (the latter referring to draft dodgers). He went after Democrats for being soft on crime and unpatriotic. On Election Day, he won the largest landslide since Franklin D. Roosevelt in 1936. For Leonhardt, that was the moment when the New Deal coalition shattered. From then on, as the Democratic Party continued to reflect the views of college graduates and professionals, it would lose more and more working-class voters.

McGhee’s and Leonhardt’s accounts might appear to be in tension, echoing the “race versus class” debate that followed Trump’s victory in 2016. In fact, they’re complementary. As the economist Thomas Piketty has shown, since the’60s, left-leaning parties in most Western countries, not just the U.S., have become dominated by college-educated voters and lost working-class support. But nowhere in Europe was the backlash quite as immediate and intense as it was in the U.S. A major difference, of course, is the country’s unique racial history.

The 1972 election might have fractured the Democratic coalition, but that still doesn’t explain the rise of free-market conservatism. The new Republican majority did not arrive with a radically new economic agenda. Nixon combined social conservatism with a version of New Deal economics. His administration increased funding for Social Security and food stamps, raised the capital-gains tax, and created the Environmental Protection Agency. Meanwhile, laissez-faire economics remained unpopular. Polls from the ’70s found that most Republicans believed that taxes and benefits should remain at present levels, and anti-tax ballot initiatives failed in several states by wide margins. Even Reagan largely avoided talking about tax cuts during his failed 1976 presidential campaign. The story of America’s economic pivot still has a missing piece.

According to the economic historian Gary Gerstle’s 2022 book, The Rise and Fall of the Neoliberal Order, that piece is the severe economic crisis of the mid-’70s. The 1973 Arab oil embargo sent inflation spiraling out of control. Not long afterward, the economy plunged into recession. Median family income was significantly lower in 1979 than it had been at the beginning of the decade, adjusting for inflation. “These changing economic circumstances, coming on the heels of the divisions over race and Vietnam, broke apart the New Deal order,” Gerstle writes. (Leonhardt also discusses the economic shocks of the ’70s, but they play a less central role in his analysis.)

Free-market ideas had been circulating among a small cadre of academics and business leaders for decades—most notably the University of Chicago economist Milton Friedman. The ’70s crisis provided a perfect opening to translate them into public policy, and Reagan was the perfect messenger. “Government is not the solution to our problem,” he declared in his 1981 inaugural address. “Government is the problem.”

Part of Reagan’s genius was that the message meant different things to different constituencies. For southern whites, government was forcing school desegregation. For the religious right, government was licensing abortion and preventing prayer in schools. And for working-class voters who bought Reagan’s pitch, a bloated federal government was behind their plummeting economic fortunes. At the same time, Reagan’s message tapped into genuine shortcomings with the economic status quo. The Johnson administration’s heavy spending had helped ignite inflation, and Nixon’s attempt at price controls had failed to quell it. The generous contracts won by auto unions made it hard for American manufacturers to compete with nonunionized Japanese ones. After a decade of pain, most Americans now favored cutting taxes. The public was ready for something different.

[Eric Posner: Milton Friedman was wrong]

They got it. The top marginal income-tax rate was 70 percent when Reagan took office and 28 percent when he left. Union membership shriveled. Deregulation led to an explosion of the financial sector, and Reagan’s Supreme Court appointments set the stage for decades of consequential pro-business rulings. None of this, Gerstle argues, was preordained. The political tumult of the ’60s helped crack the Democrats’ electoral coalition, but it took the unusual confluence of a major economic crisis and a talented political communicator to create a new consensus. By the ’90s, Democrats had accommodated themselves to the core tenets of the Reagan revolution. President Bill Clinton further deregulated the financial sector, pushed through the North American Free Trade Agreement, and signed a bill designed to “end welfare as we know it.” Echoing Reagan, in his 1996 State of the Union address, Clinton conceded: “The era of big government is over.”

Today, we seem to be living through another inflection point in American politics—one that in some ways resembles the ’60s and ’70s. Then and now, previously durable coalitions collapsed, new issues surged to the fore, and policies once considered radical became mainstream. Political leaders in both parties no longer feel the same need to bow at the altar of free markets and small government. But, also like the ’70s, the current moment is defined by a sense of unresolved contestation. Although many old ideas have lost their hold, they have yet to be replaced by a new economic consensus. The old order is crumbling, but a new one has yet to be born.

The Biden administration and its allies are trying to change that. Since taking office, President Joe Biden has pursued an ambitious policy agenda designed to transform the U.S. economy and taken overt shots at Reagan’s legacy. “Milton Friedman isn’t running the show anymore,” Biden quipped in 2020. Yet an economic paradigm is only as strong as the political coalition that backs it. Unlike Nixon, Biden has not figured out how to cleave apart his opponents’ coalition. And unlike Reagan, he hasn’t hit upon the kind of grand political narrative needed to forge a new one. Current polling suggests that he may struggle to win reelection.

[Franklin Foer: The new Washington consensus]

Meanwhile, the Republican Party struggles to muster any coherent economic agenda. A handful of Republican senators, including J. D. Vance, Marco Rubio, and Josh Hawley, have embraced economic populism to some degree, but they remain a minority within their party.

The path out of our chaotic present to a new political-economic consensus is hard to imagine. But that has always been true of moments of transition. In the early ’70s, no one could have predicted that a combination of social upheaval, economic crisis, and political talent was about to usher in a brand-new economic era. Perhaps the same is true today. The Reagan revolution is never coming back. Neither is the New Deal order that came before it. Whatever comes next will be something new.  

What Happens When Real People Play Squid Game?

The Atlantic

www.theatlantic.com › culture › archive › 2023 › 11 › squid-game-the-challenge-netflix › 676099

People clad in green tracksuits stand nervously in a circle. They’re participating in a “test” on Squid Game: The Challenge, Netflix’s new reality competition series based on the streamer’s hit South Korean drama Squid Game, but they’re really just playing a game of chance. Each player must nominate someone to be eliminated, and then roll a dice. If they roll a six, the person they chose is eliminated. And so, over the course of 10 long minutes, they roll and roll and keep on rolling. Some inevitably roll sixes. Relieved players sigh; friends of eliminated players cry. Meanwhile, sitting on my couch, I hover my thumb over my remote’s fast-forward button, wishing they’d hurry up.

This was not the reaction I had to the original Squid Game. Two years ago, I watched from behind my fingers, gasped at the twists, teared up for the characters as they risked their lives for a chance to win a massive fortune. The show’s casual hyper-violence was shocking, but its poignant relationships—the way they formed and fell apart as the prize money grew more within reach—elicited the most visceral responses.

[Read: In Netflix’s Squid Game, debt is a double-edged sword]

The Challenge, as a reality program involving civilians, is a non-murderous and far tamer version of Squid Game. Black ink is used to represent blood, and talking-head interviews compensate for the lack of scripted dialogue. For the players, there’s no actual sense of danger—only the fear of missing out on the $4.56 million jackpot, reportedly the largest in reality-TV history, which means that The Challenge actively rejects Squid Game’s trenchant commentary on wealth inequality and the hell of living in a late-capitalist society. But the series’ willful ignorance of its parent show’s message is the least of its problems. Like most competition programs, it’s highly bingeable, even addictive. Yet it’s too obviously packaged, its games carelessly designed. The Challenge ditches the elements that made Squid Game Netflix’s most-watched series of all time—the unexpected tenderness between characters, the brutality and scale of its set pieces—and delivers a rote reality show.

Even the meticulously re-created sets—the Escherlike staircases, the towers of bunk beds, the shadowy control room in which the “guards” observe the contestants—are silly rather than chilling. The Challenge, with its players cheerfully pretending to collapse when they’re eliminated and applauding when more cash is added to the prize money, is too goofy to make Squid Game’s production design work as anything more than a familiar backdrop. The same goes for the games themselves: Many players, having watched the drama, know what to expect. When they play Honeycomb—the game in which they must use a needle to extract a shape from a sheet of candy without breaking it—most of them lick their candy to soften the sugar, just as the protagonist of Squid Game did. And because The Challenge doesn’t have all of the contestants participate at the same time, the episode yields several tedious rounds of people repeatedly licking and poking their candy. It’s excruciatingly boring to watch.

The show’s greatest failure, though, is its shortage of memorable characters. In keeping with Squid Game, the program begins with 456 players, any of whom could be eliminated at any moment. With so many people to follow, The Challenge reduces most contestants to reality-TV archetypes: the overconfident villain, the loner not there to make friends, the underdog with a heart of gold. Though friendships form and players antagonize one another over time, the show doesn’t focus on any of the contestants long enough to make their stories resonate. A mother-and-son pair receives more screen time than most, because their motivation—to spend more time with each other—is so simple.

In some ways, The Challenge suffers because it’s been branded a Squid Game spin-off. Hearing players say “This is wild!” and “You got this!” over and over just made me think about how sharp the original show’s dialogue could be. The glimpses of deeper personal reasons for playing—shown in brief interviews conducted before the games in “player processing rooms”—made me miss the complex portraits of Squid Game’s fictional characters. The Challenge amounts to one episode after another of strangers hastily trying to accomplish tasks, greedily putting themselves ahead of their competition, and being surprised when anything happens that didn’t happen in Squid Game. It’s the epitome of why television has been reduced to “content” lately; it’s opportunistic programming capitalizing on recognizable IP and delivering something thoughtless and lazy. I felt guilty watching so much of it.

But then again, I couldn’t stop. Every time an episode of The Challenge ended, I found myself compelled to watch more. The show barely scratches the surface of what Squid Game interrogated—what people are like when pushed to the edge, what they’d truly sacrifice for a fortune—but The Challenge is classic reality TV: Contestants show off for the cameras, play mind games, engage in petty squabbles, form alliances that turn out to be as brittle as a sheet of candy. I can’t claim to care about any of them, but I do want to know who wins—which is why I never watched The Challenge from behind my fingers. Instead, the show is the kind of superficial entertainment that I find terribly hard to look away from.

The Decision That Could End Voting Rights

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 11 › voting-rights-act-section-2-court › 676060

The right to vote free of racial discrimination was won by blood and sacrifice, those of both the soldiers who fought to preserve the Union and the enslaved and formerly enslaved, and inscribed in the Constitution as the Fifteenth Amendment, so that sacrifice would not be in vain. But that right is also very inconvenient for the modern Republican Party, which would like to be able to discriminate against Black voters without interference from the government.

Yesterday, a three-judge panel from the Eighth Circuit Court of Appeals concluded that Section 2 of the Voting Rights Act, the law that made America a true democracy for all of its citizens, does not allow private parties to bring lawsuits challenging racial discrimination in voting, which is how the law has worked since it was passed. The decision would effectively outlaw most efforts to ensure that Americans are not denied the right to vote on the basis of race as the Fifteenth Amendment demands.

"It’s hard to overstate how important and detrimental this decision would be if allowed to stand: the vast majority of claims to enforce section 2 of the Voting Rights Act are brought by private plaintiffs, not the Department of Justice with limited resources," the election-law expert Rick Hasen wrote on his website. "If minority voters are going to continue to elect representatives of their choice, they are going to need private attorneys to bring those suits."

The Fifteenth Amendment and the Voting Rights Act were made necessary by the long and ongoing history of political parties seeking to disenfranchise voters on the basis of race. Lawmakers, given free rein, will do their best to draw districts to their party’s advantage. When racially polarized voting is present, the temptation will be to engage in racial discrimination against a rival party's constituency. For example, if your party mostly relies on support from white voters, you might try to draw a district that minimizes the political power of Black voters, a practice called racial gerrymandering. This is what Democrats did in the aftermath of Reconstruction, and what Republicans are now accused of doing in Arkansas, the subject of this lawsuit, although not deliberately. The Voting Rights Act bans practices that have the purpose or effect of discriminating on the basis of race, a standard that prevents lawmakers from benefiting from discrimination as long as they can cover their tracks. During Arkansas’s 2021 redistricting process, the state chapter of the NAACP alleges, lawmakers there drew state-district lines that dilute Black voting strength.

[Kimberly Wehle: How the Court became a voting-rights foe]

The Constitution is supposed to forbid such discrimination, but that sounds simpler than it is. In practice, if you have enough judges or justices willing to find unconstitutional the laws adopted to enforce that right, or willing to rule in such a way that nullifies the ability of those laws to function, you can simply render the Fifteenth Amendment useless. This is what the Supreme Court did after Reconstruction, when Black people were still trying to assert their right to vote and the justices decided it was a right they could not or would not defend.

The majority’s reasoning is simple, if absurd. Although acknowledging that “Congress had ‘clearly intended’ all along to allow private enforcement,” it argues that the text does not say so explicitly, therefore Congress’s intentions, Supreme Court precedent, and decades of practice are irrelevant. The fact that this would allow lawmakers to discriminate against their Black constituents without interference from pesky civil-rights groups is an innocent coincidence. This interpretation of the law was teed up for the judges by Justices Neil Gorsuch and Clarence Thomas in another 2021 voting-rights case in which the conservative-dominated high court weakened prohibitions against voting discrimination.  

All of this is part of a long-standing campaign by the Republican Party to undo one of its greatest accomplishments, the Fifteenth Amendment. It is a cause that Chief Justice John Roberts has championed since he was a 20-something lawyer in the Reagan Justice Department. As chief justice, Roberts has eviscerated voting-rights protections time and time again, in keeping with an ideological belief that prohibitions on racial discrimination are themselves morally tantamount to racial discrimination.

[From the March 2021 issue: American democracy is only 55 years old—and hanging by a thread]

Until recently. In June, Roberts and Justice Brett Kavanaugh unexpectedly sided with the Court’s Democratic appointees in upholding a lower-court order forcing Alabama to stop discriminating against the state’s Black voters. Alabama originally defied this order, perhaps because it was so out of character with Roberts’s past jurisprudence. The state’s recalcitrance forced the Supreme Court to rebuke Alabama again and tell it to follow the law. Not having done so, after all, would have sanctioned broader defiance of the Court's power, making Alabama’s behavior a direct threat to the justices’ authority, something none of the justices will countenance.

The Arkansas case does not pose such a threat, and therefore it raises the question of whether, this time, Roberts and Kavanaugh will go along with such an obvious attempt to allow Republican lawmakers to violate the voting rights of their nonwhite constituents with near-impunity. The fate of the right to vote free of racial discrimination is in the hands of powerful conservative men who, like the justices at the twilight of Reconstruction, have never considered it all that significant.

You Don’t Need to ‘Earn’ Thanksgiving Dinner

The Atlantic

www.theatlantic.com › family › archive › 2023 › 11 › turkey-trot-tradition-american-diet-culture › 676056

Every Thanksgiving, while many people are preparing stuffing or frantically Googling how long turkeys take to defrost, others rise early, don commemorative T-shirts (and maybe turkey-shaped hats), and gather for a chilly morning run.

This is the turkey trot, typically a 5- to 10-kilometer race, perhaps done for charity, which has become a delightfully contentious holiday tradition much like crack-of-dawn Black Friday lines and marshmallow topping on sweet potatoes. Participants look forward to the goofy costumes and collective endorphin rush; detractors consider the type of person who would voluntarily trade the extra holiday sleep for a cold jog that costs money to be a different species entirely. A host of memes, which feature pictures of festively clad runners in miserable weather, mock the race and those who run it with captions such as “Imagine meeting your soulmate and then finding out their family runs 5ks on holidays?” My husband has sent me many iterations of them ever since he was blindsided by the horrifying discovery that my mostly sedentary family was, in fact, full of “trotters.” But we aren’t alone: Thanksgiving is the most popular time to race all year. Though estimates vary, nearly 1 million people participate annually. That a day whose centerpiece is feasting has become one that many start by running might seem like a contradiction. However, the custom actually fits quite snugly into the American tendency to pit excess against repentance—especially when it comes to food.

[Read: The dark side of fitness culture]

I started trotting as a child, and I dreaded it each year—not just because of the cold Missouri weather. My parents had to drag me to the starting line, and I’d cross the finish only after they bribed me with a brand-new journal somewhere around the first water station. I resented being freezing and sore and covered in the weird liquefied snot that seems to come only from running in the wind. I also couldn’t shake the feeling that the race was a punishment. Indeed, for a long time, thanks to the diet and exercise culture of the 1990s and early aughts, I internalized the notion that exercise was not a pleasure in itself but, above all, a means for getting skinnier and counteracting the food I ate. My stance has evolved over the years. Now I see working out as something that helps me calm down, feel strong, and enjoy what my body is capable of. I think of trots similarly, with the added bonus of free swag and fun outfits. Still, as much as I celebrate the families and kids who pin on their bibs in pursuit of playful competition, I ache for the ones who might race as I used to: through gritted teeth, seeking absolution for the perceived sin of having a body.

Despite my hard-won personal enlightenment, turkey trots around the country are still sometimes touted as ways to “earn your Thanksgiving dinner,” “burn some pre-feast calories,” or feel “guilt melt away.” These messages imply that at least some people are motivated to run on Thanksgiving because of a pernicious myth: that eating is shameful rather than sustaining, and that we must run as redress for our caloric sins. This idea of “earning” your food can be, in some ways, traced to the early 20th century. At that time, the calorie became the go-to tool for quantifying how much one ate, and calorie restriction became a predominant weight-loss method. Lulu Hunt Peters, the author of the best-selling 1918 book Diet and Health: With Key to the Calories, is widely credited with introducing this attitude, writing, “Hereafter you are going to eat calories of food. Instead of saying one slice of bread, or a piece of pie, you will say 100 calories of bread, 350 calories of pie.” Adrienne Bitar, a food-studies scholar at Cornell University and the author of Diet and the Disease of Civilization, told me that slowly people began to talk about exercise in the same numerical way and started doing more physically demanding workouts, such as aerobics and jogging, as a result. The thinking switched from “I’m gonna go on a run” to “I’m gonna go on a 2.2-mile run and I’m gonna burn 300 calories,” she explained. By 1976, Weight Watchers had incorporated exercise into its weight-loss program.

[Read: Does overindulgence make you happy?]

It’s hardly a surprise, then, that a day dedicated to indulgence for some came to feel like it required a bit of compensatory exertion. Perhaps it’s down to this country’s puritanical sensibilities, but some of us still like a little suffering in our success, a little hard work in our happiness, a little rigor in our relaxation. As Bitar noted, “There’s this uneasy tension in American culture where it’s like control, excess, control, excess, and the pendulum swings back and forth and we’re constantly compensating for it.” Although the body-positivity movement has certainly been gaining cultural ground, the idea of food as something to be “earned” or “atoned for” lingers. Even though Weight Watchers, for instance, has rebranded as the more wellness-oriented WW, the program still counts “activity points” and “food points”; in its system, working out earns you the right to eat more or helps you make up for eating too much. Exercise deserves better—and so do we.

There are so many wonderful reasons to race on Thanksgiving Day, but I would argue that “earning” dinner is not one of them. If you want to run, do so to join a tradition people have enjoyed every year since 1896. Lace up to support local charities. Head to the start line with a (cotton) stuffed turkey on your head, observe your neighbors wearing leotards and feathers, and giggle to yourself wondering if this is what Ben Franklin had in mind when he called the turkey “a much more respectable bird” than the eagle. Run because exercise might reduce the stress of cooking a meal for your entire picky family in a preelection year. Wind yourself through blocked-off roads and corporate campuses because running (yes, even just a little bit) has been shown to help people live longer. Trot in pursuit of that post-activity appetite that makes everything taste a little better. Or, heck, don’t run at all! If turkey trots should be anything, they should be completely optional.

I last trotted in 2019, a little more than a month after having my cancerous thyroid removed. I let my brand-new fiancé (and recently converted trotter) run ahead while I blasted Katy Perry through my AirPods and marveled that a body that had so recently been at risk was now joyfully at work. It wasn’t punishment, and it wasn’t preparation; it was simply possible. And as I ate my complimentary pumpkin pie at the finish line, waiting to meet up with my five favorite people in the world so that we could go home to cook and eat and watch football and bicker and laugh, I didn’t feel more deserving of or less guilty about the food, fun, and rest that awaited me. I just felt thankful.