Itemoids

Hawaii

Can a Million Chinese People Die and Nobody Know?

The Atlantic

www.theatlantic.com › international › archive › 2023 › 02 › china-million-covid-deaths-communist-party › 673177

Can a million people vanish from the planet without the world knowing? It seems impossible in this age of instant digital communications, ubiquitous smartphones, and global social-media platforms that anything of comparable consequence can go unnoticed and unrecorded—no matter how remote the country or how determined its rulers might be to hide the truth.

Yet that’s apparently what has happened in China over the past two and a half months. After the Chinese leader Xi Jinping removed his draconian restrictions to contain COVID-19 in December, the virus rampaged across the nation with explosive speed. According to one of the government’s top scientists, 80 percent of the populace has now been infected. But we don’t know the full impact of this surge. The Chinese government’s secrecy has managed to obscure what really happened during the country’s latest and worst COVID wave.

Independent experts, skeptical of Beijing’s official data on COVID deaths, have been forced to calculate their own estimates—which indicate much higher and more disturbing numbers than the government claims. These estimates range from about 1 million to 1.5 million deaths, suggesting that, in absolute terms, China may have suffered more fatalities from COVID in two months than the U.S. did in three years.

[Read: Zero COVID’s failure is Xi’s failure]

By any reckoning, a terrible tragedy unfolded in China in recent weeks. That we’re left guessing about its scale is important as well. If the Chinese leadership can hide a million dead, what else can it conceal from the world? Authoritarian states have a notorious history of shielding the sufferings they inflict from the eyes of the world. The number of Chinese people who perished in the famine caused by the Great Leap Forward (1958–60), Mao Zedong’s crash modernization program, is still a subject of debate. In the era of Stalin’s gulags and Nazi concentration camps, limited technology and limitless repression helped dictators screen their atrocities.

Today, information wants to be free, as the netizen slogan goes. In a time when everyone carries the equivalent of a TV camera in their pocket and when satellites whir over a world saturated with open-source data, a new tech-empowered army of civic-minded citizens and whistleblowers was supposed to keep a closer watch on the bad guys. With greater transparency would come greater freedom.

The controversy over China’s death count, however, shows how much control autocratic states still wield over information. As China rises, its leadership’s inherent secrecy is a problem for all of us. Decisions made in Beijing and events in China have ramifications for global economic growth, jobs, prices, the environment, the stock market, and global security. But all too often, the world has to rummage through scraps of anecdotal evidence, opaque official pronouncements, and glimpses provided by outsiders to guess at how Beijing chooses its policies, and at the effect they have on the country—and thus China’s impact on our lives.

The Chinese Communist Party likes it that way. All governments have their secrets, of course, and try to spin news cycles and narratives. But in open societies, the debates of congresses and election campaigns expose the policy-making process to public view. Journalists, activists, and regular citizens are always poking around, asking uncomfortable questions, and posting on Instagram and TikTok.

[Read: Mourning becomes China]

No authoritarian state could survive that scrutiny, and the Chinese Communist Party has erected an extensive security state to make sure it doesn’t face such examination. China’s party-controlled legislature—the National People’s Congress, which is due to meet in early March—is more a pep rally than a debating society. In the absence of a free press, no watchful estate of reporters exists to keep tabs on the powerful. The intrepid and inquisitive are usually silenced: The citizen-journalist Zhang Zhan, for instance, was sentenced to four years in prison for documenting the original COVID outbreak in Wuhan in 2020.

As powerful as China’s police state is, however, it does not control every source of information. During the recent wave of infections, satellite imagery revealed heightened activity at cremation centers. In addition, domestic footage of overwhelmed hospital wards found its way onto Chinese social media. But once detected, such glimpses are quickly removed by China’s platoons of censors.

Concealing inconvenient truths is an industrial enterprise for the Communist Party. The Tiananmen massacre of 1989, common knowledge to much of the world, has been scrubbed from the domestic record. More recently, the Chinese government has worked to hide its mass detentions and torture of China’s minority Uyghur community in the Xinjiang region.

Chinese authorities have been obfuscating on COVID-19 since the pandemic began. The World Health Organization’s investigation into the origins of the coronavirus has stalled because of China’s lack of cooperation, but the Chinese foreign ministry continues to deflect responsibility by regurgitating an old conspiracy theory that the virus originated in the United States. Now the government appears to be trying to erase the country’s entire COVID experience from national consciousness. The zero-COVID policy, which employed large-scale quarantines and business shutdowns to contain the virus, had been heralded by authorities as a “magic weapon” to protect the people. But since the restrictions were removed, the term zero COVID has vanished from official discourse.

Next to disappear is COVID itself. China’s equivalent agency to the CDC determined recently that a new cycle of mass infection is “unlikely to occur” in coming months.

The government has tried to bury the memory of the dead with as much dispatch as it ditched its lockdowns. When COVID began to spread rapidly in December, the initial death counts released by the health authorities were so unbelievable that even the party brass seemed to realize they lacked credibility. That led to a mid-January announcement that 60,000 people had perished in the latest wave, and, more recently, the official tally of deaths has risen to a touch more than 83,000. We can’t say with absolute certainty that the Chinese government’s data are false, but public-health experts and other specialists have looked askance at these figures. Yanzhong Huang, a senior fellow for global health at the Council on Foreign Relations, called them a “vast undercount.”

[Jiwei Xiao: My mother just died of COVID in Wuhan]

One possibility is that China’s top leaders do not themselves know the real total. Contrary to the widely held image of China’s autocrats as a surveillance state’s all-knowing supervisors, the leadership in Beijing can, by the very nature of its rule, be left in the dark about what’s happening in the country.

“We think of China as a very high-capacity authoritarian regime, where the center is in control,” Jennifer Pan, a political scientist at Stanford University, told me. “The reality is that every authoritarian government that does not have free media faces a problem where they have trouble gathering reliable and accurate information.”

“Governance is delegated to local governments, and local governments have very strong incentives to keep bad news from being seen by the center,” she went on. “In order to be promoted, they have to show they are doing well.” In the case of COVID deaths, “we shouldn’t necessarily assume the central Chinese government has an accurate handle of what is going on.”

In the absence of reliable information, the Communist Party can conjure its own version of reality—and the leadership has embraced a narrative to fit the data it has, last week declaring that it had achieved a “major and decisive victory” over COVID and “effectively protected the people’s lives and health.” With this success, “China has created a miracle in human history.” A commentary published by the official news agency, Xinhua, added that the triumph was evidence of the party’s “governance capacity.”

Whether the Chinese people swallow this swill is another unknown. Without freedom of speech, the government also denies any true picture of public opinion that might open up the Communist Party to criticism. But the party has little choice but to promote this kind of narrative. Because its leaders present themselves as infallible, they can never admit to the full extent of the COVID catastrophe.

“Shocking the public presents a threat to the government,” Eric Harwit, an Asian-studies professor at the University of Hawaii at Manoa who studies Chinese social media, told me. The authorities “think it’s something the people can’t handle—they can’t handle the truth.”

Beyond that, the state and its supporters had previously promoted China’s comparatively low death count to claim that the country’s authoritarian system, which had rigidly imposed the zero-COVID policy, was superior to other forms of government, especially liberal democracy. Revealing the real death count would not only damage the party’s reputation at home but also embarrass its leadership abroad.

Yet the party’s penchant for misinformation comes with risks. Average Chinese citizens have more information at their fingertips than they’ve had in the past, thanks to smartphones and social media. As extensive and effective as the authorities’ censorship operation is, it can still be caught off guard by popular expression on social platforms, which allow individuals to connect with a broad network of people and sources. Over time, the gap between what the officials say and what the public sees could damage the party’s credibility.

The case of the missing million is a chilling reminder that the Communist Party can still make people disappear, and the world may never know. Beijing’s secrecy creates other, immediate problems. How can a government unwilling to reveal COVID deaths be trusted to share other vital information, such as China’s greenhouse-gas emissions, crucial to tackling global warming? Beijing’s resistance to cooperating with the international community in a search for answers about the origins of this pandemic does nothing to help world leaders prevent the next one. The fact that we barely understand how Xi and his team have made COVID-related decisions is an indication of how cloistered the party’s inner sanctum remains.

The chasm between what we know about China and what we need to know about China is much too large. The party will keep it that way.

An Unlucky President, and a Lucky Man

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 02 › jimmy-carter-accomplishments-james-fallows › 673146

This story seems to be about:

Life is unfair, as a Democratic president once put it. That was John F. Kennedy, at a press conference early in his term.

Jimmy Carter did not go through as extreme a range of the blessings and cruelties of fate as did Kennedy and his family. But I think Carter’s long years in the public eye highlighted a theme of most lives, public and private: the tension between what we plan and what happens. Between the luck that people can make for themselves and the blind chance they cannot foresee or control.

In the decades of weekly Bible classes he led in his hometown of Plains, Georgia, Jimmy Carter must have covered Proverbs 19:21. One contemporary translation of that verse renders it as: “Man proposes, God disposes.”

Not everything in his life happened the way Jimmy Carter proposed or preferred. But he made the very most of the years that God and the fates granted him.

Americans generally know Jimmy Carter as the gray-haired retiree who came into the news when building houses or fighting diseases or monitoring elections, and whose political past became shorthand for the threadbare America of the 1970s. Most of today’s Americans had not been born by the time Carter left office in 1981. Only about one-fifth are old enough to have voted when he won and then lost the presidency. It is hard for Americans to imagine Jimmy Carter as young—almost as hard as it is to imagine John F. Kennedy as old.

But there are consistent accounts of Carter’s personality throughout his long life: as a Depression-era child in rural Georgia, as a hotshot Naval Academy graduate working in Hyman Rickover’s then-futuristic-seeming nuclear-powered submarine force, as a small businessman who entered politics but eventually was forced out of it, as the inventor of the modern post-presidency.

What these accounts all stress is that, old or young, powerful or diminished, Jimmy Carter has always been the same person. That is the message that comes through from Carter’s own prepresidential campaign autobiography, Why Not the Best?, and his many postpresidential books, of which the most charming and revealing is An Hour Before Daylight: Memories of a Rural Boyhood. It is a theme of Jonathan Alter’s insightful biography, His Very Best. It is what I learned in two and a half years of working directly with Carter as a speechwriter during the 1976 campaign and on the White House staff, and in my connections with the Carter diaspora since then.

Whatever his role, whatever the outside assessment of him, whether luck was running with him or against, Carter was the same. He was self-controlled and disciplined. He liked mordant, edgy humor. He was enormously intelligent—and aware of it—politically crafty, and deeply spiritual. And he was intelligent, crafty, and spiritual enough to recognize inevitable trade-offs between his ambitions and his ideals. People who knew him at one stage of his life would recognize him at another.

Jimmy Carter didn’t change. Luck and circumstances did.

Jimmy Carter made his luck, and benefited from luck, when he ran for president. He couldn’t have done it without his own discipline and commitment, and his strategy. He seemed to shake every hand in Iowa—but his team was also the first to recognize that the new Iowa caucus system opened the chance for an outsider to leap into the presidency. At a time when his national name recognition was 1 percent, he spent all day walking up to strangers and saying, “My name is Jimmy Carter, and I’m running for president.” Stop and imagine doing that yourself, even once. Carter was easier to admire—when delivering his stump speech to a rapt crowd, when introducing himself at a PTA meeting or in a diner—than he was to work for. But that is probably true of most public figures with such a drive to succeed.

Because he was so engaging in person, and made such a connection in countless small-group meetings across Iowa, he won the caucuses and went on to win the nomination and the presidency. No other candidate has gone from near-invisibility to the White House in so short a time. (Barack Obama became a Democratic Party star with his famous convention speech in 2004, four years before he won the presidency. Donald Trump had been a celebrity for decades.)

This is how Carter and his team helped themselves. Other developments they hadn’t planned affected the race—mainly to their benefit.

By early 1976, Carter had become the new thing. He embraced rock music and quoted Bob Dylan. He was as powerful and exciting a fusion of cultures as any candidate who came after him. He was a Naval Academy graduate and an Allman Brothers fan. He was deeply of the South and of the Church. He also spoke about Vietnam as a racist war. He quoted poems by Dylan Thomas. He was, yes, cool. He appeared at a Law Day meeting at the University of Georgia’s law school and upbraided the audience about the injustice of America’s legal system. Here’s just one sample of the speech, which would now be considered part of the Sanders-Warren platform:

I grew up as a landowner’s son. But I don’t think I ever realized the proper interrelationship between the landowner and those who worked on a farm until I heard Dylan’s record … ”Maggie’s Farm.”

It’s worth reading the whole thing.

But what if Hunter S. Thompson had not noticed this speech and announced that he “liked Jimmy Carter” in an influential article in Rolling Stone? What if Time and Newsweek, also very influential then, had not certified him as a serious potential leader with their coverage? What if the civil-rights figures Martin Luther King Sr. and Andrew Young had not endorsed Carter to Black audiences around the country, and reassured white liberals that he was the southern voice an inclusive America needed? (As governor of Georgia, Carter had placed a portrait of MLK Jr. in the state capitol.) What if Jerry Brown had not waited so long to enter the primaries? What if Teddy Kennedy had dared to run? What if Mo Udall had figured out the Iowa-caucus angle before Carter did? What if Scoop Jackson had not been so dull? Or George Wallace so extreme?

And for the general election, what if Gerald Ford had not pardoned Richard Nixon, turning Watergate into Ford’s own problem? (The Carter team knew that this was a campaign plus. But in the first sentence of his inaugural address, Carter thanked Ford for all he had done “to heal our land.”) What if Saturday Night Live, then in its first season and itself hugely influential, had not made Ford the butt of ongoing jokes? What if Ford had not blundered in a crucial presidential debate? What if Carter’s trademark lines on the stump—I’ll never lie to you and We need a government as good as its people—had not been so tuned to the battered spirit of that moment, and had been received with sneers rather than support?

What if, what if. There are a thousand more possibilities. In the end the race was very close. Luck ran his way.

Then he was in office. Intelligent, disciplined, self-contained, spiritual. President Carter made some of his own luck, good and bad—as I described in this magazine 44 years ago. There is little I would change in that assessment, highly controversial at the time, except to say that in 1979 Carter still had nearly half of his time in office ahead of him, and most of his adult life. I argued then that his was a “passionless” presidency. He revealed his passions—his ideals, his commitments—in the long years to come.

In office he also had the challenge of trying to govern a nearly ungovernable America: less than two years after its humiliating withdrawal from Saigon, in its first years of energy crisis and energy shortage, on the cusp of the “stagflation” that has made his era a symbol of economic dysfunction. It seems hard to believe now, but it’s true: The prime interest rate in 1980, the year Carter ran for reelection, exceeded 20 percent. You never hear, “Let’s go back to the late ’70s.”

Probably only a country as near-impossible to lead as the United States of that time could have given someone like Jimmy Carter a chance to lead it.

Despite it all, Carter had broader support during his first year in office than almost any of his successors, except briefly the two Bushes in wartime emergencies. Despite it all, most reckonings have suggested that Carter might well have beaten Ronald Reagan, and held on for a second term, if one more helicopter had been sent on the “Desert One” rescue mission in Iran, or if fewer of the helicopters that were sent had failed. Or if, before that, Teddy Kennedy had not challenged Carter in the Democratic primary. Or if John Anderson had not run as an independent in the general election. What if the ayatollah’s Iranian government had not stonewalled on negotiations to free its U.S. hostages until after Carter had been defeated? What if, what if.

Carter claimed for years that he came within one broken helicopter of reelection. It’s plausible. We’ll never know.

Because we do know, in retrospect, that Reagan had two landslide victories, over Carter and then Walter Mondale, and that the 1980 election broke heavily in Reagan’s favor in its final weeks, it’s natural to believe that Carter never had a chance. But it looked so different at the time. History changed, through effort and luck, when Carter arrived on the national stage in 1976. And it changed, through effort and luck, when he departed four years later.

Effort and luck combined for Jimmy Carter’s first two acts: becoming president, and serving in office.

Luck played a profoundly important role in his third act, allowing him to live mostly vigorously until age 98, and to celebrate his 76th wedding anniversary with his beloved wife, Rosalyn. He had 42 full years in the postpresidential role—10 times longer than his term in office, by far the most of any former president.

This extended span mattered for reasons within Carter’s control, and beyond it. Good fortune, medical science, and a lifetime history as a trim, fit athlete (he was a good tennis player, a runner, and a skillful softball pitcher), helped Carter survive several bouts of cancer and other tolls of aging. But his faith, will, idealism, and purpose allowed him to invent and exemplify a new role for former presidents, and to see his own years in office reconsidered.

Suppose that, like Lyndon B. Johnson, he had died of a literal and figurative broken heart at age 64. His record and achievements would have concluded with Ronald Reagan still in office, and his story would have been summarized as ending on a loss. Carter could never have received the Nobel Peace Prize, which he won while nearing age 80, in 2002. (Nobel Prizes cannot be given posthumously.)

With health like Lyndon Johnson’s, Jimmy Carter would not have had a chance to establish his new identity—and to see prevailing assessments of his role as president change as profoundly as those of Harry Truman did. As with Truman, the passing years have made it easier to see what Carter achieved, and to recognize what he was trying to do even when unsuccessful. But Truman was no longer alive to see that happen. For Carter I think the process of reassessment will go on.

It is hard for most Americans to imagine the Jimmy Carter of those days. It is hard even for me to recognize how different the country is as a whole.

Just to talk about politics: The South was then the Democrats’ base, and the West Coast was hostile territory. Jimmy Carter swept all states of the old Confederacy except Virginia, and lost every state west of the Rockies except Hawaii. In Electoral College calculations, the GOP started by counting on California.

The Democrats held enormous majorities in both the Senate and the House. Carter griped about dealing with Congress, as all presidents do. But under Majority Leader Robert Byrd, the Democrats held 61 seats in the Senate through Carter’s time. In the House, under Speaker Tip O’Neill, they had a margin of nearly 150 seats (not a typo). The serious legislative dealmaking was among the Democrats.

In culture and economics—well, you just need to watch some movies from the 1970s, Rocky, Taxi Driver, The Conversation, Dog Day Afternoon (or, if you prefer, Saturday Night Fever and Star Wars). The United States was a country fraying on all its edges, just beginning to absorb the shock of the Vietnam years, in its first wave of grappling with globalization and environmental constraints.

Prevailing memories reached back far beyond Vietnam to the Korean War, World War II, and the Great Depression. In campaign speeches, Carter talked about the difference it made to him, as a boy, when Franklin Roosevelt’s Rural Electrification Administration brought electric power to small communities like his. We on the speechwriting staff could rely on the story for applause. Enough people remembered.

There were no cellphones then, nor even bulky “portable” phones. Computers meant behemoths at major data centers.

And in civic life, Richard Nixon’s downfall seemed to have reinforced the idea that there was such a thing as public shame. It was construed as embarrassing for Jimmy Carter that his hard-luck brother, Billy, was in a penny-ante way cashing on the family fame by promoting six packs of his own “Billy Beer.” Carter, from a small-town business-owning background, felt that he had to sell the family peanut mill to avoid even the appearance of impropriety. After Nixon’s scandals and Spiro Agnew’s resignation, “doing the right thing” mattered, and Carter did so.

Jimmy Carter took office in the “before” times. We live in an unrecognizable “after.” He did his best, in office and out, to promote the values he cared about through it all.

What did he do in office? He did a lot. He was visionary about climate and the environment. He changed the composition of the federal courts. For better and worse he deregulated countless industries, from craft brewing to the airlines. I direct you to Stuart Eizenstat’s detailed and authoritative President Carter: The White House Years for specifics. I’ll just add:

Jimmy Carter did more than anyone else, before or since, to bring peace to the Middle East, with his Camp David accords. The agreement between Menachem Begin and Anwar Sadat could not possibly have been reached without Carter’s all-in, round-the-clock involvement. I was there and saw it. Any other witness would agree. (This was also the theme of Lawrence Wright’s excellent Thirteen Days in September.) Jimmy Carter saved the United States decades of woe with his Panama Canal Treaty. Jimmy Carter bought the United States several generations’ worth of respect with his human-rights policy. Can such an approach be no-exceptions or absolute? Of course not. Carter recognized as clearly as anyone the tension between ideals and reality. But does even imperfect idealism make a difference? That is the case Carter made in a speech at Notre Dame in 1977. I think it stands up well. Its essence:

We have reaffirmed America's commitment to human rights as a fundamental tenet of our foreign policy …

This does not mean that we can conduct our foreign policy by rigid moral maxims. We live in a world that is imperfect and which will always be imperfect—a world that is complex and confused and which will always be complex and confused.

I understand fully the limits of moral suasion. We have no illusion that changes will come easily or soon. But I also believe that it is a mistake to undervalue the power of words and of the ideas that words embody. In our own history, that power has ranged from Thomas Paine's "Common Sense" to Martin Luther King, Jr.’s “I Have a Dream.”

In the life of the human spirit, words are action.

Jimmy Carter spoke to the “values” and “engagement” crises decades before demagogues like Trump or healers like Obama. In the summer of 1979, he gave an unusually sober and sermonlike address on the national “crisis of confidence.” This is generally known as the “malaise” speech, and is widely considered a downbeat marker of a down era. But as Kevin Mattson points out in his entertaining What the Heck Are You Up to, Mr. President?, the speech was well received at the time. Carter’s popularity rating went up nearly 10 points in its wake. (Also, the speech didn’t include the word malaise.) Things again started going wrong for Carter soon after that—he made mistakes, and was unlucky—but the speech deserves respect. It was a leader’s attempt to express the fears and hard truths many people felt, and to find a way forward.

Jimmy Carter survived to see many of his ambitions realized, including near eradication of the dreaded guinea worm, which, unglamorous as it sounds, represents an increase in human well-being greater than most leaders have achieved. He survived to see his character, vision, and sincerity recognized, and to know that other ex-presidents will be judged by the standard he has set.

He was an unlucky president, and a lucky man.

We are lucky to have had him. Blessed.

Father recounts moment his United flight plunged toward the ocean

CNN

www.cnn.com › videos › travel › 2023 › 02 › 14 › united-777-plunge-hawaii-cohen-dnt-ebof-vpx.cnn

A United Airlines 777 leaving Hawaii in December plunged toward the ocean for 21 seconds shortly after takeoff and came within 800 feet of sea level, flight tracking data show. CNN's Gabe Cohen has more.

Teenagers stun surfing great Stephanie Gilmore at Women's Championship Tour opener in Hawaii

CNN

www.cnn.com › 2023 › 02 › 04 › sport › stephanie-gilmore-pipeline-hawaii-spt-intl › index.html

Australia's Stephanie Gilmore is widely regarded as the greatest female surfer of all time, so it came as a surprise when the eight-time world champion started her title defense with an early exit at Pipeline in Hawaii.

Police Reform Is Not Hopeless

The Atlantic

www.theatlantic.com › books › archive › 2023 › 02 › tyre-nichols-police-reform-books-consent-decree-qualified-immunity › 672900

Most Americans want to see the police reformed. A Gallup poll conducted in May, two years after the murder of George Floyd, found that 50 percent of adults favored “major changes” to policing, 39 percent wanted “minor changes,” and only 11 percent thought no changes were required. Despite this general consensus and a patchwork of recent policy shifts in communities across the country, injustices continue to accumulate, and it would be easy to see the problems with policing as intractable.

Three high-profile deaths just since the start of this year would seem to confirm this feeling. On January 3, Keenan Anderson, a 31-year-old Black high-school teacher (and cousin of Patrisse Cullors, a co-founder of Black Lives Matter), died after Los Angeles police shocked him repeatedly with a Taser. The next day, cops in Cambridge, Massachusetts, shot and killed Sayed Faisal, a 20-year-old Bangladeshi American college student who allegedly approached them with a knife. And less than a week after that, another Black man, 29-year-old Tyre Nichols, died following a beating by Memphis police officers. Video footage of the incident, released this past Friday, led to mass protest in many cities and an anguished response to yet another senseless death. Nothing we’re doing to fix policing seems to be working—or so it might appear.

Against this backdrop, two new books chronicle horrific incidents of police abuse, cover-ups, and intransigence. But they also offer something else: light pouring through the cracks, concrete evidence that police departments can change for the better.

In The Riders Come Out at Night: Brutality, Corruption, and Cover-Up in Oakland, the journalists Ali Winston and Darwin BondGraham tell the story of Oakland, California’s police department. The title refers to a small group of officers who allegedly brutalized residents of impoverished, high-crime, largely Black West Oakland starting in the late 1990s. The actions of these cops became known only because a rookie named Keith Batt was assigned to train with one of them. Batt was deeply troubled by what he observed—behavior that Batt said included kidnapping, assault, and filing false police reports. He contacted internal-affairs investigators and became the main witness in a criminal case against the officers (three of whom stood trial; none was convicted).

[Read: No such thing as a bad apple]

In harrowing detail, Winston and BondGraham describe the terror that Batt said Oaklanders endured at the hands of the Riders, as well as the ostracism Batt faced when he refused to honor the “blue wall of silence” that has long characterized cop culture.

While the Riders’ actions may have been extreme, Winston and BondGraham view them as symptomatic of larger issues. As Oakland underwent deindustrialization in the 1970s and ’80s, poverty and crime rose. Turning away from local jobs initiatives, city leaders embraced ill-fated redevelopment efforts and pressed their often-racist police department to “clean up the streets.” When rogue cops took things too far, their supervisors looked the other way, knowing perfectly well what their marching orders were.

The Riders were significant in another respect: A lawsuit brought by the group’s alleged victims became the catalyst for a consent decree, a potentially powerful weapon for effecting change within police departments. Consent decrees are legally binding settlement agreements. In the usual course of affairs, after the Department of Justice has investigated a police agency and found that it has systematically violated people’s rights, the feds spell out changes in policy and procedure that the agency must undertake, changes that would bring it into line with established best practices. An independent monitor reports periodically to a judge on whether the department is meeting its marks.

Although the DOJ never investigated Oakland, the consent-decree model appealed to the civil-rights attorneys John Burris and Jim Chanin. In 2003, representing victims in the Riders case, they were able to maneuver the city into an unusual “negotiated” consent decree, which committed Oakland PD to a range of tasks, from better documenting the use of force to enhanced field training for young officers.

Consent decrees have been used to improve policing in cities such as Detroit and New Orleans, but they are expensive to administer and don’t always work. Winston and BondGraham show how the Oakland police resisted the required reforms at every turn. Top brass, middle management, frontline officers, and the police union displayed an “obstructionist mindset.” Oakland cops continued to shoot people at a furious pace. A poster in the department’s firing range was captioned You shut the fuck up. We’ll protect America. Keep out of our fucking way, liberal pussies.

The Riders Come Out at Night is a longish book, and its story is largely a condemnation of the Oakland police. But readers who stick with it to the end will discover something surprising. Although change was slow to come to Oakland, it did come. The turning point was the ascension of a reform-oriented police chief. Under Sean Whent, a longtime Oakland cop who led the department from 2013 to 2016, internal-affairs complaints dropped dramatically, the police did a better job protecting protesters’ rights, and the agency tackled racial bias.

Winston and BondGraham don’t put it in these terms, but Whent was arguably able to make progress because he helped shift the department’s culture. My own research on other cities suggests that the key to successful police reform is to pair sensible legal and policy restrictions on police behavior with new models of what it means to be a good cop, so that the hyperaggressive, “us versus them” culture of the profession bends in a different direction.

Whent believed not only that Oakland residents had a right to respectful policing, but that such policing would help the department control crime; the resulting trust would lubricate the all-important flow of information between cops and the community. Unlike his predecessors, he leaned into the consent decree (there was also intense legal pressure on him to do so), and enough of his cops followed suit that on the streets, things began to change.  

“The reforms that began in 2003 … have profoundly changed the Oakland police, and the city, for the better,” Winston and BondGraham conclude. “Today OPD officers are involved in far fewer deadly use-of-force incidents.” What’s more, where “Oakland cops were once known for abusive, explicit language,” now “audits of police body camera footage rarely flag instances in which officers curse or show impatience or anger.” The police have also “been able to steadily dial back their most problematic enforcement activities,” so that “Oakland is one of the only law enforcement agencies in America that could actually show (before the George Floyd protests) that it took action to reduce racial profiling.”

A similarly hopeful lesson might be drawn from Shielded: How the Police Became Untouchable, by the UCLA law professor Joanna Schwartz. Many cops perform their difficult job admirably, but part of the problem with reforming the police is that when this isn’t the case, officers aren’t always held to account for their misdeeds. Schwartz’s focus is on understanding why this should be, and she lands on 11 areas where law, policy, and politics have converged to make it hard for victims of police abuse to get justice.

Among Schwartz’s insights: There aren’t enough lawyers with the expertise to file federal civil-rights cases against police, especially outside large urban centers. This is partially a function of the fee structure allowed by the courts; only rarely can plaintiffs’ attorneys recoup their full costs, so relatively few lawyers find this kind of work financially viable.  

Schwartz’s special expertise is qualified immunity. This arcane legal doctrine dictates that a public official can’t be held responsible for violating someone’s rights unless the courts have already established that the particular circumstances do in fact constitute a violation. Although that sounds reasonable—you shouldn’t hold an official liable unless they knew that what they were doing was wrong—judges have interpreted this in a bizarro fashion.

Schwartz describes a case from Hawaii. A woman in an argument with her husband asked her daughter to call the cops and was Tasered when she accidentally bumped one of them. The Taser was used in so-called dart mode, where the weapon shoots out electrified probes. Her case against the officer ended up getting dismissed because, according to the appellate court, there had never before been a relevant ruling concerning Tasers, much less Tasers in dart mode, and therefore the officer couldn’t be held liable. Dart mode or not, the officer should have known not to do it.

Schwartz’s research shows that qualified-immunity defenses are raised in about 37 percent of lawsuits against the police. Although they’re successful only about 9 percent of the time, they gum up the litigation process because each qualified-immunity claim must be resolved before a case can proceed. The doctrine is a farce in any event, because police officers aren’t regularly updated on the intricacies of federal case law. Schwartz favors ending qualified immunity and argues that this won’t open the door to endless litigation.

Far more common than plaintiffs winning cases in court is cities settling with the victims of police abuse. (Settlements and legal awards cost Chicago nearly half a billion dollars from 2010 to 2020.) Usually cities pay these settlements out of their general funds. Police-department budgets don’t take the hit, so departments have little reason to retrain their officers and improve operating procedures. Schwartz urges cities to change this budgeting practice, giving police departments a financial incentive to learn from their mistakes.

Where’s the cause for hope? Schwartz observes that several of the changes she favors around qualified immunity were enshrined in state law in Colorado in 2020. It’s too early to tell what the effects of the Colorado law will be, but in theory, greater legal liability should deter police abuse. Other states may soon follow Colorado’s lead.

[Read: The state where protests have already forced major police reforms]

Many more levers need to be pulled to get police accountability to where it should be, but we are seeing progress. Even Schwartz, a fierce critic of law enforcement, acknowledges that over the past half century, “departments as a whole have become more professional and have improved their policies and trainings,” if only “to a degree,” in part because civil-rights attorneys and others in the community have kept the pressure on. The cops who were seen beating Tyre Nichols last month in Memphis? They were promptly fired by Memphis Police Chief Cerelyn Davis. They’ve now been arrested and charged with second-degree murder. There was a time not long ago when neither of those things would have happened so quickly.

The narrative that nothing ever gets better in policing isn’t just wrong; it’s an abdication of responsibility. It’s easier to lose oneself in resignation and despair than to bear down—motivated by a belief in the possibility of change—and put in the hard work of reforming a flawed but essential institution.