Itemoids

Chicago

Outdoor Dining Is Doomed

The Atlantic

www.theatlantic.com › health › archive › 2023 › 01 › restaurants-outdoor-dining-winter-covid › 672904

These days, strolling through downtown New York City, where I live, is like picking your way through the aftermath of a party. In many ways, it is exactly that: The limp string lights, trash-strewn puddles, and splintering plywood are all relics of the raucous celebration known as outdoor dining.

These wooden “streeteries” and the makeshift tables lining sidewalks first popped up during the depths of the coronavirus pandemic in 2020, when restaurants needed to get diners back in their seats. It was novel, creative, spontaneous—and fun during a time when there wasn’t much fun to be had. For a while, outdoor dining really seemed as though it could outlast the pandemic. Just last October, New York Magazine wrote that it would stick around, “probably permanently.”

But now someone has switched on the lights and cut the music. Across the country, something about outdoor dining has changed in recent months. With fears about COVID subsiding, people are losing their appetite for eating among the elements. This winter, many streeteries are empty, save for the few COVID-cautious holdouts willing to put up with the cold. Hannah Cutting-Jones, the director of food studies at the University of Oregon, told me that, in Eugene, where she lives, outdoor dining is “absolutely not happening” right now. In recent weeks, cities such as New York and Philadelphia have started tearing down unused streeteries. Outdoor dining’s sheen of novelty has faded; what once evoked the grands boulevards of Paris has turned out to be a janky table next to a parked car. Even a pandemic, it turns out, couldn’t overcome the reasons Americans never liked eating outdoors in the first place.

For a while, the allure of outdoor dining was clear. COVID safety aside, it kept struggling restaurants afloat, boosted some low-income communities, and cultivated joie de vivre in bleak times. At one point, more than 12,700 New York restaurants had taken to the streets, and the city—along with others, including Boston, Los Angeles, Chicago, and Philadelphia—proposed making dining sheds permanent. But so far, few cities have actually adopted any official rules. At this point, whether they ever will is unclear. Without official sanctions, mounting pressure from outdoor-dining opponents will likely lead to the destruction of existing sheds; already, people keep tweeting disapproving photos at sanitation departments. Part of the issue is that as most Americans’ COVID concerns retreat, the potential downsides have gotten harder to overlook: less parking, more trash, tacky aesthetics, and, oh God, the rats. Many top New York restaurants have voluntarily gotten rid of their sheds this winter.

The economics of outdoor dining may no longer make sense for restaurants, either. Although it was lauded as a boon to struggling restaurants during the height of the pandemic, the practice may make less sense now that indoor dining is back. For one thing, dining sheds tend to take up parking spaces needed to attract customers, Cutting-Jones said. The fact that most restaurants are chains doesn’t help: “If whatever conglomerate owns Longhorn Steakhouse doesn’t want to invest in outdoor dining, it will not become the norm,” Rebecca Spang, a food historian at Indiana University Bloomington, told me. Besides, she added, many restaurants are already short-staffed, even without the extra seats.

In a sense, outdoor dining was doomed to fail. It always ran counter to the physical makeup of most of the country, as anyone who ate outside during the pandemic inevitably noticed. The most obvious constraint is the weather, which is sometimes pleasant but is more often not. “Who wants to eat on the sidewalk in Phoenix in July?” Spang said.

The other is the uncomfortable proximity to vehicles. Dining sheds spilled into the streets like patrons after too many drinks. The problem was that U.S. roads were built for cars, not people. This tends not to be true in places renowned for outdoor dining, such as Europe, the Middle East, and Southeast Asia, which urbanized before cars, Megan Elias, a historian and the director of the gastronomy program at Boston University, told me. At best, this means that outdoor meals in America are typically enjoyed with a side of traffic. At worst, they end in dangerous collisions.

Cars and bad weather were easier to put up with when eating indoors seemed like a more serious health hazard than breathing in fumes and trembling with cold. It had a certain romance—camaraderie born of discomfort. You have to admit, there was a time when cozying up under a heat lamp with a hot drink was downright charming. But now outdoor dining has gone back to what it always was: something that most Americans would like to avoid in all but the most ideal of conditions. This sort of relapse could lead to fewer opportunities to eat outdoors even when the weather does cooperate.

But outdoor dining is also affected by more existential issues that have surmounted nearly three years of COVID life. Eating at restaurants is expensive, and Americans like to get their money’s worth. When safety isn’t a concern, shelling out for a streetside meal may simply not seem worthwhile for most diners. “There’s got to be a point to being outdoors, either because the climate is so beautiful or there’s a view,” Paul Freedman, a Yale history professor specializing in cuisine, told me. For some diners, outdoor seating may feel too casual: Historically, Americans associated eating at restaurants with special occasions, like celebrating a milestone at Delmonico’s, the legendary fine-dining establishment that opened in the 1800s, Cutting-Jones said.

Eating outdoors, in contrast, was linked to more casual experiences, like having a hot dog at Coney Island. “We have high expectations for what dining out should be like,” she said, noting that American diners are especially fussy about comfort. Even the most opulent COVID cabin may be unable to override these associations. “If the restaurant is going to be fancy and charge $200 a person,” said Freedman, most people can’t escape the feeling of having spent that much for “a picnic on the street.”

Outdoor dining isn’t disappearing entirely. In the coming years there’s a good chance that more Americans will have the opportunity to eat outside in the nicer months than they did before the pandemic—even if it’s not the widespread practice many anticipated earlier in the pandemic. Where it continues, it will almost certainly be different: more buttoned-up, less lawless—probably less exciting. Santa Barbara, for example, made dining sheds permanent last year but specified that they must be painted an approved “iron color.” It may also be less popular among restaurant owners: If outdoor-dining regulations are too far-reaching or costly, cautioned Hayrettin Günç, an architect with Global Designing Cities Initiative, that will “create barriers for businesses.”

For now, outdoor dining is yet another COVID-related convention that hasn’t quite stuck—like avoiding handshakes and universal remote work. As the pandemic subsides, the tendency is to default to the ways things used to be. Doing so is easier, certainly, than coming up with policies to accommodate new habits. In the case of outdoor dining, it’s most comfortable, too. If this continues to be the case, then outdoor dining in the U.S. may return to what it was before the pandemic: dining “al fresco” along the streetlamp-lined terraces of the Venetian Las Vegas, and beneath the verdant canopy of the Rainforest Cafe.

Public Outrage Hasn’t Improved Policing

The Atlantic

www.theatlantic.com › newsletters › archive › 2023 › 01 › public-outrage-hasnt-improved-policing › 672840

This is an edition of Up for Debate, a newsletter by Conor Friedersdorf. On Wednesdays, he rounds up timely conversations and solicits reader responses to one thought-provoking question. Later, he publishes some thoughtful replies. Sign up for the newsletter here.

Question of the Week

What is the best way forward for Americans who want to improve policing and the criminal-justice system?

Send your responses to conor@theatlantic.com or simply reply to this email.

Conversations of Note

Earlier this month, a Black man named Keenan Darnell Anderson died at a Southern California hospital hours after he was repeatedly Tasered by LAPD officers as they attempted to arrest him following a traffic accident. In video footage where he alternately seems to be asking for help and confusedly resisting arrest, “the officers tell Anderson that if he does not stop resisting, they will Taser him,” MSNBC reported. “The video shows one officer, who appears to be Black, placing his elbow on Anderson's neck to pin him to the ground. At one point, Anderson yells, ‘They’re trying to George Floyd me.’” The story continues, “Police Chief Michel Moore said Anderson had committed a felony hit-and-run and tried to ‘get into another person's car without their permission.’”

I have no idea how to apportion blame in this particular death, but in an opinion article, also at MSNBC, Ja’han Jones contrasted “the widespread public outrage over Floyd’s death” and the dearth of attention paid to the death in Los Angeles. “What are we to make of this difference?” he wrote. “Has the public gotten busier since then? Crueler? More fickle? More tolerant of violence? More futile in our response to it? Where are the black Instagram squares, the corporate news releases claiming to stand for racial justice, the social media posts about white folks listening and learning about their privilege?” But Jones neglects to acknowledge that none of those responses did anything to lessen the number of police killings.

A subsequent Slate article titled “What Happened to the National Outrage Over Police Killings?” offered variations on the same theme. Its author, Shirin Ali, began by asserting that “an ongoing analysis by The Washington Post found Black Americans are killed by police at more than twice the rate of white Americans—and in 2022, police killed the highest number of people on record.” That’s misleading, as the criminologist Peter Moskos pointed out: There were more police killings in 2022 than any year in the Washington Post database of fatal police shootings, but the newspaper has only been keeping track since 2015.

There is evidence to suggest police killings are much lower today than in the past. Moskos has found historical data on 18 major cities showing a 69 percent drop in police shootings since the early- to mid-1970s. Police in New York City and Los Angeles both shoot fewer people than they did then, even though the cities’ populations are now much bigger.

Nevertheless, police in America still kill far more people than in other liberal democracies. The Yale professor Phillip Goff, the co-founder and CEO of the Center for Policing Equity, told Slate that although periodic reforms to American policing have improved it over the decades, police reform has also been stymied. The culprit, in his telling, is “people who think the best way to manage vulnerable Black communities is to lock them up or commit acts of violence whenever they are in a place where they shouldn’t be, where they violate a law that was made to give them opportunities to lock the folks up.”

Reading both articles, I was struck not so much by what was said as by what was neglected: hugely significant factors that are obviously influencing how Americans respond to police shootings compared with how they responded in 2013, when protesters marked the killings of Trayvon Martin and Michael Brown; or during ensuing years, as #BlackLivesMatter began growing from a hashtag into an international movement; or in 2020, when Floyd was killed and the Black Lives Matter movement exploded in America and abroad.

What happened to the national outrage over police killings? It has been muted, in part, by a spike in gun homicides that dwarfs police killings in the number of Black lives that it has destroyed. The outrage has also been muted, in part, by trepidation after the weeks in 2020 when several anti-racist protests were marred by incidents of arson, vandalism, and looting, resulting in as much as $2 billion in damage and as many as  19 people killed. If history is any guide, affected neighborhoods will suffer for decades, disproportionately harming Black and brown communities and businesses.

And although it has always been hard to disentangle the exact relationship between the hearteningly widespread, decentralized activist movement Black Lives Matter and the coalition of groups called the Movement for Black Lives, the Black Lives Matter Global Network Foundation, the Black Lives Matter PAC, and more, outrage is more muted now in part because of infighting among some prominent activists within these groups. Several individuals have come under scathing criticism from some of the very families they purported to champion, or are doing who-knows-what-exactly (some bought luxury real estate) with an unprecedented windfall of grassroots contributions.

Those of us who still want to improve policing need to face reality: Probing why Americans are reacting differently to the most recent death of a Black man after an encounter with police, without at least grappling with all that went wrong in recent years, is doomed to fail.  

Long before Black Lives Matter’s ascent, I was among those inveighing against policing injustices and America’s catastrophic War on Drugs, and trying and failing to significantly reduce police misconduct. Black Lives Matter arose in part because most of us who came before it largely failed. When it did, I hoped it would succeed spectacularly in reducing police killings and agreed with at least its premise that the issue warranted attention.

But it is now clear that the Black Lives Matter approach has largely failed too.

Despite an awareness-raising campaign as successful as any in my lifetime, untold millions of dollars in donations, and a position of influence within the progressive criminal-justice-reform coalition, there are just as many police killings as before Black Lives Matter began. Politically, a powerful faction inside the movement sought to elect more radical progressives; Donald Trump and Joe Biden won the next presidential elections. That same faction sought to “defund the police”; police budgets are now rising, and “defund” is unpopular with majorities of every racial group.

Whether or not you think those reforms should have prevailed, they did not. If impact matters more than intent, the criminal-justice-reform movement needs an alternative to Black Lives Matter that has better prospects for actually improving real lives. Today, almost every American is aware of police killings as an issue. Awareness has been raised, and returns are diminished.

I wish I knew the best way forward. I lament the breakup of the constructive alliance of libertarians, progressives, and religious conservatives who cooperated during the Obama Administration to achieve some worthy criminal-justice reforms, and I continue to be impressed with the ethos Jill Leovy sketched out in the book Ghettoside, offering one strategy that would (in my estimation) dramatically increase equity in American policing. (I also urge everyone to revisit this newsletter’s previous installments on the death penalty, which highlight the powerful abolitionist arguments of my colleague Elizabeth Bruenig, and the war on drugs, which keeps imposing staggering costs while failing to prevent pandemic opioid deaths.)

This week’s question is “What is the best way forward for Americans who want to improve the criminal-justice system?” I hope to air perspectives as diverse as the country, and perhaps plant seeds that grow into constructive new approaches.

Civilian Oversight and Its Discontents

At the Marshall Project, Jamiles Lartey describes the political battle in many municipalities over police-oversight boards, and argues that police unions frequently try to undermine their mission:

Resistance to oversight boards comes primarily from pro-law enforcement groups, especially police unions, who often make concerted efforts to dilute the power of the boards. Law enforcement voices frequently argue that civilians, by definition, don’t have the right knowledge to evaluate police actions. “It would be akin to putting a plumber in charge of the investigation of airplane crashes,” said Jim Pasco, executive director of the national Fraternal Order of Police, told the Washington Post in 2021. When they can’t stop these oversight agencies, or weaken their powers, police unions sometimes seek to have allies placed in vacant board positions. In Chicago, where proponents recently won passage of a new oversight structure, WBEZ reported this week that the largest local police union is spending money “in an attempt to extend the union’s power into a domain created specifically to oversee the officers who make up the union’s membership.”

It’s common for negotiations about oversight bodies to include debate on whether people with close ties to the police (like former officers or family members of officers) are eligible to serve.

On the other side of the spectrum, some police abolitionists push back against these boards, arguing that they work “against deeper change.” It’s also not uncommon for community activists who initially back oversight boards to turn against them over time, frustrated by a lack of results. That’s how things are playing out in Dallas, where activists and board members are both expressing frustration with a board that had its powers expanded after the 2018 killing of Botham Jean by then-officer Amber Guyger. One board member told Bolts Magazine that their efforts were being “stonewalled,” “marginalized” and “put in a corner” by the department’s non-cooperation. The political wrangling about oversight boards is only one way that police departments and unions push back on accountability. In Boston, which rolled out its own independent watchdog body in 2021 (to mixed reviews), Mayor Michelle Wu is currently locked in a battle over the police union contract, and her desire to strengthen the disciplinary process for officer misconduct.

Continuing the DEI Conversation

In our last installment, I promised to run additional reader responses to the Question of the Week about diversity training and associated initiatives within organizations. Today’s collection explores how readers feel about the intersection of corporate Diversity, Equity, and Inclusion (DEI) goals and hiring practices.

Andy feels frustrated by a lack of specificity about what is expected of him––and a climate where open conversation and debate seems too risky to engage in:

In my company, we have a VP of Diversity, who has made a couple of presentations about how we “need” to be more diverse. But what does that look like? I’m in software. I’m a manager who has 10 people reporting to me. Five are white men (one an Orthodox Jew––how does he fit in?). One is an Asian man, one is an Asian woman, two are Indian women, and one is an Indian man. One of the Indian women is my highest-paid employee, deservedly. So, how much work do I have to do in order to make my team diverse?

So instead, we focus on “underrepresented,” which means women, Black, and Hispanic. Maybe gay or trans. How many "groups" do we put on the underrepresented list? Which ones? By the way, the other development manager working with me is a Black man, and our testing and product managers are Hispanic men. I’ve hired maybe 20 employees over my career. The majority are Indian, then Asian, men. My last few openings, I’ve had women recruiters, which, research says, is supposed to tilt the candidates toward women. Not working, I guess. Or maybe it’s actually reflective of the pool? Of course, there isn’t much room for discourse. I’m debating whether I should post this article in our “random” slack channel. Will I just get in trouble?

Jack hypothesizes that diversity work is less appealing when resources are scarce:

I took the all-day diversity class as a middle manager. The company was going through downsizing, which creates a zero-sum mentality that is not a good companion to confessions of moral turpitude, the holy grail of the day. Then the multimillion-dollar fee charged by the consultant came up, igniting two-way hostility.  A total fiasco. I concluded that movies would do a better job helping people internalize the diversity concepts.

D. believes that, for some positions, job candidates from historically underrepresented groups should get hired over white candidates for the sake of diversity, as opposed to a policy of strict nondiscrimination. But he is frustrated by his perception that his employer won’t admit that preference:

I am a card-carrying liberal teaching at a Canadian university. All members of hiring committees are mandated to do periodic equity training in order to sit on the committee, so I’ve done this at least twice. My experience is that the training is as good or as bad as the trainers: my second time was competent, boring, professional; it explained Canadian law and provincial law and university policies, and gave a few decent tips on how to balance the three when they are in conflict, which is pretty often.

But the first time was so insulting to our intelligence. What I most remember is the trainer’s complete ignorance of, or refusal to be honest about, affirmative action (which I support, by the way). The message was you must hire the best candidate, but make sure the best candidate is from an equity-deserving group. Our question: “Can we advertise that for diversity reasons we are only looking for, say, an Indigenous person to teach Indigenous studies?” The answer: “No, you can’t do that.” Our question: “So we have to accept applications from people who in reality have no chance of making the short list?” Their answer: “Hire the best person,” but with the implication that it would be a bad outcome to have a non-Indigenous instructor of Indigenous studies. I actually support the idea of diversity-oriented searches to address historical exclusion and present underrepresentation. Again, I’m a liberal.  But I don’t support lying in job ads.  

It’s the exact equivalent, in reverse, of the NFL mandate to give no-chance-in-hell interviews to minority head-coach candidates. So is the problem the training, or is it Canadian law, which refuses to call diversity preference or compensatory preference by its name, and just calls it “equity”? I’m not sure, but the English language weeps either way. To be clear, though, my awful experience was years back, and the second time, the trainers were pretty honest with us about the contradictions between laws at various levels.

Paul argues that the current approach to DEI generates a backlash from people who feel discriminated against:

I am a Ph.D. candidate at a flagship state university in the Midwest, and recently, a call was put out for scholarships and research funding. At the beginning of the application was the caveat that “priority will be given to underrepresented groups.” Although I am a military veteran, a “nontraditional” student (i.e. middle aged), and come from a rural and “underprivileged” background (whatever that means), I am quite persuaded that none of these “underrepresented” categories is what they meant. And that’s the problem.  

In modern academic circles, DEI initiatives engage in a good deal of coy linguistic posturing that is intended to signal “justice” but that actually sows confusion and resentment. It is well understood on campus that racial and sexual identities trump all other aspects of background and character, and that the commanding heights of student and faculty ambitions are occupied by a class of technocrats engaged in setting historical injustices straight. They do so, paradoxically, by engaging in precisely the kind of arbitrary and capricious discrimination that caused the historical injustices in the first place. And one daren’t lift so much as an eyebrow of critical inquiry (“Can we have a list of the groups to be favored and why?”) without risking professional sanction and social animus.

And even if these DEI programs were models of carefully and individually tailored merit-apportioning, it would hardly matter, since the general perception is quite the opposite. Like the Irish who “need not apply,” talented and ambitious men and women (if they are the wrong identity) quietly skulk to the sidelines to wait for the madness to end.

They don’t even look one another in the face.

Mike has concluded that it’s a waste of time for him to apply for jobs at an employer that is emphasizing certain kinds of DEI initiatives:

I was part of a layoff last week with nearly a universal demographic makeup: straight, white-looking men. The company was already 60 percent female. I have an MBA and a bunch of technical certifications. I look at data and can do analysis. Before I even respond to an inbound request from a prospective employer, I look at the DEI targets. If those targets require significant headcount growth or layoffs to meet goals based on historical trends … I will not apply or interview. I will point my POC and female friends their way.

It’s purely a numbers game.

The leaders are telling me they don’t want people like me … so they don’t get people like me. The shift from meritocracy to equity is going to cause businesses not focused on DEI to gain an advantage in the long term. I’m not less talented than I used to be; I am just the wrong race—and DEI is clear that being white makes me lower quality. There was one company I did accept an inbound with. They put their DEI targets against proportional talent metrics, and they wanted to promote proportionally. It was more work and didn’t look as good as the aggressive virtue signal, but I know if I land there, I just have to execute to win. TLDR: As a white male, when I see DEI, I know it normally means “We don’t want you, we don’t like you, and we will promote or hire literally anyone else if we can.”

James feels discarded by organizations with what he sees as an insufficient commitment to diversity and inclusion:

In my experience, as a visibly queer, Indigenous person in various leadership roles over the past decade, all that is being fulfilled by many diversity efforts––classes, webinars, newsletters, certification programs, and the like––is the documentation of completion rather than the work that should and must be done in order to actually effect change.

The people we should be listening to are Asian women, Black women, Indigenous women, queer women, and femmes of color—they are often at the bottom of the wage pool, subjected to microaggressions and outright discrimination. I’ve had a nonprofit leader ask me why we needed “another DEI class” when she had a certificate from just two or three years ago; I’ve had an instructor who touts a certification of excellence granted by some national institution or other using slurs and derogatory language about Indigenous people like it’s industry jargon. Because it is: Microaggressions; belittling remarks based on race, gender, identity, presentation, hair, makeup, clothes, body type; and the expectation of willingness to step into a stereotype are what we see. The closest thing many of us come to “inclusion” is that we’re all discarded in equal measure.

In an essay that takes aim at TikTok, Cory Doctorow puts forth a general theory of tech giants:

Here is how platforms die: First, they are good to their users; then they abuse their users to make things better for their business customers; finally, they abuse those business customers to claw back all the value for themselves. Then, they die … This is enshittification: Surpluses are first directed to users; then, once they're locked in, surpluses go to suppliers; then once they're locked in, the surplus is handed to shareholders and the platform becomes a useless pile of shit. From mobile app stores to Steam, from Facebook to Twitter, this is the enshittification lifecycle.

That’s all for this week––see you on Monday.

Thanks for your contributions. I read every one that you send. By submitting an email, you’ve agreed to let us use it—in part or in full—in the newsletter and on our website. Published feedback may include a writer’s full name, city, and state, unless otherwise requested in your initial note.

Why We Just Can’t Quit the Handshake

The Atlantic

www.theatlantic.com › health › archive › 2023 › 01 › handshakes-unhygienic-spreads-germs-covid › 672752

Mark Sklansky, a pediatric cardiologist at UCLA, has not shaken a hand in several years. The last time he did so, it was only “because I knew I was going to go to the bathroom right afterwards,” he told me. “I think it’s a really bad practice.” From where he’s standing, probably a safe distance away, our palms and fingers are just not sanitary. “They’re wet; they’re warm; they’re what we use to touch everything we touch,” he said. “It’s not rocket science: The hand is a very good medium to transmit disease.”

It’s a message that Sklansky has been proselytizing for the better part of a decade—via word of mouth among his patients, impassioned calls to action in medical journals, even DIY music videos that warn against puttin’ ’er there. But for a long time, his calls to action were met with scoffs and skepticism.

So when the coronavirus started its sweep across the United States three years ago, Sklansky couldn’t help but feel a smidgen of hope. He watched as corporate America pocketed its dealmaking palms, as sports teams traded end-of-game grasps for air-fives, and as The New Yorker eulogized the gesture’s untimely end. My colleague Megan Garber celebrated the handshake’s demise, as did Anthony Fauci. The coronavirus was a horror, but perhaps it could also be a wake-up call. Maybe, just maybe, the handshake was at last dead. “I was optimistic that it was going to be it,” Sklansky told me.

[Read: Good riddance to the handshake]

But the death knell rang too soon. “Handshakes are back,” says Diane Gottsman, an etiquette expert and the founder of the Protocol School of Texas. The gesture is too ingrained, too beloved, too irreplaceable for even a global crisis to send it to an early grave. “The handshake is the vampire that didn’t die,” says Ken Carter, a psychologist at Emory University. “I can tell you that it lives: I shook a stranger’s hand yesterday.”

The base science of the matter hasn’t changed. Hands are humans’ primary tools of touch, and people (especially men) don’t devote much time to washing them. “If you actually sample hands, the grossness is something quite exceptional,” says Ella Al-Shamahi, an anthropologist and the author of the book The Handshake: A Gripping History. And shakes, with their characteristic palm-to-palm squeezes, are a whole lot more prone to spread microbes than alternatives such as fist bumps.

Not all of that is necessarily bad: Many of the microscopic passengers on our skin are harmless, or even beneficial. “The vast majority of handshakes are completely safe,” says David Whitworth, a microbiologist at Aberystwyth University, in Wales, who’s studied the griminess of human hands. But not all manual microbes are benign. Norovirus, a nasty diarrheal disease infamous for sparking outbreaks on cruise ships, can spread easily via skin; so can certain respiratory viruses such as RSV.

The irony of the recent handshake hiatus is that SARS-CoV-2, the microbe that inspired it, isn’t much of a touchable danger. “The risk is just not very high,” says Jessica Malaty Rivera, an infectious-disease epidemiologist at the Johns Hopkins Center for Health Security. Despite early pandemic worries, this particular coronavirus is more likely to use breath as a conduit than contaminated surfaces. That’s not to say that the virus couldn’t hop from hand to hand after, say, an ill-timed sneeze or cough right before a shake. But Emily Landon, an infectious-disease physician and hand-hygiene expert at the University of Chicago, thinks it would take a hefty dose of snot or phlegm, followed by some unwashed snacking or nose-picking by the recipient, to really pose a threat. So maybe it’s no shock that as 2020’s frantic sanitizing ebbed, handshakes started creeping back.

[Read: The great pandemic hand-washing blooper]

Frankly, that doesn’t have to be the end of the world. Even when considering more shake-spreadable pathogens, it’s a lot easier to break hand-based chains of transmission than airborne ones. “As long as you have good hygiene habits and you keep your hands away from your face,” Landon told me, “it doesn’t really matter if you shake other people’s hands.” (Similar rules apply to doorknobs, light switches, subway handrails, phones, and other germy perils.) Then again, that requires actually cleaning your hands, which, as Sklansky will glady point out, most people—even health-care workers—are still pretty terrible about.

For now, shakes don’t seem to be back to 2019 levels—at least, not the last time researchers checked, in the summer of 2022. But Gottsman thinks their full resurgence may be only a matter of time. Among her clients in the corporate world, where grips and grasps are currency, handshakes once again abound. No other gesture, she told me, hits the same tactile sweet spot: just enough touch to feel personal connection, but sans the extra intimacy of a kiss or hug. Fist bumps, waves, and elbow touches just don’t measure up. At the pandemic’s worst, when no one was willing to go palm-to-palm, “it felt like something was missing,” Carter told me. The lack of handshakes wasn’t merely a reminder that COVID was here; it signaled that the comforts of routine interaction were not.

If handshakes survive the COVID era—as they seem almost certain to do—this won’t be the only disease outbreak they outlive, Al-Shamahi told me. When yellow fever pummeled Philadelphia in the late 18th century, locals began to shrink “back with affright at even the offer of a hand,” as the economist Matthew Carey wrote at the time. Fears of cholera in the 1890s prompted a small cadre of Russians to establish an anti-handshake society, whose members were fined three rubles for every verboten grasp. During the flu pandemic that began in 1918, the town of Prescott, Arizona, went so far as to ban the practice. Each time, the handshake bounced back. Al-Shamahi remembers rolling her eyes a bit in 2020, when she saw outlets forecasting the handshake’s untimely end. “I was like, ‘I can’t believe you guys are writing the obituary,’” she told me. “That is clearly not what is happening here.”

Handshakes do seem to have a knack for enduring through the ages. A commonly cited origin story for the handshake points to the ancient Greeks, who may have deployed the behavior as a way to prove that they weren’t concealing a weapon. But Al-Shamahi thinks the roots of handshaking go way further back. Chimpanzees—from whom humans split some 7 million years ago—appear to engage in a similar behavior in the aftermath of fights. Across species, handshakes probably exchange all sorts of sensory information, Al-Shamahi said. They may even leave chemical residues on our palm that we can later subconsciously smell.

[Read: What a handshake smells like]

Handshakes aren’t a matter of survival: Plenty of communities around the world get by just fine without them, opting instead for, say, the namaste or a hand over the heart. But palm pumping seems to have stuck around in several societies for good reason, outlasting other customs such as curtsies and bows. Handshakes are mutual, usually consensual; they’re imbued with an egalitarian feel. “I don’t think it’s a coincidence that you see the rise of the handshake amongst all the greetings at a time when democracy was on the rise,” Al-Shamahi told me. The handshake is even, to some extent, built into the foundation of the United States: Thomas Jefferson persuaded many of his contemporaries to adopt the practice, which he felt was more befitting of democracy than the snobbish flourishes of British court.

American attitudes toward handshakes still might have undergone lasting, COVID-inspired change. Gottsman is optimistic that people will continue to be more considerate of those who are less eager to shake hands. There are plenty of good reasons for abstaining, she points out: having a vulnerable family member at home, or simply wanting to avoid any extra risk of getting sick. And these days, it doesn’t feel so strange to skip the shake. “I think it’s less a part of our cultural vernacular now,” Landon told me.

Sklansky, once again in the minority, is disappointed by the recent turn of events. “I used to say, ‘Wow, it took a pandemic to end the handshake,’” he told me. “Now I realize, even a pandemic has failed to rid us of the handshake.” But he’s not ready to give up. In 2015, he and a team of his colleagues cordoned off part of his hospital as a “handshake-free zone”—an initiative that, he told me, was largely a success among health-care workers and patients alike. The designation faded after a year or two, but Sklansky hopes that something similar could soon return. In the meantime, he’ll settle for declining every proffered palm that comes his way—although, if you go for something else, he’d rather you not choose the fist bump: “Sometimes,” he told me, “they just go too hard.”

​​When you buy a book using a link on this page, we receive a commission. Thank you for supporting The Atlantic.

Why Mayors Are So Unpopular

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 01 › city-mayor-unpopular-job-covid-crime-housing › 672711

“I’m not going to sit here and tell you we did everything perfectly. We haven’t,” Lori Lightfoot, Chicago’s mayor, says in a campaign ad released late last year. “But we’ve tried our darndest to make sure we got it right, and when we haven’t—you pick yourself up and you listen and you’re humble and you learn from your mistakes.”

That might not be the most triumphant message for the incumbent to send Windy City voters as they decide whether to reelect her. But it is perhaps an honest one. Poll after poll has shown Chicagoans to be in a “sour” mood: A mere 9 percent believe that the city is headed in the right direction. Underwater on her approval rating, Lightfoot is not expected to win reelection next month.

It’s not just her. Eric Garcetti was term-limited and could not run for reelection in Los Angeles last year, but Angelenos probably would not have voted him in again even if he had been eligible; his approval rating had sagged nearly 20 points in the prior two years. In New York, Eric Adams’s approval rating fell more than 30 points in his first six months in office, though a majority of city voters said they still liked the guy’s style. Just a quarter of San Francisco residents rate London Breed’s performance as excellent or good, per a Chronicle poll in September; her popularity has “plummeted.” And in New Orleans, where the public is more dissatisfied with city leadership than at any time since the Hurricane Katrina era, LaToya Cantrell is facing a potential recall.

[Read: How to run for president while you’re running a city]

The Anna Karenina principle applies here: Each of these unpopular big-city mayors is unpopular in his or her own way. Yet sweeping national trends are stirring up public dissatisfaction with city executives across the country, driving down favorability ratings, ginning up recalls, and increasing retirements. Indeed, what had been one of the best perches in American politics is becoming one of its worst. The overwhelmingly liberal denizens of the country’s cities are disaffected and are holding their local leaders accountable for problems far beyond any one officeholder’s capacity to repair. That’s a trend that might get worse in the coming years.

Mayors, as a general point, have it good. They’re often well-liked, and not infrequently beloved. Their approval ratings tend to run high. Many of them have more formal power than, say, members of the House of Representatives do, and your average mayor has much more influence over the city she leads than the president has over domestic policy. They frequently win reelection. “Once a mayor’s in office, unless something disastrous happens, it’s hard to get rid of them,” Katherine Levine Einstein, a political scientist at Boston University, told me, discussing the frequency of long tenures among big-city executives.

Yet, at the moment, any number of mayors are struggling. Adams, Breed, and Lightfoot all have significantly lower approval ratings than the governor in their respective states, for instance, as did Garcetti before he left office. Even many well-liked mayors, such as Muriel Bowser of Washington, D.C, have watched their approval ratings drop of late.

In surveys, mayors themselves have expressed frustration as their community’s problems have become more intractable. They “feel like they’re being forced to deal with these big, macro problems, whether it’s crime, inflation, homelessness, housing costs, COVID, climate change,” Einstein told me. “They’re struggling with these issues. Their citizens feel really frustrated by these issues. But they often can’t do a whole lot about it.”

The early phase of the coronavirus pandemic burned a lot of mayors out, leading to a wave of retirements. City executives felt tasked with managing a public-health disaster far outside their normal purview; many struggled to design and implement masking-and-distancing mandates, initiatives to help small businesses, and educational policies that worked for parents, kids, and teachers’ unions. Those pressures might have eased, but new ones have taken their place.

The coronavirus crisis drags on. In big cities such as New York and San Francisco, the shift to working from home has left downtowns empty, destroying local businesses, making homelessness more conspicuous, and deepening residents’ sense of their vulnerability to crime. That exodus, now seemingly permanent, has decreased property-tax revenues and sapped public-transit systems of funds too, something urbanists are warning might turn into a “doom loop” of declining service and declining ridership.

At the same time, mayors are struggling with a surge in certain kinds of crimes. Homicides increased sharply in many American cities in 2020 and 2021, a trend that generated lots of media coverage and dampened many local officials’ favorability ratings. (Thankfully, the homicide wave has crested in many cities.) One study focused on New York found that an increase of 20 homicides in the city reduced mayoral approval by half a percentage point. “Successive months of increasing homicides could seriously damage a mayor’s standing with the public,” the authors note, with sustained increases proving “devastating.” Yet elected officials—including mayors and district attorneys—have a very limited impact on crime rates; even the police have less of an effect than you might think.

A third problem is the long-simmering housing crisis. Rents have increased relentlessly in big cities over the past two decades as a result of an undersupply of millions of units, squeezing residents’ budgets and leading to surges in homelessness while also jacking up the cost of services such as day care. Urban life has become an incessant, unaffordable grind, even for people with healthy incomes.

[Annie Lowrey: The U.S. needs more housing than almost anyone can imagine]

Mayors do seem to have some effect on housing prices, and they often have some control over real-estate development. But city executives cannot conjure up transitional housing units and affordable apartment buildings for low- and moderate-income residents. They often need to work, slowly and painstakingly, with planning commissions, city-council members, and neighborhood groups to get projects approved. And because cities typically must balance their budgets and have many claims on their dollars, mayors can’t easily raise billions to get their constituents off the streets. In Boston University’s latest Menino Survey of Mayors, three in four mayors said they were held accountable for homelessness, but only one in five said they had much control over the issue. “Limited funding is a serious obstacle to effectively reducing local homelessness,” the survey found.

The politics of housing development is tricky for mayors too: Although many city residents are desperate for more apartment construction to bring prices down, many others are NIMBYs who do not want to see their property values stagnate, long-term residents who do not want to see their neighborhoods change, or both. Stopping development makes people angry. Pushing development makes people angry. And mayors get held to task one way or the other.

Some mayors, of course, add to their own burdens: Adams recently faced criticism for having left his city for the Caribbean during a deadly winter storm, and Lightfoot has engaged in a bruising fight with the city’s teachers’ union. But mayors’ burdens are great, and have gotten greater. The overwhelmingly Democratic residents of America’s cities have high expectations. And mayors have limited resources and power to meet them. Until these places become more vibrant, cheaper, and more livable, a mayor’s job won’t be getting any easier.

Science Has a Crummy-Paper Problem

The Atlantic

www.theatlantic.com › newsletters › archive › 2023 › 01 › academia-research-scientific-papers-progress › 672694

This is Work in Progress, a newsletter by Derek Thompson about work, technology, and how to solve some of America’s biggest problems. Sign up here to get it every week.

We should be living in a golden age of creativity in science and technology. We know more about the universe and ourselves than we did in any other period in history, and with easy access to superior research tools, our pace of discovery should be accelerating. But, as I wrote in the first edition of this newsletter, America is running out of new ideas.

“Everywhere we look we find that ideas … are getting harder to find,” a group of researchers from Stanford University and MIT famously concluded in a 2020 paper. Another paper found that “scientific knowledge has been in clear secular decline since the early 1970s,” and yet another concluded that “new ideas no longer fuel economic growth the way they once did.”

In the past year, I’ve traced the decline of scientific breakthroughs and entrepreneurship, warned that some markets can strangle novelty, and investigated the domination of old movies and songs in the film and music industries. This year, a new study titled “Papers and Patents Are Becoming Less Disruptive Over Time” inches us closer to an explanation for why the pace of knowledge has declined. The upshot is that any given paper today is much less likely to become influential than a paper in the same field from several decades ago. “Our study is the first to show that progress is slowing down, not just in one or two places, but across many domains of science and technology,” Michael Park, a co-author and professor at the University of Minnesota, told me.

The researchers relied on a metric called the Consolidation-Disruption Index—or CD Index—which measures the influence of new research. For example, if I write a crummy literature review and no scientist ever mentions my work because it’s so basic, my CD Index will be extremely low. If I publish a paradigm-shifting study and future scientists exclusively cite my work over the research I rendered irrelevant, my CD Index will be very high.

This new paper found that the CD Index of just about every academic domain today is in full-on mayday! mayday! descent. Across broad landscapes of science and technology, the past is eating the present, progress is plunging, and truly disruptive work is hard to come by. Despite an enormous increase in scientists and papers since the middle of the 20th century, the number of highly disruptive studies each year hasn’t increased.

Why is this happening?

One possibility is that disruptive science is becoming less productive as each field becomes more advanced and the amount of knowledge new scientists have to acquire increases. This is sometimes called the “burden of knowledge” theory. Just as picking apples from a tree becomes harder after you harvest the low-hanging fruit, science becomes harder after researchers solve the easiest mysteries. This must be true, in some cases: Calculating gravity in the 1600s basically required a telescope, pen, and paper. Discovering the Higgs boson in the 21st century required constructing a $10 billion particle collider and spending billions more firing subatomic particles at one another at near–light speed. Pretending these things are the same is not useful.

A related theory is Johan S. G. Chu’s concept of “durable dominance”—a phenomenon where highly competitive fields create a small number of dominant winners. Chu and the University of Chicago scholar James Evans found that progress has slowed in many fields because scientists are so overwhelmed by the glut of information in their domain that they’re reading and riffing on the same limited canon of famous papers. It’s more or less the same principle as a weekend couch potato overwhelmed by streaming options who opts to just watch the top-ranked TV show on Netflix. In both science and streaming, a surplus of options might be entrenching a small number of massive hits.

When I spoke with the disruption paper’s co-authors last week, they seemed interested in explanations beyond the burden-of-knowledge theory. “If the low-hanging-fruit theory were sufficient, then I think we’d expect to see the oldest fields stagnate most dramatically,” said Russell Funk, a co-author and professor at the Carlson School of Management. “But the fact that the decline in disruption is happening across so many fields of science and technology points to something broader about scientific practice, and the corporatization of science, and the decline of scientific exploration in the last few decades.”

In other words, if science is getting less productive, it’s not just because we know too much about the world. It’s because we know too little about science itself. Or, more specifically, we know too little about how to conduct research in a way that gets the best, most groundbreaking results.

According to the rules of modern academia, a young academic should build status by publishing as many papers in prestigious journals as she can, harvest the citations for clout, and solicit funding institutions for more money to keep it all going. These rules may have been created with the best intentions—to fund the most promising projects and ensure the productivity of scientists. But they have created a market logic that has some concerning consequences.

First, these rules might discourage truly free exploration. As the number of Ph.D. students has grown, National Institutes of Health funding has struggled to keep up. Thus, the success rate for new project grants has mostly declined in the past 30 years. As grants have become more competitive, savvy lab directors have strategically aimed for research that seems plausible but not too radical—optimally new rather than totally new, as one researcher put it. This approach may create a surplus of papers that are designed to advance knowledge only a little. A 2020 paper suggested that the modern emphasis on citations to measure scientific productivity has shifted rewards and behavior toward incremental science and “away from exploratory projects that are more likely to fail, but which are the fuel for future breakthroughs.” As attention given to new ideas has decreased, science has stagnated.

Second, at the far extreme, these incentives might create a surplus of papers that just aren’t any good—that is, they exist purely to advance careers, not science.

“I definitely think there’s something to the idea that there are just a lot more bullshit papers out there,” Funk told me. Rather than blame individual scientists, he said the fault lies in a system that encourages volume over quality: “There are journals, which I’d consider predatory journals, that make researchers pay money to publish their papers there, with only symbolic peer review, and then the journals play games by making the authors cite articles from the same journal.”

Funk’s predatory-journal story reminded me of the dark side of Moneyball: When any industry focuses too much on one metric, it can render the metric meaningless and warp the broader purpose of the industry. Just as we are living in a platinum age of television—more quantity but perhaps not more quality—we seem to be in a platinum age of science, in which the best you can say about the industry is that there certainly seems to be more of everything, including crap.

A year ago, I pitched the idea of an abundance agenda, arguing that the U.S. suffers from a scarcity mindset in health care, housing, and beyond. The crisis in science offers an interesting test of this thesis in that researchers are struggling with a superabundance of knowledge and studies. It’s a useful reminder that abundance is not a sufficient end point; rather, it’s an input. Science may have a deficit of disruption precisely because the industry doesn’t know how to navigate its crisis of plenty—too much knowledge to synthesize, and too many papers bolstering their authors’ reputation without expanding the frontier of science.

Opinion: The staggering mistake Hamline University made is no isolated incident

CNN

www.cnn.com › 2023 › 01 › 09 › opinions › hamline-university-prophet-mohammed-academic-freedom-ctrp-perry › index.html

Well over a decade ago, I found myself teaching about abortion, eugenics, evolution, holy war and the history of the papacy at a Catholic university in the Chicago area. If you've been inundated with stories about the threats of campus culture to free speech, you might have expected me to have been worried. But although I had students who opposed my beliefs on every issue, I knew that at Dominican, everyone — from the chair of my department to the president of the university — had my back. If a student felt that my teaching somehow violated their beliefs and complained, I always knew that so long as I performed with integrity and care, I'd be fine. And I was, even when teaching Darwin to a creationist.