Itemoids

Pulitzer Prize

‘It’s Really First-Class Work’

The Atlantic

www.theatlantic.com › culture › archive › 2023 › 07 › oppenheimer-richard-rhodes-interview › 674828

This article containers spoilers for the film Oppenheimer.

Few authors have written as insightfully about the life of J. Robert Oppenheimer as Richard Rhodes, whose 1986 book, The Making of the Atomic Bomb, is widely regarded as the definitive account of the Manhattan Project. Rhodes’s comprehensive history, which won a Pulitzer Prize, is both a massive work of scholarship—the main text alone runs nearly 800 pages—and a literary feat that he conceived as “the tragic epic of the twentieth century.” Over the years, according to Rhodes, it has been optioned many times, but no film or television version has ever been made. “It’s quite obvious why,” Rhodes told me. “It’s just too big a story.”

Over the weekend, along with millions of other moviegoers, Rhodes saw Oppenheimer, Christopher Nolan’s three-hour biopic of the physicist known as the father of the atomic bomb. The next day, curious about his reaction, I spoke with Rhodes by phone. He was deeply impressed by the film, especially in light of earlier attempts to adapt the same material. “It’s really first-class work,” Rhodes said, comparing it favorably with Roland Joffé’s Fat Man and Little Boy (“badly done,” from a technological perspective) and specifically praising Cillian Murphy’s performance in the title role. “If anything, he was a little too confident. But Oppenheimer was pretty confident.”

We also discussed aspects of the story that weren’t covered by the film, which Nolan adapted from the biography American Prometheus, by Kai Bird and Martin J. Sherwin. Given its relentless concentration on Oppenheimer, the movie necessarily leaves a lot out, including plenty of what Rhodes called “drama on the industrial side” and the perspectives of scientists and victims who fall outside its protagonist’s circle of consciousness. For the rest, viewers may need to return to Rhodes’s own wide-ranging work, which expands beyond even the largest IMAX screen.

This conversation has been edited for length and clarity.

Alec Nevala-Lee: Do you think that the film’s picture of Oppenheimer is accurate?

Richard Rhodes: One time I asked [the physicist] Bob Serber if my portrait of Oppenheimer was anywhere close to the real human being. And Serber, who had a very dry wit, said, “It’s the least wrong of all those I’ve seen.”

And I think that applies here, because the difficult edges to Oppenheimer were, to some degree, sanded off. But there have been several Oppenheimers in past versions. The BBC did a series with Sam Waterston. He was wonderful, but he was much too nice. Then when the next version [the 2009 PBS docudrama The Trials of J. Robert Oppenheimer] came around, [David Strathairn] played Oppenheimer as a hand-wringing neurotic, which really pissed me off when I watched it. You could not possibly have someone who did what Oppenheimer did in his life who was just sitting around shaking all the time with anxiety.

[Read: Oppenheimer is more than a creation myth about the atomic bomb]

Nevala-Lee: Most viewers are probably encountering figures such as Lewis Strauss (the government official who orchestrated the notorious hearing that revoked Oppenheimer’s security clearance, played by Robert Downey Jr. in a towering performance) and Leslie Groves (the military head of the atomic-weapons program, played by Matt Damon) for the first time.

Rhodes: Yeah, I think Strauss, if anything, was depicted somewhat more pleasantly than he really was. I think he was even more nasty. And I was really surprised by Matt Damon, who did a damn good job. In fact, it gave me a different sort of perspective on Groves. I had pictured him as stuffier than he was depicted here, and I think this is probably closer to the truth. Groves was really a superb leader, and also anxious and insecure around the scientists. Which was a funny combination, because he drove them to get the job done anyway.

Nevala-Lee: Was there anything else about the movie that surprised you?

Rhodes: Mostly minor things. I’d read about the arrival of the shock wave after the light [from the Trinity test], but my God—when you see it in IMAX, it really hits you; it resonates in your chest. We were just knocked back in our chairs. I wish [Edward] Teller [Oppenheimer’s nemesis in the debate over the hydrogen bomb] had been a little different. I spent an interesting 30 minutes with Teller and had some sense of what he was like. That guy [Benny Safdie] was a little too oily, not quite as sinister as Teller really was.

Nevala-Lee: Nolan has always struck me as a pretty cerebral guy who also makes movies on the largest possible scale. It tracks to me that Oppenheimer, a theorist who found himself in charge of this incredible industrial operation, would appeal to him.

Rhodes: That makes sense. My experience with writing books is, your best books are the ones that you have a deep emotional investment in. And there’s an automatic tendency when you’re writing a biography to turn the character in the biography into oneself.

Nevala-Lee: Nolan, who is willing to play with structure, seems like a good choice for this story, because it allows him to deliver so much information. He can cut between the Manhattan Project period, the Oppenheimer hearing, and the hearing for Strauss’s nomination as secretary of commerce, and use the dynamic to explain things to the audience.

Rhodes: I had never thought about the parallel between Strauss and Oppenheimer before, but the story is structured so that both of them are destroyed by the forces of Washington, D.C. And that’s really a wonderful sort of parallel. Oppenheimer’s kind of a tragic hero, and I wouldn’t give that credit to someone like Strauss. But in a kind of corrupt way, he followed the same arc across his life. That was a real insight that I haven’t seen—maybe it’s in the biography [American Prometheus].

Nevala-Lee: I read it recently, and Strauss’s hearing takes up just a single paragraph. But Nolan decides to make it a fifth of the movie, for the reasons you’re saying. There’s this fascinating parallel that is possible only in a movie—the thematic echoes and the rhythm of the editing provide a sense of closure that would be much more difficult in book form.

Rhodes: You can do that, but you’d have to have that insight. And Nolan had that insight. When you do research for a book, often you’ll come across something that can be expanded upon. When I was working on The Making of the Atomic Bomb, I read a history of the development of physics in the United States. And in a footnote at the end of a chapter deep in the book, there’s this note about [Enrico] Fermi, one day going up to the window and looking down at the gray winter length of Manhattan Island—alive with crowds—and cupping his hands together and saying, “A little bomb like that and it would all disappear.” The historian who wrote this book threw that away into a footnote. I made it the end of the whole first third of my book.

Ian Allen

Nevala-Lee: The movie for the most part is very realistic, but dreamlike moments visualize Oppenheimer’s psychological state, which reminded me of your book. The opening paragraph starts with the physicist Leo Szilard—whom you use in your book as a “clothesline” character, someone the audience can follow across a complex narrative—crossing the street, with a description of what the weather was like in London, and then it ends with a passage out of John Milton. And that elevates the tone in a way that tells you something about the material.

Rhodes: That’s what I was trying to do, of course. But I thought it was really off tone when [Nolan, in one of those dream sequences] had Oppenheimer and Jean Tatlock screwing on the table in the security hearing. That was, I think, maybe a bit of an overreach. It’s curious and interesting that they decided not to visit Hiroshima.  

Nevala-Lee: I was wondering whether Nolan would show that, but every scene that’s not about Strauss is from Oppenheimer’s point of view. So instead of the bombing, you see him waiting for a phone call, because he has no control over how the weapon is used. There’s an earlier scene where the characters talk about saving lives by heading off an American invasion of Japan. Is this something that would have been discussed before the bombing, or is this a rationalization that defenders of the decision arrived at after the fact?

Rhodes: Well, George Marshall [the U.S. Army chief of staff during World War II] said we knew that the Japanese were getting their people trained to fight us. And we thought that if we could bomb the beaches with atomic bombs, we might shock the Japanese into surrender. In fact, there were plans to keep going. I found a memorandum from Oppenheimer to Groves saying if we make a design that uses both plutonium and uranium, we can have six bombs a month by October.

Nevala-Lee: If the movie had included that suggestion, it would have really changed the viewer’s sense of Oppenheimer.

Rhodes: There was also discussion in ’43 or ’44 of making radiation bombs stuffed with cobalt 60 or something that would just spread radioactive particles all over the place. We did some tests down in New Mexico, and I remember someone’s comment afterward—it was the most god-awful stuff you could imagine. Which, yes, it would be, wouldn’t it?

[Read: The real lesson from The Making of the Atomic Bomb]

Nevala-Lee: I recently watched the opera Doctor Atomic, where one character is Oppenheimer’s Native American maid. That’s the kind of voice you don’t hear in the movie.

Rhodes: It was certainly a valid perspective. These guys came in and swept the mesa clean, and they used the Native American people to clean their houses.

Nevala-Lee: Nolan is so focused on Oppenheimer—but with a movie like this, you have to find, as you’ve said, the clothesline.

Rhodes: And there was so much drama on the industrial side that’s basically just left out. It’s compressed into something that is really very brilliant—those big open jars in which they keep dropping marbles [to track the supply of uranium and plutonium]. That’s as close as we come to seeing the Hanford complex, with its huge production reactors, or the Oak Ridge complex, with one factory that was [about] a mile long, so the supervisors inside rode around on bicycles.

Nevala-Lee: How do you feel about the impact this movie will have on how a mass audience understands this immensely complicated story?

Rhodes: I’ve been living with this story now for 40, 50 years. So what I’m most excited by—you will consider this crass, but this is where I am in my life—is we’ve got an option from a German company for The Making of the Atomic Bomb to be made into a multipart television series. And I’m just hoping that this will cause enough stir that these people will finally, for the first time in all these years, actually pick up the option. That would be just wonderful, and I’d pay off my mortgage, and my version of the story would be out there.

Why Elon Killed the Bird

The Atlantic

www.theatlantic.com › technology › archive › 2023 › 07 › twitter-musk-x-rebrand › 674818

In May, Elon Musk presided over an uncharacteristically subtle tweak to Twitter’s home page. For years, the prompt in the text box at the top of the page read, “What’s happening?,” a friendly invitation for users to share their thoughts. Eight months after the billionaire’s takeover, Twitter changed the prompt ever so slightly to match the puzzling, chaotic nature of the platform under the new regime: “What’s happening?” became “What is happening?!”

This question, with its exclamatory urgency, has never been more relevant to Twitter than in the past 48 hours, when Musk decided to nuke 17 years’ worth of brand awareness and rename the thing. The artist formerly known as Twitter is now X. What is happening?! indeed.

I have three answers to that question, beyond the simple “Nothing much.” (Even with its new name, the site is pretty much the same as ever; the blue bird logo in the left-hand corner of the website is now a black X, and … that’s about it.) This X boondoggle may simply be the flailing of a man who doesn’t want to own his social network and was pressured via lawsuit to buy it, but Musk and the Twitter (X?) CEO, Linda Yaccarino, would like you to believe that much bigger things are coming.

1. Musk wants to build the “Internet of Elon.”

The first theory requires taking Musk’s ambitions somewhat seriously. In a tweet last October, he declared that “buying Twitter is an accelerant to creating X, the everything app.” Versions of these “super apps” already exist and are popular in Asia, Africa, and Latin America, so we know what this means: X would function as a holistic platform that includes payment processing and banking, ride-sharing, news, communication with friends, and loads and loads of commerce. Think of it as the internet, but without leaving Musk’s walled garden. In a leaked recording of a Twitter town hall back in June 2022, Musk hinted that China’s WeChat was a model of sorts for X. “There’s no WeChat movement outside of China,” he said. “And I think that there’s a real opportunity to create that. You basically live on WeChat in China because it’s so useful and so helpful to your daily life. And I think if we could achieve that, or even close to that with Twitter, it would be an immense success.”

If you squint, you can see the logic. Plenty of people across Twitter—influencers, freelancers, small-business owners—use the platform to sell things. Many of these people use various “link in bio” pages to direct people to their work and receive payment. X, the theoretical everything app, could streamline and consolidate these exchanges, and, like WeChat, generate revenue from them. Musk has already launched a program to share ad revenue with some of Twitter’s larger “creator” accounts, the early beneficiaries of which included many prominent right-wing shock jocks. And if he could manage to get hundreds of millions of people to live and shop and bank on his app, instead of, say, shitpost memes and argue politics with white nationalists, that would be an immense success.

To do that, X would need to build out an advanced, secure payment platform; apply the appropriate regulatory licenses to legally process payments and store money; and, of course, recruit businesses and financial institutions to use the platform. According to the Financial Times, the company began filing applications for those licenses early this year and has been at work building parts of a payment infrastructure. The bad news is that the person Musk tasked with spearheading the project was laid off—along with about 80 percent of Twitter’s workforce.

[Read: I watched Elon Musk kill Twitter’s culture from the inside]

Musk does have experience in the payments business. He founded an online bank—also called X.com—in 1999, and it shortly thereafter merged with Confinity to become PayPal. Perhaps this would give his super-app idea some credibility, if not for Musk having spent the past 15 months blundering through his latest business venture in full view of the public. He has alienated advertisers with his reactionary and conspiratorial political opinions, sent the company deep into debt, and, at times, rendered the platform unusable by limiting how many tweets users can see. The platform appears to be shrinking under Musk’s leadership, according to third-party traffic data. Spam is rampant, and the most satisfied users seem to be previously banned racists and anti-Semites who have regained access to their megaphone.

It’s already a hard sell for Musk to convince people that his sputtering platform is the best place for posting Barbenheimer memes, let alone the right home for … everything, including our money. But the cognitive dissonance between Musk’s reputational hemorrhaging and his grand ambitions is easier to bridge when you consider the second way to explain X, which is that it’s a desperate shot in the dark.

2. X is pseudoware—just buzzwords and a logo.

Unlike Facebook’s pivot to Meta, which was oriented around a real (though flawed and unappealing) virtual-reality product, X is a rebrand built on little more than a vague collection of buzzwords cobbled together to form a complete sentence. On Sunday, Yaccarino described the forthcoming app as “the future state of unlimited interactivity” that is “centered in audio, video, messaging, payments/banking—creating a global marketplace for ideas, goods, services, and opportunities.” She also noted that, “powered by AI, X will connect us all in ways we’re just beginning to imagine.” (Yaccarino did not respond to a request for comment.)

Her tweet is a near-perfect example of business-dude lorem ipsum—corporate gibberish that sounds superficially intelligent but is actually obfuscating. What is “unlimited interactivity?” How will X be “powered by AI”? Does she mean generative AI like ChatGPT or standard algorithms of the sort that have powered Twitter’s “For you” feed for years? The particulars are irrelevant: The words just need to sound like something when strung together.

Musk, too, is guilty of such blabber. He has argued that his hypothetical project, if built correctly, could “become half of the global financial system.” This is the empty language of a dilettante, the equivalent of me telling you that this article, if written correctly, is on pace to win the Pulitzer Prize for explanatory journalism, or that I am the executive wordsmith for The Atlantic’s Words About Computers section.

Musk’s and Yaccarino’s descriptions of X don’t just suggest vaporware, an industry term for a hyped-up product that never materializes. They feel like something else, too: I’d call it pseudoware. Like a pseudo-event, pseudoware masquerades as something newsworthy, even though it is not. Yaccarino’s Mad Libs–ian tweetstorm is the press release for this pseudo-event: big talk for a pivot that has so far manifested in ornamental changes. Musk’s first orders of business have been to adopt a new logo and to project it in conference rooms at headquarters. When Yaccarino tweeted that “X will connect us all in ways we’re just beginning to imagine,” I take her at her word: No one seems to have spent much time thinking about any of this.

It’s natural to wonder why the world’s richest man would spend his time dismantling one of the world’s most recognizable social-media brands in favor of an inscrutable super app nobody asked for. He could, at any moment, jet off to a private island and drink bottomless piña coladas while giggling about Dogecoin instead! Herein lies the third and most important explanation for X, which is that it is a reputational line of credit for Musk.

3. Musk needs to save face.

Musk’s reputation is dependent on people believing that he can gin up new categories of industry and see around corners to build futuristic stuff. His $44 billion acquisition of Twitter was marketed with a visionary framework: Musk would realize Twitter’s dreams of being a truly global town square and solve the intractable problems of free speech at scale. Having failed at that, Musk is, in essence, going back to basics with X—it’s an opportunity to remake the internet in his own image.

[Read: Elon Musk really broke Twitter this time]

As Bloomberg’s Max Chafkin pointed out, Musk’s original X.com brand was a total failure. He was obsessed with the name and the web address, but consumers, not illogically, associated the brand with adult-entertainment sites. Musk was ousted by X.com’s board after the merger in 2000, a year before the service was renamed PayPal, but the payment company remains an important part of his legacy. X, then, is a callback to a younger version of the entrepreneur—one who is associated with success, products that work, and generous investor returns. It also represents unfinished business and a chance to rewrite an earlier phase of his career.

In this sense, X is less a brilliant vision than it is an act of desperation. Musk is behaving much like a start-up with an unsustainable burn rate—he needs a cash injection. And in order to raise some reputational capital, he needs a good idea, one that seems plausible and scalable. X checks all the boxes for such an idea. Its ambitions are so vague as to be boundless, which signals perpetual growth and moneymaking—it is, quite literally, the everything app. In this way, X is an inkblot test for anyone who still wants to believe in Musk: Squint the right way and X takes on the form of whatever hopes and dreams that person has for the future of the internet. This is the fleeting genius of pseudoware: The best feature of marketing an app for everything is that you can get away with saying mostly nothing.

But just how many people still have unwavering faith in Musk is an open question. How many people are willing to give the benefit of the doubt to the man who can’t pay his rents and server bills on time? Who is going to help build this technological behemoth for the man who fired the majority of his company and allegedly owes $500 million in severance payments? And, perhaps most important, who is excited about what Musk is focusing his time, energy, and money on?

Even though X wants to be big, a WeChat clone is conceptually small in comparison with the projects that have allowed Musk to market himself as a savant: space exploration, rewiring the human brain, revolutionizing transportation to save the Earth. The most cynical view of X is that the people who still respect Musk are reactionaries who delight only when their enemies have been gleefully trolled. Perhaps, then, Musk’s dismantling of the social network is a success, so long as it makes the right people miserable. Left with few options, Musk has decided to mortgage the Twitter brand to save his own. Even by his standards, it’s a risky bet.

The Atlantic Hires Michael Powell and Zoë Schlanger as Staff Writers

The Atlantic

www.theatlantic.com › press-releases › archive › 2023 › 07 › atlantic-hires-michael-powell-and-zoe-schlanger › 674739

The journalists Michael Powell and Zoë Schlanger will join The Atlantic as staff writers, editor in chief Jeffrey Goldberg announced today. Michael has been a reporter at The New York Times since 2007, and will begin with The Atlantic next month. Zoë will start this fall, covering issues of climate and writing the newsletter The Weekly Planet, which tells the story of life on a changing planet.

In a note to staff, Jeff wrote: “Michael and Zoë are brilliant additions to our growing roster of staff writers. Michael is one of the preeminent reporters working today, and Zoë is a young journalist of exceptional promise. I am committed to providing our readers with the best journalism from the best writers, and Michael and Zoë will help us achieve this goal.”

At The New York Times, Michael covered presidential campaigns, reported on the economy, wrote the “Gotham” column for the Metro section, and for six years was the “Sports of the Times” columnist. Most recently, he was a national reporter covering issues around free speech and expression, and stories capturing intellectual and campus debate. He and two colleagues won the George Polk Award for reporting on a corrupt police detective—stories that led to more than a dozen exonerations, including freeing a man who had served 22 years for a murder he did not commit—and he was part of a team that won the Pulitzer for breaking-news reporting on Eliot Spitzer. Before joining the Times, Michael worked for The Washington Post from 1996 to 2006, where he covered the 2000 presidential campaign and later served as New York bureau chief.

Zoë is a distinguished science reporter and the author of a forthcoming book, The Light Eaters, about plant intelligence. She has contributed to The Atlantic, The New York Times, New York Review of Books, and Audubon magazine, among other publications. She was previously a staff reporter at Newsweek and later Quartz, reporting on climate change, the environment, health, and science policy. She has received several awards for her reporting, including for her coverage of the global plastic trade and air pollution in Detroit.

Other recent journalists to join The Atlantic are Hanna Rosin, as a senior editor and host of the Radio Atlantic podcast; the Pulitzer Prize winner Stephanie McCrummen as a staff writer, after nearly two decades at The Washington Post; and Laura Secor as a senior editor directing coverage of global issues and foreign policy. Laura was a features editor for The Wall Street Journal’s Weekend Review, and previously a deputy editor at Foreign Affairs.

The Writers Who Went Undercover to Show America Its Ugly Side

The Atlantic

www.theatlantic.com › books › archive › 2023 › 07 › detective-books-wwii-racism-anti-semitism › 674658

This story seems to be about:

In the years during and after World War II, the battle against fascism spread to an unanticipated front line: the national conscience of the United States. The warriors in this fight, many of them Black and Jewish veterans of combat abroad, insisted that America confront and rectify its homegrown racial hierarchy and religious intolerance. “Double V” was the slogan coined by the African American newspaper The Pittsburgh Courier, meaning victory over Hitler abroad and over Jim Crow at home.

The seeds of what would eventually become the civil-rights movement included not only mass protest and political mobilization but a wide array of cultural and artistic expressions. Some of them—Frank Sinatra’s song and short film The House I Live In; a Superman radio serial pitting the Man of Steel against a thinly veiled version of the Ku Klux Klan—sought nothing less than a redefinition of American identity that would embrace racial and religious minorities. In his 1945 film, Sinatra came to the defense of a Jewish boy menaced by a gentile mob. On the radio serial a year later, Superman protected a Chinese American teenager from the lethal assault of the “Clan of the Fiery Cross.” The lyrics of The House I Live In captured the new ethos: “The faces that I see / All races and religions / That’s America to me.”

Alongside these sunnier affirmations of inclusion, there appeared a withering critique of American bigotry in the form of a very specific subset of books. All of them, whether fictional or factual, employed the identical device of a writer going undercover to discover and expose the bigoted netherworld of white Christian America. Within the finite period of six years beginning in 1943, these books became both commercial phenomena and effective goads to the national soul. They explicitly sought a mass audience by employing devices borrowed from detective novels, espionage fiction, and muckraking journalism: the secret search, the near-escape from being found out, the shocking revelation of the rot hiding just below the surface of normal life. Whatever these books may have lacked in sentence-to-sentence literary elegance, they made up for with page-turning drama.

Unfortunately, for the most part, they have since been forgotten, or simply overwhelmed by the volume of World War II self-congratulation, however well deserved. But in their own time period, when these books were reaching millions of readers, a victorious America was by no means presumed to be an innocent America. Within a year of V-J Day, the investigative journalist John Roy Carlson released his exposé of domestic right-wing extremism, The Plotters, and laid out the stakes starkly:

We’ve won the military war abroad but we’ve got to win the democratic peace at home. Hitlerism is dead, but incipient Hitlerism in America has taken on a completely new star-spangled face. It follows a ‘Made in America’ pattern which is infinitely subtler and more difficult to guard against than the crude product of the [pro-fascist German American] Bundists. It is found everywhere at work in our nation. It’s as if the living embers had flown over the ocean and started new hate fires here while the old ones were dying in Europe.

Carlson did not need Nazi Germany to alert him to the perils of mass bigotry. His real name was Avedis Derounian, and as a boy, he had fled the Turkish genocide against Armenians. Having mastered English as a high-school student on Long Island and an undergraduate at New York University, Derounian found his way during the late 1930s into Friends of Democracy, an anti-fascist organization led by a Unitarian minister. With the title of chief investigator and a salary of $50 a week, Derounian developed a cover as the publisher of a pro-fascist newspaper, the Christian Defender, and soon found situations where he could immerse himself in the purpose of exposing the purveyors of hate: a pro-Nazi summer camp on Long Island, the “Christian Mobilizers” militia formed by the right-wing radio priest Charles Coughlin, and also a Bund rally in Madison Square Garden that flanked a portrait of George Washington with a pair of swastikas.     

[Read: The new anarchy]

Derounian inhabited his doppelgänger so deftly that sometimes he even joined in the shouting. His Christian Defender newspaper looked so genuine that the U.S. State Department launched an investigation of it and Derounian hurriedly stopped publishing. All this derring-do led to some trenchant and disturbing conclusions. “My experience convinced me,” Derounian wrote, “that under the slogans of ‘patriotism’ they were inoculating innocent Americans with the virus of hate, undermining confidence in our leaders, promoting hate and suspicion.”

When his book Under Cover landed—all 521 pages, not counting index, illustrated with dozens of reproduced extremist documents—it was impossible to ignore. According to a compilation by Andrew Immerwahr, a historian of ideas, Under Cover was the best-selling nonfiction book in America in 1943, ultimately going through 20 printings. The Army Air Forces had Derounian speak to enlisted men on the theme “The Enemy Within.”

At the book’s end, Derounian promised readers (and himself), “I am going back to the world I left behind … to live in the sunshine again.” He did no such thing. Instead, he cloaked himself in the character of Robert Thompson, a disgruntled war veteran, and extended his stealthy inquiry from America’s wartime traitors to its peacetime demagogues. Most prominent among them was Gerald L. K. Smith, the minister who founded the America First political party (the name an homage to the isolationist movement that featured the aviation hero Charles Lindbergh) as the electoral vehicle for his virulent racism and anti-Semitism. But Derounian also found extremism in women’s groups with such anodyne names as “United Mothers.”        

“The conclusion is inescapable,” Derounian wrote, “that while we have won a war of democracy over fascist evil abroad, we have allowed hate and prejudice to gain a firm foothold at home.” A page later, he continued, “The grim fact is that they have infiltrated into the warp and woof of American life.”        

Given the massive attention that Derounian’s books received, it seems entirely possible, even probable, that the novelist Laura Z. Hobson took note of his methodology. Though her married surname obscured the fact, Hobson was the daughter of two Jewish immigrants of socialist leanings, and the Z stood for her family patronymic of Zametkin. Her novel Gentleman’s Agreement—excerpted in Cosmopolitan magazine in late 1946 and published in early 1947—inverted Derounian’s tactic of pretending to be an extremist by having a gentile journalist, Philip Green, purport to be Jewish in order to write a magazine exposé about anti-Semitism. And whereas Derounian had revealed the bellicose, violent style of Jew-hating embodied by Silver Shirts, the German American Bund, and their ilk, Hobson used the fictive Green to unveil the polite, socially acceptable anti-Semitism of the country club and exclusive hotels and neighborhoods. Eventually Green’s own fiancée shows herself to be one of those refined bigots, or at least an apologist for them, and the revelation ruptures the couple’s engagement.

“It’s just that I’ve come to see that lots of nice people who aren’t [anti-Semites] are their unknowing helpers and connivers,” Green lectures his fiancée. “People who’d never beat up a Jew or yell kike at a child. They think antisemitism is something way off there, in a dark crackpot place with low-class morons. That’s the biggest thing I’ve discovered about this whole business.”

Hobson’s message clearly struck a chord. Gentleman’s Agreement went through three printings before its official publication date and ultimately sold 1.6 million copies. As a manual of moral instruction, Gentleman’s Agreement was released in a special Armed Services Edition for the American military. Magnifying the novel’s impact, a film adaptation written by Moss Hart, directed by Elia Kazan, and starring Gregory Peck as Philip Green received eight Oscar nominations in 1948 and won three, including for Best Picture and Best Director. A straight line can easily be drawn from Peck playing one version of the ethical role model in Gentleman’s Agreement and another 15 years later as Atticus Finch in To Kill a Mockingbird.

[Read: Is Holocaust education making anti-Semitism worse?]

In the same year when the fictive Philip Green loomed so large in American popular culture, an award-winning journalist was undertaking a real-life version of passing. Ray Sprigle of the Pittsburgh Post-Gazette had already won a Pulitzer Prize for revealing that Supreme Court Justice Hugo Black had belonged to the Ku Klux Klan. For another investigative scoop, Sprigle had disguised himself as a psychiatric patient in order to expose an abusive state hospital. But to similarly report on racism in the South, Sprigle, who was white, needed to fake his way across the color line. He failed in several attempts to chemically dye his skin, because the substances could cause illness or even death if he kept using them, before settling on shaving his scalp to leave no telltale straight hairs and then tanning for three weeks in Florida. His success at the deception depended on the “one-drop rule” of racial identity, in which any American with the slightest fraction of African ancestry, regardless of pigment, was categorized as Black. In a way, Sprigle was reversing the passing formula deployed by Walter White, the executive director of the NAACP, who used his fair skin and hair to pretend to be white while courageously researching racist attacks, many of them against Black war veterans returning to the South.

With the pseudonym of James R. Crawford and a backstory about being “a light-skinned Negro from Pittsburgh,” Sprigle crossed the Mason-Dixon line—the “Smith and Wesson line to us black folk”—in one of the all-Black railroad carriages known as a “Jim Crow car.” During four “fear-filled weeks,” Sprigle embedded himself in the very heart of the former Confederacy: Georgia, Alabama, and Mississippi. He bore witness to the financial exploitation of the sharecropping system, the miserly funding for Black schools, the refusal of white hospitals to admit a Black woman needing an emergency Cesarean section, who ultimately died untreated. Sprigle also paid sympathetic attention to the echelon of Black professionals—dentists, professors, doctors, lawyers, NAACP activists, real-estate developers—who nonetheless found their social status to be relegated below the poorest, least-educated white person.

“These whites … were a people entirely alien to me, a people set far apart from me and my world,” Sprigle wrote in his Black persona. “The law of this new land I had entered decreed that I had to eat apart from these pale-skinned men and women—behind that symbolic curtain.” At the same time, he added perceptively, “Not that I wanted to ride with these whites or eat with them. What I resented was their impudent assumption that I wanted to mingle with them, their arrogant and conceited pretense that no matter how depraved and degenerate some of them might be, they [were] … of a superior breed.”

Sprigle produced a 21-part series for the Post-Gazette, “I Was a Negro in the South for 30 Days,” which began running in August 1948. Newspapers as wide-ranging as the Pittsburgh Courier, The Seattle Times, and the New York Herald Tribune reprinted the series, providing national exposure. Then, in 1949, Simon & Schuster collected the articles in book form under the title In the Land of Jim Crow.

The effect that Derounian, Hobson, and Sprigle had on American public opinion and policy cannot be quantified. But it also seems more than accidental that their books—along with Sinatra’s song and film; the Superman radio series; and such works as Richard Wright’s memoir, Black Boy (1945), and Gunnar Myrdal’s sociological tome, An American Dilemma (1944)—coincided with a surge of activism against racism and anti-Semitism during the 1940s. One need not employ the term woke to suggest that these books, movies, songs, and comics roused many Americans from a complacent moral slumber.

The Democratic Party embraced civil rights for the first time in its platform at the 1948 convention, driving the bloc of southern segregationists to form their Dixiecrat third party. Within weeks of the convention, President Harry Truman issued executive orders desegregating the military and the federal workforce. Also in 1948, the Supreme Court unanimously ruled in Shelley v. Kraemer that restrictive covenants, the sort routinely used to keep Black people, Jews, and other minority groups out of certain neighborhoods, were unconstitutional. These efforts amounted to a kind of proto–civil-rights movement, anticipating what we know as the civil-rights movement that launched in the mid-1950s with the Supreme Court’s decision outlawing school segregation in Brown v. Board of Education and the Montgomery bus boycott led by Martin Luther King Jr.

Yet Ray Sprigle’s book about his time being Black in the South sold only modestly, and that disappointing outcome may well have reflected more than the endemic capriciousness of the publishing industry. The historical moment during and immediately after the war years, when America belatedly began to redress its own deep-seated prejudices, ended as abruptly as one could say the words Cold War. By 1949, the anti-fascist alliance between the United States and the Soviet Union had mutated into global ideological and military rivalry. As Derounian had presciently foreseen in The Plotters, the specter (and partial but exaggerated reality) of communism in the United States had supplanted the actually existing presence of American right-wing extremists as public enemy No. 1. To express the belief that America was imperfect, indeed hypocritical, in its claims of equality, was to risk being branded disloyal and caught up in the Red Scare.

[Read: America has had it worse]

None of Hobson’s subsequent novels nearly equaled the sales of Gentleman’s Agreement. Derounian wrote only one more book in the remaining decades of his life, dying in 1991 at the age of 82. Sprigle died in a car accident in 1957. Four years later, the white writer John Howard Griffin basically adopted Sprigle’s idea and method of traversing the Jim Crow South as a Black man. (Unlike Sprigle, Griffin was able to dye his skin dark without medical risks.) With the civil-rights movement compelling America to once again regard itself in the moral mirror, Griffin’s book Black Like Me sold more than 1 million copies and was adapted for a film. More recently, one of Sprigle’s successors at the Pittsburgh Post-Gazette, Bill Steigerwald, recounted the race series in a 2017 book, 30 Days a Black Man. And Rachel Maddow’s 2022 podcast, Ultra, which focused on the pro-Nazi movement in 1940s America, made reference to Derounian’s work in Under Cover.

Among these authors of the 1940s, Hobson has fared best. But the lingering impact of Gentleman’s Agreement surely owes more to the film adaptation, which neatly pruned away some of the novel’s formulaic subplots, than the book itself. The works of Derounian and Sprigle, so daring in their time, fit very awkwardly within current norms. ABC News lost a federal court case (though the verdict was reversed on appeal) for planting reporters with false résumés as workers in Food Lion supermarkets to expose unsafe practices. The Chicago Sun-Times was denied a Pulitzer Prize in 1978 for a series about corrupt city inspectors that involved creating a phony bar, wryly called the Mirage, that was staffed by journalists and equipped with hidden cameras. As for a journalist or nonfiction author pretending to be a Black person, even for the sake of chronicling discrimination, the gambit would assuredly be reviled as cultural appropriation at best and its own form of liberal racism at worst.

And in Trumpian America, the excretions of racism, anti-Semitism, homophobia, and on down the list hardly feel the need to hide. Yet, for that very reason, there is immense value in cracking open the books of Derounian and his fellow truth detectives from nearly 80 years ago. They provide a piercing reminder of the deep roots, indeed the nearly identical vocabulary and populist demagoguery, of the hatred on such lurid display today.

Hip-Hop’s Midlife Slump

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 07 › hip-hop-mainstream-evolution-puff-daddy-hamptons-white-party › 674575

This story seems to be about:

In the summer of 1998, the line to get into Mecca on a Sunday night might stretch from the entrance to the Tunnel nightclub on Manhattan’s 12th Avenue all the way to the end of the block; hundreds of bodies, clothed and barely clothed in Versace and DKNY and Polo Sport, vibrating with anticipation. Passing cars with their booming stereos, either scoping out the scene or hunting for parking, offered a preview of what was inside: the sounds of Jay-Z and Busta Rhymes and Lil’ Kim. These people weren’t waiting just to listen to music. They were there to be part of it. To be in the room where Biggie Smalls and Mary J. Blige had performed. To be on the dance floor when Funkmaster Flex dropped a bomb on the next summer anthem. They were waiting to be at the center of hip-hop.

What they didn’t realize was that the center of hip-hop had shifted. Relocated not just to another club or another borough, but to a beachfront estate in East Hampton. Although Sundays at the Tunnel would endure for a few more years, nothing in hip-hop, or American culture, would ever be quite the same again.

It’s been 25 years since Sean Combs, then known as Puff Daddy, hosted the first of what would become his annual White Party at his home in the Hamptons. The house was all white and so was the dress code: not a cream frock or beige stripe to be seen. Against the cultural landscape of late-’90s America, the simple fact of a Black music executive coming to the predominantly white Hamptons was presented as a spectacle. That summer, The New York Times reported, “the Harlem-born rap producer and performer had played host at the Bridgehampton polo matches, looking dapper in a seersucker suit and straw boater. The polo-playing swells had invited him and he had agreed, as long as the day could be a benefit for Daddy’s House, a foundation he runs that supports inner-city children.”

To be clear, hip-hop was already a global phenomenon whose booming sales were achieved through crossover appeal to white consumers. Plenty of them were out buying Dr. Dre and Nas CDs. Combs was well known to hip-hop aficionados as an ambitious music mogul—his story of going from a Howard University dropout turned wunderkind intern at Uptown Records to a mega-successful A&R executive there was the kind of thing that made you wonder why you were paying tuition. But to those young white Americans, in 1998, he was just the newest rap sensation to ascend the pop charts. When Combs’s single “Can’t Nobody Hold Me Down” hit No. 1 on the Billboard Hot 100 the year before, it was only the tenth rap track to do so. The genre was still viewed as subversive—“Black music” or “urban music,” music that was made not for the polo-playing swells, but for the inner-city children whom their charity matches benefited.

Hip-hop was born at a birthday party in the Bronx, a neglected part of a neglected city. The music and culture that emerged were shaped by the unique mix of Black and Puerto Rican people pushed, together, to the margins of society. It was our music. I was a Nuyorican girl in Brooklyn in the ’80s and ’90s; hip-hop soundtracked my life. If Casey Kasem was the voice of America, on my radio, Angie Martinez was the voice of New York.

When I went to college in Providence, I realized all that I’d taken for granted. There was no Hot 97 to tune into. There were no car stereos blasting anything, much less the latest Mobb Deep. Hip-hop became a care package or a phone call to your best friend from home: a way to transcend time and space. It also became a way for the few students of color to create community.

You could find us, every Thursday, at Funk Night, dancing to Foxy Brown or Big Pun. Sundays, when the school’s alternative-rock station turned the airways over to what the industry termed “Black music” were a day of revelry. Kids who came back from a trip to New York with bootleg hip-hop mixtapes from Canal Street or off-the-radio recordings from Stretch Armstrong and Bobbito Garcia’s underground show were lauded like pirates returning home with a bounty. We knew that hip-hop was many things, but not static. We understood that it was going to evolve. What we weren’t perhaps ready for was for it to go truly mainstream—to belong to everyone.  

The media were quick to anoint Combs a “modern-day Gatsby,” a moniker Combs himself seems to have relished. “Have I read The Great Gatsby?” he said to a reporter in 2001. “I am the Great Gatsby.” It’s an obvious comparison—men of new money and sketchy pasts hosting their way into Long Island polite society—but a lazy one. Fitzgerald’s character used wealth to prove that he could fit into the old-money world. Combs’s White Party showcased his world; he invited his guests to step into his universe and play on his terms. And, in doing so, he shifted the larger culture.

Would frat boys ever have rapped along to Kanye West without the White Party? Would tech bros have bought $1,000 bottles of $40 liquor and drunkenly belted out the lyrics to “Empire State of Mind”? Would Drake have headlined worldwide tours? Would midwestern housewives be posting TikToks of themselves disinfecting countertops to Cardi B songs? It’s hard to imagine that a single party (featuring a Mister Softee truck) could redefine who gets to be a bona fide global pop star but, by all accounts, Puffy was no ordinary host.

Mel D. Cole

The man had a vision. “I wanted to strip away everyone’s image,” Combs told Oprah Winfrey years after the first White Party, “and put us all in the same color, and on the same level.” That the level chosen was a playground for the white and wealthy was no accident. Upon closing a merger of his Bad Boy record label with BMG for a reported $40 million in 1998, he told Newsweek, “I’m trying to go where no young Black man has gone before.”

“It was about being a part of the movement that was a new lifestyle behind hip-hop,” Cheryl Fox told me. Now a photographer, she worked for Puffy’s publicist at the time of the first White Party. The Hamptons, the all-white attire: It was Puffy’s idea. But the white people, she said, were a publicity strategy. “He was doing clubs, and he was doing parties that did not have white people,” she told me. “I brought the worlds together, and then I was like, ‘You got to step out of the music. You can’t just do everything music.’” She meant that he should expand the guest list to include actors and designers and financiers—the kinds of people who were already flocking to the Hamptons.

In the end, “​​I had the craziest mix,” Combs told Oprah. “Some of my boys from Harlem; Leonardo DiCaprio, after he’d just finished Titanic. I had socialites there and relatives from down south.” Paris Hilton was there. Martha Stewart was there. “People wanted to be down with Puff,” Gwen Niles, a Bad Boy rep at the time, told me about that first party. “People were curious: Who is this rap guy?

Hip-hop was already popular. The message the party sent was that hip-hop, and the people who made it, were also “safe.”

Rap music was for so long cast by white media as dangerous, the sonic embodiment of lawlessness and violence. This narrative was so sticky that it kept hip-hop confined to the margins of pop culture despite its commercial success.

Hip-hop didn’t always help itself out here. Artists screwed up in the ways artists in all genres do—with drug addictions, outbursts, arrests—but when it came to hip-hop, those transgressions were used to reinforce cultural stereotypes. Misogyny had been embedded in the lyrics of hip-hop nearly since its inception. A heartbreaking 2005 feature by Elizabeth Méndez Berry in Vibe exposed the real-world violence inflicted upon women by some of hip-hop’s most beloved artists, including Biggie Smalls and Big Pun. Homophobia in hip-hop perpetuated anti-queer attitudes, particularly in communities of color. And although lyrical battles have always been a thing, rhetorical fights never needed to become deadly physical ones.

This was the context in which Puffy headed to the Hamptons. Though only 28, he had baggage. While a young executive at Uptown in 1991, he had organized a celebrity basketball game at CUNY’s City College to raise money for AIDS charities. Tickets were oversold, and a stampede left nine people dead and many more injured. The tragedy stayed in the headlines for weeks. (Years later, Puffy would settle civil suits with victims.)

In 1993, Combs launched Bad Boy Records, with a roster of stars such as Biggie. The label met with immediate success, but also controversy, after a shooting involving the California rapper Tupac Shakur embroiled Bad Boy in a contentious battle between East and West. By the spring of 1997, Biggie and Tupac were dead—Biggie gunned down in Los Angeles in what appeared to be retribution for the killing of Tupac the year before. Biggie was shot while stopped at a red light; Combs was in another car in the entourage. (Neither murder has been solved.) That fall, Combs performed “I’ll Be Missing You,” his tribute to Biggie, live at MTV’s Video Music Awards. With a choir in the rafters, Combs danced through his grief. It was a moment of rebirth, of reinvention. Combs and the gospel singers wore white.

To be clear, most of what Puffy was making as an artist and producer in this era was accessible to a white, affluent fan base. These were the kind of tracks that sampled songs your parents would have danced to, spliced and sped up so that you wanted to dance to them now. Outside of “I’ll Be Missing You” and a few songs about heartbreak, many of the lyrics were about getting, having, and spending money.

But Puffy made possible the crossover explosion of more substantial artists such as Lauryn Hill and OutKast and Jay-Z, the first generation of hip-hop superstars.

You could also say that Puffy took a musical neighborhood—one that held history and heritage and layers of meaning—and gentrified it. Cleaned it up for whiter, wealthier patrons to enjoy, people who had no idea of what the “old ’hood” was about. Both things can be true.

The summer of 1998 was also the summer before my last year of college. Up in Providence, a local copycat to Hot 97 had cropped up and gained traction: WWKX, Hot 106, “the Rhythm of Southern New England.” Seemingly overnight, the frat houses added DMX to their rotation. A classmate—a white socialite from the Upper East Side—came back senior year with box braids describing herself as a real “hip-hop head.” Funk Night became a campus-wide phenomenon, and then it ceased to exist. Nobody needed a hip-hop night when every night was hip-hop night.

In rap, the feeling was “I’m keeping it real. I’m gonna stay on this block,” Jay-Z recounts of this era in the Bad Boy documentary, Can’t Stop, Won’t Stop. “And our feeling was like, Yeah? I’ll see you when I get back.” Emotions around this ran hot at the time—the idea that hip-hop had left its true fans behind. But in the end, more of us were happy to see hip-hop conquer the world than were grouching in the corner about the good ol’ days.

In 2009, Puffy, by then known as Diddy, relocated his White Party to Los Angeles; hip-hop’s new mecca was the land of celebrity. The vibe, according to people who were there, just wasn’t the same. But hip-hop itself was moving on to bigger and bigger arenas. In 2018, hip-hop dominated streaming, and accounted for more than 24 percent of record sales that year. That same year, Eminem headlined Coachella, Drake dominated the Billboard 100 for months, and Kendrick Lamar won a Pulitzer Prize.

Then something shifted again. This year isn’t just the 25th anniversary of the first White Party. It’s the 50th anniversary of hip-hop itself. And although it’s come a long way since Kool Herc deejayed a Bronx basement dance party, the genre appears to be suffering a midlife slump.

For the first time in three decades, no hip-hop single has hit No. 1 yet this year. Record sales are down. According to one senior music executive I spoke with, who asked to remain anonymous because she wasn’t authorized to speak, festivals have been reluctant to book rappers as headliners since 2021. That’s the year that eight people were crushed to death at the Astroworld Festival in Houston; two more died later of their injuries. The performer Travis Scott was accused (fairly or unfairly) of riling up the crowd. (Coachella hasn’t had a true hip-hop headliner since Eminem.)

But the other question is: Which headliners would they even book? Kendrick Lamar is winding down his 2022 tour. Nicki Minaj doesn’t have a new album coming out until the fall. Staple acts such as J. Cole probably won’t release an album this year at all. Megan Thee Stallion, who got shot a few years ago and has been feeling burned out by the industry, is taking a break from music. As the legendary artists Too $hort and E-40 wrote in this magazine, since 2018, hip-hop has seen at least one rapper’s life a year ended by violence. The careers of Gunna and Young Thug—two major acts on the rise—have stalled while they’ve been caught up in RICO charges in Atlanta. (Perhaps sensing an opportunity, Drake just announced that a new album and tour would be coming soon.)

Recently, The New York Times ran an article about how the Hamptons have lost their cool. Too affluent. Too old. Too out of touch. Maybe hip-hop, for the first time, is suffering from similar doldrums. But obituaries to the genre have been written before. It’s only a matter of time before a new Gatsby shows up, ready to throw a party.