Itemoids

New York

Opinion: The political tightrope that Trump's Republican rivals must walk

CNN

www.cnn.com › 2023 › 03 › 23 › opinions › republicans-tightrope-trump-legal-troubles-brown › index.html

Progressives may be eager to see former President Donald Trump potentially fingerprinted and told he has the right to remain silent. But let's be clear — no one knows how the possible criminal charges against Trump will play out politically. Or even if prosecutors in New York, Georgia and Washington, DC will ultimately obtain indictments against the former president.

The Crisis of the Intellectuals

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 03 › intellectualism-crisis-american-racism › 673480

This story seems to be about:

In 2017, I was trying to write How to Be an Antiracist. Words came onto the page slower than ever. On some days, no words came at all. Clearly, I was in crisis.

I don’t believe in writer’s block. When words aren’t flowing onto the page, I know why: I haven’t researched enough, organized the material enough, thought enough to exhume clarity, meticulously outlined my thoughts enough. I haven’t prepared myself to write.

But no matter how much I prepared, I still struggled to convey what my research and reasoning showed. I struggled because I was planning to challenge traditional conceptions of racism, and to defy the multiracial and bipartisan consensus that race neutrality was possible and that “not racist” was a definable identity. And I struggled because I was planning to describe a largely unknown corrective posture—being anti-racist—with long historical roots. These departures from tradition were at the front of my struggling mind. But at the back of my mind was a more existential struggle—a struggle I think is operating at the front of our collective mind today.

[Ibram X. Kendi: The mantra of white supremacy]

It took an existential threat for me to transcend my struggle and finish writing the book. Can we recognize the existential threat we face today, and use it to transcend our struggles?

As I tried to write my book, I struggled over what it means to be an intellectual. Or to be more precise: I struggled because what I wanted to write and the way in which I wanted to write it diverged from traditional notions of what it means to be an intellectual.

The intellectual has been traditionally framed as measured, objective, ideologically neutral, and apolitical, superior to ordinary people who allow emotion, subjectivity, ideology, and their own lived experiences to cloud their reason. Group inequality has traditionally been reasoned to stem from group hierarchy. Those who advance anti-racist, anti-sexist, anti-classist, and anti-homophobic ideas have historically been framed as anti-intellectual.

The traditional construct of the intellectual has produced and reinforced bigoted ideas of group hierarchy—the most anti-intellectual constructs existing. But this framing is crumbling, leading to the crisis of the intellectual.

Behind the scenes of the very public anti–critical race theory, anti-woke, and anti–anti-racism campaign waged mostly by Republican politicos is another overlapping and more bipartisan campaign waged mostly by people who think of themselves as intellectuals. Both campaigns emerged in reaction to the demonstrations in the summer of 2020 that carried anti-racist intellectuals to the forefront of public awareness.

These intellectuals not only highlighted the crisis of racism but, in the process, started changing the public conception of the intellectual. Their work was more in line with that of medical researchers seeking a cure to a disease ravaging their community than with philosophers theorizing on a social disease for theory’s sake from a safe remove. We need the model these new intellectuals pursued to save humanity from the existential threats that humans have created, including climate change, global pandemics, bigotry, and war.

But this new conception of the intellectual and those who put it into practice face all sorts of resistance. Opponents denounce the “illiberal” dangers of identity politics and proclaim the limits of “lived experience.” They argue that identity politics makes everything about identity, or spurs a clash of identities. In fact, the term identity politics was coined in the 1970s, a time when Black lesbian women in organizations like Boston’s Combahee River Collective were being implored to focus their activist work on the needs of Black men, in Black power spaces; white women, in feminist areas; and gay men, in gay-liberation struggles—on everyone’s oppression but their own. They were determined to change that. “This focus upon our own oppression is embodied in the concept of identity politics,” Demita Frazier, Beverly Smith, and Barbara Smith wrote in the 1977 Combahee River Collective statement. It is common sense for people to focus on their own oppression, but these activists did not wish to focus only on their own oppression. The Combahee River Collective was “organizing Black feminists as we continue to do political work in coalition with other groups.”

Forty-six years later, when intellectuals of all races produce work on matters primarily affecting white people, the assumed subject of intellectual pursuits, these thinkers are seldom accused of engaging in identity politics. Their work isn’t considered dangerous. These thinkers are not framed as divisive and political. Instead, they are praised for example, for exposing the opioid crisis in white America, praised for pushing back against blaming the addicted for their addictions, praised for enriching their work with lived experiences, praised for uncovering the corporations behind the crisis, praised for advocating research-based policy solutions, praised for seeking truth based on evidence, praised for being intellectuals. As they all should be. But when anti-racist intellectuals expose the crisis of racism, push back against efforts to problematize people of color in the face of racial inequities, enrich our essays with lived experiences, point to racist power and policies as the problem, and advocate for research-based anti-racist policy solutions, the reactions couldn’t be more different. We are told that “truth seeking” and “activism” don’t mix.

American traditions do not breed intellectuals; they breed propagandists and careerists focusing their gaze on the prominent and privileged and powerful and on whatever challenges are afflicting them. Intellectuals today, when focused on the oppression of our own groups—as embodied in the emergence of Queer Studies, Women’s Studies, African American Studies, Native American Studies, Critical Whiteness Studies, Disability Studies, Latino Studies, Jewish Studies, Middle Eastern Studies, and Asian American Studies—are ridiculed for pursuing fields that lack “educational value,” and our books, courses, programs, and departments are shut down and banned by the action of Republicans and the inaction of Democrats. We are told to research, think, and write about people, meaning not our people. We are told to let our people die. We are told to die.

[Jarvis R. Givens: What’s missing from the discourse about anti-racist teaching]

Think about the gaslighting of it all. We are told that white people are being replaced in society, in their jobs within the “intellectual” class. One of the most successful living authors, James Patterson, claimed that white men are experiencing “another form of racism” as they, according to Patterson, struggle to break through as writers in publishing, theater, TV, and film.

Aggrieved white people and their racist propagandists are offering similarly dangerous replacement theories across the “intellectual” class. If white people are being replaced by Black and Latino people, then why are Black and Latino people still underrepresented across many sectors of the “intellectual” class—among authors, in publishing, among full-time faculty, in newsrooms? (Such evidence likely compelled James Patterson to backtrack and apologize.) With all of this evidence, other commentators have focused on the extent of “self-censorship” or “cancel culture” affecting white people (as if people of color aren’t self-censoring or being canceled at least as often). Worst of all, the racist perpetrators of these theories, like Donald Trump, frame themselves as the victims. When Scott Adams has his comic dropped after he called Black people a “hate group” and told his white listeners “to get the hell away from Black people,” they claim that the real problem is anti-whiteness.

And then, when anti-racist intellectuals historicize these white-supremacist talking points about anti-racism being anti-white and give evidence of their long and deep and violent history, when we historicize disparities like the racial wealth gap that are as much the product of the past as the present, when new research and thinking allow us to revise present understandings of the past, when we use the past to better understand the present and the future, we are told to keep the past in the past. We are told not to change the inequitable present, and not to expect anything to change in the future. We are told to look away as the past rains down furiously on the present. Or we are told that intellectuals should focus only on how society has progressed, a suicidal and illogical act when a tornado is ravaging your community. Yet again, we are told to let our people die. We are told to die.

“Above all, historians should make us understand the ways in which the past was distinct,” the New York Times columnist Bret Stephens wrote. When we are told that historical writings should be irrelevant to our contemporary debates, it is not hard to figure out why. History, when taught truthfully, reveals the bigotry in our contemporary debates. Which is why the conservators of bigotry don’t want history taught in schools. It has nothing to do with the discomfort of children. It is uncomfortable for the opponents of truthful history to have the rest of us see them, to have their kids to see them. They don’t want anyone to clearly see how closely they replicate colonizers, land stealers, human traders, enslavers, Klansmen, lynchers, anti-suffragists, robber barons, Nazis, and Jim Crow segregationists who attacked democracy, allowed mass killings, bound people in freedom’s name, ridiculed truth tellers and immigrants, lied for sport, banned books, strove to control women’s reproduction, blamed the poor for their poverty, bashed unions, and engaged in political violence. Historical amnesia is vital to the conservation of their bigotry. Because historical amnesia suppresses our resistance to their bigotry.

Or, for others, it is about conserving tradition. James Sweet, while serving as the American Historical Association president last year, challenged what he calls “presentism” in the profession. He recently clarified that his target was the “professional historians who believe that social justice should be their first port of entry, which is not the way that we’ve traditionally done history.” And yet, throughout most of the history of history as a discipline, historians have centered Europe, white people, men, and the wealthy in their accounts and composed tales of their superiority. That is the way historians have traditionally done history until recent decades, all of this social injustice entering our collective consciousness clothed in neutrality and objectivity. So now, abolishing the master’s narrative and emancipating the truth must be one of our first ports of entry. To be an intellectual is to know that the truth will set humanity free to gain the power to make humanity free.

Maybe I did have writer’s block when I started composing How to Be an Antiracist back in 2017. I did not suffer from that sort of blockage when writing Stamped From the Beginning, several years earlier. Writing that book was like writing in a cave, to the cave. I didn’t think many people would read the book, let alone think of me as an intellectual. All I cared about was writing history.

But when Stamped From the Beginning won a National Book Award, I began to think about my standing as an intellectual. Suddenly, I was writing in the public square, to the public square. The traditional strictures kept blocking the writing. Be objective. Be apolitical. Be balanced. Be measured. Your primary audience should be others in your field. Keep them in mind. Do not defy the orthodoxy they created. Reinforce it. Satisfy them to advance your career. I faced a blockade of old and fraught traditions regarding what it means to be an intellectual that had nothing to do with the process of truth finding and telling.

[Ibram X. Kendi: The double terror of being Black in America]

Traditional notions of the intellectual were never meant to include people who looked like me or who had a background like mine, who came from a non-elite academic pedigree, emerged proudly from a historically Black university, earned a doctorate in African American Studies. Traditional notions of the intellectual were never meant to include people who researched like me, thought like me, wrote like me—or who researched, thought, or wrote for people like me. Traditional notions of the intellectual were never meant to include people who are not ranking groups of people in the face of inequity and injustice. Traditional notions of the intellectual were never meant to include those of us who are fixated and focused wholly and totally on uncovering and clarifying complex truths that can radically improve the human condition. Traditional notions of the intellectual were never meant to include our conception of the intellectual.

I knew this. I knew about the equation of the Enlightenment and “reason” and “objectivity” and “empiricism” with whiteness and Western Europe and masculinity and the bourgeoisie. I knew that Francis Bacon, the father of “empiricism” in the sciences, held anti-Black racist ideas, and that his work became the basis for “empirical” quests among eugenicists to assert natural human hierarchy that climaxed in the mass sterilization of Black and Latina and disabled and low-income women in the United States and in the Holocaust of Jews and other “undesirables” in Nazi Germany. I knew that the originator of “objectivity” in history, Leopold von Ranke, believed that the “world divinely ordered” meant Europeans, Christians, and the wealthy at the top. I knew that bigoted academics, who obscured their bigotry behind their objectivity, founded almost every academic discipline in the United States. I knew that objectivity and the construct of “balance” migrated from the U.S. academy to U.S. journalism as professional ideals after World War I, when a wave of newspaper mergers and closings compelled reporters to appeal to wide swaths of the public. (Sound familiar?) I knew that the Hutchins Commission, organized in 1947 to report on the proper function of the media, had warned against objective and balanced reporting that was “factually correct but substantially untrue.” I knew that traditional conceptions of the intellectual serve the status quo of injustice.

Intellectuals who are people of color, women, non-Christian, LGBTQ, or working class—indeed intellectuals of all identities who have challenged the status quo, especially traditional and bigoted conventions—have historically been cast aside as nonintellectuals. Commentators lambasted the investigative journalist and educator Ida B. Wells as “partisan” and “a licentious defamer” for the “obscene filth that flows from her pen”—all for finding and telling the hard truths about lynchings. Scholars described W. E. B. Du Bois, a pioneering historian, sociologist, and editor, as “bitter” after he wrote The Souls of Black Folk and his magnum opus, Black Reconstruction. In his landmark book, An American Dilemma, the Swedish Nobel laureate and economist Gunnar Myrdal dismissed the work of Carter G. Woodson—the father of Black History Month—and other Black scholars studying “Negro history and culture” as “basically an expression of the Negro protest,” in spite of its “scholarly pretenses and accomplishments.”

Gay professors were among those harassed and arrested by the U.S. Park Police’s “Pervert Elimination” campaign in Washington, D.C., in 1947—just as LGBTQ teachers are being harassed and censored today. Spelman College fired the Jewish professor Howard Zinn in 1963 for “radicalizing” Black women students by telling them the truth about U.S. history—and firings or threats of firing continue today at other schools and colleges. In 2021, the University of North Carolina’s board of trustees denied tenure to the Pulitzer Prize–winning journalist and 1619 Project creator Nikole Hannah-Jones over “politics.”

When the traditionalists today disagree with the evidence-based findings of intellectuals—or envy the prominence of our work—too often they do not contest our findings with their own evidence. They do not usually engage in intellectual activity. They misrepresent our work. They play up minor typos or small miscues to take down major theses. They call us names they never define, like “leftist” or “Marxist” or “woke” or “socialist” or “prophet” or “grifter” or “political” or “racist.” All to attack our credibility as intellectuals—to reassert their own credibility. In politics, they say, when you can’t win on policy, you smear the candidate. In intellectualism, when you can’t win on evidence, you smear the intellectual.

I knew the smears were coming, because I knew history. What blocked my writing bound my intellectualism. What finally set me free to be an intellectual was the face of death, a face I still stare at to amass the courage to be an intellectual.

It took me all of 2017 to write six chapters of How to Be an Antiracist. A slog. But when doctors diagnosed me with Stage 4 colon cancer in January 2018, when I figured I probably wouldn’t survive a disease that kills 86 percent of people in five years, when I decided that this book would be my last major will and testament to the world, everything that blocked my writing wilted away, along with my prospects for living. I no longer cared about those traditional conceptions of the intellectual—just like I no longer cared about the orthodoxy of racial thinking. I no longer cared about the backlash that was likely to come. All I cared about was telling the truth through the lens of research and evidence, reaction be damned. And just like that, between chemotherapy treatments, the words started flowing, furiously: 13 chapters in a few months.

Since I wasn’t going to live, I wanted to write a book that could help prevent our people from dying at the hands of racism. Yes, I was told I would die, but I wanted to tell my people to live. Like an intellectual.

The Many Pieces of Catherine Lacey

The Atlantic

www.theatlantic.com › books › archive › 2023 › 03 › catherine-lacey-interview-biography-of-x-book › 673472

This story seems to be about:

M

y favorite work by the artist X, An Account of My Abduction, depicts a kidnapping. For part of the 87-minute video, a woman lies taped up on the floor, writhing, while a voice off camera hisses threats at her. The woman on the floor is named Věra. The one off camera is named Yarrow Hall. The video is disturbing for multiple reasons. It captures suffering and vulnerability. It presents brutality as art. And both of the women are actually characters inhabited by X. The abduction is staged, performed, fabricated, whatever word you prefer. But its first viewers didn’t know what they were looking at, or whether it was real or invented. And once they realized it was the latter, they were confused by what felt like deception—a reaction that seems to have been the point.

I’ve never actually seen An Account of My Abduction. No one has, or will. But you can “view” it yourself in Catherine Lacey’s genre-quaking new novel, Biography of X, which invents X, and her assumed identities, and her big, brash, occasionally stunty body of work. X is a creation in the vein of David Bowie and Kathy Acker and Cindy Sherman and Andrea Fraser—a shape-shifter who encourages her fictional selves to metastasize until they kick her out of her own life, an iconoclast with many noms de plume but no answers about her own childhood or upbringing. “It only seems to be a simple question—Where are you from? It can never be sufficiently answered,” she enigmatically tells a magazine interviewer, posing the question that animates every inch of Biography of X.

This is Lacey’s fourth novel, and she has shown a keen streak of inventiveness and ambition that’s been rewarded with much recognition: She’s won the New York Public Library Young Lions Fiction Award, a Guggenheim, and a Whiting. But Biography of X revels in the kind of identity theft that artists (and writers) employ to build the stories of their work and themselves. Lacey fashioned this enigma that is X—a woman known for her “uncommon brutality” and venomous disdain—out of dozens of artists and provocateurs and hucksters who inhabit our world, but she also made her something inimitable, a vehicle for exploring Lacey’s favorite theme: the fungibility of identity. “I think because I’m an artist,” X says, “my image will always come before me.” In creating this character made up of characters, Lacey has posed an unanswerable question about whether an artist can bury herself so far under work that it becomes impossible to find the traces of an authentic self.

Sitting at a downtown-Manhattan restaurant on a warm, gusty winter afternoon, Lacey came across as more contemplative and unencumbered than enfant terrible—she was wearing a fluffy, forest-green coat and looked at me through wide blue eyes; large paper-clip tattoos on each wrist appeared to secure her hands to her body. She looked slightly perplexed when I came at her with sharp-angled questions, like I was trying to pry open a shell for a pearl that was already strung on a necklace.

X hides herself so well that her own wife doesn’t even know her birthplace. But the Catherine Lacey who wrote Biography of X and produced its brilliant, vicious, capricious protagonist—an unstable new element in the periodic table of literature—doesn’t believe in a unified theory of the self, so she was happy to hand me remnants of her own life and let me create some Cubist version of her. Under a photo of us that she posted to Instagram right after we met, she wrote that she still has “no idea how to properly organize past selves,” an idea she explained to me that day: You contain multiple people, from different periods of your life, and you lose some of them along the way. “There’s a part of me that feels really troubled by [that] separation of identities,” she noted. “I don’t know; isn’t it troubling?” It seems to me that it is: The lifelong project of making a self is, by nature, hopeless. Turning Lacey into one firmly outlined person seems against the spirit of her project.

A few facts anyway: Lacey is 37. She was born in Tupelo, Mississippi, but hasn’t lived in the state since she left for boarding school at age 14. She pinballs around: Right now she’s living in the Ditmas Park neighborhood of Brooklyn, swapping houses with a friend for the place she shares in Mexico City. “I don’t have a region,” she told me. “I’m not of any one place.” She’s been married (to a performance artist) and partnered and un-partnered and re-partnered again. In the past nine years, she’s emerged as the rare young writer who has successfully produced a true oeuvre: Her novels vary thematically—they include a hypnotic road-trip tale (Nobody Is Ever Missing, 2014), a speculative pseudo-satire of dating and mating (The Answers, 2017), and a Shirley Jackson–esque race-and-gender fable (Pew, 2020)—but they all share Lacey’s particular ability to build sturdy narratives that point to the flimsiness of narrative itself.

Lacey is an open book but a profiler’s riddle, even though that’s the kind of writing she once hoped to produce: “I wanted to be doing what you’re doing,” she said, with a look of wonder—she wanted to write nonfiction and coerce artists into sharing their lives. Biography of X, a true magnum opus, plants real lives—like Bowie’s and Acker’s, along with figures as varied as Connie Converse, Frank O’Hara, Richard Serra, and Susan Sontag—alongside the fictional, spirographing the two together. It’s almost a form of profile writing, but she’s suitably busted up the whole thing to retrofit those “real” lives to her protagonist’s purposes.

[Read: Understanding your past won’t liberate you]

Biography of X serves as the title of two books, actually: Lacey’s novel and the biography “inside” that novel (“published” by Farrar, Straus and Giroux in 2005), written by CM Lucca, a lapsed journalist. She is also X’s widow—the story is told in retrospect by a grieving spouse using biography to make sense of the unknowable person she loved. CM (alternately Charlotte Marie or Cynthia Malone, depending) obsessively roots through paperwork and gallery slides, interviews old friends and enemies, tries to fill in the broad gaps in the personal history of a woman who appeared seemingly out of nowhere in New York in 1972 and ended up with a retrospective at MoMA two decades later. X was the kind of artist who provoked conversation whenever she exhibited new work—less a lightning rod than the lightning itself. She had several personas: Clyde Hill, a cult novelist with New Directions; Martina Riggio, a feminist small-press founder; the aforementioned Yarrow Hall and Věra, who each put out work of their own.

Lacey’s power as a mimic is on full display here: Her creations are all as believable as X is, even when we know they are Cindy Sherman–like roles, pulled on as a kind of winking game. Longing rises up from every crack. X, CM explains, “lived in a play without intermission in which she cast herself in every role.” But who was she? CM can gather all sorts of information on her wife—through Vanity Fair profiles, towers of notebooks in her study, critics’ takedowns. But she yearns to identify the precipitating event that turned her into X: a name that signifies no name, a woman who claimed, “It’s not that I am a private person; I am not a person at all.” CM wants to know where X came from in order to make sense of her.

Lacey is happy to disclose bits from her own past. As we scanned our menus, she told me that she’s been a vegetarian ever since she read Leviticus during her church-intensive childhood and decided that no matter what her mother said, she’d likely spend eternity in the furnaces of hell if she mixed milk and meat. Her attachment to her Christianity was fierce and then suddenly gone: “I had a total certitude about why the world was put together, the way that it was put together, what happens after you die. It made all these answers completely clear.” She left her faith and Mississippi around the same time, and wound up with a hole that those identities used to occupy.

Her answerless fiction is a new way of working through those big questions; it’s also gorgeously anti-solution—those “viewers” who witness An Account of My Abduction have been conned by the art world into believing that revelation is the end point of any narrative. “I’m constructing this whole fictional thing because it feels like the only way to clearly convey something that I’m feeling,” Lacey noted with a head shake and some laughing exasperation, “which is ridiculous.”

As if its main conceit isn’t distorting enough, Biography of X also steps through a side door to present a bizarro alternative version of American history. Just months after X’s birth, in 1945, the novel’s America split into three big chunks: the libertarian Western Territory, the socialist Northern Territory, and the theocratic Southern Territory, which covertly built a wall and locked itself in. The latter didn’t reunite with the rest of the country until weeks after X’s death, in 1996—“as if her very existence were tethered to that dangerous, doomed boundary.” This lets Lacey imagine a South that could physically trap X as a child, and hint that X might be so powerful as to bend the world, once out of her control, to her whim.

Lacey’s characters usually don’t escape the South intact. (“I felt wrong there,” Lacey offered—an idea she repeated to me over and over.) In The Answers, a girl is entirely isolated on a farm in Tennessee with her radical-Christian father and simpering mother, fed Bible verses and kept blind to pop culture, that American golden calf. She wanders adulthood in search of experiences that might make her a full person. Pew revolves around someone with no discernible age, gender, or race, no background or history—found in a church in a small, unnamed southern town, where citizens fight to decide whether the vulnerable stranger should be sheltered, as Jesus commands, or rejected. Pew ends up “alone” and “gone” and entirely unaccounted for. In Lacey’s South, the region’s external pressure to conform produces irreparably cracked identities.

[Read: She never meant to write a memoir]

Early on, CM learns that X is one of a very small number of people who escaped the Southern Territory, where, as along the Berlin Wall, armed guards shot down anyone caught crossing. X’s birth identity, it turns out, is Carrie Lu Walker of Byhalia, Mississippi (75 miles from Lacey’s Tupelo); her childhood of purged libraries, global isolation, church-house education, and female submission radicalized her into rebellion and then escape. X’s manifestos embrace the notion that “art is an expression of the society from which it emerges.” And the revelations about X’s childhood give CM the feeling that she is making progress toward understanding her wife’s work, now “more folded with meaning and complication.” For X, a refugee from religious tyranny, the act of self-creation is about addition, not subtraction.

This alternative America is a distorted version of our own, ratcheted up just enough that it reads like a dream state. The result is pleasantly disorienting; it gives the feeling that history is operated by a series of levers, and that fiction can yank on some of them to spit out varied, unruly results. If art kick-starts a “total, ongoing delusion,” as X writes, then Lacey understands that setting her work inside a prototype of a slightly different world—with the socialist Emma Goldman as an architect of the American economy, with Wassily Kandinsky and Jackson Pollock and Alexander Calder killed off so that “women were seen as the sex to whom ‘art’ belonged,” with reparations paid for the descendants of the enslaved—keeps the ground just unsteady enough that certainty floats away.

Janet Malcolm has called biography “the arrogant desire to impose a narrative on the stray bits and pieces of a life.” Biography of X came from an experiment designed to amplify that notion by turning it on its head: Use a novel to create a fake biography, then splice in enough of those “bits and pieces of a life” to make it seem real even as Lacey never loses sight of the artifice of it all.

Readers might feel the impulse to parse CM’s reporting for some base truth, but they’d be missing the point. Practically speaking—and Lacey is a devotee of practicality, meticulously explaining to me how each decision in the novel resulted from a set of what she called “enticing boundaries” she’d set for herself—the book is a highly stylized crossbreed of genres. A set of footnotes cites imaginary magazine articles, interviews, and profiles about X by real-life writers such as Joshua Rivkin, Naomi Fry, Hermione Hoby, and Renata Adler. Some of them swirl our reality with the book’s—Chris Kraus’s After Kathy Acker, her biography of the punk writer, is cited, but with a publication date 15 years early; there’s a magazine (perhaps a cousin of this one) called The Atlantic Coast; the artist Alex Prager films a documentary about a seditious librarian in the Southern Territory. The second set of annotations are Lacey’s 13 pages of endnotes. They cite the parts of our world that she’s collaged into hers: a Tom Waits speech that X recites verbatim; a character’s murder that’s modeled on the assassination of Kim Jong Un’s brother; a quote from Lacey’s own first novel, Nobody Is Ever Missing, that she attributes to another one of X’s personas, called Angel Thornbird.

The writer David Shields, whose book Reality Hunger employed written collage to illustrate the power of creative borrowing, counseled Lacey to leave those annotations out and let the audience wonder. “But I’m not really trying to get away with something,” she said, while we ordered tea after lunch. It’s vital to her that readers see the wires she crossed and the easy co-opting of one reality for another. What better material to screw with than what people already believe to be indubitable? For Lacey, fiction—and biography—aren’t precious little feats to be preserved in formaldehyde. “The more you buy into the idea that you are somehow the entity that’s really responsible for your work, the unhappier you are,” she said. She wishes her own name weren’t on her novels, and claims she isn’t the authority on them. The self can’t be siphoned off from the work, but it needn’t be the work. “X believed that making fiction was sacred,” CM writes, “and she wanted to live in that sanctity, not to be fooled by the flimsiness of perceived reality, which was nothing more than a story that had fooled most of the world.” Biography of X moves past autofiction: The reality of a personal history is no more reliable than the uncertainty of fiction.

[Read: Six books that will change how you look at art]

By sheer luck, the artist Alex Prager, known for her staged, cinematic photos, and now grafted into Lacey’s book, had a new multimedia show that had just opened, and Lacey and I took the C train up to Chelsea to catch it. Part Two: Run! was set in one of Prager’s signature simulacrums: a movie set so luminous and sharp-cornered that it was obviously constructed for the camera. In a short film, four aggressively wigged and costumed actors—to me, all Xs in their invented selves—pushed a giant pinball down the set’s street; it mowed down everyone in its path, but they were miraculously resurrected, standing back up, brushing themselves off. There was a pinball machine too, so observers could implicate themselves: Neither of us was any good. And in the corner, there was a sculptural installation in which a life-size “body,” in a demure gingham dress and sensible heels, lay crushed under the movie’s mammoth mirrored ball. Where the head ought to have been was the ball, and the reflection created a continuation of the body, another body, another self.

While we watched the film, Lacey wondered out loud how the camera wasn’t captured in the ball’s reflection—an artist concerned with the technicality of craft. Standing in front of the orb, we could easily see ourselves, made small but still present, us and another version of us. For a moment, I could imagine that Lacey’s reflection would simply walk away without her.

What Really Broke the Banks

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 03 › svb-collapse-covid-relief-measures-interest-rates-inflation › 673481

When the Federal Reserve board last met, at the end of January, its main concern was whether it needed to continue hiking interest rates aggressively in order to bring down inflation. When it met yesterday, it had a whole new pile of concerns, including, most importantly, whether further interest-rate hikes would destabilize more banks and aggravate the mini banking crisis we’ve been living through since the failure of Silicon Valley Bank on March 10. Those concerns help explain why, even with inflation still high, the Fed chose to raise rates only a quarter of a point.

The fact that six weeks ago almost no one was talking about banks’ balance sheets, let alone bank runs, and today everyone is makes it seem as though this crisis came out of nowhere. But its true origins go back almost exactly three years, to spring 2020. The banking system’s current woes are in real sense a product of the pandemic.

After COVID-19 hit the U.S., bank deposits soared. The pandemic-relief measures—including stimulus payments, expanded unemployment insurance, and Paycheck Protection Program funds—put more money in people’s hands, even as consumer spending fell. At the same time, businesses cut back sharply on spending and investment. The result was a flood of money into the banking system. In 2020 alone, according to the Federal Deposit Insurance Corporation, bank deposits rose by 21.7 percent, the largest increase since the 1940s. The following year, deposits rose by another 10.7 percent. At the end of 2021, total bank deposits were an astonishing $4.4 trillion greater than they’d been just two years earlier.

[James Surowiecki: Don’t read his lips]

You might think that would have been a good thing for banks, because it meant they had more money to play with. The problem was that they didn’t have anything useful to do with much of that money. Deposits, it’s important to remember, aren’t capital invested in a bank’s business; they’re loans from depositors. For deposits to be profitable for banks, the banks need to reinvest the money.

Unfortunately for bankers, business demand for loans plummeted in 2020, owing to the uncertainty created by the pandemic, and demand recovered just slowly in 2021. And although the mortgage market bounced back quickly, there were only so many 30-year mortgages—which also happened to be at historically low interest rates—that banks could write.

Banks could have stopped accepting deposits, or started paying negative interest rates—actually charging customers non-negligible sums for having the bank hold their money. But they didn’t. So they ended up with huge piles of cash sitting in their virtual vaults, which they wanted to put to work.

The low-risk, and most sensible, strategy would have been to put most of that money into highly liquid, low-interest-rate short-term investments (such as Treasury notes). But that would have reduced banks’ interest income, and therefore their profits. Instead, a lot of banks put many billions of dollars into long-term bonds or safe mortgage-backed securities, which offered somewhat higher yields and had no risk of defaulting. As the headline on a November 2021 New York Times article put it, banks were “bingeing on bonds.”

This was not an especially lucrative strategy, but it seemed like the best of banks’ not-good options. As the subheading of that same article noted, banks “have little choice but to buy up government debt, even if it means skimpy profits.”

[Annie Lowrey: Silicon Valley Bank’s failure is now everyone’s problem]

The strategy had one obvious downside: It exposed banks to a huge amount of what economists call “interest-rate risk.” When interest rates rise, the value of bonds falls. If inflation—and therefore interest rates—spiked, all of those low-interest government bonds and mortgage-backed securities were going to be worth a lot less than the banks had paid for them. But in 2020, and even in early 2021, that outcome seemed to almost everyone, including the Federal Reserve itself, very unlikely.

Banks, you might say, had been lulled into a false sense of security by years of low inflation and near-zero interest rates: They were operating on the assumption that, for many years to come, inflation would remain quiescent, and interest rates would stay low. Accordingly, banks made what now seems like an obviously foolish decision: taking hundreds of billions of dollars in deposits and putting them into long-term bonds yielding only a couple of percentage points. Now that inflation has returned and the Fed has jacked up interest rates, banks find themselves sitting on piles of bonds that are worth far less than they once were. As a result, their balance sheets are much weaker than they had previously appeared to be.

This doesn’t mean the banking system as a whole is in crisis. In contrast to the situation in 2008, when banks had made trillions of dollars’ worth of bad loans, the government bonds and agency-backed securities that banks own today are not in danger of default: Whoever holds them to maturity will get their money back. And the system as a whole is still reasonably well capitalized and has plenty of cash on hand. But individual banks, particularly those that, like the already failed Silicon Valley Bank and Signature Bank, took in lots of money from companies that now need cash and from depositors who will pull out their money at the slightest sign of trouble, are at risk. In turn, what regulators are obviously most concerned about is the specter of more bank runs, which can bring down even well-capitalized banks.

[Derek Thompson: The end of Silicon Valley Bank—and a Silicon Valley myth]

There’s plenty of blame to go around for this situation. The Fed was late in recognizing the risk of inflation, which has forced it to raise interest rates steeply over the past year. Banks, meanwhile, weren’t forced to buy long-term bonds: They chose to, because they were largely oblivious of the interest-rate risk they were running. And the banks that have already collapsed were especially reckless in the way they concentrated their business in the tech and crypto industries—seemingly with no thought of what would happen if the investment bubbles in those businesses burst. Finally, bank regulators did not do enough to intervene to force mid-tier banks such as SVB to manage their exposure better, something they’d neglected to do on their own.

Yet this is not just a story of bad decisions made out of greed or carelessness. It’s really the story of how the pandemic brought an end to the era of low inflation and near-zero interest rates, and how long it took for even savvy financial institutions to realize how much things had changed. The coronavirus outbreak, it turns out, was a colossal shock to not just our public-health system but also our financial system. We’re still feeling its effects today.