Itemoids

Midjourney

People Aren’t Falling for AI Trump Photos (Yet)

The Atlantic

www.theatlantic.com › technology › archive › 2023 › 03 › ai-generated-fake-trump-indictment-images › 673513

On Monday, as Americans considered the possibility of a Donald Trump indictment and a presidential perp walk, Eliot Higgins brought the hypothetical to life. Higgins, the founder of Bellingcat, an open-source investigations group, asked the latest version of the generative-AI art tool Midjourney to illustrate the spectacle of a Trump arrest. It pumped out vivid photos of a sea of police officers dragging the 45th president to the ground.

Higgins didn’t stop there. He generated a series of images that became more and more absurd: Donald Trump Jr. and Melania Trump screaming at a throng of arresting officers; Trump weeping in the courtroom, pumping iron with his fellow prisoners, mopping a jailhouse latrine, and eventually breaking out of prison through a sewer on a rainy evening. The story, which Higgins tweeted over the course of two days, ends with Trump crying at a McDonald’s in his orange jumpsuit.

pic.twitter.com/V6Y8hHnGRN

— Eliot Higgins (@EliotHiggins) March 21, 2023

All of the tweets are compelling, but only the scene of Trump’s arrest went mega viral, garnering 5.7 million views as of this morning. People immediately started wringing their hands over the possibility of Higgins’s creations duping unsuspecting audiences into thinking that Trump had actually been arrested, or leading to the downfall of our legal system. “Many people have copied Eliot’s AI generated images of Trump getting arrested and some are sharing them as real. Others have generated lots of similar images and new ones keep appearing. Please stop this,” the popular debunking account HoaxEye tweeted. “In 10 years the legal system will not accept any form of first or second hand evidence that isn’t on scene at the time of arrest,” an anonymous Twitter user fretted. “The only trusted word will be of the arresting officer and the polygraph. the legal system will be stifled by forgery/falsified evidence.”

This fear, though understandable, draws on an imagined dystopian future that’s rooted in the concerns of the past rather than the realities of our strange present. People seem eager to ascribe to AI imagery a persuasion power it hasn’t yet demonstrated. Rather than imagine emergent ways that these tools will be disruptive, alarmists draw on misinformation tropes from the earlier days of the social web, when lo-fi hoaxes routinely went viral.

These concerns do not match the reality of the broad response to Higgins’s thread. Some people shared the images simply because they thought they were funny. Others remarked at how much better AI-art tools have gotten in such a short amount of time. As the writer Parker Molloy noted, the first version of Midjourney, which was initially tested in March 2022, could barely render famous faces and was full of surrealist glitches. Version five, which Higgins used, launched in beta just last week and still has trouble with hands and small details, but it was able to re-create a near-photorealistic imagining of the arrest in the style of a press photo.

[Read: The Trump AI deepfakes had an unintended side effect]

But despite those technological leaps, very few people seem to genuinely believe that Higgins’s AI images are real. That may be a consequence, partially, of the sheer volume of fake AI Trump-arrest images that filled Twitter this week. If you examine the quote tweets and comments on these images, what emerges is not a gullible reaction but a skeptical one. In one instance of a junk account trying to pass off the photos as real, a random Twitter user responded by pointing out the image’s flaws and inconsistencies: “Legs, fingers, uniforms, any other intricate details when you look closely. I’d say you people have literal rocks for brains but I’d be insulting the rocks.”

I asked Higgins, who is himself a skilled online investigator and debunker, what he makes of the response. “It seems most people mad about it are people who think other people might think they’re real,” he told me over email. (Higgins also said that his Midjourney access has been revoked, and BuzzFeed News reported that users are no longer able to prompt the art tool using the word arrested. Midjourney did not immediately respond to a request for comment.)

The attitude Higgins described tracks with research published last month by the academic journal New Media & Society, which found that “the strongest, and most reliable, predictor of perceived danger of misinformation was the perception that others are more vulnerable to misinformation than the self”—a phenomenon called the third-person effect. The study found that participants who reported being more worried about misinformation were also more likely to share alarmist narratives and warnings about misinformation. A previous study on the third-person effect also found that increased social-media engagement tends to heighten both the third-person effect and, indirectly, people’s confidence in their own knowledge of a subject.

The Trump-AI-art news cycle seems like the perfect illustration of these phenomena. It is a true pseudo event: A fake image enters the world; concerned people amplify it and decry it as dangerous to a perceived vulnerable audience that may or may not exist; news stories echo these concerns.

There are plenty of real reasons to be worried about the rise of generative AI, which can reliably churn out convincing-sounding text that’s actually riddled with factual errors. AI art, video, and sound tools all have the potential to create basically any mix of “deepfaked” media you can imagine. And these tools are getting better at producing realistic outputs at a near exponential rate. It’s entirely possible that the fears of future reality-blurring misinformation campaigns or impersonation may prove prophetic.

But the Trump-arrest photos also reveal how conversations about the potential threats of synthetic media tend to draw on generalized fears that news consumers can and will fall for anything—tropes that have persisted even as we’ve become used to living in an untrustworthy social-media environment. These tropes aren’t all well founded: Not everyone was exposed to Russian trolls, not all Americans live in filter bubbles, and, as researchers have shown, not all fake-news sites are that influential. There are countless examples of awful, preposterous, and popular conspiracy theories thriving online, but they tend to be less lazy, dashed-off lies than intricate examples of world building. They stem from deep-rooted ideologies or a consensus that forms in one’s political or social circles. When it comes to nascent technologies such as generative AI and large language models, it’s possible that the real concern will be an entirely new set of bad behaviors we haven’t encountered yet.

[Read: The prophecies of Q]

Chris Moran, the head of editorial innovation at The Guardian, offered one such example. Last week, his team was contacted by a researcher asking why the paper had deleted a specific article from its archive. Moran and his team checked and discovered that the article in question hadn’t been deleted, because it had never been written or published: ChatGPT had hallucinated the article entirely. (Moran declined to share any details about the article. My colleague Ian Bogost encountered something similar recently when he asked ChatGPT to find an Atlantic story about tacos: It fabricated the headline “The Enduring Appeal of Tacos,” supposedly by Amanda Mull.)  

The situation was quickly resolved but left Moran unsettled. “Imagine this in an area prone to conspiracy theories,” he later tweeted. “These hallucinations are common. We may see a lot of conspiracies fuelled by ‘deleted’ articles that were never written.”

Moran’s example—of AIs hallucinating, and accidentally birthing conspiracy theories about cover-ups—feels like a plausible future issue, because this is precisely how sticky conspiracy theories work. The strongest conspiracies tend to allege that an event happened. They offer little proof, citing cover-ups from shadowy or powerful people and shifting the burden of proof to the debunkers. No amount of debunking will ever suffice, because it’s often impossible to prove a negative. But the Trump-arrest images are the inverse. The event in question hasn’t happened, and if it had, coverage would blanket the internet; either way, the narrative in the images is instantly disprovable. A small minority of extremely incurious and uninformed consumers might be duped by some AI photos, but chances are that even they will soon learn that the former president has not (yet) been tackled to the ground by a legion of police.

Even though Higgins was allegedly booted from Midjourney for generating the images, one way to look at his experiment is as an exercise in red-teaming: the practice of using a service adversarially in order to imagine and test how it might be exploited. “It’s been educational for people at least,” Higgins told me. “Hopefully make them think twice when they see a photo of a 3-legged Donald Trump being arrested by police with nonsense written on their hats.”

AI tools may indeed complicate and blur our already fractured sense of reality, but we would do well to have a sense of humility about how that might happen. It’s possible that, after decades of living online and across social platforms, many people may be resilient against the manipulations of synthetic media. Perhaps there is a risk that’s yet to fully take shape: It may be more effective to manipulate an existing image or doctor small details rather than invent something wholesale. If, say, Trump were to be arrested out of the view of cameras, well-crafted AI-generated images claiming to be leaked law-enforcement photos may very well dupe even savvy news consumers.

Things may also get much weirder than we can imagine. Yesterday, Trump shared an AI-generated image of himself praying—a minor fabrication with some political aim that’s hard to make sense of, and that hints at the subtler ways that synthetic media might worm its way into our lives and make the process of information gathering even more confusing, exhausting, and strange.

The Trump AI Deepfakes Had an Unintended Side Effect

The Atlantic

www.theatlantic.com › culture › archive › 2023 › 03 › fake-trump-arrest-images-ai-generated-deepfakes › 673510

The former president is fighting with the police. He’s yelling. He’s running. He’s resisting. Finally, he falls, that familiar sweep of hair the only thing rigid against the swirl of bodies that surround him.

When I first saw the images, I did a double take: The event they seem to depict—the arrest of Donald Trump—has been a matter of feverish anticipation this week, as a grand jury decides whether to indict the former president for hush-money payments allegedly made on his behalf to the adult film star Stormy Daniels. (Trump, that canny calibrator of public expectation, himself contributed to the fever.) Had the indictment finally come down, I wondered, and had the arrest ensued? Had Trump’s Teflon coating—so many alleged misdeeds, so few consequences—finally worn away? Pics or it didn’t happen, people say, and, well, here were the pics.

My wonderings were brief, though. Looking more closely, I noticed the blurry unreality of the people in the images: the faces that seemed, up close, only loosely face-like; the hands with not-quite fingers; the extra appendages; the missing ones. The images were not photos, but rather the results of artificial intelligence responding to that most human of prompts: impatience. Speculation over the possibility of a “perp walk” grew so intense, the British journalist Eliot Higgins decided to imagine the event using the text-to-image generator Midjourney. (His inputs: “Donald Trump falling over while getting arrested. Fibonacci Spiral. News footage.”) Higgins posted the AI’s responses on Twitter, making the just-a-joke fakery clear. Soon, they went viral. Some posts sharing the images acknowledged their AI origins; others were notably less clear. “#BREAKING: Donald J. Trump has been arrested in #Manhattan this morning!” one post read, teetering between credulity and parody. The result was an absurdity fit for the era that is shaped, still, by Trump. The AI renderings, meant to capture him at the moment of accountability, instead serve as reminders of his ongoing power. Attention is the one currency that Donald Trump has never squandered. The images of his “comeuppance” have now been viewed more than 5 million times.

The crucial element of the images is not the fact that they are misleading. It is that they are melodramatic. They present Trump’s imagined arrest in maximally cinematic terms: the fight, the flight, the fall. They lie with such swagger that, even after you realize the fakery, it becomes difficult to look away. The images channel one of the showman’s abiding insights: that spectacle, wielded well, will not merely complement reality. It will compete against it. The deepfakes, those hyperreal renderings of a thing that hasn’t happened, are arguably harmless fun, obvious jokes that bide time until real news breaks. But attention being what it is, the images put a dent in any events to come. They are agents of preemptive—and false—catharsis. Trump’s arrest hasn’t happened. Nonetheless, we’ve already seen it.

“Behind closed doors at Mar-a-Lago,” The New York Times reported this week, “the former president has told friends and associates that he welcomes the idea of being paraded by the authorities before a throng of reporters and news cameras.” He has wondered how he should play the scene—should he smile for the cameras?—and how the audience of the American public might take in the show. This speculation, too, is revealing: “We likely won’t see a classic perp walk,” my colleague David Graham noted yesterday; still, the notion of a scenic arrest has been a common one across the media coverage of Trump’s legal woes.

[Read: The cases against Trump: A guide]

As a broader news story, the probability of the former president’s arrest has similarly pitted the doubtful-but-cinematic against the probable-but-dull. Reports about the potential event have been peppered with artful cushions and caveats (“likely indictment,” “expected arrest,” etc.), vacillating between the conditional and the future tenses. Would it happen on Tuesday, as Trump himself had predicted? (No.) How about Wednesday? (No again.) Trying to keep track of it all, as a news consumer, meant being caught in unending whiplash between what has already happened, what will happen, and what merely might.

The AI images neatly channel the maybes. They also capture one of the tensions at play in an event that is both an ongoing legal proceeding and an anticipated spectacle: the public desire for catharsis chafing against the prosecutor’s desire for a winning case. Both desires, though, play games of expectation. Both rely, in their way, on shock in the moment and sustained attention in the long run. And when cinematic images are pitted against dutiful, uncertain realities, you can usually predict the victor. The pictures are very obviously fake; to see them at all, though, is to have an emotional reaction to them. If you’re one of those millions who have seen the fake arrest, the real one, if it happens, may seem like a letdown—a matter of been-there-done-that, already experienced, felt, filed away.

The hype cycle is a fickle thing. And now, as the fake images remind us, its movements can be shaped not only by human spectacles, but also by AI-generated ones. Donald Trump, wielder of fakeries, is broadly akin to AI in the threats he both poses and represents. And the images that claim to depict him in his moment of humble humanity hint at those commonalities. The savvy marketer and the savvy algorithm both eviscerate long-standing, and load-bearing, norms. They are both shocks to the system, in the near term and the long. They treat reality as merely the opening bid in an endless negotiation. And they highlight one of the truths that shapes American politics as readily as it shapes everything else: Shock is a finite resource. Because of that, even the specter of Trump’s arrest, deprived of its ability to surprise, can become the thing that Trump himself never seems to: old news.