Itemoids

TikTok

NSA chief warns TikTok could censor videos as part of Chinese influence operations

CNN

www.cnn.com › 2023 › 03 › 07 › tech › nsa-tiktok-surveillance-china-america › index.html

The head of the National Security Agency and US Cyber Command told lawmakers Tuesday that TikTok presents a national security concern for two main reasons: It collects data on people who use the social media app, and it censors information for its vast audience around the world.

Andrew Tate Is Haunting YouTube

The Atlantic

www.theatlantic.com › technology › archive › 2023 › 03 › andrew-tate-youtube-shorts-video-algorithm-tiktok › 673291

Andrew Tate, the anti-feminist influencer with a reputation for hating women, smoking enormous cigars all the time, and claiming to punch a piece of wood 1,000 times a day, was banned from Facebook, Instagram, TikTok, Twitch, and YouTube last August for hate speech; he was then arrested in December and charged with various crimes in relation to an alleged sex-trafficking operation, including rape. (Tate has insisted he’s innocent.) Weirdly, he has not really gone away. Teachers are still worried about the influence his horrible ideas about women-as-property are having on teenage boys. And his face is still all over the internet, because his fans (and some detractors) simply keep re-uploading it, over and over.

In particular, users of YouTube’s new-ish short-form video service (obviously built to compete with TikTok) say they haven’t been able to get away from Tate. Although YouTube doesn’t allow users to repost old videos from Tate’s banned channel, people are free to share clips of him from other sources. On Reddit, you can scroll through many versions of the same question: “Is there any way to stop seeing any Andrew Tate content?” You might find some commiseration (“every other clip is one from this moron”), but you won’t find many satisfying answers. Many of the people posting about Tate’s ability to lurk in the YouTube Shorts feed claim they are not doing anything that would indicate they are interested in seeing him. (“Most of the things I watch on YouTube are related to music production, digital painting, some fashion history, asmr and light content to relax,” one Reddit commenter wrote, perplexed.) Others said they are giving explicit feedback that seems to be ignored: “I press ‘do not recommend’ every time I get his content recommended but nothing works. Do I just need to stay off social media until he dies?”

Tate’s continued presence is a mystery to users, but it makes some sense in the context of YouTube’s current struggles. Shorts are part of YouTube’s effort to win video watchers over from TikTok and reverse a decline in ad revenue. Tate was a huge, rich star, which means that reposting clips of him, or even clips of criticism or parodies of him, is a reliable way for low-effort content creators to win engagement and potentially profit.

Last fall, YouTube announced that it would bring ads to Shorts and share revenue with creators, which was necessary if it was going to woo talent that could meaningfully compete with TikTok’s. This revenue-sharing program finally got started at the beginning of February, as noted in a Media Matters report that argued Shorts was “rife with anti-LGBTQ vitriol, racism, misogyny, and COVID-19 misinformation.” The report quoted a tweet from the popular YouTuber Hank Green, whose channel, Vlogbrothers (which he shares with his brother, the author John Green), has more than 3.5 million subscribers. Green explicitly said that he felt the YouTube Shorts recommendation algorithm was worse than TikTok’s: “It’s like, ‘we’ve noticed you like physics, might I interest you in some men’s rights?’”

TikTok, for all its problems, has strong norms of creativity and is always evolving. YouTube Shorts, its users seem to be saying, is somehow glitching and has gotten stuck on the recent, disgusting past. And though regurgitated clips of Tate can certainly still be found on other platforms, none of them has created the impression of unduly stalking its users with his content. (It’s impossible to know whether this is because their algorithms are actually superior or simply because these sites have less Tate content relative to everything else.) The YouTube spokesperson Elena Hernandez wrote, in an emailed statement, “YouTube’s recommendation system takes into account a wide range of signals both personalized for the viewer and at scale from activity across the platform. We also offer people control over their recommendations, including the ability to block a specific video or channel from being recommended to them in the future. Because of this, no two viewers’ experiences are the same. We’re always improving our recommendation system to help people find content they want to watch and find valuable.”

Although YouTube has published some details about how its recommendation system works, users are still left to make guesses, based on their own anecdotal experiences, about what’s going on behind the screen—a practice some researchers refer to as the creation of “algorithmic folklore.” And when they share these guesses in public, they contribute to the shared impression that the YouTube Shorts algorithm is inexplicably dogged in its efforts to show users offensive content. (“YouTube shorts are nasty. You simply can’t downvote the misogyny off the algorithm.”) Brad Overbey, a 35-year-old YouTuber with a popular gaming channel, told me that he thinks he sees Tate and other misogynistic content in his YouTube Shorts feed because of his demographic profile: white, high-income, tech-savvy, from Texas. “That puts me in the misogyny pipeline,” he said. He spent about a week trying to correct the recommendations by disliking things and blocking accounts, but he didn’t notice a change. “I don’t even fool with it anymore,” he told me.

[Read: Elon Musk can’t solve Twitter’s “shadowbanning” problem]

Overbey at least had a theory as to why he was getting Tate fan content. Lux Houle, a 22-year-old YouTube user who mostly watches comedy sketches and cooking videos, told me she had no idea why she was seeing it. “I started disliking and hiding the accounts and saying ‘Don’t show this stuff to me,’ but it just kept going and going and going,” she said. “It’s always these really small accounts with 7,000 followers, but it will have 100,000 likes on the Short. I’m always really confused by that.”

I told her that I wondered whether part of the problem might be that people react emotionally to recommendations that a semi-anthropomorphized algorithm makes specifically for them: What does it say about me that “my” algorithm thinks I want to see this? I asked if she had engaged with Tate videos by watching them out of morbid curiosity or hate-sharing them with friends, which could have given the system signals that she didn’t intend or wouldn’t remember. “I think at first I did watch one or two, because I just didn’t know what it was,” she said. “Now I’ve gotten to the point where I can detect his voice or his face. I’ll scroll past immediately.”

After speaking with Houle, I checked my own YouTube Shorts recommendations to see if they would be equally strange, but they were fine—almost all clips of celebrity interviews, probably because I mostly use YouTube to watch music videos and Architectural Digest tours of actors’ homes. It wasn’t until I logged out of my account and used Shorts as a generic first-time user of YouTube that I saw any creepy content. As Overbey had told me, it was about once every eight to 10 videos. I would get a clip of a cool soccer trick, then one of an old lady cooking, then somebody painting a very detailed portrait of Scrooge McDuck. Kittens, skateboarders, a pomeranian in a bathtub. Then, after “Watch my mouse grow up!” there was Tate, eating a cake shaped like a Bugatti—22 million views.

I’m creating my own algorithmic folklore here, but my best guess is that, because Tate was so popular, accounts posting Tate content have ended up among the default categories the algorithm will pull from if it doesn’t know much about what a user wants to see. It fits right in with the rest of the lowest-common-denominator content—it’s just more surprising and memorable.

When I spoke with Manoel Horta Ribeiro, a fourth-year Ph.D. student at the Swiss Federal Institute of Technology in Lausanne who studies online misogyny and radicalization, he told me it would be very difficult, without access to YouTube’s data, to say whether anything is off with regard to Shorts’ recommendations. No matter how many people complain that Tate is being shown to them for no reason, there’s no way for an onlooker to know for sure. “You suggest that there’s, like, a baseline value that is how much this content should be amplified, and that this content was amplified above this baseline level,” he said. “But I think that the big problem is that it’s very hard to define the baseline level.” This is one reason researchers are still debating the precise role that YouTube’s recommendation algorithms may play in promoting extremism, misinformation, and other problematic content.

[Read: What happens when everything becomes TikTok]

The explanation behind the lingering ghost of Andrew Tate is probably simpler than it appears to the people who are sick of seeing him: It’s just the long half-life of internet trash. A ban doesn’t solve a content-moderation problem immediately; it just makes it more convoluted. Tate’s star rose on TikTok, which has had its own problems getting rid of his content even after it banned him, but YouTube Shorts now has the reputation of being his postarrest home. The site has always been criticized for misogyny so pervasive that it is difficult or impossible to check, even with stricter and more specific rules. Now it’s also getting dinged for trying and failing to capture the magic of the TikTok-style algorithmic recommendation—in mimicking something contemporary and popular and applying it to a site with decades of thorny history, it has inadvertently highlighted just how central misogyny has always been to its own culture.

To some extent, Ribeiro argued, the recommendation of distasteful content is “core to the concept of social media”: Anyone with any interest, no matter how niche, can find creators and content that pertain to that interest; everything will be surfaced. The problem with that is how difficult it becomes to understand, by working backwards, what people care about. You don’t want to see Andrew Tate, and you hope that no one else does, either. But do you really know whether his ideas are unpopular?