Itemoids

Noa Argamani

Should You Delete Your Kid’s TikTok This Week?

The Atlantic

www.theatlantic.com › technology › archive › 2023 › 10 › graphic-content-children-social-media-use › 675619

This week, a teenager might open up their TikTok feed and immediately be served a video about a hairbrush that promises to gently detangle the roughest of tangles. Or a clip about Travis Kelce and Taylor Swift’s rumored romance. Or the app could show them a scene from the Israeli Supernova music festival, where on Saturday a young woman named Noa Argamani was put on the back of a motorcycle as her boyfriend was held by captors.

Footage from Hamas’s surprise attack on Israel, and the retaliatory strikes it has prompted, is appearing in social-media feeds across the world. Videos about the conflict have drawn billions of views on TikTok alone, according to The Washington Post, and queries related to it have appeared in the app’s trending searches. Hamas reportedly posted the murder of one grandmother to her own Facebook page.

Hamas reportedly captured about 150 hostages, and has threatened to execute them. Some schools in Israel and the United States have asked that parents preemptively delete social-media apps from their children’s devices in order to protect them from the possibility of seeing clips in which hostages beg for their lives. “Together with other Jewish day schools, we are warning parents to disable social media apps such as Instagram, X, and Tiktok from their children’s phones,” reads one such statement, posted by The Wall Street Journal’s Joanna Stern. “Graphic and often misleading information is flowing freely, augmenting the fears of our students.”

Parents have good reason to be concerned. Psychologists don’t fully know how watching graphic content online can affect kids. But “there’s enough circumstantial evidence suggesting that it’s not healthy from a mental-health standpoint,” Meredith Gasner, a psychologist at Boston Children’s Hospital, told me, citing research on the viral videos of George Floyd’s death in police custody.

Of course, kids have long been at risk of encountering disturbing or graphic content on social media. But the current era of single feeds serving short videos selected by algorithms, sometimes with little apparent logic, potentially changes the calculus. Firing up TikTok feels like pulling the lever of a content slot machine; every time a user opens up the app, they don’t necessarily know whether they’ll find comedy or horror. Lots of kids are pulling the lever many times a day, sometimes spending hours in the app. Nor is this just a TikTok problem: Instagram and YouTube, among other platforms, both have their own TikTok-like feeds. Much of the material on these platforms is benign, but on weeks like this one, when even adults may have trouble stomaching visuals they encounter, the idea that children are all over social media is particularly unsettling.

If hostage videos appear, the social-media platforms are hypothetically in a position to prevent them from going viral. A spokesperson for TikTok did not respond to a request for comment, but the platform’s community guidelines forbid use of the platform “to threaten or incite violence, or to promote violent extremism,” and the website says that the company works to detect and remove such content. Instagram, for its part, also moderates “videos of intense, graphic violence,” and has established a special-operations center staffed with experts to monitor the situation in Israel, a spokesperson for Meta said in an email. Both platforms offer safety tools for parents. Still, social-media platforms’ track record when it comes to content moderation is abysmal. Some videos that are upsetting to children may find their way onto the apps, especially those posted by reputable news outlets.

I talked to eight experts on children and the internet who told me that deleting social-media apps unilaterally might not work. For one, TikTok and Instagram videos are often cross-posted on other platforms, like YouTube Shorts, so you’d have to delete a lot of apps to create a true bubble. (And even so, that might not be impenetrable.) Kicking your teen off social media, albeit temporarily, may also feel like a punishment to your kid, who did nothing wrong.

But that doesn’t mean that parents are helpless. A better approach, experts told me, is for parents to be more open and communicative with their kids. “Having that open dialogue is key because they’re not really going to be able to escape what’s going on,” Laura Ordoñez, head of digital content and curation at Common Sense Media, a nonprofit that advocates for a safer digital world for kids and families, told me. Even if children can avoid videos of violence, the realities those videos represent still exist.

Families with a direct connection to the region may have a tougher time navigating the next few days than those without one. And age matters a lot, the experts said. Younger kids, particularly those in second grade or below, should be protected from watching upsetting videos as much as possible, says Heather Kirkorian, the director of the Cognitive Development and Media Lab at the University of Wisconsin at Madison. They’re too young to understand what’s happening. “They don’t have the cognitive and emotional skills to understand and process,” she told me.

At those younger ages, parents can realistically bubble kids from certain platforms and sites. Though that’s not to say they won’t hear about the war at school or have questions about it. When discussing with younger children, experts advise talking in kid-friendly language and, when appropriate, letting them know that they personally are safe. If the child is under 7, Ordoñez advises using “very simple and concrete explanations” like “Someone was hurt” or “People are fighting.” She also recommends that adults avoid watching or listening to news in front of children, who may overhear material that upsets them.

For older children, quarantining them from life online is rarely plausible. If you do delete TikTok from their phone, kids may just download it again or find another way to view it—by, say, using another kids’ device or a school computer. As Diana Graber, the author of Raising Humans in a Digital World, pointed out: “The minute you tell a child you can’t look at something, guess what they’re going to do?” Experts told me that a more productive approach is to ask kids questions about what they know, what they’ve seen, and how they feel. Warn them that the content they encounter may upset them, and talk to them about how it might affect them. Graber notes that a lot of kids these days are fluent in the language of mental health. If you’ve seen graphic content on your feeds, you can assume that your kid might see it, too. Julianna Miner, the author of Raising a Screen-Smart Kid, notes that “it’s important to give your kids a heads up” and to “prepare them for what they might see.” After that, you can “give them the choice of logging off or changing settings or taking some steps to potentially limit the types of things they could be exposed to.” This way, you’re on the same team.

In tense moments like this one, kids—like everyone else—are likely to encounter misinformation and disinformation, some of which began circulating even as the attacks were first being carried out. Bloomberg reported that a video from a different music festival in September was making the rounds on TikTok and had gotten almost 200,000 likes. For this reason, Sarita Schoenebeck, a professor at the University of Michigan who directs its Living Online Lab, recommends reminding kids that we don’t always know whether what we see online is real or fake.

In general, experts advise that parents should personalize their approach to their children. Some are more sensitive than others, and parents know their kids and what they can handle best. More broadly, monitor for signs that they’re upset. That might look different depending on the child. One good rule of thumb Schoenebeck gives when advising parents about whether kids are ready for smartphones is to think about how well your child is able to self-regulate around technology. “When you say, ‘Oh, time to turn the TV off!’ or whatever, are they able to self-regulate and do that without having a fit?” she asked. Are they capable of doing a dinner without phones or do they sneak a peek under the table? The same questions may show how ready they are to self-regulate their social-media use in upsetting times.