Itemoids

Tufts University

Spotify Doesn't Know Who You Are

The Atlantic

www.theatlantic.com › technology › archive › 2023 › 11 › spotify-wrapped-personalization-algorithmic-theories › 676184

When I think of last year, I hear “Northeast Texas Women,” by Willis Alan Ramsey. I’ve carried it with me since my father played it in the car in January 2022, him drumming the steering wheel and me the dashboard. Our windows were open to the desert outside, the air all sagebrush and sunburnt dirt. I was nostalgic for the moment even as I lived it, five minutes and 51 seconds of country music in the Mojave.

But to Spotify, my favored music-streaming app, this was nothing more than a single “play,” just one data point among countless others. When the company released its annual “Wrapped” feature—an interactive slideshow that analyzes a user’s listening habits throughout the year and packages them into cheeky graphics that are meant to be shared on social media—Willis Alan Ramsey was nowhere to be seen. The platform had crunched my numbers, aggregating thousands of minutes spent streaming on the subway, at the gym, and in the office, to arrive at its assessment of my character. It produced a withering phrase: “Pumpkin Spice.”

Having my musical taste compared to a mass-market fall flavoring didn’t sit right. I’ve since learned that I’m not alone in being unsettled by what Spotify Wrapped—the latest version of which was released yesterday—seemed to say about my character. There are lengthy Reddit threads of people worrying about their results and trying to guess how they’re calculated in the first place. Articles deliver tips on how to expungecringe” music from the roundup. One college newspaper reported that listeners could create an impressive playlist and loop it silently while they sleep. A friend of mine recently suggested, in earnest, that I could get ahead of things by requesting my personal listening data from Spotify. I would have to wait up to a month for the company to email them to me in a spreadsheet file; then I could put that into ChatGPT and ask the chatbot questions about my music habits. What were my top five artists? What genres were most listened to? What are some words that describe the musical aesthetic of the dataset (DO NOT SAY “PUMPKIN SPICE”)? I didn’t, but I was tempted.

[Read: How Starbucks perfected autumn]

All of these ideas amount to algorithmic “folk theories,” as researchers call them—stories we tell ourselves about the technology that collects our data and presents some kind of compelling, inscrutable output in response. Users might assume that commenting on a TikTok prompts the algorithm to serve them similar content; I might speculate that looping Steely Dan will keep the soundtrack to High School Musical: The Musical: The Series off my Wrapped slideshow. But I’ll never really know if I’m right.

As Robyn Caplan, a social-media scholar at the Data & Society Research Institute, told me, “The gap between what we believe about algorithms and how they actually work will always remain.” Companies keep their algorithmic secret sauce under lock and key, and its precise recipe changes constantly anyway, as developers tweak and morph it for whatever mysterious reasons. But folk theories help us feel like we’re narrowing the gap between ourselves and the technology that shapes our online experience; they offer us a sense of control. And maybe when it comes to a platform like Spotify, which tries to capture something as personal as taste, people feel compelled to close another gap—between how the algorithm sees them and how they see themselves.

The central premise uniting these theories is that we can’t really tell an algorithm who we are; we have to show it. Platforms used to offer recommendations based on clear user inputs (consider that Netflix used to ask you to rate a movie out of five stars); now things have gotten murkier as our behavior is tracked and collated in complex, opaque ways. Consumers have learned to adjust their actions to get the content they want, according to Nick Seaver, an anthropology professor at Tufts University and the author of Computing Taste: Algorithms and the Makers of Music Recommendation. “You were much more in control of how you represented yourself under those [earlier] systems,” Seaver told me. Now our behavior—even the embarrassing kind—generates our unique media world.

[Read: What will happen to my music library when Spotify dies?]

Just as the machine tailors itself to us, we try to tailor ourselves to it. This could mean deliberately streaming an album so that Spotify “knows” we like that artist or listening to some “tasteful” music to remind it that our preferences extend beyond guilty pleasures. It’s not just Spotify: People do this kind of thing on TikTok all the time, lingering on a BookTok video if they want more literary content, and so on.

This preoccupation with tweaking the algorithm reflects a belief that it’s saying something meaningful about who we are. Jeff Hancock, a communications professor at Stanford University, told me he calls it the “algorithmic mirror”—the assumption that whatever the technology spits back at us tells us something true about ourselves. A user might get served an ad for knitting needles, something they’d never used before, and think, I actually would like to knit!

Research has shown that people can be so swayed by an algorithm’s read of their personality that they will justify complete mischaracterizations. Motahhare Eslami, a computer-science professor at Carnegie Mellon University, conducted a study in 2018 on how we process algorithmic communication. She explained to me that one person she spoke with kept receiving ads that were tailored toward living in New York City, even though the participant did not. Rather than assume that the program was making a mistake, the individual contrived an explanation: The algorithm thought she was interested in New York because she’d been watching a lot of Friends.

Spotify Wrapped leans into this mystique. It’s not just a calculation of the songs you’ve played—it’s the “real, the realer, and the realest listening moments” from your year, according to the 2023 marketing campaign. Of course, that’s not really true. The realest moments aren’t the ones when Spotify and its algorithmically curated playlists are just filling dead air or getting me through my commute. Maybe I’m pumpkin spice in the in-between moments of my life that I dot with John Mayer (apologies to my fellow women) and Olivia Rodrigo. But when I look away from the algorithmic mirror, I see my dad in the driver’s seat and hear Willis Alan Ramsey coming through the speakers. And I feel like myself again.