If you opened Facebook, Twitter or Instagram about a decade ago, you'd likely see posts from friends and family, in chronological order. Nowadays, users are hit with a barrage of content curated by an algorithm. Passionate about plants? Sports? Cats? Politics? That's what you're going to see.

"[There] are equations that measure what you're doing, surveil the data of all the users on these platforms and then try to predict what each person is most likely to engage with," New Yorker writer Kyle Chayka explains. "So rather than having this neat, ordered feed, you have this feed that's constantly trying to guess what you're going to click on, what you're going to read, what you're going to watch or listen to."

In his new book, Filterworld, Chayka examines the algorithmic recommendations that dictate everything from the music, news and movies we consume, to the foods we eat and the places we go. He argues that all this machine-guided curation has made us docile consumers and flattened our likes and tastes.

"For us consumers, they are making us more passive just by feeding us so much stuff, by constantly recommending things that we are unlikely to click away from, that we're going to tolerate [but] not find too surprising or challenging," Chayka says.

What's more, Chayka says, the algorithms pressure artists and other content creators to shape their work in ways that fit the feeds. For musicians working through Spotify or TikTok, this might mean recording catchy hooks that occur right at the beginning of a song — when a user is most likely to hear it.

Though the algorithms can feel inescapable, Chayka says increased regulation of social media companies can mitigate their impact. "I think if Meta, Facebook's parent company, was forced to spin off some of its properties, like Instagram or WhatsApp, and those properties were made to compete against each other, then maybe users would have more agency and more choices for what they're consuming," he says.


Interview highlights

On how the internet takes power away from gatekeepers

There's this huge power of the internet to let anyone publish the art that they make or the songs that they write. And I think that's really powerful and unique. ... [In] the cultural ecosystem that we had before, there were these gatekeepers, like magazine editors or record executives or even radio station DJs, who you did have to work through to get your art heard or seen or bought. And so these were human beings who had their own biases and preferences and social networks, and they tended to block people who didn't fit with their own vision.

Now, in the algorithmic era, let's say rather than seeking to please those human gatekeepers or figure out their tastes, the metric is just how much engagement you can get on these digital platforms. So the measure of your success is how many likes did you get? How many saves did you get on TikTok or bookmarks? How many streams did you get on Spotify?

So I think there are advantages and disadvantages to both of these kinds of regimes. Like, on the internet, anyone can put out their work and anyone can get heard. But that means to succeed, you also have to placate or adapt to these algorithmic ecosystems that, I think, don't always let the most interesting work get heard or seen.

On the difficulty of knowing what's going outside your specific algorithm

These digital platforms and feeds, they kind of promise a great communal experience, like we're connecting with all the other TikTok users or all of the other Instagram users, but I think they're actually kind of atomizing our experiences, because we can never tell what other people are seeing in their own feeds. We don't have a sense of how many other people are fans of the same thing that we are fans of or even if they're seeing the same piece of culture that we're seeing, or experiencing an album or a TV show, in the same way. So I think there's this lack of connection ... this sense that we're alone in our consumption habits and we can't come together over art in the same way, which I think is kind of deadening the experience of art and making it harder to have that kind of collective enthusiasm for specific things.

On how success on social media determines who gets book deals, TV shows and record deals

Every publisher will ask a new author, "What is your platform like? How big of a platform do you have?" Which is almost a euphemism for, "How many followers do you have online?" — whether that's [on] Twitter or Instagram or an email newsletter. They want to know that you already have an audience going into this process, that you have a built-in fan base for what you're doing. And culture doesn't always work that way. I don't think every idea should have to be so iterative that you need fans already for something to succeed, that you have to kind of engage audiences at every point in the process of something to have it be successful. So for a musician, maybe you'll get a big record deal only if you go viral on TikTok. Or if you have a hit YouTube series, maybe you'll get more gigs as an actor. There's this kind of gatekeeping effect here too, I think, where in order to get more success on algorithmic platforms, you have to start with seeding some kind of success on there already.

On how some film and TV shows lean into becoming internet memes

You can see how TV shows and movies have adapted to algorithmic feeds by the kind of one-liner, GIF-ready scenes that you see in so many TV shows and movies now. You can kind of see how a moment in a film is made to be shared on Twitter or how a certain reaction in a reality TV show, for example, is made to become a meme. And I think a lot of production choices have been influenced by that need for your piece of content to drive more pieces of content and to inspire its own reactions and riffs and more memes.

On how algorithms impact journalism

Algorithmic feeds, I think, took on the responsibility that a lot of news publications once had. ... In decades past, we would see the news stories that we consumed on a daily basis from The New York Times front page on the print paper or as on The New York Times homepage on the internet. Now, instead of the publication choosing which stories are most important, which things you should see right away, the Twitter, or X, algorithmic feed is sorting out what kinds of stories you're consuming and what narratives are being built up. We now have TikTok talking heads and explainers rather than news anchors on cable TV. So the responsibility for choosing what's important, I think, has been ported over to algorithmic recommendations rather than human editors or producers.

On how passive consumption affects how deeply we think about culture

I think passive consumption certainly has its role. We are not always actively consuming culture and thinking deeply about the genius of a painting or a symphony. ... It's not something we can do all the time. But what I worry about is the passivity of consumption that we've been pushed into, the ways that we're encouraged not to think about the culture we're consuming, to not go deeper and not follow our own inclinations. ... And I suppose that when I really think about it ... the kind of horror that's at the end of all this, at least for me, is that ... we'll never have the Fellini film that's so challenging you think about it for the rest of your life or see the painting that's so strange and discomforting that it really sticks with you. Like I don't want to leave those masterpieces of art behind just because they don't immediately engage people.

Sam Briger and Susan Nyakundi produced and edited this interview for broadcast. Bridget Bentz, Molly Seavy-Nesper and Beth Novey adapted it for the web.

Copyright 2024 Fresh Air. To see more, visit Fresh Air.

Transcript

TONYA MOSLEY, HOST:

This is FRESH AIR. I'm Tonya Mosley. Depending on what corners of social media you're on, chances are good you've heard this earworm of a song by the group Ocean Alley.

(SOUNDBITE OF SONG, "CONFIDENCE")

OCEAN ALLEY: (Singing) It's all about confidence, baby. She was a confident lady.

MOSLEY: The song is called "Confidence," and the Australian indie band released it five years ago. But thanks to going viral, it's having a moment right now. But whether it's having a moment on your feed, well, that's all up to the algorithm. Writer Kyle Chayka has been thinking about this for several years. In his new book, "Filterworld: How Algorithms Flattened Culture," he writes about how we are fed algorithmic recommendations that dictate what music we like, how we interpret the news, what movies we consume, even what foods we eat, clothes we wear, the language we use and the places we go. And Chayka argues that all of this machine-guided curation has made us docile consumers and flattened our likes and tastes. Kyle Chayka is a staff writer for The New Yorker, covering technology and culture on the internet. His work has also appeared in The New Republic, The New York Times Magazine and Harper's, among other publications. Chayka's first nonfiction book, "The Longing For Less: A History Of Minimalism," was published in 2020.

Kyle Chayka, welcome to FRESH AIR.

KYLE CHAYKA: Thanks so much for having me here.

MOSLEY: This is a conversation I wanted to have for the longest time, so I'm really excited that you're here. So almost about a decade ago, I guess, we could basically go on Facebook or Instagram or Twitter and scroll through the posts of everyone we followed almost chronologically, especially in those early days. Now most of what we engage in, as you write in this book, is content flowing through the algorithm, optimized for engagement and pretty much devoid of the human touch. What changed about eight or nine years ago? I guess that was around 2015, 2016?

CHAYKA: Yeah. In the earlier era of social media, most of the feeds that we were interacting with were linear, so that just meant they were chronological. They ordered all the posts that you saw from most recent to oldest, and that was just how everything was filtered. You could see it on Facebook or Instagram or whatever. And over the past decade, most of those feeds have switched to being more algorithmic or more driven by algorithmic recommendations. So these are equations that measure what you're doing, surveil the data of all the users on these platforms and then try to predict what each person is most likely to engage with. So rather than having this neat, ordered feed, you have this feed that's constantly trying to guess what you're going to click on, what you're going to read, what you're going to watch or listen to. And it feels like a kind of intrusive mind reading sometimes.

MOSLEY: I could see how all of this can make us passive consumers, but what do you mean when you say the algorithms are flattening culture?

CHAYKA: I think algorithmic recommendations are kind of influencing us in two different directions. For us consumers, they are making us more passive just by, like, feeding us so much stuff, by constantly recommending things that we are unlikely to click away from, that we're going to tolerate, not find too surprising or challenging. And then I think those algorithmic feeds are also pressuring the creators of culture, like visual artists or musicians or writers or designers, to kind of shape their work in ways that fits with how these feeds work and fits with how the algorithmic recommendations promote content.

MOSLEY: Yeah. That's why I thought that bringing up music is a really good way - a good example of how filterworld can feel like it's both expanding and contracting culture. Because...

CHAYKA: Yeah.

MOSLEY: ...You know, I never would have learned about a group like Ocean Alley otherwise. But there are these other elements that you're talking about then tailoring the work based on the algorithm and trying to go viral.

CHAYKA: Yeah. Yeah. I mean, because we consumers, like, really consume so much culture through these feeds, in order to reach audiences, creators also have to work through these feeds. Like, a musician has to work through Spotify or TikTok and kind of mold their work in a way that fits with TikTok. So that might mean, like, a really catchy hook that occurs right at the beginning of a song or packing every sound possible into the, like, 10 seconds that you have for a viral TikTok sound.

MOSLEY: One other thing I was thinking about is what I also see, though, is that the digital space has lessened the potency and power of gatekeepers, so we're no longer relying on a handful of media that dictate what is good art, what is good fashion and culture and music. Couldn't it be argued that algorithms in the digital space, more broadly, have opened up the world, though, in ways that we've never had access to before?

CHAYKA: I think they really have. Like, there's this huge power of the internet to let anyone publish the art that they make or the songs that they write. And I think that's really powerful and unique. Like, in the ecosystem - the cultural ecosystem that we had before, there were these gatekeepers like magazine editors or record executives or even radio station DJs who you did have to work through to get your art heard or seen or bought. And so these were human beings who had their own biases and preferences and social networks, and they tended to block people who didn't fit with their own vision.

And now, in the algorithmic era, let's say, rather than seeking to please those human gatekeepers or figure out their tastes, the metric is just how much engagement you can get on these digital platforms. So the measure of your success is how many likes did you get? How many saves did you get on TikTok or bookmarks? How many streams did you get on Spotify? So I think there are advantages and disadvantages to both of these kinds of regimes. Like, on the internet, anyone can put out their work, and anyone can get heard. But that means to succeed, you also have to placate or adapt to these algorithmic ecosystems that I think don't always let the most interesting work get heard or seen.

MOSLEY: I was especially fascinated by your chapter on personal taste and the ways that algorithms have disrupted our taste. You explored this by first asking the question, what is taste? It is a very human thing.

CHAYKA: Yeah. I think - I mean, taste gets a bad rap sometimes as something that can be pretentious or elitist, but I think we all have taste. Like, we all...

MOSLEY: Right.

CHAYKA: ...Have things we like and don't like, and we all think about what we like and don't like, and that's what our taste is. I think, like, what we like is also what we identify with, and it's how we connect with other people and how we build communities around culture. So I think taste is really important, and it's something that algorithmic feeds and these big digital platforms kind of allow us to ignore or encourage us to ignore just so they can keep us listening and clicking and watching.

MOSLEY: Well, as part of your exploration of taste, you wanted to see if a digital space could actually identify your taste. So in 2017, Amazon created something called the Amazon Echo Look, which tried to approximate taste by making fashion decisions for the user. And you tried it out. How...

CHAYKA: Yes.

MOSLEY: ...Did it go?

CHAYKA: Well, this was a little camera that stood on your shelf, and it could take these full-body selfies for you that would show you your full outfit. And you could have the app - the Echo Look app - send out the images, kind of algorithmically analyze them with some human help, as well and the machine would tell you how stylish you were being or not. Like, it would purport to give you an analysis of how your outfit was. And I found that it didn't really work for me. I mean, this pushed on me - I think popped collars it was a big fan of, which I think were last fashionable when I was in middle school. It really didn't like my choice of monochrome outfits - like an all-gray outfit - which, you know, maybe that's true. Maybe that's not cool, but it's part of my personal choice, my style. To me, the biggest problem with the Echo Look was that it just kind of gave you this grade of your outfit. Like, it told you, oh, this is 75% stylish, but it couldn't really tell you why or it didn't give you the logic behind its analysis. It just kind of, like, told you whether you were going in the right direction or the wrong direction.

And that's just so antithetical to what we think of as personal style or even what we want to communicate via fashion. Like, how is this algorithm to know what you are trying to communicate with your clothes that day or how you're trying to feel out in the world? So just - I found it kind of useless as a style analysis and also just almost actively misleading or distorting the purpose of fashion, which is actually to communicate something about yourself, not to conform to some data-driven standard.

MOSLEY: And that was in 2017. I mean, several years later, now the big conversation is around generative AI and its ability to be - to predict what we like, to offer more specificity. How does that play into this conversation?

CHAYKA: Yeah. I feel like AI is, like, the looming question for all of this technology. My feeling is that algorithmic feeds and recommendations have kind of guided us into conforming to each other and kind of having this homogenization of culture where we all accept the average of what everyone's doing. We all kind of fit into these preset molds. And now AI is kind of promising to just spit out that average immediately - like to - it'll digest all of the data in the world. It'll take in every song, every image, every photograph and produce whatever you command it to. But that output will just be a complete banal average of what already exists. Like, that almost signals to me, like, a death of art or a death of innovation.

MOSLEY: If you're just joining us, I'm talking to journalist Kyle Chayka. He's a staff writer for The New Yorker and has written a new book called "Filterworld: How Algorithms Flattened Culture," which explores the impact of algorithmic technology on culture. We'll continue our conversation after a short break. This is FRESH AIR.

(SOUNDBITE OF RUDY ROYSTON'S "BED BOBBIN'")

MOSLEY: This is FRESH AIR. I'm Tonya Mosley, and today we're talking to Kyle Chayka about his new book, "Filterworld: How Algorithms Flattened Culture." Chayka is a staff writer for The New Yorker covering technology and culture on the internet. His work has also appeared in The New Republic, The New York Times Magazine and Harper's, among other publications. Chayka's first nonfiction book, "The Longing For Less," is about the history of minimalism.

OK, I want to talk about some of the other platforms where we're guided by the algorithm. In the case of streaming services, Netflix pioneered the filtering of culture through recommendation engines. What does the Netflix algorithm factor?

CHAYKA: It factors a lot of different things, including what movies or shows you've already watched, what other users are watching and clicking on, and also just what Netflix chooses to prioritize in a given moment. So the Netflix homepage now is really driven by algorithmic recommendations. It's always personalized to try to present you things that you are likely to watch. And that's always measuring the kinds of genres that you're watching or the actors you like or, you know, other favorites that you've shown to the system.

MOSLEY: The problem is, as you write in the book and one scholar wrote, is that it's taking away the process of cultural meaning through decision making. We make meaning through making our own decisions about what we want to see and what we like.

CHAYKA: Yeah, I think so. I mean, the act of choosing a piece of culture to consume is a really powerful one. Like, it is an internal decision. That means we're giving our attention to a specific thing. It means we're interested in a specific category of culture. And I think it can be really empowering. But I think in the context of a Netflix homepage, it can also be completely manipulative. On the Netflix homepage, there's this problem called corrupt personalization, which is the appearance of personalization without the reality of it. And that happens with Netflix because Netflix is always changing the thumbnails of the shows and movies that you are watching in order to make them seem more appealing to you. So if you're someone...

MOSLEY: Oh, give me an example, yeah.

CHAYKA: Yeah, it's - an academic did a long-term study of this by creating a bunch of new accounts and then kind of giving them their own personalities. Like, one character, let's say, only watched romantic comedies. One character only watched sports documentaries. One only watched thrillers. And then one, like, test version watched everything at random times. But what this academic found was that Netflix would change the thumbnails of the shows to conform to that category that the user watched, even if the show was not of that category. Like, it would - say you're the sports viewer. Netflix would take a romantic comedy and put, like, the one sports scene as the thumbnail to kind of encourage you to watch it. Or, you know, in a thriller, maybe if you're a romantic comedy watcher, it would take the one frame where, like, two characters are going to kiss or something in order to make it look like this is the kind of culture you want to consume, even though it's actually not. So it's - the algorithm, in that way, is kind of manipulative and using your tastes against you.

MOSLEY: You know, I'm just wondering about something, and you as someone who follows art and culture, this is what you do for a living is write about it. If everything is recommended for us or tailored to the kinds of movies we like or the news that I like to consume or the music I like to listen to, how do I really know what's happening culturally in the world? So how do I know what's happening around me to understand if my tastes and sensibilities are running in parallel or up against what's happening?

CHAYKA: Yeah. I think that's really hard to do right now. Like, the - these digital platforms and feeds, they kind of promise a great communal experience, like we're connecting with all the other TikTok users or all of the other Instagram users, but I think they're actually kind of atomizing our experiences, because we can never tell what other people are seeing in their own feeds. We don't have a sense of how many other people are fans of the same thing that we're fans of or even if they're seeing the same piece of culture that we're seeing or experiencing an album or a TV show in the same way. So I think there's this lack of connection, like, as you're saying, this sense that we're alone in our consumption habits and we can't come together over art in the same way, which I think is kind of deadening the experience of art and making it harder to have that kind of collective enthusiasm for specific things.

MOSLEY: On the other hand, I'm someone, for instance, who - I'm a plant lover. I'm a plant mom. I'm obsessed with plant life. And so through the algorithm, it feeds me lots of content around caring for plants and facts about plants. And so there is also another community, though, I'm tapping into through that.

CHAYKA: Yeah. I think there's always - I think algorithms are essentially an ambivalent force. Like, I think you can use them to great effect. You can use them to find the people or pieces of culture that you like. But I think when we rely on them too much, that's when it becomes so overwhelming and flattening of our experiences. So in the plant department, I think it's been really cool to see communities develop around these different trends, like plants. But then you kind of see the same plants being popular everywhere you go. Like the...

MOSLEY: It's so true.

CHAYKA: ...Unavoidable fiddle leaf fig or, you know, a pothos plant. And I think, I don't know, it's hard to sustain both that community building and a sense of diversity and, like, a sense that everyone can pursue their own paths within it. It's like within these feeds, I feel like there's always one correct way to approach a thing or one correct mode of consumption. And so in plants that might be, oh, I have to go get the fiddle leaf fig or, you know, in films, I have to go see the "Barbie" movie or something...

MOSLEY: Right.

CHAYKA: ...Like that.

MOSLEY: I mean, you write about this in the book about how the flattening of culture has impacted, quote-unquote, "the real world," right? Every coffee shop has a fiddle leaf plant. And so, like, you give the example of the hipster coffee shop. You...

CHAYKA: Yes.

MOSLEY: ...Noticed something when you were traveling around the world about how they're influenced by the algorithm.

CHAYKA: Yes. I think this was in the mid-2010s or so when I was traveling around as a freelance magazine writer. I would always find a coffee shop to work in in whatever city I landed in. So whether that was Tokyo or Los Angeles or Copenhagen or Berlin, I would kind of land, go to my Airbnb, open Yelp or Google Maps and search hipster coffee shop and see where the nearest place was that I could go. And it struck me that all of these places looked increasingly similar to each other.

So I could reliably land in any city in the world, open one of these apps and easily find my way to a coffee shop with a fiddle leaf fig and cappuccinos with nice latte art and white subway tiles on the walls and minimalist reclaimed wood furniture. And it just struck me as so strange because no one was forcing these cafes to design themselves this way. There was no, like, corporate parent like a Starbucks mandating a particular style. Instead, it was just that all of these independent cafe owners around the world had kind of gravitated toward the same style and these same set of symbols, like the fiddle leaf fig.

MOSLEY: Our guest today is journalist Kyle Chayka. He's a staff writer for The New Yorker and has written a new book called "Filterworld: How Algorithms Flattened Culture," which explores the impact of algorithmic technology on the ways we live. We'll continue our conversation after a short break. I'm Tonya Mosley, and this is FRESH AIR.

(SOUNDBITE OF VIKINGUR OLAFSSON'S "ETUDES: NO. 9")

MOSLEY: This is FRESH AIR. I'm Tonya Mosley, and today I'm talking to Kyle Chayka. He's a staff writer for The New Yorker covering technology and culture on the internet. His work has also appeared in The New Republic, The New York Times Magazine and Harper's, among other publications. Chayka's first nonfiction book, "The Longing For Less," published in 2020, is a history of minimalism. We're talking about his new book, "Filterworld: How Algorithms Flattened Culture," which explores the impact of algorithmic technology on the ways we live. Meta, the parent company for Facebook, announced this month that it will begin removing some sensitive and age-inappropriate content from teenagers' feeds, even if the content was posted by people they follow.

Influencers have been very vocal over the years about how the algorithm is against them, working against them in many instances, how some have had great success, and then all of a sudden, their likes have gone down, their views have gone down. It does seem like a vicious cycle in the way that you're talking about in the, quote-unquote, "real world" with retail owners who say they're constantly chasing something. How and why is this happening, where people are feeling like they may have a career, actually, as an influencer online, and then all of a sudden, they've lost it?

CHAYKA: It's so capricious in a way. Like, I mean, I've felt it myself as a journalist or someone who's been on Twitter for a decade or so. Like, sometimes you feel like you're in tune with the algorithm. You know exactly what to do and how to tune your content so that it gets popular and gets a lot of likes. And then at other times, it's just not hitting. Like, the feed is suddenly prioritizing other things or other formats.

So I think there's this seduction of algorithmic platforms where they can really deliver your content to a huge audience and suddenly get you a big fan base, like, as a TikTok creator or a proto influencer. And then all of a sudden, just as you're gaining that audience and starting to maybe become addicted to the attention, then the algorithm kind of goes away, or your solution to the equation stops working quite as well as you thought it was. And that's - I mean, tastes fluctuate, and maybe your content can stay the same and people just are not as interested in it anymore. But also, the algorithm of the feed changes, and the forms of content that digital platforms emphasize change over time. So you kind of do have to play this constant catch-up game of figuring out what the feed likes to this day or this week and what it might prioritize in the future.

MOSLEY: Well, you write quite a bit about your own experience, how early in your career you learned to judge your success on the internet almost entirely in terms of numbers. So many of us do this - how many thumbs up on Facebook or how many likes on Twitter.

CHAYKA: Yeah.

MOSLEY: How does this impact how you see yourself as a writer, how you measure what is good?

CHAYKA: Oh, man. I mean, it's tough to separate your success from attention online, I think. I mean, this is just a real fact of existing in the past decade or more. So much of how we consume things is routed through these feeds that if your work is doing badly in the feeds, then you kind of feel bad about it. Like, as a journalist, particularly in the early 2010s, I would feel bad that my article - my brand-new article - didn't get as many likes as the last one, or my pithy observation on Twitter didn't have enough response, and that would make me feel embarrassed or that I had done the wrong thing. So it kind of puts you in this trap of self-consciousness, I think, where it's like you're always being graded on everything you put out into public.

MOSLEY: And in a way, you are, because I'm thinking about how hard it is for maybe filmmakers and writers and visual artists, creators of all kind, to find footing in filterworld. Book publishers, for instance, want to know how many platforms you're on, how many followers you have as part of your proposal for a book deal. How much of that assessment actually changes the kinds of books and movies and music that we have access to?

CHAYKA: Oh, I think it really does change it. I mean, every publisher will ask a new author, what is your platform? Like, how big of a platform do you have? Which is almost a euphemism for how many followers do you have online, whether that's Twitter or Instagram or a email newsletter. They want to know that you already have an audience going into this process, that you have a built-in fan base for what you're doing.

And culture doesn't always work that way. I don't think every idea should have to be so iterative that you need fans already for something to succeed, that you have to kind of engage audiences at every point in the process of something to have it be successful. So for a musician, you know, maybe you'll get a big record deal only if you go viral on TikTok. Or if you have a hit YouTube series, maybe you'll get more gigs as an actor. There's this kind of gatekeeping effect here, too, I think, where in order to get more success on algorithmic platforms, you have to start with seeding some kind of success on their already.

MOSLEY: Have television shows or movies used algorithms to help them greenlight or shape content?

CHAYKA: I think so. I mean, I think you can see how TV shows and movies have adapted to algorithmic feeds by the kind of, like, one-liner, gif-ready scenes that you see in so many TV shows and movies now. You can kind of see how a moment in a film is made to be shared on Twitter or how a certain reaction in a reality TV show, for example, is made to become a meme. And I think a lot of production choices have been influenced by that need for your piece of content to drive more pieces of content.

MOSLEY: How would you say journalism has been impacted and shaped by these algorithmic forces?

CHAYKA: Well, algorithmic feeds, I think, took on the responsibility that a lot of news publications once had. So, say, in decades past, we would see the news stories that we consumed on a daily basis from The New York Times' front page on the print paper, or then The New York Times homepage on the internet. Now, instead of the publication choosing which stories are most important, which things you should see right away, the Twitter or X algorithmic feed is sorting out what kinds of stories you're consuming and what narratives are being built up. Or, you know, TikTok - we now have kind of TikTok talking heads and explainers rather than news anchors on cable TV. So the kind of responsibility for choosing what's important, I think, has been ported over to algorithmic recommendations rather than human editors or producers.

MOSLEY: This feels like we're now venturing into what is dangerous territory, because we've been talking for many years about misinformation. You write about how algorithms can speed up ideological radicalization because it feeds users more extreme content in a single category. How would regulation - federal regulation of these tech companies impact maybe the power that the algorithm has?

CHAYKA: I mean, I think part of the problem with filterworld is that it feels inescapable. Like, we've been talking about so many negative qualities of this environment. I think there are ways out of it and ways that we can, like, break down the grip of filterworld. And that is - a lot of that is through regulation. I mean, social media is a very unregulated space right now, especially compared to traditional media. And so I think monopolization is one thing that we can be very wary of. So I think if Meta, Facebook's parent company, was forced to spin off some of its properties, like Instagram or WhatsApp, and those properties remain to compete against each other, then maybe users would have more agency and more choices for what they're consuming.

MOSLEY: There are also real consequences that go beyond what we focused our conversation on. You tell the story of a 14-year-old from London named Molly Russell, who died by suicide. An audit of her online consumption found that she had been fed content on suicide. Can you explain how something like that happens?

CHAYKA: When Molly Russell was using Twitter and using Pinterest and Instagram to consume content that was about depression and about self-harm - but then it wasn't stopping there. The platforms were then recommending her more and more of that content. So at one point, Pinterest actually sent her an email of suggestions of self-harm images to add to a collection that she had made. And I think - I mean, it's just so blatant why that's dangerous and bad. It's an algorithmic acceleration of content, but the content is harmful. And we - I don't think we want it to be pushed at people.

MOSLEY: What could tech companies do to filter or slow down or block access to harmful material like this?

CHAYKA: In the case of Molly Russell, I think there are ways that could pretty easily have prevented it. I mean, better moderation would make platforms more wary of promoting such negative content, like, about depression or other things, like anorexia or mental health problems. And there's another strategy that just excludes certain subjects and content matter from algorithmic promotion. So if a tech company's filter has detected that something was about self-harm or depression, then maybe that content would not be algorithmically promoted, and that would kind of slow down the acceleration that might have caused problems for Molly Russell.

MOSLEY: We need to take a short break here. But before we do, I just want to say that if you are in a state of despair and are having thoughts of self-harm, you can get help by calling or texting the Suicide & Crisis Lifeline at 988 at any time. The number again to text or call is 988. If you're just joining us, my guest is journalist Kyle Chayka. He's a staff writer for The New Yorker and has written a new book called "Filterworld: How Algorithms Flattened Culture," which explores the impact of algorithmic technology on the ways we live. We'll continue our conversation after a short break. This is FRESH AIR.

(SOUNDBITE OF DAVID NEWMAN'S "YOU CAN'T ALWAYS GET WHAT YOU WANT")

MOSLEY: This is FRESH AIR, and today I'm talking to Kyle Chayka about his new book, "Filterworld: How Algorithms Flattened Culture." Chayka is a staff writer for The New Yorker covering technology and culture on the internet. His work has also appeared in The New Republic, The New York Times Magazine and Harper's, among other publications. Chayka's first nonfiction book, "The Longing For Less," is about the history of minimalism.

So, Kyle, what else can we do? You're recommending an algorithmic cleanse, which you actually did for a few months. How did you do that and how did it go?

CHAYKA: Yes. I mean, I think regulation can help these situations. But in the end, users are kind of responsible for themselves for the time being. And one thing you can do is just opt out of these systems entirely. Like, you can log off Instagram. You can log off Twitter. You can not go on TikTok, even though it might feel impossible. And that's what I wanted to do, I think in September of 2022. I just felt, like, totally overwhelmed. I was nearing the end of writing this book. I needed to just cut myself off completely from the influence of algorithmic feeds. Not coincidentally, this was around the time that Elon Musk was acquiring Twitter, and that was kind of damaging my experience with one of my favorite platforms. So I was feeling some dread anyway. And I just decided one Friday that I was going to completely log off all of these things, reroute my consumption habits away from digital platforms and almost figure out a different way of existing in the world than what I had been used to the past decade or more.

MOSLEY: That had to be hard because this is what you do. You cover the internet.

CHAYKA: Oh, yeah. It was very difficult. I had to really push myself to do it. There were weeks and weeks where I said, OK, I'm definitely going to do it this week. I'll do it next week. I'll do it the following week. It was compulsive in a way because I had spent years and years, you know, every day being on Twitter, looking at Instagram a dozen times a day, being on TikTok, especially during quarantine. So to cut myself off, I mean, it felt like breaking an addiction. And when I did cut myself off and suddenly not have all of that algorithmically recommended stimulus, I did feel my brain kind of, like, gasping for oxygen and grasping for more stimulus that I was missing.

MOSLEY: How did you fill it?

CHAYKA: (Laughter) With some difficulty. My first - the first attempt I made was actually putting these fidget apps on my phone. Like, I found that my thumb was twitchy and, like, uncomfortable because I wasn't scrolling through things on my phone. Like, that familiar scrolling motion where you flip your thumb upward, that was totally missing. And so what I did was download these apps where you can, like, flip a light switch to turn a light on, or, like, sweep a digital floor or spin a dial. Like, accomplish these totally meaningless tasks just in order to, like, soothe your brain with these motions. And it worked for a little while. It was a good interim solution.

MOSLEY: What are you afraid of with the flattening of culture? Like, what is the future that you see that is really concerning when we think about all of this, because this sounds great for us as an audience and for those who will read your book, but for the vast number of those who are online, they are passively consuming.

CHAYKA: I mean, I think passive consumption certainly has its role. Like, we are not always actively consuming culture and, like, thinking deeply about the genius of a painting or a symphony or something. Like, it's not something we can do all the time. But I think what I worry about is this - just the, I suppose, the passivity of consumption that we've been pushed into, the ways that we're encouraged not to think about the culture we're consuming, to not go deeper and not follow our own inclinations. And I worry that that passivity along with the ways that algorithmic feeds pressure creators to conform, too, it kind of leads to this vision of all culture that's, like, the generic coffee shop. That's, like, it looks good, it might be nice, you might be comfortable in it, but ultimately, there's, like, nothing deeply interesting about it. It doesn't lead anywhere. It's just this kind of, like, ever-perfecting perfect boredom of a place. Like, a perfectly ambient culture of everything.

And I suppose that, when I really think about it, is the kind of horror at the end of all this, at least for me, is that we'll never have anything but that. We'll never have the Fellini film that's so challenging you think about it for the rest of your life or see the painting that's so, like, strange and discomforting that it really sticks with you. Like, I don't want to leave those masterpieces of art behind just because they don't immediately engage people.

MOSLEY: You know what I'm scared about is the younger generations who know nothing else.

CHAYKA: I - you know, I was not born into the era of algorithmic feeds. Like, the internet that I grew up on as an early teenager was more about self-directed creation of stuff and kind of exploring. But now I feel like so much of internet experience is driven through an algorithmic feed or you're forced to fit into a mold - like the ideal of an Instagram influencer or a creator on TikTok - that younger people maybe don't have the freedom to just kind of figure out what they're into without being pushed in one direction or another.

MOSLEY: So what are you hopeful about in regards to this topic? We've seen that there have been hearings around regulation but none of them have really pushed us far enough where we're going to see it in the way that we see the changes in the U.K. What are some things that bring you hope?

CHAYKA: I think what makes me the most hopeful is that people are starting to get bored of this whole situation. Like, we, as users of the internet, have spent a solid decade or more, you know, experiencing these things, existing within algorithmic feeds and it feels increasingly decrepit and kind of bad. Like, we're realizing that these feeds aren't serving us as well as they could, and who they really benefit are the tech companies themselves. So I think as users start to realize that they're not getting the best experience, I think people will start to seek out other modes of consumption and just build better ecosystems for themselves.

MOSLEY: Kyle Chayka, thank you so much for this conversation and your book.

CHAYKA: It was a great discussion. Thank you.

MOSLEY: Kyle Chayka is a staff writer for The New Yorker, covering technology and culture on the internet. His new book is "Filterworld: How Algorithms Flatten Culture." Coming up, book critic Maureen Corrigan reviews Stephen McCauley's new novel, "You Only Call When You're In Trouble." This is FRESH AIR.

(SOUNDBITE OF THE BENNIE MAUPIN QUARTET'S "PROPHET'S MOTIFS") Transcript provided by NPR, Copyright NPR.

300x250 Ad

Support quality journalism, like the story above, with your gift right now.

Donate