Many Americans get their news through social media. But the stories that surface in our Facebook feeds are heavily curated by an unseen force: an algorithm that purports to select the most relevant stories and postings for each user. In contrast, Twitter users access an unfiltered stream of postings by people in their network. But recent comments by company executives hinted that could soon change. Tech Tuesday examines how algorithmic filtering works, and why some people worry it is negatively impacting social media users access to information.

Guests

  • Zeynep Tufekci Assistant Professor, School of Information, University of North Carolina; and Faculty Associate, Harvard Berkman Center for Internet and Society

Transcript

  • 12:06:43

    MR. KOJO NNAMDIFrom WAMU 88.5 at American University in Washington, welcome to "The Kojo Nnamdi Show," connecting your neighborhood with the world. Later in the broadcast, a high profile hack of celebrities' racy images has many people wondering just how secure cloud storage services are. We'll get some tips for protecting your content online. It's "Tech Tuesday." First, unseen forces influencing what you see on social media. Millions of Americans get their news from Facebook.

  • 12:07:22

    MR. KOJO NNAMDIThey use the site's front page to figure out what's going on in the world. And what's happening to people in their social network. But Facebook's news feed may not be serving up as much news as you think it does. The social network uses a computer program to comb through all the postings from your friends and selects the stories it thinks are most relevant to you. It's called algorithmic filtering. And some people think it's having a negative impact on social media users by narrowing the news and views we see.

  • 12:07:52

    MR. KOJO NNAMDISo they were especially alarmed when an executive recently seemed to indicate that Twitter could introduce filtering. Joining us to talk about this is Zeynep Tufekci. She's a Professor in the School of Information at the University of North Carolina and faculty associate at the Harvard Berkman Center for Internet and Society. Zeynep Tufekci joins us from studios at UNC Chapel Hill. Thank you for joining us.

  • 12:08:17

    MS. ZEYNEP TUFEKCIThank you for inviting me.

  • 12:08:18

    NNAMDIYou too can join the conversation. Give us a call at 800-433-8850. How would you assess the user experience on sites like Facebook? Do you think the stories on your news page are giving you an accurate view of the news of the day? 800-433-8850. You can send email to kojo@wamu.org. Shoot us a tweet @kojoshow using the hashtag techtuesday. Or go to our website, kojoshow.org and join the conversation there.

  • 12:08:46

    NNAMDIZeynep Tufekci, the average Facebook user has about 200 friends, according to the Pugh Research Center, but that does not mean that we actually see postings from 200 people on our news feeds. In fact, a computer program is employed by Facebook to comb through all that user data, brings a set of stories to our front page. What do we know about how this algorithm works?

  • 12:09:08

    TUFEKCIRight. So, the algorithm is not released. So what we're -- what I'm going to tell you is what we think we know by trying to reverse engineer this. The things we've been able to tell are that it allows things that are commented upon, liked to rise to the surface. It allows things that are contained pictures, that contained links. And also, very interesting, things that have the word congratulations in the comments, right? So, if somebody says, here, I'm getting engaged or I'm having a baby.

  • 12:09:48

    TUFEKCIAnd people start congratulating them. The word congratulations seems to trigger the algorithm to put that on top of everybody's feeds. So if you're wondering why your Facebook is full of baby news and engagements and marriages, that's partly why. And recently, my friends who've learned of this have started hacking this. One of them was trying to sell her camera equipment and she just went and said, can you all congratulate me so that, you know, my camera equipment sale is visible.

  • 12:10:16

    TUFEKCIAnd we tested this, and sure enough, it was on top of a lot of peoples' news feeds. So, it's really -- it's opaque. It's gameable. And again, remember, this is an ad based platform, so it's also kind of tailored to make things easier to sell ads and also to show you content that Facebook thinks is going to keep you coming back and clicking on like.

  • 12:10:39

    NNAMDILast week, news stories indicated that Twitter was considering its own foray into filtering, which set off a chorus of outrage among many Twitter users. Since its inception, Twitter has been unfiltered, meaning it reads as a sort of a chronological, objective document of news as it breaks. Would filtering change the nature of Twitter?

  • 12:11:02

    TUFEKCISure. It would. Right now, it's not true that there is no filtering. There is filtering in the sense that the people I choose to follow on Twitter filter for me, right, by re-tweeting or by mentioning other things, so it's kind of like this human network that I've chosen to follow that shows me, here, this is what I think is important. And I get to choose whether I like what these people are choosing or not. Now, if a computer program started doing that, I really don't know whether it would pick the kinds of things that my friends and the people I follow would pick.

  • 12:11:36

    TUFEKCIAnd I also don't know if it would just pick a few very popular things. You know, if my Twitter feed right now has very little, you know, fluff, or Kardashians, or other things, you know, maybe the algorithm might think, hey look, all these other people are clicking on that. And start showing me those things, which is why I kind of worry. There's a way in which that there's a human intelligence at work at Twitter, which makes it kind of cumbersome. That's why they're trying to change the algorithm, I think.

  • 12:12:04

    TUFEKCIBecause for a new user, it's a little confusing. But it's very rewarding after you've put in the labor to create a list of people whose judgment you trust. But of course, being a company, they're thinking, how do we make this quicker and faster and get a lot more engagement? And on Facebook, what Facebook does is we know better what you want. Here you go. And that kind of works in some way, in that you get stuff that other people liked and commented on. Or have pictures or have congratulations. But who knows what you're missing? Maybe you're missing...

  • 12:12:35

    NNAMDIHas there been any survey conducted, at all, about how people respond to what we might be missing? How people respond to the knowledge that they're not necessarily seeing everything that their friends are posting, which, of course, would be a whole lot of information.

  • 12:12:48

    TUFEKCIYes. Yes.

  • 12:12:48

    NNAMDIBut, yes. There have?

  • 12:12:49

    TUFEKCIPeople don't even know. No. People say the thing -- two friends of mine, Christian Sanbeck (sp?) and Carrie Cariolis (sp?) , they did a study. They were trying to test just this. And what they found was that 62 percent of the people didn't even know, they didn't even realize, that Facebook was algorithmically curating their feed. For most people, they turn on, here it is. You know, this is what my friends said. And they're not really giving it a lot of thought. Algorithms are really interesting because they're invisible, right?

  • 12:13:20

    TUFEKCISo you, Facebook doesn't make -- advertise the fact that it's curating. It doesn't advertise the fact that it's hiding things from you. You have this thing that kind of appears, almost like magic. Here you go. And I think what has happened is as people learn, then they might start looking and thinking, well, what is it that I'm missing? So, I sometimes do it. I sometimes go and check, like, through my Facebook feed as much as I can. You know, chronology. But even I found that it doesn't show me everything.

  • 12:13:50

    TUFEKCIAnd on Facebook, one of the downsides, I think, of algorithmic curation is that the only signal you sent -- can send is to like something. And we saw this, I think, in the Ferguson news. How do you like a news story like that?

  • 12:14:04

    NNAMDII was about to get to that. The Ferguson...

  • 12:14:05

    TUFEKCIYeah, I dislike it. You know? So, I couldn't really say -- it's hard for us to tell Facebook, hey, I don't like this. But it's important. Right? So Facebook, I think, because it only allows like as a signal, seems to be surfacing things that are happy, more -- I'm guessing, because once again, they're not showing us the data. They're not allowing us control, which I would really like. And we have a lot more -- you know, my Facebook feed has a large number of, partly because of my age, of course, people who are announcing babies.

  • 12:14:39

    TUFEKCIBecause everybody's congratulating them and then Facebook is showing it to more people. And the more people that see it, it becomes this, you know, feedback cycle. The more people see it, the more people comment. And the more Facebook decides, oh, I'm gonna show it to even more people. Whereas maybe, if there's some more sad news that people aren't necessarily clicking like on, I don't even know if I'm -- you know, I don't know what I'm not seeing.

  • 12:15:02

    NNAMDIIn case you're just joining us, we're talking with Zeynep Tufekci. She's a Professor in the School of Information at the University of North Carolina. And faculty associate at the Harvard Berkman Center for Internet and Society. We're taking your comments and questions at 800-433-8850. Do you trust the algorithms used by Facebook and Google to accurately guess what you want when you log in? 800-433-8850. You can send us email to kojo@wamu.org. Zeynep, I'm so glad you mentioned the issue of congratulations and how it tends to attract attention.

  • 12:15:37

    NNAMDIThis summer, there were two huge stories that caught fire, first on social media, then ended up becoming huge stories across traditional media. You mentioned Ferguson, Missouri, the shooting of Michael Brown, an unarmed black teenager, which sparked outrage then days of unrest. Also, the ALS ice bucket challenge encouraged people to dump ice water on themselves to raise money for a horrible disease. And some people began to notice a curious pattern on their social media accounts.

  • 12:16:06

    NNAMDIThe ice bucket challenge was blowing up on Facebook while everyone on Twitter was talking about Ferguson. Was this algorithmic filtering at work?

  • 12:16:17

    TUFEKCII suspect very much so. Again, since Facebook doesn't tell us, we really don't know, but I really suspect so. Because the ALS bucket challenge is geared, it's made perfect for Facebook's algorithm. It's got a video, which, as far as we can tell, the Facebook algorithm seems to prioritize. You tag other people, you know, you say here, I pass this challenge on to you. And, you know, since people are trying to raise money for a horrible disease, everybody says, oh, congratulations. Great, good for you. So it raises a lot of comments.

  • 12:16:46

    TUFEKCISo it's just the kind of thing that is designed, almost, to be prioritized by Facebook's algorithm. And I had the same experience. You know, I've had so many post after post after post on the ALS bucket challenge. And also, Facebook keeps showing the same posts to me, again and again. You know, more people would see it and more people would comment. So I've still got, you know, ALS bucket challenge from a week ago showing up on my feed. And it just perpetuates this.

  • 12:17:16

    TUFEKCIWhereas, beginning of the sort of, the early days of the demonstrations, my own Twitter account was very heavily -- my Twitter follow, which has a lot of overlap with my Facebook friends. Although, of course, very different groups by numbers, was very heavily geared towards Ferguson. And Ferguson didn't surface in my Facebook profile, you know, my Facebook page until the next day or two and then I started seeing more posts. And still, they were greatly outnumbered by the ALS bucket challenge.

  • 12:17:47

    TUFEKCISo, it's probably partially a function of who posts what where. But it's also, since you post something to Facebook, you don't really know who's gonna see it, it seems, you know, one might be more reluctant. The algorithm changes our behavior, too. It doesn't just filter what we see. It kind of encourages and discourages us, you know, to take the time to post here or there, because you don't know if your friends are gonna see it. But I do want to say, if you go on Facebook and post something and none of your friends comment, and you're thinking, why aren't they saying something?

  • 12:18:19

    TUFEKCIMaybe they're not seeing it at all. It's totally possible. So this is this great new divergence in which instead of an editor, like in traditional newspapers or people you choose to follow, human intelligence on Twitter. On Facebook, it's an opaque algorithm who's functioning a secret that's deciding what's important and what's not. In some ways, it makes life easier, because you don't have to make decisions. In other ways, you have a lot less control and you don't really know what's going on.

  • 12:18:48

    NNAMDIWe got a tweet from Howard who says, I've got an idea. Why can't Twitter let us pick our algorithms? A filter might be nice if we could enable it as we choose, which brings me to the issue of transparency. Facebook caught a lot of heat earlier this year when it was revealed that it had intentionally manipulated the newsfeeds of some users as part of a research project. That raised some thorny questions about consent and research ethics. But it also highlighted how little we know about how the algorithms work. So when Howard says, let us pick our own algorithms, what would or what should transparency look like here?

  • 12:19:26

    TUFEKCII absolutely agree. I mean, algorithms aren't some evil demons, right? I mean, they're computer programs. When you, you know, if you have an ABS brake system in your car, you got an algorithm figuring out how to best brake. So they can be quite useful. But you know, and in Facebook, obviously, if you have 200 friends and they're all posting three things a day, that's, you know, 600 right there. And a lot of people post more, and which one is Facebook gonna show you? So there is a question of how do you prioritize? The thing, as you highlight, with the backlash to the study, is that this is, especially on Facebook, this is you and your friends and your family.

  • 12:20:03

    TUFEKCIIt's a couple hundred people that you feel close enough to mutually friend. So people do want more control and more transparency as far as I can tell. And I think that's what the backlash to the experiment was about. There are a lot of ways that there could be algorithmic choices that were transparent to, and you could pick. And you could say, here, you know, this is how I want it to work and you could go back and tweak if it didn't work for you. And there would be a lot more transparent ways for you to go, you know, find what you were missing.

  • 12:20:35

    TUFEKCIYou could -- there's so many ways to do it, but I see no move from Facebook, which is kind of worrisome in and of itself. And again, remember, the platform lives by delivering ads to you. So you're not its actual customer. The people who are buying the ads are its actual customers. So I feel like, you know, how do we know the algorithm is tweaked more to our benefit, not to the delivery of ads? And now Twitter's, you know, sort of moved towards this direction, potentially. It's unclear. I think caused that backlash, because people are, more and more, our online spaces are crucial to our civic spaces.

  • 12:21:13

    TUFEKCIYou know, for Ferguson, it's civic. For ALS bucket challenge, it's important. For, you know, our friends, our families, and if you're an immigrant, your family is probably communicating with you on Facebook and other platforms. They're these really integral, really important, both public and private spaces of 21st century. And we've got this invisible layer of algorithms that are opaque, not transparent. We have no idea how they work. And these stand between us and our news.

  • 12:21:41

    TUFEKCIThey stand between us and our friends and family. And I think, given how much money these companies have made recently, how big they've grown, because they're so important to us, I think they have a moral obligation to say, wait, this algorithm isn't just for us to decide. This is something that's determining how people see, you know, what they see of their friends. And I fear the day that, you know, somebody's gonna post a cryptic suicide note to Facebook and nobody's gonna think it's important. That algorithm's not gonna think it's important enough, because...

  • 12:22:12

    NNAMDICause nobody's gonna say congratulations.

  • 12:22:14

    TUFEKCIIt -- nobody's gonna say congratulations, and maybe it's not gonna have the key words, you know?

  • 12:22:17

    NNAMDIExactly.

  • 12:22:18

    TUFEKCIYou know, I'm sure Facebook has some key words for -- if you say, I'm going to, you know, commit suicide, I'm sure Facebook has key words for that. I'm not saying they're irresponsible. But what if it's an opaque roundabout thing only the friends would understand, but it's never shown? It's gonna happen.

  • 12:22:32

    NNAMDIHere is Bob in Chillum, Maryland. Bob, you're on the air. Go ahead, please.

  • 12:22:37

    BOBHey. Good afternoon, folks. I want to say, kind of in a nutshell, that this is really a highly depressing conversation. And that what we're already seeing in our society is increasing polarization because of algorithms. People are only getting news that's tailored to agree with what they already believe. And it's heightening, exponentially, this degree of polarization in this country, to the degree that we're not able to do anything. And yet, more and more, we rely on this and these technologies. And at the very least, we need to stop referring to this as a platform for free speech, because it's obviously not. Anyway...

  • 12:23:24

    NNAMDIYou raise a fascinating point, Bob. Because Zeynep Tufekci, for a number of years now, scholars have observed a tendency among we internet users, that we tend to aggregate among people who see the world in similar ways to us. Some have called this the filter bubble. Some have called it homophile. Do you think sites like Facebook, with their curated stories, make it worse, which is what Bob seems to be suggesting?

  • 12:23:46

    TUFEKCIWell, we had recent research from my friend at Rutgers, Keith Hampton, whose previous research had found that social media increases political participation, but it seems to be dampening deliberation exactly for this reason. Because of polarization and we are sort of afraid of the arguments. Now, the thing that -- this comes from real human tendency, right? Wanting to go find people who think like you is not the fault of technology. This is a human trait.

  • 12:24:20

    TUFEKCIBut what happens online is this already existing human tendency to seek the comfort of agreement is made worse by the fact that the algorithm's gonna feed that sugar to you. It's going to show you things you clicked like on before, because they don't want you to get upset and storm off the platform. They want you to say, oh, here are things that I agree with. Now, an interesting twist to this would be, since we, as humans, have this tendency, if we had more control over our algorithms, maybe we could put them to work the other way.

  • 12:24:54

    TUFEKCII mean, if somebody offered me, here are 20 smart awesome people who disagree completely with you and really make good points, I'd actually want to know. Right now, you know, it's not easy for someone to find things to read that are good and smart and that disagree with you. Because, by definition, you think, I'm so smart, therefore if they disagree with me, they're not. Maybe our algorithms could be used to nudge us, push us in the opposite direction.

  • 12:25:23

    TUFEKCIAlmost like sort of nutrition labels, right, when you could look at and say, hey, this is good food for me. And this is not. But right now, because the internet is unfortunately based on ads, what they're doing is how do we create the pleasant, mellow environment that makes us click on ads and buy stuff?

  • 12:25:42

    NNAMDIGlad you...

  • 12:25:43

    TUFEKCIWhich is a pity, given the, you know, potential.

  • 12:25:45

    NNAMDIGlad you made that point, because my final question, these algorithms are good at making a quantitative assessment of what's important. They can quickly identify the stories that are garnering the most clicks on the web and surface them on our news feeds. But what about important stories that cannot be quantified? Stories that have not been read by very many people, but they should be. You say that these algorithms reward content that has already been rewarded.

  • 12:26:10

    TUFEKCIAbsolutely.

  • 12:26:11

    NNAMDIWhat do you mean?

  • 12:26:11

    TUFEKCIAbsolutely. I mean, these are rich get richer systems. If you're already, sort of, on the rise, then you're gonna be seen by more people, and more people are gonna react with you. And then Facebook's gonna show it to more people. So if you take off, you're gonna dominate. Like ALS bucket challenge. Once something spikes, it can completely take over the feed, because, you know, it's just a feedback loop. On the other hand, if something doesn't spike, it could completely be buried.

  • 12:26:39

    TUFEKCISo it increases the distance between things that, you know, dominate the news and the things that maybe should have a chance, but never even had a chance. If you never get the chance, you just get buried very quickly. So, there are very steep inequality regimes, so to speak. Which is unfortunate, once again, because it's an open system. It has the potential to do so much more, in terms of bringing important and essential news and information from each other to each other.

  • 12:27:10

    NNAMDIZeynep Tufekci is a professor in the School of Information at the University of North Carolina and faculty associate with the Harvard Berkman Center for Internet and Society. Thank you so much for joining us.

  • 12:27:21

    TUFEKCIThank you for inviting me.

  • 12:27:23

    NNAMDIWe're gonna take a short break. When we come back, a high profile hack of celebrities' racy images has many people wondering just how secure cloud storage services are. We'll get some tips for protecting your content online. It's "Tech Tuesday." I'm Kojo Nnamdi.

Topics + Tags

Most Recent Shows