Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
Make a sustaining gift today to support local journalism!

Facebook's Newsfeed Study: Was It Ethical Or A Violation Of Privacy?

MICHEL MARTIN: Let's turn to an issue in social media now. If you were a Facebook user in January of 2012, you might remember logging on, scrolling through your newsfeed and feeling especially upbeat or maybe a little more down than usual. If so, you are not alone. Back then, Facebook's data scientists manipulated the newsfeed. More than 600,000 unwitting users were shown either mostly upbeat or more depressing news stories in an effort to determine if this affected their moods. Facebook recently published their findings, but a lot of people are not liking this experiment. We wanted to talk about it, so we called Rey Junco. He's a fellow at the Berkman Center for Internet and Society at Harvard. And Adrienne LaFrance is a senior associate editor at The Atlantic, and she wrote about the study and people's reaction to it. Welcome to you both. Thank you so much for joining us.

ADRIENNE LAFRANCE: Thank you.

REYNOL JUNCO: Thanks, Michel.

MARTIN: Adrienne, you wrote that there was something new level creepy about this study, but you're also trying to figure out - because you pointed out that, you know, we know that the newsfeeds are curated or manipulated by somebody. And you were trying to investigate what it is that felt new level creepy about it. So what'd you come up with?

LAFRANCE: I think there's a number of issues to unpack here and one of them is sort of the extent to which this manipulation has to do with consent. So it's sort of an ethical question as well as a legal question. But I thought it was telling - one of the authors who works at Facebook posted a statement over the weekend - something to the effect of, you know, we're just trying to better give you what you want. And to me there's a real contradiction there in them saying, we're trying to give you what you want, but, you know, we're not even going to give you the chance to give consent to let us know.

MARTIN: So the first issue was they didn't ask people whether they were willing to, you know, participate in this. But what else? Is it the emotional manipulation, trying to evaluate your feelings as opposed to what you're interested in - is that what you think?

LAFRANCE: I think so. I mean, yes, there is a contextual question here. So people are accustomed to being manipulated with the messages of say, like, an advertiser for instance. But when you go to your Facebook page, unless you've read the terms of service extensively - or even if you have frankly - you're not necessarily going to understand that there's sort of this puppet-master on the other side affecting what you see with the purpose of trying to see if it makes you happier or more sad.

MARTIN: Rey Junco, what's your reaction to the study - what was your reaction to it?

JUNCO: Well, my reaction was, oh, I have to take a look and let me see. But it's clear that there was, so that at least wasn't thought of when planning this study. Cornell released a statement saying that they took a look and what was told to them was that the researcher was using pre-existing data, so they didn't need to do a review. I mean, the data was already there. Facebook said that their internal review process, which clearly wasn't enough because they really didn't plan on what would happen if this would adversely affect people. I think the thing that bugs me the most about it is that you're toying with people's psychological states - that could have some pretty significant consequences. I mean, imagine if somebody who is depressed - who is going on Facebook to garner support from some friends - is exposed to a more negative newsfeed. They're probably more likely to be even more depressed because of that. So they really didn't consider how the risk reward balance, which you have to do ethically when you're planning a study, you have to purposely and thoroughly consider the risk inherent in the study and whether that risk is justified given what you might find. And then you also have to put in place protections for people who might experience some negative effects.

MARTIN: Adrienne, you contacted Susan Fiske, who edited the study before it was printed in the proceedings of the National Academy of Sciences of America. And presumably you asked her these questions about what ethical considerations were made before utilizing this technique. What did she say in response to these questions?

LAFRANCE: This is actually an area where some confusion emerged because when I first talked to her, she described the process in which - when the paper first crossed her desk, it raised some concerns for her. And so she went back to the authors and asked them about what kind of review process the paper had gone through. And her understanding was that it had been held to the scrutiny of a local institutional review board. And what that means is - an institutional review board - or IRB a lot of people call them - are boards that consider the ethics and sort of hold research to a standard, particularly when human subjects are involved. And so at first it seemed from my conversation with her as though there was this level of scrutiny. And then later came out that no, a university IRB or institutional review board, hadn't been used. And in fact, Facebook used an internal review process - the details of which are not public and don't appear to be (laughing) anywhere near being made public given Facebook's reluctance to talk in detail about this.

MARTIN: So, Rey, is the issue for users is that you think you're creating a newsfeed based on your interests, and then you find out that actually the person on the other end is deciding what you see based on what - a desire to affect your mood, right? And that's different from say advertising, where you turn on your favorite show and you realize that the advertisers are trying to get you to buy something. That's the point of advertising, right? Or to do something, which is the point of political advertising. And in this instance ,you're being manipulated but you don't know why. Is that really kind of the issue here?

JUNCO: Yeah, I think - I think that's it. I mean, hey, we live in a very consumerist society. We're kind of used to this, right? We're kind of used to billboards and advertising pushing us to buy a product. But this isn't about buying a product, this is about messing with people's emotions. I mean, I don't want to blow this out of proportion - their effect sizes were small. However, that being said, the Facebook team had little way of knowing that the effect sizes were going to be small. So - so, you know, before the fact they should've planned this out. So I think people are justifiably upset because they are messing with your emotions without your permission. I mean, you know, who needs more drama in their life, right?

MARTIN: (Laughing) Unless you want it - unless you're seeking the drama. In which case...

JUNCO: Hey, if that's your thing...

MARTIN: ...You can turn on "Scandal." Adrienne, what's Facebook's response to this? As I mentioned it - when you talked to Susan Fiske, who edited the study, she said that this was really Facebook's issue to decide how they are allowing people to look at their data. So what did Facebook have to say about this?

LAFRANCE: Facebook has issued a couple of statements, including a public statement from one of the authors of the study who's a Facebook employee. But they haven't, at least to me - and I know several other reporters at other outlets - they haven't been willing to answer follow-up questions asking for more specifics. So they've really circled the wagons. One of the things that keeps sort of sticking out to me - it's a quote from the study that they found they were able to lead people to experience the same emotions without their awareness. So sort of taking a step back from this specific study - to know that a company like Facebook, who doesn't have necessarily much interest in giving public disclosure about its intentions, is trying to lead people to experience emotions without their awareness. I think that that raises much larger concerns, especially when they have access to data about everybody - at levels that we don't fully understand.

MARTIN: Do you think that there will be consequences going forward now that this has come to light?

LAFRANCE: Well, I mean, Facebook's stock price was not affected (laughing) so not - it doesn't appear to be hurting the company in the immediate short-term. You know, I'm hopeful that it will launch a larger conversation, and an overdue one, about data use and data privacy and the intersection of these companies that actually wield enormous power over information but don't adhere to ethics in the professional sense that we as journalists might think, or even federal regulations that academics are - are scrutinized to. So hopefully the conversation will continue. But, you know, it's hard to say if Facebook has any incentive to change.

MARTIN: Rey Junco, what about you? What - do you think there will be consequences as a result of this?

JUNCO: I - well, I hope so. I hope that Facebook and other technology companies will implement more stringent independent reviews of their research. I think it's important for these companies to understand the power they hold over their users. And that when you are doing social science research, that you should follow the same ethics as all other social science researchers - as the ethics recommended by the federal government. And even ethics that are not recommended (laughing) by the federal government, just the basic ethical business of doing no harm.

MARTIN: Well, Rey, can I just push through on this for one second? Now, what would you say to those who argue that - first of all, users are giving their consent to these kinds of experiments when they sign up for Facebook and then they click the I agree on the user agreement, which is very expansive. And other people who say well, you know, maybe I as a journalist don't agree with this statement but a lot of people feel that there are certain cable channels who are to doing the same thing. In essence, they are stacking the story selection in an effort to achieve a certain response in their viewers, so what's the difference?

JUNCO: Well, I think - I certainly wouldn't suggest that we limit the ability of companies to analyze their data. I think where we need to draw the line is when there is a manipulation of something about the user - what they might feel, what they might experience, if there is a possibility of some negative effect. That needs to be thought through and there need to be some kind of protections in place.

MARTIN: Rey Junco is an associate professor in the school of education at Iowa State University. He's a fellow at the Berkman Center for Internet and Society at Harvard. We go to him often to talk about matters related to the Internet and society and social media. Adrienne LaFrance is a senior associate editor at The Atlantic. Thank you both so much for speaking with us.

JUNCO: Thanks Michel.

LAFRANCE: Thanks so much for having us. Transcript provided by NPR, Copyright NPR.

You make NHPR possible.

NHPR is nonprofit and independent. We rely on readers like you to support the local, national, and international coverage on this website. Your support makes this news available to everyone.

Give today. A monthly donation of $5 makes a real difference.