Home > >

Facebook's secret user manipulation experiment

POST A REPLY
 
Thread Tools
Jul 7, 2014, 10:18am   #1
bag-mania's Avatar
Thread Starter
bag-mania
Member
With apology, Facebook tries to defuse growing backlash


Quote:
Did Facebook go too far this time?

Facebook sought to defuse a sharp backlash against the giant social network on Wednesday by publicly apologizing for running a psychology experiment on hundreds of thousands of people without their knowledge or consent.

Facebook's No. 2 executive, Chief Operating Officer Sheryl Sandberg, said the company communicated "poorly" about the experiment, which tested whether Facebook could manipulate users' emotions.

Her mea culpa came as British regulators said they had begun investigating the Facebook experiment.

Nearly a week after a report about the experiment appeared in the New Scientist magazine, the torrent of outrage shows no signs of abating. Protests have quickly spread on Facebook and social media.

"We are not experimental rats in a laboratory. What gives them the right to run experiments without our knowledge?" said Kiley Smith, a 31-year-old blogger and a daily Facebook user from Fairfax, Va. "I believe that is a personal invasion of privacy. They have definitely overstepped the bounds there."

In the week-long experiment, nearly 700,000 users were exposed either to positive or negative posts to see if the feelings would spread on the social network.

"The experiment manipulated the extent to which people were exposed to emotional expressions in their news feed," researchers wrote in their report.

"These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks."

The research conducted with two universities was published in March.

"Given the massive scale of social networks such as Facebook, even small effects can have large aggregated consequences," the researchers concluded. "Online messages influence our experience of emotions, which may affect a variety of offline behaviours."

Facebook says it conducts this type of research to improve its service. It also says none of the information used was associated with a specific individual's Facebook account.

"This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated," Sandberg said of the experiment, which was conducted in 2012. "And for that communication we apologize. We never meant to upset you."

But that has not mollified many users who say they were not aware that Facebook experimented on them or just how much power Facebook had to monitor and influence their behavior.

Denise Dorman, a 50-year-old writer, producer and digital content strategist from Carpentersville, Ill., says the experiment was "Orwellian."

"What's next? Will my bank lose my account balance for a day to test my reaction? Will big pharma start doling out placebos to unaware control groups for less serious maladies?" Dorman said.

Karl Volkman, chief technology officer of SRV Network in Chicago, says Facebook may not be able to shake its new public image as a digital age "Big Brother."

"If Facebook hasn't already crossed the line, they are getting very close," Volkman said.
Feeding the furor: the Facebook news feed — and what appears in it — was already a hotly contested subject.

People have strong feelings about the news feed. It's the main way they keep up with friends and family on Facebook.

Facebook filters the content in the news feed, and that has given rise to regular complaints that people do not see the updates they want to see.

Janet Cash, a 40-year-old middle school teacher from Davenport, Fla., says she relies on Facebook to stay in touch with far-flung family and friends and with subscribers to her newsletter. She says she has grown increasingly troubled that Facebook's computer algorithm decides what she sees in her news feed.

"I don't like Facebook manipulating my relationships," she said.

Grover Welch, a 42-year-old high school English teacher from Jonesboro, Ark., says he thinks it's more than that.
The revelation that Facebook was running tests on unsuspecting users has gotten many people to realize for the first time just how much power the giant social network has amassed, he said.

With nearly 1.3 billion users, Facebook has established itself as one of the primary means of communication and social interaction around the globe.
"People feel like they own what they put on Facebook, but they don't," Welch said. "We are investing all of our personal time into something that does not belong to us. This is the first time we have gotten a real glimpse of that."

And that — unlike previous flaps over privacy — has captured people's attention and shaken their faith in the service, Welch said.

"Finally people are seeing through the ruse," he said. "Facebook bills itself as being so user friendly, but they are not in it for us. This is a company, and we need to treat it like we do every other company. Everybody wants to feel like Facebook serves them, but it doesn't. Facebook serves Facebook and its investors. It has only its own self interest at heart."
http://www.usatoday.com/story/tech/2...lash/12078327/
Jul 7, 2014, 12:00pm   #2
boxermom's Avatar
boxermom
Member
I know in their fine print they say they can do this, but it goes against all guidelines for ethical studies. I'm surprised anyone published it considering the subjects in the study didn't give express permission as they must for any study I've ever heard of. I was part of an educational study and am currently part of a clinical trial for a heart valve. The amount of paperwork I had to read, have explained and had to sign was significant before the study/trial could proceed.

Facebook went too far and an apology doesn't cut it with me. They are not research scientists or educators and need to leave it to the professionals who follow the rules.
Jul 7, 2014, 12:27pm   #3
bag-mania's Avatar
Thread Starter
bag-mania
Member
Any time you see something odd in your Facebook news feed you have to wonder "why is that here?"

Their halfhearted apology was entirely to get the media off their back so the controversy will die down. You will notice that in that apology they never said they would not do it again. That's because they have been altering users' news feeds to manipulate all along and they are still doing it right now. We will never know the true extent of it, because I doubt any more of the results will be published in scientific journals.
Jul 7, 2014, 2:05pm   #4
HauteMama's Avatar
HauteMama
.
And there was a temporary flap about privacy rights once, too, and that died down without much damage. I expect the same here. People are addicted to FB, and most people don't care what FB is doing as long as they can spend their free time there. If people cared about their rights or their privacy, they wouldn't have anything to do with FB, but it is clear that they DON'T care. I doubt this will have any lasting effects.
Jul 7, 2014, 2:33pm   #5
bag-mania's Avatar
Thread Starter
bag-mania
Member
Originally Posted by HauteMama
And there was a temporary flap about privacy rights once, too, and that died down without much damage. I expect the same here. People are addicted to FB, and most people don't care what FB is doing as long as they can spend their free time there. If people cared about their rights or their privacy, they wouldn't have anything to do with FB, but it is clear that they DON'T care. I doubt this will have any lasting effects.
You're probably right. By now most people take it for granted that they have no privacy on Facebook. But I don't think many people really considered the potential for manipulation. Imagine your news feed is suddenly bombarded with stories about a political candidate or a particular social agenda. Not because one of your Facebook friends is on a rant, but because someone paid Facebook to promote it (or perhaps the executives at Facebook wanted to promote it).

This positive vs. negative experiment was trying to figure out how much influence they could have. All the better to determine how much to charge advertisers and others for their services.

Facebook is a powerful tool and the potential for abuse is high.
Jul 7, 2014, 3:02pm   #6
HauteMama's Avatar
HauteMama
.
Originally Posted by bag-mania
You're probably right. By now most people take it for granted that they have no privacy on Facebook. But I don't think many people really considered the potential for manipulation. Imagine your news feed is suddenly bombarded with stories about a political candidate or a particular social agenda. Not because one of your Facebook friends is on a rant, but because someone paid Facebook to promote it (or perhaps the executives at Facebook wanted to promote it).

This positive vs. negative experiment was trying to figure out how much influence they could have. All the better to determine how much to charge advertisers and others for their services.

Facebook is a powerful tool and the potential for abuse is high.
I agree, especially with your last line. There is no question that it is a very powerful vehicle for potentially all sorts of things. However, you know that and I know that, and presumably everyone else knows that, too. Indeed, it is too powerful a tool NOT to use, and I doubt that any similar site with as many users would not engage in something like this (or worse).

When a person "signs" or okays the box after the pages of information outlining what FB can and cannot do, they attest that they are fine with FB's practices. Just because they didn't think FB would do that doesn't mean they can't, and it doesn't mean they won't continue to do so. It just means that people are mostly fine with it. Until there is a viable alternative for staying plugged in that is as easy to use as FB, there will be no mass exodus away from the site. There are a few people screaming loudly about this, but for the most part people neither really understand or care about what happened.

People CAN affect what FB does, but the only way to do that is for many people to stop using it. That's the only response FB cares about because it will affect their advertising revenue. And it is pretty clear that people are not yet willing to do that.
Jul 7, 2014, 3:12pm   #7
boxermom's Avatar
boxermom
Member
bag-mania and HauteMama, you both are so right about people possibly expressing outrage but in the end they keep using FB. I recall learning that a value (for example, believing in the right to privacy) is something you will act on to protect; forget about what we say---what do we DO? For myself, I've had situations that challenge how strong my values really are. Sometimes I'm too lazy or selfish to actually stop shopping or eating someplace even though I disagree with the other party. So will I fight for that value or just talk about it? It's a mixed bag, as with most people.

I'm still disgusted with Zuckerburg and his FB minions, though.
Jul 7, 2014, 5:26pm   #8
limom's Avatar
limom
Member
Were the names of the Universities revealed?
Jul 7, 2014, 5:29pm   #9
Echoes's Avatar
Echoes
NoWhere Atoll
Originally Posted by boxermom
I know in their fine print they say they can do this, but it goes against all guidelines for ethical studies.
There are no ethics involved in this 'company' anywhere.

I'll never understand why anyone uses them.
Jul 7, 2014, 5:43pm   #10
bag-mania's Avatar
Thread Starter
bag-mania
Member
Originally Posted by limom
Were the names of the Universities revealed?
I believe they were Cornell and Princeton, based on their being part of this article.

Quote:
Cornell ethics board did not pre-approve Facebook mood manipulation study

Facebook’s controversial study that manipulated users’ newsfeeds was not pre-approved by Cornell University’s ethics board, and Facebook may not have had “implied” user permission to conduct the study as researchers previously claimed.

In the study, researchers at Facebook tweaked what hundreds of thousands of users saw in their news feeds, skewing content to be more positive or negative than normal in an attempt to manipulate their mood. Then they checked users’ status updates to see if the content affected what they wrote.

They found that, yes, Facebook users’ moods are affected by what they see in their news feeds. Users who saw more negative posts would write more negative things on their own walls, and likewise for positive posts.

Ethics board consulted after the fact
As reported by The Post and other news outlets, Princeton University psychology professor Susan Fiske told the Atlantic that an independent ethics committee, Cornell University’s Institutional Review Board (IRB), had approved use of Facebook’s “pre-existing data set” in the experiment. Fiske edited the study, which was published in the June 17 issue of Proceedings of the National Academy of Sciences.

A statement issued Monday by Cornell University clarified the experiment was conducted before the IRB was consulted. A Cornell professor, Jeffrey Hancock, and doctoral student Jamie Guillory worked with Facebook on the study, but the university made a point of distancing itself from the research. Its statement said:
Professor Hancock and Dr. Guillory did not participate in data collection and did not have access to user data. Their work was limited to initial discussions, analyzing the research results and working with colleagues from Facebook to prepare the peer-reviewed paper “Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks,” published online June 2 in Proceedings of the National Academy of Science-Social Science.

Because the research was conducted independently by Facebook and Professor Hancock had access only to results – and not to any data at any time – Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required.
User consent called into question
Facebook researchers claimed the fine print users agreed to when they signed up was tantamount to “informed consent” to participate in the study. Facebook’s current data use policy says user information can be used for “internal operations” including “research.” However, that’s not what it said in 2012 when the study was conducted. According to Forbes:
In January 2012, the policy did not say anything about users potentially being guinea pigs made to have a crappy day for science, nor that ‘research’ is something that might happen on the platform.

Four months after the study, in May 2012, Facebook made changes to its data use policy, and that’s when it introduced this line about how it might use your information: ‘For internal operations, including troubleshooting, data analysis, testing, research and service improvement.’ Facebook helpfully posted a ‘red-line’ version of the new policy, contrasting it with the prior version from September 2011 — which did not mention anything about user information being used in ‘research.’
“When someone signs up for Facebook, we’ve always asked permission to use their information to provide and enhance the services we offer,” a Facebook spokesman told Forbes. “To suggest we conducted any corporate research without permission is complete fiction. Companies that want to improve their services use the information their customers provide, whether or not their privacy policy uses the word ‘research’ or not.”

This revelation will likely further rile critics already angered that Facebook fell short of the standards imposed by the government and professional associations for informed consent in studies conducted on humans. Informed consent involves disclosing information about the study before it takes place and giving subjects a chance to opt out – and Facebook did neither. Since Facebook is a private company, it isn’t held to those standards, according to legal experts interviewed by the International Business Times, but that hasn’t stopped some from feeling violated and angry.
http://www.washingtonpost.com/news/m...n-you-thought/
Jul 7, 2014, 5:46pm   #11
limom's Avatar
limom
Member
Originally Posted by bag-mania
I believe they were Cornell and Princeton, based on their being part of this article.
Thanks, the Atlantic seems to concur
http://www.theatlantic.com/technolog...riment/373648/
Jul 7, 2014, 6:43pm   #12
boxermom's Avatar
boxermom
Member
So much for thinking universities are following educational study guidelines--rules that any beginning graduate student should know. I wasn't going into research but was required to take a class that covered issues like these so we could distinguish between valid studies and those that aren't well-designed.. No excuse, IMO, for Princeton or Cornell.
Jul 7, 2014, 7:09pm   #13
*schmoo*'s Avatar
*schmoo*
Member
Quote:
Professor Hancock and Dr. Guillory did not participate in data collection and did not have access to user data. Their work was limited to initial discussions, analyzing the research results and working with colleagues from Facebook to prepare the peer-reviewed paper “Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks,” published online June 2 in Proceedings of the National Academy of Science-Social Science.

Because the research was conducted independently by Facebook and Professor Hancock had access only to results – and not to any data at any time – Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required.
Hmm, wonder how other researchers would feel about this.
POST A REPLY
  HOME > >  
TOP

Thread Tools