Facebook faces criticism for conducting psychological experiment on users …

Facebook is under fire after it was revealed it secretly altered the news feeds of nearly 700,000 members as part of a psychology experiment about how negative and positive news moves through social networks and affects users.

The news adds to the growing concerns about social media websites and how they might infringe on privacy.

The study recently was published by the Proceedings of the National Academy of Sciences, according to Verge.com. A summary of the study from The Guardian:

Facebook filtered users' news feeds - the flow of comments, videos, pictures and web links posted by other people in their social network. One test reduced users' exposure to their friends' "positive emotional content", resulting in fewer positive posts of their own. Another test reduced exposure to "negative emotional content" and the opposite happened. The study concluded: "Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks."

The Guardian reports that lawyers, politicians and Internet activists are calling the experiment "scandalous," "spooky" and "disturbing":

Jim Sheridan, a member of the Commons media select committee, said the experiment was intrusive. "This is extraordinarily powerful stuff and if there is not already legislation on this, then there should be to protect people," he said. "They are manipulating material from people's personal lives and I am worried about the ability of Facebook and others to manipulate people's thoughts in politics or other areas. If people are being thought-controlled in this kind of way there needs to be protection and they at least need to know about it."

Adam D.I. Kramer, a co-author of the study, posted an explanation on his Facebook page Sunday:

The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.

Researchers did point out that this kind of data manipulation is written into the Facebook users agreement, the Verge reports:

"Based on what Facebook does with their newsfeed all of the time and based on what we've agreed to by joining Facebook, this study really isn't that out of the ordinary. The results are not even that alarming or exciting."

Open all references in tabs: [1 - 5]

Leave a Reply