Facebook turns 700000 users into lab rats for a psychology experiment

Gaze not overlong into the Facebook abyss, for it gazes also into you.  And messes with your timeline, to manipulate your emotional state and monitor the results.  

One of my all-time favorite "Dilbert" cartoons highlighted the danger of not reading every last word in those impenetrable blocks of text known as "end-user agreements":

Facebook apparently took this as an inspiration, and slipped a little something-something into their Terms of Service that would permit them to monkey around with the news feeds of 700,000 users, over the course of one week in January 2012.  Basically, they changed the number of positive and negative posts the selected users were seeing, then studied the behavior of the unwitting lab rat users, and concluded that people who saw a lot of bad news tended to write more negative Facebook posts, while people fed a heavier died of positive stories would write positive posts.

(Actually, according to the New York TimesFacebook conducted this experiment before altering the Terms of Service, then made the alteration so they would be empowered to conduct more such experiments in the future.  A group called the Electronic Privacy Information Center is suing Facebook, which contends that its old policies gave it all the authorization it needed to conduct the mood-swing study.  The privacy group counters that Facebook's earlier Terms of Service "did not tell users that their personal information could be used for research purposes," and the company "failed to inform users that their personal information would be shared with researchers.")

In addition to anger from user groups over the clandestine study, Facebook could be looking at some heat from government regulators, in Ireland and Britain as well as the United States.  The company has been apologizing all over the place, not always in ways that unhappy users find reassuring.  The UK Daily Mail quotes Facebook data scientist Adam Kramer explaining that "the reason we did this research is because we care about the emotional impact of Facebook and the people that user our product... we felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out."

"It's clear that people were upset by this study and we take responsibility for it," said Facebook's European policy director, Richard Allen.  "We want to do better in the future and are improving our process based on this feedback.  The study was done with appropriate protections for people's information, and we are happy to answer any questions regulators might have."

COO Sheryl Sandberg, the second highest-ranking Facebook exec, probably took the biggest misstep, by trying to assure irate users the whole mess was just a result of poor communications.  "This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated," Sandberg said.  "And for that communication we apologize. We never meant to upset you."

Personally, I've never been able to get an angry person to calm down by patronizing them.  The people upset by this caper are not mad that Facebook didn't make it sufficiently clear that they might be turned into subjects in a psychological experiment without their consent or knowledge.  The phrase you hear over and over again in forums is betrayal of trust.  It is no longer safe for users to assume that Facebook is accurately representing what they and their friends enter into the system.  

It doesn't matter than the January 2012 manipulation was subtle, another defense the company has been floating.  Facebook was one of the Internet's great success stories, and that success was based on trust.  Now users have reason to suspect they could be manipulated for the edification of Facebook data analysts, or its corporate clients, or (according to one theory) military psy-ops researchers.  If enough people get upset about this, or regulators on either side of the Atlantic make the proverbial federal case out of it, it might be difficult for Facebook to ever regain that level of trust.  Sandberg treating users like a bunch of upset children isn't going to help matters.  Speaking of which... were any children used as subjects in this experiment?

Facebook could easily have avoided all this trouble by offering some modest incentive for those willing to take part in such experiments - obviously the experiment doesn't work if people know exactly what's going on, but if the company tossed out some electronic goodies in exchange for permission to make subtle changes to the user experience without notice and monitor the results, they'd probably get plenty of takers.  Volunteers have different parameters of trust than people who get unwittingly drafted into experiments.

 

Open all references in tabs: [1 - 3]

Leave a Reply