Advertisement

Facebook performed secret psychological experiments on users, intentionally making them sad

Users were primed with posts depicting specific emotions

Facebook sneaky, sneaky
News broke this weekend that Facebook secretly performed psychological experiments on 689,000 people by secretly altering their news feeds in hopes of inciting specific emotions. The news went viral after the Proceedings of National Academy of Sciences published the paper detailing the study’s results this Sunday.

Working in collaboration with Cornell University and University of California at San Francisco, Facebook sought to gauge whether exposure to specific emotions led people to change their posting behavior. They tested whether increasing a user’s exposure to negative posts — by reducing the number of positive news feed posts ─ increased the user’s likelihood to post something negative. The same was tested for the reverse scenario using positive posts.

Posts were analyzed and classified accordingly using software that scanned snippets of text for positive and negative words. Next, participants were randomly fed neutral to sad or neutral to happy information taken from their friends’ posts. Afterwards, the participants own posts were evaluated to measure the tone of their own posts. 

The results? 
Well, the results were horrifying to say the least. Facebook deliberately made thousands of people sad and proved that social networks can institute both positive and negative feelings.

Public reaction
Not surprisingly, thousands of users took to twitter with pitchforks, calling the experiment unethical and dangerous. User Kate Crawford tweeted, “Let's call the Facebook experiment what it is: a symptom of a much wider failure to think about ethics, power and consent on platforms.” Another user, Lauren Weinstein, posted: “Facebook secretly experiments on users to try and make them sad. What could go wrong?”

Psychologists on the other hand, are not the slightest bit surprised. “Based on what Facebook does with their news feed all of the time and based on what we've agreed to by joining Facebook, this study really isn't that out of the ordinary,” explained Katherine Sledge Moore, a psychology professor at Elmhurst College in Elmhurst, IL.

Did the research go too far? 
James Grimmelmann, professor of technology and the law at the University of Maryland, says that this kind of thing definitely requires some sort of informed consent, considering it encroaches on criteria laid out in federal law and human rights declarations. The sole “informed consent” in the paper was outlined as being “consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.” Ironically, this is not how informed consent is defined by social scientists; there’s no mention of negative repercussions whatsoever.

The Proceedings of National Academy of Sciences dictates specific ethical guidelines for all authors who wish to publish their studies in the journal, yet oddly enough, Facebook did not abide by these regulations. The rule reads as follows: “Authors must include in the Methods section a brief statement identifying the institutional and/or licensing committee approving the experiments.” Facebook did not spell this out clearly in any shape, mean, or form, yet it still managed to publish its findings. Another PNAS requirement states that studies must abide by the Declaration of Helsinki, a mandate require that all human subjects are adequate informed of the aims, methods, benefits, and risks or discomforts associated with this study.

So what’s the best takeaway from all this? Users beware. We’ve already determined that social media companies sell personal information to marketers, so use social media with caution.

Via BBC, Slate, DiscoveryNews

Engineering_Advice

Advertisement



Learn more about Electronic Products Magazine

Leave a Reply