Facebook Performs Psychology Experiment On Users Without Informed Consent: Facebook's Responds To Public Concerns Claiming Terms And Conditions Equals Consent To Be Part Of Experiment [PHOTO]

Facebook has gotten into some hot water recently after it published an paper revealing that it performed a psychology experiment on users without their informed consent. 

Scientists at Facebook have published a paper showing that they manipulated the content of 600,000 users to determine whether this would affect their emotional state. The paper, "Experimental evidence of massive-scale emotional contagion through social networks," was published in The Proceedings Of The National Academy Of Sciences.

The paper shows how Facebook data scientists tweaked the algorithm that determines which posts appear on users' news feeds. The researchers skewed the number of positive or negative terms seen by randomly selected users. Facebook then analyzed the future postings of those users over the course of a week to see if people responded with increased positivity or negativity of their own, thus answering the question of whether emotional states can be transmitted across a social network. Well as it turns out, showing people negative content puts them in a negative mood, who would have thought. Oh, and Facebook can now control your emotions, so theres that. 

Facebook has covered its ethical behind by saying the experiment was not unethical because users agreed to the Facebook Data Use Policy, users are required to not read and then sign in order to join the site, and allows the company to access and use information users post. Somehow by signing this, you are allowed to be experimented on.

Facebook data scientist Adam Kramer is listed as the study's lead author. Kramer is quoted as saying he joined Facebook because "Facebook data constitutes the largest field study in the history of the world."

Kramer also wrote a response was arguing that the experiments was totally not a violation of public trust and ethical codes of experimentation. He reminds us that over the course of the week the experiment was conducted, no posts were actually hidden from users-they were only filtered out of certain iterations of the news feed by the algorithms that already choose what to display whenever the feed is loaded. Kramer also downplayed the project's effect, saying, "The result was that people produced an average of one fewer emotional word, per thousand words." Though the study itself concluded that this was significant. 

Kramer also issued an apology, saying, "I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety."

Tags
Join the Discussion

Latest Photo Gallery

Real Time Analytics