“For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves.
This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it.”
Read more from ROBINSON MEYER at The Atlantic here: http://www.theatlantic.com/technology/archive/2014/06/everything-we-know-about-facebooks-secret-mood-manipulation-experiment/373648/
Cenk Uygur (http://www.twitter.com/cenkuygur) and Ana Kasparian (http://www.twitter.com/anakasparian) break down this creepy Facebook study. Do you think the experiment was ethical? Are you comfortable with Facebook studying your status updates in this way? Tell us what you think in the comment section below.