By Aaisha Zafar Islam

photo credit: johnscotthaydon via photopin cc

photo credit: johnscotthaydon via photopin cc

 

This past month a report on Facebook’s covert psychological experiment saw the social media giant make headlines and create yet another ‘invasion of privacy’ furore.

Facebook, with more than a billion users worldwide, is touted as the third most populous state in the world, albeit a virtual one. It had crossed this threshold in October 2012. In the same year, little over its 689, 000 users were subjected to a carefully orchestrated news feed as part of a study conducted by Facebook’s in-house Core Data Science team, Cornell University and University of California, San Francisco.

Facebook uses a ‘ranking algorithm’ to filter news feed content on its users’ homepage. For this study, the algorithm identified ‘positive’ and ‘negative’ content.

The ‘subjects’ for this study were divided into three groups. One was a control group shown random friend’s postings in their news feed. One group was shown more ‘happy’ content while the other had more ‘sad’ postings shown on their homepages.

The findings of this psychological experiment were presented as a report for the Proceedings of National Academy of Sciences.

According to the researchers, this experiment, ‘manipulated the extent to which people were exposed to emotional expressions in their news feed.’

These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.

There have been previous studies conducted on how social media affects our moods in real life. Both studies, one conducted in the US and another in Germany, around the same time as Facebook’s experiment, noted that seeing more self-promotional content on our Facebook friends profile made users feel unsatisfied with their own lives. The German research identified this as ‘the self-promotion –envy spiral.’

These studies, however, were different from how Facebook proceeded to conduct its own experiment as both of these previous researches gathered data from volunteers, through a series of questions and other statistical analysis tools. ‘Subjects,’ if they can be labelled so, knew of the experiment and were willing participants to the study – unlike Facebook.

In fact, Susan Fiske, a psychologist at the Princeton University, who reviewed and edited the Facebook research was reportedly ‘creeped out’ by its nature.

Talking to The Atlantic, Fiske said that she was concerned until she queried the authors and they said their local institutional review board had approved it—and apparently on the grounds that Facebook apparently manipulates peoples News Feeds all the time.’

photo credit: M i x y via photopin cc

photo credit: M i x y via photopin cc

At the heart of this controversial episode is how Facebook manipulates users, not just as part of a scientific study, but in general.

A report by Wall Street Journal cited Andrew Ledvina,  a Facebook data scientist from February 2012 to July 2013. He told WSJ:

Theres no review process, per se. Anyone on that team could run a test. Theyre always trying to alter peoples behavior.

It is interesting to note that the Data Science Group at Facebook has had a free-run all this while.

How can they foist such studies onto users? Facebook’s Terms of Service is a very long piece of wordy documentation no one bothers much with and almost everyone clicks their consent to it. A clause in these terms of service identifies that user’s data ‘could be used to improve Facebook’s products. These terms have now been updated to state that user data may be used for research.

John Gapper, writing for Financial Times, has suggested that we, the users, are products Facebook has been testing.

Facebook has made no secret of the fact that its news feed is a manipulated version of reality. It selects the posts and links to display prominently that it has found through testing are the most likely to interest users, and encourage them to return and post themselves. These tests are not sinister experiments; they are product development.

All true, but there is one big difference: we are the product that Facebook has been testing. Perhaps we should grow up and accept that this is how the world works when we use an advertising-funded social network stuffed with details of our lives and those of family and friends. But if we find it creepy, that is what the experiment proves.

Concluding their research, Adam Kramer, from Facebooks Core Data Science team, Jamie Guillory, a postdoctoral fellow at the University of California in San Francisco, and Jeffrey Hancock, a professor at Cornell University, wrote that:

Given the massive scale of social networks such as Facebook, even small effects can have large aggregated consequences. Online messages influence our experience of emotions, which may affect a variety of offline behaviours.

At the end of the day, we do realize that by posting and sharing our lives online, we are subjecting ourselves to an on-going social experiment. And while Facebook might have sketchy ‘legal’ grounds to validate its study, ethical boundaries had been crossed. No one gets to play God and mess with people’s emotions.