Facebook Deserves Retribution For Psychological Tests On Humans

+44 (0)800 019 6813
Facebook Deserves Retribution For Psychological Tests On Humans

Facebook Deserves Retribution For Psychological Tests On Humans

For once, Facebook is not in the line of fire for violating user privacy. Instead, they’re in an even bigger, hotter mess this time. 689,003 users were subject to psychological testing by the social network, without their knowledge or consent. And that’s not the worst part of it.

Students taking examA “scientific” study was commissioned by Facebook and carried out by academics from Cornell University and the University of California. The study, titled “Experimental evidence of massive-scale emotional contagion through social networks“, filtered content on test subjects’ news feed to measure their reaction to more negative and more positive stories.

“When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred.”

In other words, when Facebook toyed with its users’ emotions by showing them more positive emotional posts, those users tended to post happier thoughts. On the flip side, when Facebook showed more negative emotion in posts, comments, videos, pictures and links, those human test subjects felt negative emotions – through a method of transference they call “emotional contagion” – and as a result they tended to post more negative content themselves.

Scientifically and from a business point of view, that is great information to have. Ethically, though, their methods are questionable, at best.
Facebook crash test dummy

  1. Subjects were chosen randomly, without any thought for their mental well being. So if an already depressed person got picked and were shown negative content, they could potentially have become suicidal. Did Facebook care? Apparently, not.
  2. All scientific experimentation on human beings requires the scientists to obtain permission from the test subjects before experimenting on them. Did Facebook obtain permission from any of its users before using them as human Guinea pigs? No.
  3. This study was limited to toying with simple emotions. What’s to stop Facebook, in the future, from subliminally influencing buying behaviour for the right price? Or voter behaviour for the right political party? Not much!

This time, Facebook really should be punished. Google was fined by European courts for simply snooping on private Wi-Fi networks. What Facebook did is in many ways worse than what Monsanto is regularly under fire for nowadays.

Motivations behind such a study are fairly transparent too. Facebook advertising drives more than $2 Billion of revenue per quarter. At that rate, it is definitely worthwhile for the company to demonstrate to advertisers the profound impact their ads might have on consumer psyche.

Television advertising is regulated in the USA, UK, Europe and Asia. Regulation for Internet advertising, especially with new frontiers such as this, gets made up as we go and a lot rests on the shoulders of giants such as Facebook, Google, LinkedIn, Yahoo! and Twitter to ensure that practices followed are ethical. However, when any one of these corporations shamelessly cross the line, shouldn’t they be stopped, even though the laws to do so might not currently exist?