
(image source http://www.nichepursuits.com)
“Facebook reveals news feed experiment to control emotions” – The Guardian
“Facebook conducted secret psychology experiment on users’ emotions” – The Telegraph
“Facebook Manipulated 689,003 Users’ Emotions For ‘Creepy’ Secret Experiment” – The Huffington Post
We learnt yesterday that in 2012 Facebook, alongside academics from Cornell and the University of California, ran a psychological experiment manipulating some users’ timelines to see what would happen. The results showed that what people read had an impact on their own emotions. An influx of positive stories in a person’s timeline led to an increased likelihood that the user would post positive stories themselves, negative stories led to a higher likelihood of negative posts.
According to the Guardian, the study concluded: “Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks.”
Media uproar
Unsurprisingly, people were unhappy. In the UK, politicians called for an investigation into how “Facebook and others manipulate people’s thoughts in politics and other areas”. The head of Obama’s 2008 online campaign speculated that the “CIA (could) incite revolution in Sudan by pressuring Facebook to promote discontent.” Regardless of political persuasions, the media appeared horrified by the study’s implications. How could we allow a private corporation to get away with the blatant manipulation of people’s emotions? Should anybody be allowed that power? Was this the beginning of a dystopian society where thoughts were manipulated to devastating effect?
Before I go any further, I should explain that I’m not a fan of Facebook. It’s not that I don’t find their service a useful way to keep in contact with friends and family, especially as most of one-side of my family live abroad, but they have a very poor record on data privacy. They continually change their settings to allow people (and companies) access to your personal data by default, rather than as an option. This is not an accident. Your data is valuable and Facebook is beholden to shareholders to make a return on their investment. Yes, users should expect to make some form of payment to use the Facebook service, and access to their data costs them nothing financially, but Facebook should make this clear from the beginning rather than sneaking it through the back door.
Still, in the case of this experiment I am siding slightly with Facebook.
Was it really so bad?
In one respect, yes. There is one thing Facebook did which was very wrong. As James Grimmelmann, professor of law at Maryland University said on his blog, Facebook had failed to gain “informed consent” as defined by the US federal policy for the protection of human subjects. It is illegal to experiment on people without their consent. The press should take Facebook to town for this, not just because it is wrong, but because it is consistent with their behaviour in other areas, including privacy.
However, I am less concerned by the experiment itself. Why? Because as secret experiments go, this one gave us a valuable insight into something very important. Also, despite what I said above, it was being overseen by a third party, and the results were published. If they had wanted to, Facebook could have run this test in-house, kept the information to themselves and used it for whatever purposes they liked. Instead they made it available to all. This should be applauded.
The other point is that the experiment did nothing more than what happens on a daily basis by global media corporations. They feed you a selection of information, portrayed in a certain way to manipulate your emotions and opinions. What is different from a corporation experimenting on their users to understand if they can manipulate emotions to a media corporation selecting which news stories to promote or which slant to put on it to promote the political and financial goals of their owner? Politicians, the very people who want to ‘control’ Facebook and others from manipulating emotions, have been happy for the media to do so on their behalf for years. Perhaps the hysteria is less about the experiment itself and more to do with the fact that people outside the cosy political / media establishment now have the ability to do what they have been doing for years.
The reason I’m happy this experiment took place is that perhaps it will re-open the debate, now with the science to back it up, about just how much the public are being manipulated by the media – not with the purpose of controlling the media, but with the goal of educating the public to not necessarily take everything they read at face value.
Lessons for writers
There is also a lesson here for writers (and no it’s not that it would be a good plot start point). We too manipulate people’s thoughts and opinions and with that power comes responsibilities. I am not saying that we should only write happy, positive messages, but that we should be fully aware of the potential impact of what we write and the messages we convey, whether explicit or implied, before publishing our work.