Was the Facebook experiment really wrong?

“Facebook reveals news feed experiment to control emotions” – The Guardian

“Facebook conducted secret psychology experiment on users’ emotions” – The Telegraph

“Facebook Manipulated 689,003 Users’ Emotions For ‘Creepy’ Secret Experiment” – The Huffington Post

We learnt yesterday that in 2012 Facebook, alongside academics from Cornell and the University of California, ran a psychological experiment manipulating some users’ timelines to see what would happen. The results showed that what people read had an impact on their own emotions. An influx of positive stories in a person’s timeline led to an increased likelihood that the user would post positive stories themselves, negative stories led to a higher likelihood of negative posts.

According to the Guardian, the study concluded: “Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks.”

Media uproar

Unsurprisingly, people were unhappy. In the UK, politicians called for an investigation into how “Facebook and others manipulate people’s thoughts in politics and other areas”. The head of Obama’s 2008 online campaign speculated that the “CIA (could) incite revolution in Sudan by pressuring Facebook to promote discontent.” Regardless of political persuasions, the media appeared horrified by the study’s implications. How could we allow a private corporation to get away with the blatant manipulation of people’s emotions? Should anybody be allowed that power? Was this the beginning of a dystopian society where thoughts were manipulated to devastating effect?

Before I go any further, I should explain that I’m not a fan of Facebook. It’s not that I don’t find their service a useful way to keep in contact with friends and family, especially as most of one-side of my family live abroad, but they have a very poor record on data privacy. They continually change their settings to allow people (and companies) access to your personal data by default, rather than as an option. This is not an accident. Your data is valuable and Facebook is beholden to shareholders to make a return on their investment. Yes, users should expect to make some form of payment to use the Facebook service, and access to their data costs them nothing financially, but Facebook should make this clear from the beginning rather than sneaking it through the back door.

Still, in the case of this experiment I am siding slightly with Facebook.

Was it really so bad?

In one respect, yes. There is one thing Facebook did which was very wrong. As James Grimmelmann, professor of law at Maryland University said on his blog, Facebook had failed to gain “informed consent” as defined by the US federal policy for the protection of human subjects. It is illegal to experiment on people without their consent. The press should take Facebook to town for this, not just because it is wrong, but because it is consistent with their behaviour in other areas, including privacy.

However, I am less concerned by the experiment itself. Why? Because as secret experiments go, this one gave us a valuable insight into something very important. Also, despite what I said above, it was being overseen by a third party, and the results were published. If they had wanted to, Facebook could have run this test in-house, kept the information to themselves and used it for whatever purposes they liked. Instead they made it available to all. This should be applauded.

The other point is that the experiment did nothing more than what happens on a daily basis by global media corporations. They feed you a selection of information, portrayed in a certain way to manipulate your emotions and opinions. What is different from a corporation experimenting on their users to understand if they can manipulate emotions to a media corporation selecting which news stories to promote or which slant to put on it to promote the political and financial goals of their owner? Politicians, the very people who want to ‘control’ Facebook and others from manipulating emotions, have been happy for the media to do so on their behalf for years. Perhaps the hysteria is less about the experiment itself and more to do with the fact that people outside the cosy political / media establishment now have the ability to do what they have been doing for years.

The reason I’m happy this experiment took place is that perhaps it will re-open the debate, now with the science to back it up, about just how much the public are being manipulated by the media – not with the purpose of controlling the media, but with the goal of educating the public to not necessarily take everything they read at face value.

Lessons for writers

There is also a lesson here for writers (and no it’s not that it would be a good plot start point). We too manipulate people’s thoughts and opinions and with that power comes responsibilities. I am not saying that we should only write happy, positive messages, but that we should be fully aware of the potential impact of what we write and the messages we convey, whether explicit or implied, before publishing our work.

 

Advertisements

12 thoughts on “Was the Facebook experiment really wrong?

  1. This experiment seems to have been a multi-source, constant drip method. So – while I agree authors should consider the impact of their books – I don’t think a single book will have the same impact, so an authors responsibility is not much larger than a non-authors.

    • I understand your point, Dave but I’m not so sure us writers can get off so lightly. We may not be multi-source, but given most people’s reading habits we are a constant drip, plus we also tend to hold a reader’s attention for longer and arguably at a deeper level.

  2. A few thoughts float up here Dylan. To manipulate emotions is unkind and disrespectful and the end does not justify the means. It feeds off this premise that watching/recording how people react through reality television and what passes for entertainment these days is justifiable. It is not. It fosters unkindness and not love for our fellow man. On your last point, we choose whether to pick up a book or not and then make further choices on whether we continue to read it. Our responsibility as writers is to be authentic and to write from the heart.

    • Hi Jane, first of all thank you for your comments. I may be talking semantics here but if feel as a writer I do manipulate my readers to a certain extent. I create characters to like, those to loath. If I write well my characters will change. Those that were liked earlier may, through their actions, become less likeable. All of these come from conscious choices I make as a writer. My point is that we need to take care that there are consequences for what our character’s do. For example, if a character solves a problem through violence (which happens a lot in literature. film and television) what are the consequences? Are we saying violence is a good thing, a bad thing or any of the variables that fit in between? Or if we have characters playing roles that support gender stereotypes, are we happy that it is the case? While each of our actions as a writer may be small, it can support a larger cultural narrative that we need to be aware of and consciously decide to support or not. I agree totally with your last point, but we need to understand with our head.

      • Yes, I’m with you Dylan. It’s the logical thinking, but also the ability to reflect on where our actions might lead. The characters we create and write about are but reflections of our perspectives on the world and how we live. We evolve by thinking, reflecting with our heads and always acting with heart…I think! 🙂

  3. Good post, and very thought provoking. My response to Facebook’s actons was initially negative as I don’t like Facebook for exactly the same reasons as you. But then I thought, what has this ‘experiment’ revealed? That people are sensitive/empathetic to the emotions expressed by their friends and associates. And that’s supposed to be a revelation? Really? I would call it a waste of time. I thought our levels of empathy and sensitivity are what put us (humans) above the vast majority of the animal kingdom in evolutionary terms. The fact that a ‘study’ of this fact comes as any kind of a surprise is the surprising thing.

    • I think the difference here is between believing and knowing. What the test did was provide evidence to support our belief that you can manipulate peoples emotions this way. My surprise is that the media are either oblivious or actively avoiding the implications on their own behaviour.

  4. Informed consent is certainly a key issue here when it comes to published works. The back cover blurb, the categories (Horror, Fantasy, Crime) that label books, the sections in libraries and bookshops (Real Life Stories, Westerns, SF) tend to be guides to our tastes, and it’s rare that someone would pick up an explicit novel under the impression it was a light read.

    But the news media, including social media, are in a particular position of trust: they won’t tell lies, they won’t omit relevant facts, they won’t disguise opinion as accurate unbiased reporting. And yet they do, all the time. And we deliberately go along with it. You can’t protect people from themselves.

    Soapbox moment over. Great post. Fine discussion after too.

  5. I think this might not be the most popular approach, but I fully agree that there is one positive aspect about disclosing the experiment: showing how much the public is being manipulated.

    The funniest part is that the whole study didn’t technically go too far from what Facebook or Google do every day. Only that now the people fully realized how the web works, its dark secrets.

    But the speculation about Facebook pushing for a revolution in Sudan seems a little bit too far-fetched. Was its author being serious?

Don't be shy, talk to me. I promise I won't bite.

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s