Over the last couple of weeks the internet has felt varying emotions at Facebook’s carrying out of an experiment on its users. For those of you who missed the furore, Facebook essentially ran an experiment, during a week in January 2012, on 700,000 English-speaking users to test whether or not their emotions were influenced by what they see on Facebook.
This was to identify whether seeing positive content would lead to FOMO (fear of missing out!) or negative feelings from users who weren't involved. This negative feeling could lead to these users spending less time on Facebook, needless to say - Facebook does not like this.
You can read a more thorough explanation of the hypothesis and results below, as one of the researchers, on being harangued and abused across the internet, felt the urge to share his thoughts and feeling about the experiment.
When Sheryl Sandberg was asked about the experiment, she shrugged and said that it was simply poorly communicated. I guess she’s right, it *was* poorly communicated to those who feel that they need explanations for events like this.
When I first heard about the experiment, my initial reaction was, “oh, Facebook has done an experiment.” I wasn't fazed, I was interested in the results naturally, but didn’t feel violated or shocked or offended. When a member of the team here posted his rage about the experiment on, where else, Facebook – a lively debate broke out in the office.
Let’s be clear, when you sign up to a free service like Facebook (where I'm guessing you don’t read the terms and conditions!) you give consent for Facebook to do, almost, whatever it likes with your data. There is no precedent for something like this, Facebook IS the precedent. And let’s not forget, you've given that data to Facebook willingly, no one forced you into liking that Mothercare page. (You can read a good piece about Facebook being the precedent here.)
Facebook is a business, it’s not just a social platform, and I think that this might be something that users forget when they’re posting pictures of cats, looking at ex-boyfriends, hiding photos of babies. It’s a business, businesses need to make money, and adverts make Facebook money. For adverts to show, users need to be *on* Facebook, and for users to be on Facebook – Facebook needs to be interesting to the users. Facebook wasn’t trying to make you sad, Facebook is trying to make you happy, you being happy will make Facebook money.
.The News Feed is the pillar of Facebook, it’s where users spend the majority of their time and Facebook (not unlike Google – let’s not forget!) is always playing with its algorithm to make it more relevant, more interesting and more engaging. The more engaging the platform is – the more time people spend on there, the more ads are shown, the more money Facebook makes.
I think that the tough pill to swallow is that Facebook *can* influence our emotions. Facebook can show us content that will make us feel a certain way, it could influence our actions, sway elections even – allegedly. But who chose to sign up to Facebook? Who is making the choice to stay there? You, you the user have the power here. If you want to stop Facebook from using your data to make your social experience more engaging and relevant – deactivate your account.
But I bet you don’t.