Chief Science Correspondent
Everyday, millions of people go online and post their families, their breakfast, their hometowns and even details of their less than honorable escapades. Their lives are on display for their friends, coworkers, relatives - and the world to see.
When internet users take part in social media, they automatically acknowledge a loss of privacy. And most will accept it. True, a Facebook user can manipulate his privacy settings in the hope of keeping his online life away from, say, an ex-girlfriend. Or a boss. Or just a complete stranger.
But the reality is that most of us know that what users post online automatically becomes public information.
So, with that sentiment in mind, does a mega-corporation such as Facebook have the right to use this
information for scientific studies? More importantly, do they have the right to manipulate their users- specifically, their emotions - without anyone's knowledge?
This question arose earlier this month, when researchers from Facebook's Core Data Science Team published a study in PNAS titled, "Experimental evidence of massive-scale emotional contagion through social networks."
In the paper, the team revealed that they had manipulated the News Feeds of nearly 700,000 users to show either more positive or negative content from the users' friends. Then, they monitored the emotional reports of the users, finding that they tended to be influenced by the states of their friends.
The results are fairly intuitive. Read depressing content, and you'll be more down on yourself. Read more "uplifting" content, and the world doesn't seem like such a terrible place. Emotional contagion is not a new psychological concept.
However, as critics points out, the scary aspect of the experiment is the fact that Facebook conducted a mass experiment in which thousands of people were experimented on without their knowledge or consent. We'll probably never know who these users were - but if you have a Facebook, you easily could have been a test subject.
"Nobody has ever had this sort of power before," New Statesman writer Laurie Penny wrote. "No dictator in their wildest dreams has been able to subtly manipulate the daily emotions of more than a billion humans so effectively."
The power of Facebook on everyday life is undeniable. In 2012, the site conducted a study in which they found that their ads, and the way they were presented, supposedly influenced 340,000 extra people to vote.
This is significant, because Facebook could target these ads to a certain group of people that report specific political views on their profiles, and potentially change the outcome of an election.
The emotional contagion study is most likely not a major issue on its own. The test subjects may have suffered temporary emotional distress, but the reality is that Facebook is not the first media company to manipulate the emotions of consumers. Emotional responses serve as the foundation for advertising.
And as a Facebook user myself, I'm not surprised that something like this was taking place. I tend to automatically assume that anything I post is public content - even to the Facebook team itself. The internet is not a private safe haven. I know my News Feed is regulated, theoretically to display information most relevant to me. It's why my closest friends' statuses tend to come up more than those from the girl I friended from high school, and it's why half my science feed is science memes.
We all consented to be monitored and manipulated as soon as we clicked "I Have Read the Terms and Conditions" back in 2007. And while none of us actually read the document, it shouldn't come as a surprise that Facebook does these things.
Does that make it right? Despite my acceptance of Facebook's power, I do recognize the danger of this experiment, and ones like it in the future. I don't think the "mind control" some users predict is likely, but I acknowledge that this is a practice we should be more aware of.
But chances are, Facebook won't be doing something like this anytime soon. Although no one has taken action against the company as of yet, it does appear that their actions were illegal.
Two of the paper's authors came not from Facebook, but from universities - UC San Francisco and Cornell. As any social research student knows, universities tend to have very strict protocols for using human subjects for academic research. My gut feeling is that these standards were not met.
If so, the team could find themselves in quite a bit of trouble.
In the meantime, remember that Facebook is, at its heart, a company that holds significant power over its users. Every time you post online, you are a test subject - if not for Facebook, then probably for Google.
These companies only truly hold power over us as long as we remain oblivious to their effects.