Facing ethics in a data-driven world

3

I have previously blogged about how the dark side of our mood skews the sentiment analysis of customer feedback negatively since we usually only provide feedback when we have a negative experience with a product or service. Reading only negative reviews from its customers could make a company sad, but could reading only negative status updates from your friends on a social network make you sad?

Facebook decided to find out by running an experiment on almost 700,000 users in early 2012. In late 2013, they published the results in the Proceedings of the National Academy of Sciences (PNAS) with a research report entitled Experimental evidence of massive-scale emotional contagion through social networks. Facebook first used sentiment analysis to determine the general mood of status updates (in addition to the mood status that Facebook provides to allow users to convey what mood they are in as part of a status update). Facebook then used mood to intentionally filter what users saw in their newsfeed for one week. Based on the mood of the status updates these users posted during the experiment, the research revealed that users who saw fewer positive updates became sad and users who saw fewer negative updates became happy.

The reaction to these published findings, however, was neither happy nor sad but mad, since these Facebook users were experimented on without their knowledge.

Facebook defended themselves by pointing out that their Data Use Policy, which every user must agree to when creating an account on Facebook, constitutes informed consent for users of their free service to be used as guinea pigs in research. Although their legalese obviously phrases it differently, basically it boils down to: if you click like a guinea pig, update your status like a guinea pig, and Facebook “Like” like a guinea pig, then you’re a guinea pig.

As Alex Hern reported, “at least when a multinational company, which knows everything about us and controls the very means of communication with our loved ones, acts to try and maximize its profit, it’s predictable. There’s something altogether unsettling about the possibility that Facebook experiments on its users out of little more than curiosity.” As Hern noted, Facebook practicing A/B testing is not unusual since almost every major tech firm does it. The issue was what the A/B test was testing. “In most A/B tests, the dependent variable (the thing the study is trying to affect) is something like click rates, or time on page. In this case, it was the emotion of the users.”

Why has the Facebook experiment struck such a nerve? As Josh Constine reported, “the impact of this experiment on manipulating emotions was tiny, but it raises the question of where to draw the line on what’s ethical with A/B testing. This study purposefully sought to manipulate people’s emotions positively and negatively for the sake of proving a scientific theory about social contagion.”

“When you look to the future,” Hern concluded, “it gets even scarier. Facebook’s research offers the spectre of a company paying to only advertise to people already in a bad mood — or even a company paying to improve the mood. And yet, even manipulating emotions gets a pass in other situations. From TV news to political speeches, and, of course, advertisements, organizations have been trying to do it for years, and largely succeeding. I think what we’re feeling, as we contemplate being rats in Facebook’s lab, is the deep unease that what we think of as little more than a communications medium is actually something far more powerful.”

“The world has quickly become data-driven,” Constine concluded. “It’s time ethics caught up.”

Share

About Author

Jim Harris

Blogger-in-Chief at Obsessive-Compulsive Data Quality (OCDQ)

Jim Harris is a recognized data quality thought leader with 25 years of enterprise data management industry experience. Jim is an independent consultant, speaker, and freelance writer. Jim is the Blogger-in-Chief at Obsessive-Compulsive Data Quality, an independent blog offering a vendor-neutral perspective on data quality and its related disciplines, including data governance, master data management, and business intelligence.

3 Comments

  1. Pingback: Facebook and the myth of big data perfection - The Data Roundtable

  2. Pingback: Information Development » Blog Archive » Is Informed Consent outdated in the Information Age?

Leave A Reply

Back to Top