In: Operations Management
The Problem
Facebook has long conducted digital experiments on various aspects of its website. For example, just before the 2012 election, the company conducted an experiment on the News Feeds of nearly 2 million users so that they would see more “hard news” shared by their friends. In the experiment, news articles that Facebook users' friends had posted appeared higher in their News feeds. Facebook claimed that the news stories being shared were general in nature and not political. The stories originated from a list of 100 top media outlets from the New York Times to Fox News. Industry analysts claim that the change may have boosted voter turnout by as much as 3 percent.
Next, Facebook decided to conduct a different kind of experiment that analyzed human emotions. The social network has observed that people's friends often produce more News Feed content than they can read. As a result, Facebook filters that content with algorithms to show users the most relevant and engaging content. For one week in 2012, Facebook changed the algorithms it uses to determine which status updates appeared in the News Feed of 689,000 randomly selected users (about 1 of every 2,500 Facebook users). In this experiment, the algorithm filtered content based on its emotional content. Specifically, it identified a post as “positive” or “negative” if it used at least one word previously identified by Facebook as positive or negative. In essence, Facebook altered the regular news feeds of those users, showing one set of users happy, positive posts while displaying dreary, negative posts to another set.
Previous studies had found that the largely positive content that Facebook tends to feature has made users feel bitter and resentful. The rationale for this finding is that users become jealous over the success of other people, and they feel they are not “keeping up.” Those studies, therefore, predicted that reducing the positive content in users' feeds might actually make users less unhappy. Clearly, Facebook would want to determine what types of feeds will make users spend more time on its site rather than leave the site in disgust or despair. Consequently, Facebook designed its experiment to investigate the theory that seeing friends' positive content makes users sad.
The researchers—one from Facebook and two from academia—conducted two experiments, with a total of four groups of users. In the first experiment, they reduced the positive content of News Feeds; in the second experiment, they reduced the negative content. In both experiments, these treatment conditions were compared with control groups in which News Feeds were randomly filtered without regard to positive or negative content.
The results were interesting. When users received more positive content in their News Feed, a slightly larger percentage of words in their status updates were positive, and a smaller percentage were negative. When positivity was reduced, the opposite pattern occurred. The researchers concluded that the emotions expressed by friends, through online social networks, elicited similar emotions from users. Interestingly, the results of this experiment did not support the hypothesis that seeing friends' positive content made users sad.
Significantly, Facebook had not explicitly informed the participants that they were being studied. In fact, few users were aware of this fact until the study was published in a paper titled “Experimental evidence of massive-scale emotional contagion through social networks” in the prominent scientific journal Proceedings of the National Academy of Sciences. At that point, many people became upset that Facebook had secretly performed a digital experiment on its users. The only warning that Facebook had issued was buried in the social network's one-click user agreement. Facebook's Data Use Policy states that Facebook “may use the information we receive about you . . . for internal operations, including troubleshooting, data analysis, testing, research, and service improvement.” This policy led to charges that the experiment violated laws designed to protect human research subjects.
Some lawyers urged legal action against Facebook over its experiment. While acknowledging the potential benefits of digital research, they asserted that online research such as the Facebook experiment should be held to some of the same standards required of government-sponsored clinical trials. What makes the Facebook experiment unethical, in their opinion, was that the company did not explicitly seek subjects' approval at the time of the study.
Some industry analysts challenged this contention, arguing that clinical research requirements should not be imposed on Facebook. They placed Facebook's experiment in the context of manipulative advertising—on the web and elsewhere—and news outlets that select stories and write headlines in a way that is designed to exploit emotional responses by their readers.
On July 3, 2014, the privacy group Electronic Privacy Information Center filed a formal complaint with the Federal Trade Commission claiming that Facebook had broken the law when it conducted the experiment without the participants' knowledge or consent. EPIC alleged that Facebook had deceived its users by secretly conducting a psychological experiment on their emotions.
Facebook's Response
Facebook Chief Operating Officer Sheryl Sandberg defended the experiment on the grounds that it was a part of ongoing research that companies perform to test different products. She conceded, however, that the experiment had been poorly communicated, and she formally apologized. The lead author of the Facebook experiment also stated, “I can understand why some people have concerns about it (the study), and my co-authors and I are very sorry for the way the (academic) paper described the research and any anxiety it caused.”
For its part, Facebook conceded that the experiment should have been “done differently,” and it announced a new set of guidelines for how the social network will approach future research studies. Specifically, research that relates to content that “may be considered deeply personal” will go through an enhanced review process before it can begin.
The Results
At Facebook, the experiments continue. In May 2015, the social network launched an experiment called Instant Articles in partnership with nine major international newspapers. This new feature allowed Facebook to host articles from various news publications directly on its platform, an option that the social network claims will generate a richer multimedia experience and faster page-loading times.
The following month Facebook began experimenting with its Trending sidebar, which groups news and hashtags into five categories among which users can toggle: all news, politics, science and technology, sports, and entertainment. Facebook maintained that the objective is to help users discover which topics they may be interested in. This experiment could be part of Facebook's new effort to become a one-stop news distributor, an approach that would encourage users to remain on the site for as long as possible.
A 2016 report asserts that Facebook's list of top trending topics is not quite objective. For example, one source stated that Facebook's news curators routinely excluded trending stories from conservative media sites from the trending section. Facebook strongly denied the claim.
Questions
1. Discuss the ethicality and legality of Facebook’s experiment with human emotions.
Facebook decided to conduct an experiment that analyzed human emotions. The social network has observed that people’s friends often produce more News Feed content than they can read. As a result, Facebook filters that content with algorithms to show users the most relevant and engaging content. For one week in 2012, Facebook changed the algorithms it uses to determine which status updates appeared in the News Feed of 689,000 randomly selected users. In this experiment, the algorithm filtered content based on its emotional content. Specifically, it identified a post as ‘positive’ or ‘negative’ if it used at least one word previously identified by Facebook as positive or negative. In essence, Facebook altered the regular News Feeds of those users, showing one set of users happy, positive posts while displaying dreary, negative posts to another set.
2. Was Facebook’s response to criticism concerning that experiment adequate? Why or why not?
Facebook Chief Operating Officer Sheryl Sandberg defended the experiment on the grounds that it was an ongoing research that companies perform to test different products. Facebook conceded that the experiment should have been ‘done differently’, and it announced a new set of guidelines for how the social network will approach future research studies. Specifically, research that relates to content that ‘may be considered deeply personal’ will go through an enhanced review process before it can begin.
3. Consider the experiments that Facebook conducted in May and June 2015. Is there a difference between these two experiments and Facebook’s experiment with human emotions? Why or why not?
Facebook launched an experiment called Instant Articles in partnership. This new feature allowed Facebook to host articles from various news publications to generate a richer multimedia experience and faster page-loading times. The objective was to help users discover which topics they may be interested in. This was an approach to encourage users to remain on the site for as long as possible. The experiment conducted in respect to human emotions was to see the emotions expressed by friends, through online social networks , elicited similar emotions from users. But the above hypothesis didn’t work out because seeing friends’ positive content made users sad.
4. Should the law require companies to inform their users every time they conduct experiments? Why or why not?
Yes, the law has to inform their users every time they conduct experiments because violating someone’s privacy and using private and confidential information about you to perform experiments without your knowledge or consent is a criminal offence.
****Please please please LIKE THIS ANSWER, so that I can get a small benefit, Please****