Facebook collects data from 1.6 billion people, on everything from “likes” to social connections, to establish behavioral patterns. That went further — some would say, too far — in June 2014 when the company conducted a psychological test on 700,000 people to look at how omitting “positive” or “negative” words could alter mood. The resulting controversy about the company’s ethics moved Facebook to add an internal review policy in October 2014. But it is just now publishing new details on how it conducts that research.
The Wall Street Journal says that Facebook created a five-person “standing group” of employees, including law and ethics experts, to “assess the ethical impact of every research effort,” although it would not identify those individuals. This ethics board, modeled on academic institutional review boards (IRBs) charged with assessing the ethics of research involving human subjects, can consult outside experts if necessary.
If a manager believes a research project “deals with sensitive topics such as mental health,” the study will be reviewed in detail by that group “to weigh risks and benefits, as well as to consider whether it is in line with consumers’ expectations of how their information is stored.” Managers determine which research projects get full review and can simply approve any proposal they “deem more innocuous.”
Former Stanford University IRB manager Lauri Kanerva now heads Facebook’s research review process. Facebook public-policy research manager Molly Jackman notes that, “the issues of how to deal with research in an industry setting aren’t unique to Facebook.” Microsoft senior compliance program manager Joetta Bell calls research ethics “an emerging field that everyone in the industry is struggling with.”
To meet the challenge, tech companies are hiring academics, such as Northwestern University communication professor Jeremy Birnholtz, who worked at Facebook last year, to study data that tech companies believe can ultimately improve products and ads and even track health trends. For example, Microsoft just published a study “that suggests that search queries could provide clues that a person might have cancer, even before a diagnosis” and, in 2013, used searches “to detect adverse effects of drugs.”
Meanwhile, some government officials propose changes to the Common Rule, federal policies regulating human research, which would “make it easier for research participants to give scientists broad consent to use their data and tissue samples for future studies.”