You have /5 articles left.
Sign up for a free account or log in.

There are plenty of reasons to be upset at the news that a voter-profiling firm secretly harvested private information from 50 million Facebook profiles. Beyond the big-picture questions about Facebook’s nonstop surveillance of our daily activities and the use of this information to influence U.S. elections, another part of this story should cause particular disquiet for academics: Facebook is using academic pedigrees to whitewash unethical corporate behavior.

In 2014, Facebook let Aleksandr Kogan, a psychology professor at Cambridge University, use its platform to collect information. Kogan enlisted Facebook users to take a personality quiz and download an app that collected information from not only the users but their Facebook friends. Participants in this study were paid a small fee. From this information, Kogan developed “psychographic profiles” on millions of Facebook users. The information collected supposedly revealed whether someone was shy or extroverted, liberal or conservative, and a host of other personality traits that could be used to deliver targeted political ads.

This isn’t the first time researchers have used Facebook to survey the emotions and behavioral quirks of its users. In June 2014, it came to light that the social networking site had allowed researchers to manipulate the news feeds of nearly 700,000 users to see if they could be made to feel more happy or more sad. (They could.) Last summer, a leaked memo showed Facebook executives boasting that they could monitor the posts and photos of teenagers in real time to determine if they were in midst of vulnerable mental states like “stressed,” “defeated” and “insecure.”

The difference here is that Kogan turned over all of his psychological data to a private business built for the purpose of swaying voters. Facebook contends Kogan never revealed that he was sharing this data with Cambridge Analytica. In high dudgeon, Facebook now labels Kogan and Cambridge Analytica’s experiment a “scam” and a “fraud.”

Facebook appears to be shocked to find that gambling is going on in Casablanca. It apparently did nothing to verify Kogan’s research project was actually for the “academic purposes” he claimed. No safeguards were imposed to ensure that the private information Kogan collected remained with him alone and did not fall into other hands.

Nor did Facebook take steps to prevent Kogan or Cambridge Analytica from collecting information on the friends of the users who took Kogan’s personality quiz. In fact, Facebook’s terms of service appear to have permitted just this kind of hijacking of personal information. Facebook was not a dupe; it was an enabler.

Other parties have been complicit in Facebook’s use of scholars to justify abuses of its users’ trust. The 2014 study that tweaked user news feeds to see if they could make users post more happy or more sad content was conducted by researchers from Cornell University. Normally, academic-run experiments on human subjects require approval from a university institutional review board. But Cornell contended that IRB review was unnecessary because the Cornell researchers were working with Facebook’s private data.

Facebook benefits from this less-than-rigorous relationship with academe. The Cornell experiment may have probed an interesting scientific question about human psychology. Yet the study also offered impressive evidence of Facebook’s value to advertisers. You can bet that corporate CEOs took note that a study of over half a million Facebook users proved that the social media platform can change how people feel with just a couple small algorithmic tweaks.

Likewise, the Cambridge Analytica revelations may be a public relations nightmare for Facebook, but they may also be a corporate relations dream. What better proof of Facebook’s power to persuade shoppers than steering the results of an election?

This is not to say that academics should be excluded from using Facebook for research. There has long been a tension between purely academic study and applied market research. It is unrealistic to call for a complete separation of one from the other. A blanket ban on academic participation would make Facebook even more of a black box than it is already. Done the right way, academic experiments using the social media giant’s enormous data set can make its influence on us more transparent, not less. And Facebook deserves some credit for finally starting to develop a more rigorous review process for in-house research.

But the best practices of academia need to find more purchase at Facebook. For studies on humans, it is necessary in the university setting to obtain informed consent. As a private business, Facebook is not obligated to comply with this standard, and it doesn’t. Instead, it need only make sure that the terms of any potential human experimentation are covered under its capacious and unreadable terms of service.

By contrast, in the realm of academic research, scientists cannot wave a bunch of impenetrable legalese under a test subject’s nose and receive a blank check to do what they want. Moreover, university internal review boards act as a safeguard, making sure that even when consent is informed, the benefits of any proposed research outweigh their costs to the participants. University IRBs need to make sure they fulfill their responsibilities when it comes to experimenting on social media users.

More importantly, it is time that Facebook starts following academics’ best practices instead of using them for cover.

Next Story

Found In

More from Views