LinkedIn Test Raises Ethics Questions Over Parsing Big Data

LinkedIn’s experiments on users have drawn scrutiny from a new study that says the platform may have crossed a line into “social engineering.” The tests, over five years from 2015 to 2019, involved changing the “People You May Know” algorithm to alternate between weak and strong contacts when recommending new connections. Affecting an estimated 20 million users, the test was designed to collect insight to improve the Microsoft-owned platform’s performance, but may have impacted people’s career opportunities. The study was co-authored by researchers at LinkedIn, Harvard Business School, MIT and Stanford and appeared this month in Science.

LinkedIn’s algorithmic experiments came as a surprise because the company had not publicized the fact that tests were underway. That is not uncommon; Big Tech firms often run large-scale experiments to try different features, algorithms and designs among various user segments, called A/B testing. Rarely are these internal evaluations publicly discussed or are users asked for permission to participate.

“But the changes made by LinkedIn are indicative of how such tweaks to widely used algorithms can become social engineering experiments with potentially life-altering consequences for many people,” writes The New York Times.

Experts on the societal impacts of computing say “conducting long, large-scale experiments on people that could affect their job prospects in ways that are invisible to them,” NYT reports, noting the tests “raised questions about industry transparency and research oversight.”

The study was designed to assess a sociological theory called “the strength of weak ties,” the thesis of which is that people have the best chance at job and life opportunities through casual acquaintances, not close friends. LinkedIn’s research indicated that “relatively weak social ties on LinkedIn proved twice as effective in securing employment as stronger social ties,” NYT reports.

Linkedin says that “during the study it had ‘acted consistently with’ the company’s user agreement, privacy policy and member settings,” according to NYT, which describes the platform as “the world’s largest professional network.” But Marquette University’s Center for Data, Ethics and Society director Michael Zimmer suggests there could be long-term consequences of such studies, suggesting the repercussions “need to be contemplated when we think of the ethics of engaging in this kind of big data research.”

To weigh the strength of connections — weak or strong, acquaintance or stranger — the researchers “analyzed factors like the number of messages they sent back and forth or the number of mutual friends they shared, gauging how these factors changed over time after connecting on the social media platform,” Ars Technica explains. “LinkedIn says these results will lead to changes in the algorithm to recommend more relevant connections to job searchers.”

No Comments Yet

You can be the first to comment!

Sorry, comments for this entry are closed at this time.