LinkedIn conducted secret social experiments on 20 million users for half a decade

A new study analyzing data from more than 20 million LinkedIn users over a five-year period reveals that our acquaintances can be more helpful in finding a new job than close friends.

The researchers behind the study say the results will improve career mobility on the platform, but since users were unaware that their data was being studied, some people might find the lack of transparency concerning.

Published this month in Sciencethe study was conducted by researchers from LinkedIn, Harvard Business School and MIT between 2015 and 2019. The researchers conducted “several large-scale randomized experiments” on the platform’s “People You May Know” algorithm , which suggests new connections to users.

In a practice known as A/B testing, the experiments involved giving certain users an algorithm that offered different contact recommendations (like close or not so close) and then analyzing new jobs from those two billion new connections. .

After:Snapchat, big tech layoffs making you anxious about your job? Don’t panic. What there is to know.

The strength of weak ties

The researchers were testing a socio-scientific theory known as the “strength of weak ties”, which, according to Sinan Aral, award-winning professor of management and data science at MIT and lead author of the study, “is one of the most influential social sciences”. theories of the last century.”

In this theory from Stanford professor Mark Granovetter, there are weak ties, like friends of friends, and strong ties, like immediate colleagues. His research posits that it’s these weak ties that can lead you to better job opportunities that aren’t in your network of strong ties.

Strong ties can be “confined” to “small, well-defined groups,” like how you probably know your close friends’ close friends.

The LinkedIn study “surprisingly” confirmed this theory, Aral said.

“Knowledge is more valuable sources of job opportunities,” Aral said. “We also found that it’s not the weakest links, but the moderately weak links, that are the best.”

The strength of these weak ties varied across industries.

“The results help us understand how the platform’s algorithms affect job opportunities and outcomes and help LinkedIn design its platform to more effectively help its members find jobs and achieve social and economic mobility,” said said Aral.

A question of ethics

Privacy advocates told The New York Times on Sunday that some of LinkedIn’s 20 million users may not be happy that their data has been used without their consent. This resistance is part of a long-standing pattern in which people’s data is tracked and used by tech companies without their knowledge.

TECHNICAL PRIVACY: 5 ways you’re being tracked and how to stop it

LinkedIn told the newspaper it was “acting consistently” with its user agreement, privacy policy and member settings at The New York Times.

LinkedIn did not respond to an email sent by USA TODAY on Sunday.

The newspaper reports that LinkedIn’s privacy policy states that the company reserves the right to use its users’ personal data.

This access may be used “to conduct research and development for our Services to provide you and others with a better, more intuitive and personalized experience, to drive membership growth and engagement on our Services, and to help connect professionals to each other and to the second-hand economy.”

It can also be deployed to look for trends.

The company also said it used “non-invasive” techniques for the study research.

Aral told USA TODAY that the researchers “did not receive any private or personally identifying data during the study and only made aggregate data available for replication purposes to ensure further Protection of private life”.

“The study was reviewed and approved by the MIT Committee on the Use of Human Subjects in Research and these types of algorithm experiments, in addition to helping platforms improve, are also standard in the industry,” Aral said.

LinkedIn is far from the first technology company to analyze the data of its members without their knowledge.

In 2014, Facebook and researchers from the University of California and Cornell University shocked people when they published the results of a study that had silently manipulated people’s News Feeds for a week in 2012.

The company said it wanted to see how positive versus negative content affects people’s emotions and Facebook usage.

But privacy advocates immediately pushed back against the study’s methods. A professor called the study “psychological manipulation”. Eventually, even the Facebook scientists who worked on the study apologized for “any anxiety it caused.”

Previous Best Buy: An Updated View (NYSE: BBY)
Next Ford Focus ST gains track pack with chassis upgrades for sharper handling