Skip to main content

Retrospective: How Facebook Has Manipulated What You See on Its Service

17517911822_0946b4b74e_bNews broke last week that Facebook had been censoring news stories in its "trending news" widget for political reasons.

While a single report is not enough to prove anything, the company’s past actions give the allegations a lot of credibility.

Here’s Gizmodo with the original story:

Facebook workers routinely suppressed news stories of interest to conservative readers from the social network’s influential “trending” news section, according to a former journalist who worked on the project. This individual says that workers prevented stories about the right-wing CPAC gathering, Mitt Romney, Rand Paul, and other conservative topics from appearing in the highly-influential section, even though they were organically trending among the site’s users.

Several former Facebook “news curators,” as they were known internally, also told Gizmodo that they were instructed to artificially “inject” selected stories into the trending news module, even if they weren’t popular enough to warrant inclusion—or in some cases weren’t trending at all. The former curators, all of whom worked as contractors, also said they were directed not to include news about Facebook itself in the trending module.

Basically Facebook is accused of manipulating the "trending news" widget just like they have manipulated other aspects of what users on the social network.

Here are three I recalled off the top of my head; I’m sure there are more.

Facebook turned its users into test subjects in an unethical experiment

Over 600,000 Facebook users have taken part in a psychological experiment organised by the social media company, without their knowledge. Facebook altered the tone of the users' news feed to highlight either positive or negative posts from their friends, which were seen on their news feed.

They then monitored the users' response, to see whether their friends' attitude had an impact on their own. "The results show emotional contagion," wrote a team of Facebook scientists, in a paper published by the PNAS journal – Proceedings of the National Academy of Scientists of the United States. (Telegraph)

Facebook secretly broke its own app to test user’s commitment to the service

Fascinating to learn, then, that the company has been selectively disconnecting the world for hours at a time. In a fascinating report in The Information, reporter Amir Efrati details the various steps Facebook is taking to prepare for the possibility that Google one day removes its apps from the Play Store for competitive reasons.

Facebook has tested the loyalty and patience of Android users by secretly introducing artificial errors that would automatically crash the app for hours at a time, says one person familiar with the one-time experiment. The purpose of the test, which happened several years ago, was to see at what threshold would a person ditch the Facebook app altogether. The company wasn’t able to reach the threshold. "People never stopped coming back," this person says.

(TheVerge)

And of course we all know that Facebook limits the number of followers who see an update in order to extort fees from users who want to reach all of their followers.

On Facebook, It’s Pay to Play

It’s no conspiracy. Facebook acknowledged it as recently as last week: messages now reach, on average, just 15 percent of an account’s fans. In a wonderful coincidence, Facebook has rolled out a solution for this problem: Pay them for better access.

As their advertising head, Gokul Rajaram, explained, if you want to speak to the other 80 to 85 percent of people who signed up to hear from you, “sponsoring posts is important.”

In other words, through “Sponsored Stories,” brands, agencies and artists are now charged to reach their own fans—the whole reason for having a page—because those pages have suddenly stopped working. (Observer)

Facebook has been accused of breaking its own app, is know to have experimented on its uses, and sells access to its users. The company has a history of getting caught in its manipulations, and has even built its business model on that activity, so is it any surprise that they’ve been accused of manipulating the news you see on their site?

It’s no surprise, but on the other hand this doesn’t mean that the Gizmodo story is true. One needs to be careful that Facebook’s past actions  don’t trick us into accepting an allegation simply because it fits our preconceived expectations.

That is called confirmation bias.

Update: And it turns out I was right to be cautious. Glenn Beck is convinced it didn’t happen, and the NY Times concurs.

image by C_osett

Similar Articles


Comments


IrishImbas May 18, 2016 um 3:48 pm

That’s a fair point but, based on their history of bad behaviour, even the unsubstantiated claim bears weight.

Nate Hoffelder May 18, 2016 um 5:38 pm

I agree. But the original piece is so juicy that it tempts us to go with our biases rather than the evidence.


Write a Comment