fbpx

From biased editors to biased algorithms? The case of Facebook trending stories

by | Sep 13, 2016 | Facebook, Media

Facebook has come under fire again for the nature of trending stories showing up in people’s news feeds. With Facebook’s algorithm now doing much of its content selection without human assistance, the expectation was that the platform’s automated choice of trending stories would be less biased.

But those who held out hope for this forgot an important fact: algorithms can sometimes be as biased as the people who write them.

Let’s rewind: in early May, after Gizmodo published the article in the above link, Facebook was accused of suppressing content via its human editors. The story of Facebook’s supposed liberal bias was quickly picked up by major outlets, with headlines including The Guardian‘s “People think there is more leftwing news on Facebook, says study” and “Facebook ‘Trending’ List Skewed by Individual Judgement, Not Institutional Bias” in The New York Times.

Republican Senator John Thune also requested that Facebook investigate the claims and provide records of changes made to trending stories.

While Facebook denied culpability, it still felt the pressure of these accusations. In response, it instructed its curators to step back, leaving the curation of trending stories almost entirely up to the  algorithms.

According to Facebook’s new rulebook, workers can only remove a trending story from the site if it either doesn’t correspond to an actual event or is a duplicate topic. So, with the “non-biased” algorithm in charge, Facebook’s trending stories should have run smoothly.

Except they didn’t. Just days after leaving the algorithm to its own devices, a fake story, a sexist slur, and an indecent video were among those that made it into people’s news feeds.

Now news sources are bemoaning Facebook for doing away with its human curators. Kate Conger of Techcrunch.com wrote that “Trending’s reliance on clicks and re-shares in this case makes Facebook’s once-helpful news module far less useful for everyone.” Georgia Wells of The Wall Street Journal wrote, “‘Trending lists have appeared more flawed than when humans were in charge.”

So what went wrong? The mistake Facebook made – in fact the same mistake the company’s critics made – was assuming algorithms are free of bias.

The New York Times defined algorithms as “a series of instructions written by programmers”. But they went on to say that “Software is not free of human influence. Algorithms are written and maintained by people, and machine learning algorithms adjust what they do based on people’s behavior….algorithms can reinforce human prejudices.”

MediaMiser reached out to Dr. Rena Bivens, a professor of communications at Carleton University who specializes in software design. According to Bivens, “[Facebook’s] role in distributing and curating news will always be biased since it always involves gatekeepers – whether they are human, human-programmed algorithms, or a mixture of both. Norms and values inevitably enter gatekeeping processes.”

Dr. Bivens adds that “It’s important to monitor how dominant social media companies like Facebook attempt to evade questions of bias and obfuscate their programming practices.”

And as The Guardian more comically put it, “Algorithms are a bit like the recipes we use in cooking, but they need to be much more precise because they have to be implemented by stupid, literal-thinking devices called computers.”

But Facebook isn’t the only company that’s had to deal with recent headaches over its algorithms. This past month, LinkedIn’s search engine was reported to have a gender bias: reports stated that when contacts search for female names, the engine suggests similar male names (but doesn’t do the opposite when users search for male names).

Algorithms may be great time-savers, but sometimes putting them in charge without adult supervision just means replacing human bias with machine bias.

We at Agility PR Solutions know that algorithms aren’t perfect. That’s why we employ media analysts who not only work with algorithms and automated processes, but also employ manual QA processes to ensure nothing slips under the radar.

Chantelle Brule
After receiving her Master's in Communications from Carleton University, Chantelle brought her research experience to Agility's Media Insights team as a data analyst. She's particularly good at what she does.

RECENT ARTICLES