Facebook's agenda and worrying hold on the news

Facebook's agenda and worrying hold on the news

Facebook is the world's most influential source of news.

That's true according to every available measure of size -- the billion-plus people who devour its News Feed every day, the cargo ships of profit it keeps raking in, and the tsunami of online traffic it sends to other news sites.

But Facebook has also acquired a more subtle power to shape the wider news business. Across the industry, reporters, editors and media executives now look to Facebook as the source of all knowledge and nourishment, the model for how to behave in this scary new-media world. Case in point: The New York Times, among others, recently began an initiative to broadcast live video. Why do you suppose that might be? Yup, the F word. The deal includes payments from Facebook to news outlets, including The Times.

Yet few in the US think of Facebook as a powerful media organisation, one that can alter events in the real world. When blowhards rant about the mainstream media, they do not usually mean Facebook, the mainstreamiest of all social networks. That's because Facebook operates under a veneer of empiricism. Many people believe that what you see on Facebook represents some kind of data-mined objective truth unmolested by the subjective attitudes of fair-and-balanced human beings.

None of that is true. This week, Facebook rushed to deny a report in Gizmodo that said the team in charge of its "trending" news list routinely suppressed conservative points of view. Last month, Gizmodo also reported that Facebook employees asked Mark Zuckerberg, the social network's chief executive, if the company had a responsibility to "help prevent President Trump in 2017". Facebook denied it would ever try to manipulate elections.

Even if you believe that Facebook isn't monkeying with the trending list, the reports serve as timely reminders of the ever-increasing potential dangers of Facebook's hold on the news.

The question isn't whether Facebook has outsize power to shape the world -- of course it does. If it wanted to, Facebook could try to sway elections or favour certain policies, as it once proved it could do in an experiment devised to measure how emotions spread online.

There is no evidence Facebook is doing anything so alarming now. The danger is nevertheless real. The biggest worry is that Facebook doesn't seem to recognise its own power, and doesn't think of itself as a news organisation with a well-developed sense of institutional ethics and responsibility, or even a potential for bias. Neither does its audience, which might believe that Facebook is immune to bias because it is run by computers.

That myth should die. It's true that beyond the Trending box, most of the stories Facebook presents to you are selected by its algorithms, but those algorithms are as infused with bias as any other human editorial decision.

"Algorithms equal editors," said Robyn Caplan, a research analyst at Data & Society, a research group that studies digital communications systems. "With Facebook, humans are never not involved. Humans are in every step of the process -- in terms of what we're clicking on, who's shifting the algorithms behind the scenes, what kind of user testing is being done, and the initial training data provided by humans."

Everything you see on Facebook is therefore the product of these people's expertise and considered judgement, as well as their conscious and unconscious biases apart from possible malfeasance or potential corruption. It's often hard to know which, because Facebook's editorial sensibilities are secret. So are its personalities: Most of the engineers, designers and others who decide what people see on Facebook will remain forever unknown to its audience.

Facebook also has an unmistakable corporate ethos and point of view. The company is staffed mostly by wealthy coastal Americans who tend to support Democrats, and it is wholly controlled by a young billionaire who has expressed policy preferences that many people find objectionable. Mr Zuckerberg is for free trade, more open immigration and for a certain controversial brand of education reform. Instead of "building walls", he supports a "connected world and a global community".

You could argue that none of this is unusual. Many media outlets are powerful, somewhat opaque, operated for profit, and controlled by wealthy people who aren't shy about their policy agendas -- Bloomberg News, The Washington Post, Fox News and The New York Times, to name a few.

But there are some reasons to be even more wary of Facebook's bias. One is institutional. Many mainstream outlets have a rigorous set of rules and norms about what's acceptable and what's not.

"The New York Times contains within it a long history of ethics and the role that media is supposed to be playing in democracies and the public," Ms Caplan said. "These technology companies have not been engaged in that conversation."

According to Tom Stocky, who is in charge of the trending topics list, Facebook has policies "for the review team to ensure consistency and neutrality" of the items that appear in the trending list.

But Facebook declined to discuss whether any editorial guidelines governed its algorithms, including the system that determines what people see in News Feed. Those algorithms could have profound implications for society. For instance, one persistent worry about algorithmic-selected news is that it might reinforce people's previously held points of view. If News Feed shows news that we're each likely to Like, it could trap us into echo chambers and contribute to rising political polarisation. In a study last year, Facebook's scientists asserted the echo chamber effect was muted.

But when Facebook changes its algorithm does it have guidelines to make sure the changes aren't furthering an echo chamber? Or that the changes aren't inadvertently favouring one candidate or ideology over another? In other words, are Facebook's engineering decisions subject to ethical review? Nobody knows.

The other reason to be wary of Facebook's bias has to do with sheer size. Ms Caplan notes that when studying bias in traditional media, scholars try to make comparisons across different news outlets. To determine if The New York Times is ignoring a certain story unfairly we can look at competitors.

"Facebook has achieved saturation," Ms Caplan said. No other social network is as large, popular, or used in the same way, so there's really no good rival for comparing Facebook's algorithmic output in order to look for bias.

What we're left with is a very powerful black box. In a 2010 study, Facebook's data scientists proved that simply by showing some users that their friends had voted, Facebook could encourage people to go to the polls.

If Facebook tried to influence an election you might never find out.


Farhad Manjoo is an American journalist and author.

Farhad Manjoo

American journalist and author

Farhad Manjoo is an American journalist and author.

Do you like the content of this article?
COMMENT (1)