Facebook's personalised news feed isn't progress
Facebook has a lot to learn from John Stuart Mill, one of history's greatest thinkers about freedom and democracy. In 1834, Mill wrote, "It is hardly possible to overstate the value, in the present low state of human improvement, of placing human beings in contact with other persons dissimilar to themselves, and with modes of thought and action unlike those with which they are familiar ... such communication has always been, and is peculiarly in the present age, one of the primary sources of progress."
Last week Facebook announced a change in its News Feed service, designed to put human beings in contact with people similar to themselves, and with modes of thought and action like those with which they are familiar. That is not progress.
But it was probably bound to happen. In 1995, Nicholas Negroponte, a technology specialist at the Massachusetts Institute of Technology, prophesied the emergence of the Daily Me -- a personalised newspaper, designed by and for you. Facebook hasn't quite found a way to produce a Daily Me, but that's its aspiration.
Cass Sunstein is an American legal scholar.
"The goal of News Feed is to show people the stories that are most relevant to them," Facebook's vice president for product management, Adam Mosseri, wrote in a blog post. He said the News Feed is animated by "core values", one of which ranks above all others: "Getting people the stories that matter to them most." Your News Feed should be "subjective, personal, and unique".
"To help make sure you don't miss the friends and family posts you are likely to care about," Mr Mosseri continued. "We put those posts toward the top of your News Feed. We learn from you and adapt over time."
Facebook said it was fully committed to giving you exactly what you want: "Something that one person finds informative or interesting may be different from what another person finds informative or interesting," so the service aims, "to give you the most personalised experience."
Decisions like this matter even to non-Facebook users: The site accounts for more than 40% of referral traffic to news sites.
We don't know for sure, but Facebook probably made this change for three reasons. First, it recently faced allegations of political bias, in the form of suppression of conservative news sources. An algorithm that emphasises family and friends, and seemingly puts users in control, can claim political neutrality. Second, Facebook has an obligation to its shareholders, and if its News Feed really can be turned into a Daily Me, it might well get more clicks, which means more revenue. Third, many users have been just posting news articles of various sorts, which means a reduction in original posts.
It's reasonable for Facebook to take these points into account.
But we shouldn't aspire to a situation in which everyone's News Feed is perfectly personalised, so that supporters of Bernie Sanders, Hillary Clinton or and Donald Trump see fundamentally different stories, focusing on different topics or covering the same topics in radically different ways.
The first problem is political fragmentation.
If a major source of the nation's news is personalising user experiences, people with different points of view will end up in echo chambers of their own design.
Facebook didn't create that problem, but it shouldn't aggravate it.
The second is group polarisation. Research shows that if people are talking and listening to like-minded others, they become more dogmatic, more unified and more extreme.
Personalised Facebook experiences are a breeding ground for misunderstanding and miscommunication across political lines, and ultimately for extremism.
The third problem involves the immense value of unchosen, unanticipated exposures. In a well-functioning democracy, people frequently encounter topics and points of view that they did not specifically select but from which they learn. Those encounters can change minds, and even the course of lives.
Facebook seems to think that it would be liberating if everyone's News Feed could be personalised so that people see only and exactly what they want. Don't believe it. That's a prison.
If Facebook wants to address these concerns, it has plenty of options.
It could choose to put serious news stories in prominent places, along with posts from family and friends. It could promote serendipitous encounters with topics and ideas. It might provide users with information about topics and perspectives they didn't specifically choose. It could include material about major news developments around the world, or about prominent opinions on those developments -- while allowing people to opt out. Alternatively, it could create a "serendipitous news and opinions button", allowing people to opt in.
Facebook could (and should) experiment with an "opposing viewpoints button", allowing users to choose to include in their News Feed points of view very different from their own.
Its immensely creative staff could undoubtedly think of other ideas.
True, innovations of this sort could be seen as an "eat your vegetables" approach, which would not be particularly attractive to Facebook or its users. But so long as people have ultimate control over what they see, any objections would be pretty weak. In any case, some vegetables turn out to be delicious, even if you were reluctant to try them.
The company might answer that it is a business, not a public utility, and that business is simple: Connect people with what interests them. But in view of its growing public influence, Facebook is hardly an ordinary business.
And while it is certainly entitled to care about profits, it also has a lot of freedom to be more reflective than it has been about what its business is, and about the "core values" that animate it. ©2016 Bloomberg View
Former White House Administrator
Cass R Sunstein, the former administrator of the White House Office of Information and Regulatory Affairs, is the Robert Walmsley university professor at Harvard Law School and a Bloomberg View columnist.