How Facebook helps to muddy up elections

How Facebook helps to muddy up elections

Alexander Nix, Cambridge Analytica CEO, was the featured speaker at AW Asia at the Centara Grand and Bangkok Convention Centre in Bangkok last December, where he explained how the firm obtained 'private' Facebook data. (Screen cap YouTube/Affiliate World Conferences)
Alexander Nix, Cambridge Analytica CEO, was the featured speaker at AW Asia at the Centara Grand and Bangkok Convention Centre in Bangkok last December, where he explained how the firm obtained 'private' Facebook data. (Screen cap YouTube/Affiliate World Conferences)

MENLO PARK, California: Facebook has a problem it just can't kick: People keep exploiting it in ways that could sway elections, and in the worst cases even undermine democracy.

Facebook Inc now is faced with new calls for regulation from within US Congress and the UK parliament. It was hit with questions about personal data safeguards on Saturday after reports a political consultant gained inappropriate access to 50 million users' data starting in 2014.

The head of the British Parliament's media committee, Conservative legislator Damian Collins, said Facebook misled lawmakers by downplaying the risk of users' data being shared without their consent.

He said Sunday that "someone has to take responsibility for this. It's time for Mark Zuckerberg to stop hiding behind his Facebook page."

News reports that Facebook let the Trump-affiliated data mining firm Cambridge Analytica abscond with data from tens of millions of users mark the third time in roughly a year the company appears to have been outfoxed by crafty outsiders in this way.

Before the Cambridge imbroglio, there were Russian agents running election-related propaganda campaigns through targeted ads and fake political events. And before the Russians took centre stage, there were purveyors of fake news who spread false stories to rile up hyper-partisan audiences and profit from the resulting ad revenue.

In the previous cases, Facebook initially downplayed the risks posed by these activities. It only seriously grappled with fake news and Russian influence after sustained criticism from users, experts and politicians. In the case of Cambridge, Facebook says the main problem involved the transfer of data to a third party - not its collection in the first place.

"This was unequivocally not a data breach," longtime Facebook executive Andrew Bosworth said on Twitter about the Cambridge incident. "People chose to share their data with third-party apps and if those third-party apps."

That probably won't fly. On Sunday, the US congressional counter-attack began. "It's clear these platforms can't police themselves," Democratic Senator Amy Klobuchar tweeted.

Each new issue has also raised the same enduring questions about Facebook's conflicting priorities - to protect its users, but also to ensure that it can exploit their personal details to fuel its hugely lucrative, and precisely targeted, advertising business.

Facebook founder and chief executive Mark Zuckerberg will probably have to appear at congressional and parliamentary hearings to explain why the company failed to keep personal data secret. (File photo)

Facebook may say its business model is to connect the world, but it's really "to collect psychosocial data on users and sell that to advertisers." said Mike Caulfield, a faculty trainer at Washington State University who directs a multi-university effort focused on digital literacy.

Late Friday, Facebook announced it was banning Cambridge , an outfit that helped Donald Trump win the White House, saying the company improperly obtained information from 270,000 people who downloaded a purported research app described as a personality test. Facebook first learned of this breach of privacy more than two years ago, but hasn't mentioned it publicly until now.

And the company may still be playing down its scope. Christopher Wylie, a former Cambridge employee who served as a key source for detailed investigative reports published Saturday in The New York Times and The Guardian , said the firm was actually able to pull in data from roughly 50 million profiles by extending its tentacles to the unwitting friends of app users. (Facebook has since barred such second-hand data collection by apps.)

Wylie said he regrets the role he played in what he called "a full service propaganda machine." Cambridge's goal, he told the Guardian in a video interview , was to use the Facebook data to build detailed profiles that could be used to identify and then to target individual voters with personalised political messages calculated to sway their opinions.

"It was a grossly unethical experiment," Wylie said. "Because you are playing with an entire country. The psychology of an entire country without their consent or awareness."

Cambridge has denied wrongdoing and calls Wylie a disgruntled former employee. It acknowledged obtaining user data in violation of Facebook policies, but blamed a middleman contractor for the problem. The company said it never used the data and deleted it all once it learned of the infraction - an assertion contradicted by Wylie and now under investigation by Facebook.

Jonathan Albright, research director at the Tow Center for Digital Journalism at Columbia University, said Facebook badly needs to embrace the transparency it has essentially forced on its users by sharing their habits, likes and dislikes with advertisers.

Albright has previously noted cases in which Facebook deleted thousands of posts detailing Russian influence on its service and under-reported the audience for Russian posts by failing to mention millions of followers on Instagram, which Facebook owns.

Facebook is "withholding information to the point of negligence," he said Saturday. "How many times can you keep doing that before it gets to the point where you're not going to be able to wrangle your way out?"

The Cambridge imbroglio also revealed what appear to be loopholes in Facebook's privacy assurances, particularly regarding third-party apps. Facebook appears to have no technical way to enforce privacy promises made by app developers, leaving users little choice but to simply trust them.

In fact, the enforcement actions outlined in Facebook's statement don't address prevention at all - just ways to respond to violations after they've occurred.

On Saturday, Facebook continued to insist that the Cambridge data collection was not a "data breach" because "everyone involved gave their consent" to share their data. The purported research app followed Facebook's existing privacy rules, no systems were surreptitiously infiltrated and no one stole passwords or sensitive information without permission. (To Facebook, the only real violation was the transfer of information collected for "research" to a third party such as Cambridge.)

Experts say that argument only makes sense if every user fully understands Facebook's obscure privacy settings, which often default to maximal data sharing.

"It's a disgusting abuse of privacy," said Larry Ponemon, founder of the privacy research firm Ponemon Institute.

"In general, most of these privacy settings are superficial," he said. "Companies need to do more to make sure commitments are actually met."

Do you like the content of this article?
COMMENT (9)