Facebook has said it has already investigated claims of Russian activity discussed during today’s hearing and found there was no concerns.
Collins quoted an email seized from software company Six4Three alleging that a Facebook engineer had notified the company in October 2014 that Russian IP addresses were accessing “three billion data points a day” on the network.
“If Russian IP addresses were pulling down a huge amount of data from the platform was that reported or was that just kept, as so often seems to be the case, within the family and not talked about,” he asked.
Facebook’s Richard Allan said the claim was misleading and taken out of context.
Facebook has now issued a statement confirming the issue was looked into and told the Guardian “the engineers who had flagged these initial concerns subsequently looked into this further and found no evidence of specific Russian activity”
And finally, Charlie Angus, Canada’s representative, brings up Facebook’s inflated video metrics, overstated for two years. “I would consider that corporate fraud, on a massive scale,” he says, “and the best fix is anti-trust. The simplest form of regulation would be to break facebook up, or treat it as a utility, so that we can all be sure that we’re counting metrics that are accurate or true. To allow you to gobble up all the competition is not good.”
Allan says, “it depends on the problem we’re trying to solve,” and Angus counters that “the problem is Facebook, everything else is just a symptom.”
Allan: “Unless you’re going to turn off the internet, I’m not sure people would be better off in doing without Facebook offering the services it’s spent 15 years perfecting how to offer.”
Brazil’s Alessandro Molon makes a brief statement, asking for internet companies and social media companies to work with governments to preserve democracy. He then asks what Facebook is doing to prevent improper manipulation of its algorithms to prevent illegal manipulation of elections.
Allan cites a previous post by Zuckerberg, which says that Facebook is trying to stop its algorithm rewarding sensationalist content; and highlight’s Facebook’s partnership with third-party fact checkers, which sees content suppressed if it’s marked as false.
Molon asks further about how anyone can be sure Facebook has deeper commitment to democracy than to profit, and again brings up WhatsApp, “which was widely used to spread manipulated content”. The service, he says, banned more than 100,000 accounts in Brazil during the election.
Allan says “we are now building WhatsApp into our thinking around election integrity.” The company acquired WhatsApp four years ago.
“There are some novel challenges to look at,” Allan says, but “we don’t think that sort of manipulative behaviour is in anyone’s interest.”
Argentina’s Leopoldo Moreau asks his question in Spanish (and Allan briefly responds fluently, before asking for the translator to continue for the benefit of the committee), and asks why Facebook’s Argentinian office didn’t engage with the country’s parliament.
Allan apologises (in English), and says that the company has a large presence in Argentina and that they should be engaging better.
Moreau asks about WhatsApp campaigning: the company, wholly owned by Facebook, allows for encrypted communications that facebook cannot oversee. Allan says that WhatsApp is “intended as a person to person messaging service; it should not be used for spamming people.”
The Argentinian delegation counters that WhatsApp does have business APIs, that do allow for bulk mailing. Allan says that “if shadowy companies are promising to circulate on WhatsApp information through lists of numbers, that should stop. We will be offering proper business communication, but that we can oversee.”
“Where we were made aware of it, we did take action. We’re building WhatsApp into those election task forces I mentioned.”
Julie Elliott asks how Facebook defines political advertising.
Allan: “This is one of the areas where we would really appreciate a discussion with policymakers. At the moment, in the UK, we say if you’re talking about a party or a candidate, or an issue in front of the legislature.”
Elliott asks how Facebook monitors that. Allan describes the current system, which requires people to register as political advertisers if they’re found running political adverts.
Elliott asks what percentage of Facebook’s budget is being spent on this effort; Allan says it’s a major effort, but that he can’t tell the committee the percentage.
Elliott asks “what other checks and balances” Facebook is applying to the money that is funding the advertising. Facebook gets the money from the person who is paying it, Alan says, but thinks that the best way to explore further up the chain is with regulators like the Electoral Commission.
Singapore’s Edwin Tong asks about Facebook’s policy on hate speech, and quotes from a Mark Zuckerberg statement saying that the company has always taken down such content.
Tong then brings up a post made in Sri Lanka, calling for the murder of Muslims. “It was put up at a time when there were significant tensions between Sri Lankan Muslims… that eventually resulted in a state of emergency.
“In that context, wouldn’t such a post inflame tensions?”
Allan agrees it would.
Tong asks why, then, that post is not down. Allan says it should be, and that there must have been a mistake; Tong quotes from Facebook’s response, which says that no policy has been broken, and Allan repeats that it’s a mistake.
“Would you agree that Facebook cannot be trusted to choose what goes on its platform,” Tong asks. Allan disagrees, and says “the best way to resolve this is a dictionary of hate speech terms in Sinhalese that gets surfaced to a Sinhalese reviewer.”
“We make mistakes; our job is to reduce the number of mistakes. We should be accountable for our mistake to you and your colleagues, to every parliament that’s sat round the table today.”
Sun Xueling from Singapore asks how Facebook is policing the setting up and shutting down of fake accounts and their networks.
“The shutting down of fake accounts is an ongoing battle that we have,” Allan says. “Most fake accounts are created with commercial intent … but they’re taken down within minutes.
“Then there are people who are careful, create one or two accounts, and act as though they are a normal Facebook user. The issue in the US, with the Internet Research Agency, was that.”
Allan says that “low-quality information” has reduced by over 50% on the site, according to a study from a French research institute. But, he says, those people who curate individual fake accounts are the hardest to catch.
Zimmer quotes again from the New York Times story two weeks ago: “Mr Zuckerberg and Ms Sandberg stumbled … and sought to conceal warning signs from public view.”
Allan says he doesn’t think that’s true. “Issues have come up, and been debated fully and thoroughly.”
Zimmer notes that Facebook’s quarterly profit is $13bn. “What do you say to the 400 million constituents we represent that shows you’re taking this seriously? There are other bigger issues involving election campaigns … but you’re still downplaying the role that Facebook has in this situation. That’s a huge player on the global scene, and you still don’t seem to get a grasp on how much influence you have on global election campaigns.”
Allan says: “We now have a world-leading security team, who are finding those people and taking them down. We tell you, and you ask how did they get on the site. There will be problems, but we will catch most of them, and our goal is that the Canadian elections should not be unduly influenced through online activity on our platform.”
Canada’s Bob Zimmer asks whether Allan thinks Canada’s democracy is at risk if the country doesn’t change its laws to deal with ‘surveillance capitalism’.
Allan says there are a number of vectors that are problematic: foreign interference, the ability for others to project their views into the country; but also domestic issues, allowing people inside the country to do dirty tricks campaigns.
After a brief interruption from Ireland’s Eamon Ryan, and a quip about missing his gavel from Zimmer, the Canadian asks about Zuckerberg’s dismissal of the idea that Facebook affected the US election as a “crazy idea”.
Allan concedes it was “not elegantly said”, but says that “in an election campaign there is a huge amount of legitimate activity carried out by all the parties … We did spot this activity that was wrong, shouldn’t have happened, but we think that if you look at what changed the outcome, it’s the main point.”
“They’re both problems, but if you ask my why that statement was made, I’m trying to describe to you the thinking behind it.”
The UK’s Brendan O’Hara reiterates the irritation with Facebook’s decision not to send Zuckerberg, and asks if Allan was sent to answer questions or defend the company.
“Were you sent because you, in the entire Facebook empire, are the best person to answer all these questions, or because you’re best placed to defend the company?”
Allan says he thinks it’s the former, and reminds O’Hara that Mike Schroepfer, the company’s chief technical officer, had previously come and not satisfied the committee. He says he volunteered to speak to the committee: “I said, ‘I believe that I have the knowledge that this group needs.’”
“To be precise, both for the issues that you want to raise as the UK commitee, and, I now work on election issues globally… this is the stuff I work on. Our working assumption was that’s what you want to discuss.”
O’Hara complains about how many times Allan is promising to write with answers after the commitee, and asks Allan what light he thinks he’s shone on the issue that has provided greater clarity than Zuckerberg could have.
“I think I’ve given you insights around the way we think about regulation–” he is cut off by Collins, who hands over to the next questioner.