On Tuesday, news broke via TechCrunch that Facebook ran a sketchy “Research” program involving paid participants who downloaded an app onto their phones that was capable of monitoring virtually everything that they did—including in some cases teens as young as 13, who were recruited via social media ads that appeared designed to keep Facebook’s involvement low-profile. The app appeared to be a version of Facebook’s awful Onavo Protect VPN, the iOS version of which got yanked from Apple’s App Store last year for violating rules on data collection.
Apple responded soon after by revoking Facebook’s enterprise developer certificates, saying the Research app functioned via exclusive tools supposed to be used only for internal development purposes, not distribution to consumers. That threw the Facebook’s iOS development programs into chaos, infuriating Facebook staff. But beyond the inter-corporate drama, Facebook is also facing immense criticism for paying teenagers to install a program capable of monitoring everything from private messages and browsing histories to app messages on their devices.
Chief operating officer Sheryl Sandberg’s defense? The teens “consented.”
“So I want to be clear what this is,” Sandberg told CNBC’s Julia Boorstin on Wednesday. “This is a Facebook Research app. It’s very clear to the people who participated. It’s completely opt-in. There is a rigorous consent flow and people are compensated. It’s a market research program.”
“Now, that said, we know we have work to do to make sure people’s data is protected,” Sandberg added, repeating a thoroughly unconvincing line that has been rolled out so many times amid Facebook’s constant scandals that it has barreled into self-satire territory. “It’s your information. You put it on Facebook, you need to know what is happening. In this case the people who chose to participate in this program did.”
“But we definitely have work to do and we’ve done it,” Sandberg said, just to hammer home that line.
When Boorstin asked whether Facebook regretted not pulling the app before Apple had revoked its certificates, Sandberg replied by saying Facebook had done so as soon as it realized it was not “in compliance.”
“Well of course, as long, as soon as we realized we weren’t in compliance with the rules on their platform, we pulled it,” Sandberg said. “The important thing is that the people involved in that research project knew they were involved and consented.”
That’s quite an interesting response, because TechCrunch reported on Wednesday that Apple told them it had the Research app blocked before Facebook could “voluntarily” pull it down. Facebook had also originally told TechCrunch that the Research app was not in violation of Apple enterprise developer certificate policies, and early Wednesday morning, a Facebook spokesperson declined to explain to Gizmodo why it was pulling the app if that was the case.
Then there’s the fact that Facebook had users sideload the app and avoided submitting it through TestFlight, Apple’s beta testing system, which requires Apple review.
Facebook also seemed caught completely off guard by Apple’s decision to revoke their certificates, according to Business Insider, which obtained internal Facebook communications that appeared to show staff blindsided. That does not exactly back Sandberg’s narrative that this was an orderly process in which Facebook made a genuine mistake and quickly moved to resolve the matter with Apple.
As for whether the teenagers involved in the program “consented,” legal minors cannot sign contracts without parental or guardian consent. Facebook has insisted that all participants below the age of 18 submitted parental consent forms, but it would be impossible for Facebook to remotely verify that the teens just didn’t fill them out themselves without extra steps.
For example, researcher Amanda Lenhart described the intensive process required to ethically conduct research on minors’ online habits for Pew Research Center in 2013:
So, all this means we must obtain parental consent to interview minors younger than 16, and because of state differences, we typically seek parental consent for all youth under the age of 18. This means that we must speak or interact with two individuals in each household (in a specific order with the parent first) rather than one, as in traditional surveys of adults. Interviewing two people increases the complexity of the project, and requires more phone calls or messages to reach eligible respondents in the proper order.
And that’s just for polling, not invasive monitoring.
While speaking with TechCrunch, Facebook also compared the Research app to focus groups run by Nielsen and comScore. Unlike Facebook’s program, Nielsen surveys are invite-only and not advertised to the general public. They also write this on their privacy FAQ:
We take the privacy and safety of children very seriously. If we collect information about, or get opinions from minor children, we do so only with the consent of the child’s parent or guardian, which can be withdrawn at any time.
Conversely, BBC reporter Dave Lee tweeted that he was able to sign up for Facebook’s Research program and told to download the app using a birthday in 2005, all without being asked for any kind of parental consent form at all.
Very rigorous consent flow there.
A technical expert consulted by TechCrunch, Guardian Mobile Firewall security researcher Will Strafach, also told the site that few participants in the program would have the technical knowledge to “reasonably consent” to the scale of the surveillance.
“The fairly technical sounding ‘install our Root Certificate’ step is appalling,” Strafach tells us. “This hands Facebook continuous access to the most sensitive data about you, and most users are going to be unable to reasonably consent to this regardless of any agreement they sign, because there is no good way to articulate just how much power is handed to Facebook when you do this.”
Nothing about Facebook’s side of the story holds up here, but hey. It has some work to do.