One evening in April, a California law enforcement officer was browsing Facebook when she saw a post from the National Center for Missing and Exploited Children with a picture of a missing child. The officer took a screenshot of the image, which she later fed into a tool created by nonprofit Thorn to help investigators find underage sex-trafficking victims. The tool, called Spotlight, uses text- and image-processing algorithms to match faces and other clues in online sex ads with other evidence.
Tom Simonite covers artificial intelligence for WIRED.
Using Amazon’s facial recognition technology, Spotlight quickly returned a list of online sex ads featuring the girl’s photo. She had been sold for weeks. The ads set in motion some more traditional police work. “Within weeks that child was recovered and removed from trauma,” Julie Cordua, CEO of Thorn, said, recounting the case at an Amazon conference in Las Vegas this month.
The rescue illustrates Thorn’s strategy of nurturing new technology to combat child sex-trafficking and exploitation online. The nonprofit was cofounded in 2009 by actors Demi Moore and Ashton Kutcher and has become influential with both law enforcement—who can use Spotlight and other tools for free—and the tech industry. Thorn’s partners include Facebook, Amazon, and Dropbox.
Thorn may soon expand its influence. In April, the nonprofit was named one of eight projects that will share in $280 million from TED’s philanthropic offshoot, the Audacious Project. Thorn’s exact share has not been disclosed, but it will likely provide a major boost: The nonprofit’s income totaled $3.2 million in 2017, filings show.
One potential use for the new funding: new technology that would dig deeper into the online supply chain of child pornography, attempting to control it closer to the source. Cordua imagines software crawling the dark web, where she says the material often first appears, to find new imagery. Digital fingerprints for the files could then be added to automated blacklists used by companies such as Facebook, preventing it from circulating more broadly. Facebook says it took action on 5.4 million pieces of child pornography in the first quarter of 2019.
Cordua describes Thorn’s mission as a kind of immune response to an undertreated disease of the internet. Social networks and smartphones have enabled new forms of commerce and fun—but also made it easier to traffic in children or pornographic material featuring them. Cops lack the tools and expertise needed to fight that, Cordua says. Tech companies lack the motivation to spend heavily on a problem where progress doesn’t offer profits.
“It was becoming more and more difficult to address this problem,” Cordua says. She previously led global marketing at Motorola’s cellphone division during the brand’s peak, and filled a similar role at RED, the brand Apple and others use to direct funds to AIDS programs in Africa.
Thorn initially worked to pressure technology companies to do more about online child exploitation. More recently, it shifted to what Cordua says is a more effective strategy of producing and operating new technology for use by law enforcement and the private sector.
Spotlight was Thorn’s first software project and got its first major test during the 2015 Super Bowl, in Arizona. The initial version used text processing technology to highlight posts likely to be written by, or about, an underage person, and to pull out phone numbers and other data. Investigators could use those details to connect different ads, or cross-reference with NCMEC’s list of missing children.
Spotlight had been built on Amazon’s cloud service from an early stage, but in 2018 the two organizations began to talk about new features that use the company’s image processing technology. Investigators can now use facial recognition algorithms marketed under Amazon’s Rekognition service to check images against faces on NCMEC’s list. Spotlight also uses Rekognition to extract text from photos, because some sex ads hide text in images to escape conventional search tools.
Thorn says Spotlight has been used by law enforcement on almost 40,000 cases in North America, in which investigators found more than 9,000 children, and over 10,000 traffickers. For Amazon, Thorn also offers a way to highlight the benefits of facial recognition, after accusations that its use by law enforcement endangers privacy, and that the company’s technology is inaccurate.
The WIRED Guide to Artificial Intelligence
Thorn’s second major software product, Safer, is built around different image processing technology. It helps tech companies detect images of child sexual abuse on their platforms using PhotoDNA, a system developed by Microsoft with Dartmouth College, and used by other companies including Facebook.
PhotoDNA works by checking images against a list of hashes—mathematical fingerprints—of known child-abuse images. Cordua says Thorn’s implementation makes deploying the system and processes needed to support it less costly, encouraging use by smaller firms. Photo sharing sites Imgur and SmugMug, which owns Flickr, are among a handful of companies testing Safer.
Cordua says the new investment from TED’s Project Audacity could help make hashing blacklists more proactive, so images can be added to the system before they have circulated widely. That requires digging into the dark web—sites protected by anonymity tools such as Tor.
“The dream scenario is that in real time you could get hashes from newly produced content from the dark web, before it goes viral,” Cordua says. The plan is still taking shape, but one option would be to train machine learning algorithms that could flag potential material for review by experts. Thorn has previously built language processing tools to help law enforcement officers find child abuse content on the dark web.
Thorn has been in talks with the Canadian Centre for Child Protection, a fellow nonprofit, about coordinating to curtail child pornography on the dark web. The Canadian organization operates software called Project Arachnid that crawls the dark web and conventional websites to spot known child abuse images and automatically notify site operators. It also helps investigators find new content. In two and a half years, Arachnid has crawled 76 billion images, and sent out 3.8 million notices.
Hany Farid, a UC Berkeley professor who codeveloped PhotoDNA with Microsoft while at Dartmouth, says that program has demonstrated an important new model for tackling child pornography. Previous work has typically waited for companies or users to report material. “This is the first and only active approach, as far as I know,” Farid says. He also says Thorn has experience tracking content beyond the openly accessible web. “Thorn has been effective at diving into the dark web where a lot of child sexual abuse material has moved,” he says.
Lianna McDonald, executive director at the Canadian organization, says a new generation of more proactive technical tools can take the fight to suppress child exploitation online to a new level. “We’re at a point where we really feel that we’re going to get ahead of this victimization,” she says.