IBM secretly used New York’s CCTV cameras to train its surveillance software

By James Vincent

IBM’s software was reportedly developed and tested on surveillance cameras run through the Lower Manhattan Security Initiative (pictured).
Photo by John Moore / Getty Images

IBM secretly used footage from NYPD CCTV cameras to develop surveillance technology that could search for individuals based on bodily characteristics like age and skin tone. This is according to a report from The Intercept that cites confidential company documents, as well as interviews with former IBM researchers and NYPD officials.

Although IBM advertises its video analytics software’s ability to search for individuals using traits like age and ethnicity, the company has not previously disclosed its use of New York surveillance footage to develop this technology. The extent to which IBM worked with the NYPD, and the secrecy with which it did so, is a worrying example of tech companies partnering with governments to build surveillance systems without public oversight.

The ability of new technology like AI to supercharge surveillance has been worrying privacy advocates for years. Not only do we know that such systems can be racially biased (meaning they’re prone to making more errors when identifying individuals with particular skin colors), but there’s no clear legislation in place to regulate their use.

According to The Intercept’s story, the NYPD first acquired IBM’s video analytics software through Microsoft subsidiary Vexcel in 2007. The technology began testing in New York City’s counterterrorism command center, which is part of the Lower Manhattan Security Initiative, in March 2010. NYPD spokesperson Peter Donald told The Intercept that the software was running on “fewer than fifty” of the center’s 512 cameras at that time.

By 2012, IBM was reportedly “testing out the video analytics software on the bodies and faces of New Yorkers, capturing and archiving their physical data as they walked in public.” The company then used this footage to improve its search functionality, letting operators build functions to find people based on age, gender, hair color, facial hair, and skin tone.

A screenshot from one of IBM’s internal company documents listing different search characteristics.
Image: The Intercept

The NYPD says it never actually used these features, hoping to avoid the “suggestion of appearance of any kind of technological racial profiling.” A spokesperson told The Intercept, “While tools that featured either racial or skin tone search capabilities were offered to the NYPD, they were explicitly declined by the NYPD.”

However, former IBM researcher Rick Kjeldsen, who said he worked on the NYPD project between 2009 and 2013, claims this is misleading. “We would have not explored [the search functionality] had the NYPD told us, ‘We don’t want to do that,’” Kjeldsen told The Intercept. “No company is going to spend money where there’s not customer interest.”

Kjeldsen says that a number of IBM researchers involved in the project were worried about the technology’s potential misuse. He adds that one of his main objections was the lack of public disclosure around how these potentially invasive tools might be deployed. “That’s where we need the conversation,” said Kjeldsen. “That’s exactly why knowledge of this should become more widely available — so that we can figure that out.”

The NYPD says it stopped using IBM’s technology in 2016. When contacted by The Verge, IBM would not answer specific questions, but it gave a statement saying it “remains absolutely committed to responsibly advancing new technologies.”