NYPD, civil rights advocate face off on facial recognition tech during panel
The NYPD's facial recognition unit received 7,000 requests last year. Privacy advocates have concern about the technology.
A representative of the New York Police Department said that, despite criticism, cops will continue to exploit facial recognition and surveillance technology to the best of their abilities — while operating within the boundaries of the law — to solve crimes.
“We are focused on investigative-driven cases; we are not scanning the people walking down the street, trying to identify wanted people,” Jason Wilcox, assistant chief of the NYPD detective bureau, said Wednesday evening. “We live in a very 21st-century city, and I think people have an expectation that the NYPD uses cutting-edge technology, uses it well, to keep them safe — and we do that.”
The remarks were made during a panel discussion, “The Promise & Dangers of Facial Recognition,” at the NYU Center for Urban Science and Progress. The event was organized by the Downtown Brooklyn Partnership, which manages business improvement districts in the area.
Panelist Jonathan Stribling-Uss, a Media Democracy Fund technologist fellow from the New York Civil Liberties Union (ACLU of NY), said the NYPD’s use doesn’t always fit neatly into the public’s expectations.
“The chief is saying that the NYPD is using [facial recognition technology] in a limited way, but that’s not actually what we’ve seen from the NYPD’s own documents,” Stribling-Uss said, citing a recent Georgetown University Law Center report that slammed the NYPD, as well as other law enforcement agencies, for liberty-threatening tactics aided by facial recognition technology. “In those documents the NYPD used, for example, the face of Woody Harrelson to search for someone who they thought looked kind of like Woody Harrelson.”
The cops’ curious approach in that case eventually led to an arrest.
“The incident that happened where the Woody Harrelson image was used was one incident of 7,000 requests that the facial unit got last year to scan an image,” Wilcox said, repeating the statistic to emphasize its outlier status. “Using that photo, or that image of that actor, that’s not standard proto[col] — well, it’s not what we do in general, we don’t do that.”
Wilcox did not rule out the possibility cops might use a similar approach in the future.
“I don’t want to say ‘never’ because it depends on the circumstances. But does it happen in practice? Absolutely not,” he told the Brooklyn Eagle after the event.
Wilcox also said he couldn’t comment on the legal boundaries of what he called “dragnets,” broadly using facial recognition technology and scanning crowds to find suspected criminals.
“That’s a legal question; you’re coming to the wrong place. I’m just telling you how we take this science and how we apply it,” he said.
Alongside Wilcox and Stribling-Uss on the panel was Nasir Memon, professor of computer science at the NYU Tandon School of Engineering, and Noah Levenson, a technologist and artist at the Mozilla Foundation, “a global nonprofit dedicated to keeping the Internet a global public resource,” according to the organization’s website.
Memon said he was surprised at how quickly facial recognition technology has advanced over the past five years, and offered warnings about how people could “spoof” someone’s face by developing masks and other tricks to subvert the technology.
“Security is a cat and mouse game,” Memon said, emphasizing the importance of using the technology only in responsible ways, while preserving freedom.
“In any healthy society, citizens need the ability to remain anonymous if they choose to. Without anonymity, dissent starts becoming shaky,” he said. “There would be no Boston Tea Party without dissent. Dissent is essential to our democracy.”
Levenson offered a gloomy outlook on the possibilities of facial recognition technology. He discussed a government-approved patent the social media platform Snapchat retained in 2015 for facial recognition technology that judges facial expressions and deduces emotions in the subject.
“They’re going to correlate [those emotions] with your geolocation,” Levenson explained. “So now they know about how you feel wherever you’re at in the world, and then they’re going to sell that data to the organizers of public events, like concert promoters or the organizers of political rallies.”
The visions of the future only became darker as the event went on. Stribling-Uss, of the NYCLU, warned that the United States may be growing closer to adopting a system similar to the soon-to-be-mandatory social credit system in China, which will employ facial recognition technology to rank its citizens’ social equity. The ranking determines whether individuals may or may not engage in a range of social functions. “People could be tracked into bands on the basis of characteristics they can’t change,” Stribling-Uss predicted.
It’s not all doom and gloom, Stribling-Uss said.
Prompted by the moderator, Tyler Woods of the Downtown Brooklyn Partnership, to “say one good thing about facial recognition technology,” Stribling-Uss offered: “Being able to pick out undercover police in a crowd.”
Leave a Comment
Leave a Comment