Chicago Police

CPD Using Controversial Software That Scans Photos on Your Twitter, Facebook

Chicago police say the department’s use of the software follows the law, but the American Civil Liberties Union said the use of any kind of recognition technology should only happen after public debate.

NBCUniversal, Inc.

In a city filled with surveillance cameras, Chicago police are using controversial new tools to better understand what they are seeing.

The Chicago Police Department has licensed photo-matching and facial recognition software that can search public sources like Twitter and Facebook to match photos and videos to individual identities.

The American Civil Liberties Union is calling for a moratorium on the use of that technology.

"It could really create a chilling of your civil liberties," said Rachel Murphy, ACLU Illinois staff attorney. "You’ll be afraid to go outside for fear that your face will be captured and that image will be used against you or stored in some kind of database."

Illinois law currently prohibits the use of biometric identifiers like facial recognition, but interim CPD Supt. Charlie Beck said the department’s use follows the law and has never been used by itself to make an arrest.

"We comply with the Illinois state law which has addressed this," Beck said. "We do not use either technique without a predicate crime. We do not crowd search. We do not violate people’s First Amendment rights by doing First Amendment activities."

The recognition and matching software is used at the department’s new technology centers.

Beck said facial matching has been effective when it's been used. One example the department points to is a robbery and theft case involving a cell phone.

The suspect took selfies of himself with the victim’s camera wearing her jewelry. He was later identified through photo matching as Lamont Hines, who was ultimately arrested and charged.

"We strictly use these processes to solve crimes that have been reported where we have evidence that gives us a person’s face but not identity," Beck said.

CPD said only 30 officers who use the controversial Clearview Ai software all have federal clearances and separate log-ins to prevent abuses.

But the ACLU said the use of any kind of recognition technology should only happen after public debate.

"This kind of technology is really invasive," Murphy said. "It's also very unregulated and prone to inaccuracy. Studies have shown it's more inaccurate with women and people of color."

Contact Us