Fabian Rogers was none too pleased when the landlord of his rent-stabilized Brooklyn high-rise announced plans to swap out key fobs for a facial recognition system.
He had so many questions: What happened if he didn’t comply? Would he be evicted? And as a young black man, he worried that his biometric data would end up in a police lineup without him ever being arrested. Most of the building’s tenants are people of color, he said, and they already are concerned about overpolicing in their New York neighborhood.
“There’s a lot of scariness that comes with this,” said Rogers, 24, who along with other tenants is trying to legally block his management company from installing the technology.
“You feel like a guinea pig,” Rogers said. “A test subject for this technology.”
Amid privacy concerns and recent research showing racial disparities in the accuracy of facial recognition technology, some city and state officials are proposing to limit its use.
Law enforcement officials say facial recognition software can be an effective crime-fighting tool, and some landlords say it could enhance security in their buildings. But civil liberties activists worry that vulnerable populations such as residents of public housing or rent-stabilized apartments are at risk for law enforcement overreach.
“This is a very dangerous technology,” said Reema Singh Guliani, senior legislative counsel for the American Civil Liberties Union. “Facial recognition is different from other technologies. You can identify someone from afar. They may never know. And you can do it on a massive scale.”
The earliest forms of facial recognition technology originated in the 1990s, and local law enforcement began using it in 2009. Today, its use has expanded to companies such as Facebook and Apple.
Such software uses biometrics to read the geometry of faces found in a photograph or video and compare the images to a database of other facial images to find a match. It’s used to verify personal identity — the FBI, for example, has access to 412 million facial images.
“Our industry certainly needs to do a better job of helping educate the public how the technology works and how it’s used,” said Jake Parker, senior director of government relations for the Security Industry Association, a trade association based in Silver Spring, Maryland.
“Any technology has the potential to be misused,” Parker said. “But in the United States, we have a number of constitutional protections that limit what the government can do.”
A 2018 study from the Massachusetts Institute of Technology found that the software more often misidentifies darker-skinned people, particularly women of color, raising concerns about bias built into the technology. The study found the software had an error rate of 34.7% for darker-skinned women, compared with 0.8% for lighter-skinned men.
This year several cities — San Francisco; Somerville, Massachusetts; and Oakland, California — became the first to ban municipal departments, including police and housing agencies, from using facial recognition technology. And this year, lawmakers in at least 10 states introduced bills to ban or delay the use of the technology by government agencies and businesses.
“We’re concerned about government overreach,” Michigan state Rep. Isaac Robinson, a Democrat who sponsored one of the bills, told Stateline. “And preserving our right to walk freely down the street without having our faces scanned.”
A handful of private apartment complexes in New York have started using the technology. But for now, few public housing complexes seem to be embracing facial recognition software, said Adrianne Todman, CEO of the National Association of Housing and Redevelopment Officials.
In Detroit, one public housing complex uses live cameras as part of the citywide surveillance system Project Green Light Detroit. Images from those cameras could be loaded into the Detroit Police Department’s facial recognition software.
Agencies rely more on cameras and security personnel to manage safety issues in their communities, Todman said. “They also rely on information they get from residents, who often are the most informed about what’s happening on their floors, in their buildings and in their neighborhoods.”
In May, U.S. Housing and Urban Development Secretary Ben Carson, a Detroit native, was asked about the use of the technology in public housing by U.S. Rep. Rashida Tlaib, a Democrat also from Detroit.
“I oppose the inappropriate use of it,” Carson said. He did not specify what use he considered inappropriate.
HUD spokesman Brian Sullivan said facial recognition technology in public housing was a local issue and that he wouldn’t comment beyond Carson’s testimony at the hearing.
Two weeks ago, Tlaib introduced a bill that would ban facial recognition software from public housing, along with a bill that would ban federal purchases of the technology. A third bill introduced in the House by U.S. Rep. Eliot Engel, a Democrat from New York, would prohibit federal agencies from using facial recognition technology without a court order.
In July, Michigan lawmakers introduced two bills. One would place a five-year moratorium on facial recognition technology while another would ban it outright.
Vermont and Washington state lawmakers introduced bills this year to curb police use of the technology. California lawmakers introduced a bill to require businesses using facial recognition software to alert their customers.
In New York, a package of bills focuses on the use of facial recognition in housing. One bill would ban biometric and facial recognition software from being used in federally funded public housing. Another bill would bar landlords from installing the technology on “any residential premises.”
Three years ago, Detroit launched Project Green Light Detroit, an $8 million surveillance system that uses live cameras in schools, gas stations, churches, medical centers and liquor stores to deter crime and improve police response times.
The city installed Project Green Light cameras in more than 500 locations with little fanfare. In May, though, a Georgetown University study found the city used facial recognition software, in conjunction with Project Green Light cameras, to make arrests.
“No longer is video surveillance limited to what happens,” the study found, “it may now identify who is going where, doing what, at any point in time.”
The study found live cameras were tracking the movements of tenants in apartment buildings and even patients coming and going from a medical center, which Detroit Police Chief James Craig denied in an interview with Stateline. Craig said his department does not use facial recognition software to track people.
The city started using the cameras in areas with high crime rates, such as gas stations and outside liquor stores. But earlier this year, public housing officials installed Project Green Light cameras in a senior citizens’ community, said Sandra Henriquez, executive director of the Detroit Housing Commission. She said the cameras themselves are not equipped with facial recognition software.
“People seem to conflate the issue,” Henriquez said. “I have video surveillance equipment. I do not have facial recognition software in any of our properties. I want to make that crystal clear.”
Asked whether she had concerns about the technology, Henriquez said, “I would not say there are concerns. It is a technology, as a landlord, I do not need. I understand, in certain circumstances and applications, there might be a need. But not what happens on my property at this point.”
Henriquez said she has no intention of installing facial recognition software in any of the public housing units and has no plans to install Project Green Light in other city housing complexes.
The cameras were installed at the behest of tenants, said Craig, the police chief. He said the city has used facial recognition 500 times in the past year to identify suspects. A positive identification was made in about a third of the cases.
“The thing that’s being lost in the conversations, whether it’s cameras or facial recognition, no one talks about the victims,” Craig said. “It’s almost as though the victims don’t count.”
The police take a snapshot from Project Green Light cameras and enter it into the software, which generates photos gleaned from mug books, ranks the photos, and identifies likely matches.
Craig said after the software identifies a possible match, two analysts trained in biometrics by the FBI study the photograph. If they think they’ve made a positive match, they then run it by a supervisor, who turns the photograph over to prosecutors.
A positive match from facial recognition software is not sufficient to charge a suspect with a crime, Craig said.
“Never in my wildest dreams would I have guessed that using facial recognition would have garnered such a vitriolic response,” he said.
Last fall, Nelson Management Group informed tenants of Atlantic Plaza — the rent-stabilized, middle-income complex in Brooklyn where Fabian Rogers lives — it planned to replace key fobs with facial recognition software.
The system would be next to the doors, according to Colleen Dunlap, CEO of StoneLock, which manufactures the technology. Tenants could be scanned in through an automatic door without touching anything.
But tenants would not be tracked using the StoneLock system, Dunlap said in an emailed statement. “We work hard to protect user privacy.”
Rogers and other tenants objected because surveillance cameras were already on the property, along with security guards and a doorman. They filed a legal action with the state’s Homes and Community Renewal agency, which oversees rent-stabilized housing.
“The sole goal of implementing this technology is to advance that priority and support the safety and security of residents,” Chris Santarelli, a spokesman for the Nelson Management Group, said in an emailed statement.
Rogers, who’s lived in the building for over a decade, remains unconvinced.
“I have no control over where this information goes,” Rogers said.
“So we’re going to keep fighting.”