On May 10, 40 advocacy groups sent an open letter demanding a permanent ban on the use of Amazon’s facial recognition software, Rekognition, by US police. The letter was addressed to Jeff Bezos and Andy Jassy, the company’s current and incoming CEOs, and came just weeks before Amazon’s year-long moratorium on sales to law enforcement was set to expire.
The letter contrasted Bezos’s and Jassy’s vocal support of Black Lives Matter campaigners during last summer’s racial justice protests after the murder of George Floyd with reporting that other Amazon products have been used by law enforcement to identify protesters.
On May 17, Amazon announced it would extend its moratorium indefinitely, joining competitors IBM and Microsoft in self-regulated purgatory. The move is a nod at the political power of the groups fighting to curb the technology—and recognition that new legislative battle grounds are starting to emerge. Many believe that substantial federal legislation is likely to come soon.
“People are exhausted”
The past year has been pivotal for face recognition, with revelations of the technology’s role in false arrests, and bans on it put in place by almost two dozen cities and seven states across the US. But the momentum has been shifting for some time.
In 2018, AI researchers published a study comparing the accuracy of commercial facial recognition software from IBM, Microsoft, and Face++. Their work found that the technology identified lighter-skinned men much more accurately than darker-skinned women; IBM’s system scored the worst, with a 34.4% difference in error rate between the two groups.
Also in 2018, the ACLU tested Amazon’s Rekognition and found that it misidentified 28 members of Congress as criminals—an error disproportionately affecting people of color. The organization wrote its own open letter to Amazon, demanding that the company ban government use of the technology, as did the Congressional Black Caucus—but Amazon made no changes.
During the racial justice movements against police brutality last summer, however, Amazon surprised many by announcing that it was halting police use of Rekognition, with exceptions for federal law enforcement officers such as ICE. The company’s announcement said it hoped the pause “might give Congress enough time to put in place appropriate rules.”
Evan Greer is the director at Fight for the Future, a technology advocacy group that believes in the abolition of face recognition technology and says there is growing public support for it to be regulated. She says this week’s extension of the moratorium shows that “Amazon is responding to this enormous pressure that they’re receiving, not just around facial recognition,” adding, “I really give tremendous credit to the nationwide uprisings for racial justice that have happened over the last year and a half.”
“A political reality”
Although there is pressure building on large technology providers, the reality is that most law enforcement and government users don’t buy facial recognition software from companies like Amazon. So though the moratoriums and bans are welcome to advocacy groups, they don’t necessarily prevent the technologies from being used. Congress, meanwhile, has yet to pass any federal legislation on facial recognition in law enforcement, government, or commercial settings that would regulate smaller providers.
Some hope that federal legislation is soon to come, however, either through direct congressional action, a presidential executive order, or upcoming appropriation and police reform bills.
“I think best-case scenario is that Congress passes a moratorium on the use of it,” says Kate Ruane, senior legislative counsel at the ACLU. She thinks that new uses should only be permitted after more legislative work.
Several federal bills have already been proposed that would rein in access to facial recognition.
- The Facial Recognition and Biometric Technology Moratorium Act calls for banning use of the software by any federal entities and withholding federal grant money from any state and local authorities that do not enact their own moratorium. It was proposed by four Democratic members of Congress and introduced to the Senate last year.
- The George Floyd Justice in Policing Act would prevent the use of facial recognition in body cameras. The bill has already passed in the House and is expected to reach the Senate this coming week. President Biden has asked that the bill be passed ahead of the anniversary of George Floyd’s death on May 25.
- The Fourth Amendment Is Not For Sale Act, a bipartisan bill introduced by 18 senators, limits the government from working with technology providers that break terms of service. In practice, it would largely prevent government access to systems that engage in web scraping, such as Clearview AI.
Mutale Nkonde, the founding CEO of AI for the People, a nonprofit that advocates for racial justice in technology, believes we are likely to see additional federal legislation by the midterm elections next year.
“I do think there is going to be federal legislation introduced that is going to govern all algorithmic systems, including facial recognition,” Nkonde says. “I think that that’s a political reality.”
Nkonde says the concept of impact assessments that evaluate technological systems on the basis of civil rights is gaining traction in policy circles on both sides of the aisle.
The ACLU is lobbying the Biden administration for an executive order, and it recently published a letter with 40 other groups asking for an immediate ban on government use of the technology.
“If we are going to commit to racial justice, if we’re going to commit to racial equity in the criminal justice system, we’re going to commit to those sorts of reforms, one of the simplest and clearest things you can do is end the use of facial recognition technology,” says Ruane.
“People are just more radical”
In the meantime, Ruane expects self-regulation to remain one of the most effective methods of preventing expanded use of facial recognition. It’s plausible that federal agencies like the Departments of Housing, Homeland Security, and Education will consider imposing rules banning the use of the technology.
Nkonde is optimistic that the moratoriums will expand into bans and more permanent legislation: “I think moratoriums seemed to be what was possible prior to George Floyd being killed. After that, people are just more radical.”
But Greer cautions that for all the momentum against face recognition, legislation that focuses heavily on the racial accuracy of the systems might not solve deeper problems. “I think it would be a mistake if policymakers see accuracy as the only problem with facial recognition that needs to be addressed,” she says. “Industry would actually be very happy with a bill that, for example, says something like ‘If you’re going to sell a facial recognition system, it has to be 99% accurate on people of all different races and skin tones.’”
“Even if the bias isn’t baked into the system, you still have a biased policing system that’s now being accelerated and sort of supercharged with this technology,” she adds.