San Francisco has passed a pre-emptive city ordinance, 8 votes to 1, that prevents the city’s police force and its municipal agencies from using facial recognition technology on citizens. This is the first such ban in the USA, and we believe the same is true globally, and comes at a juncture where blood is running hot – as technology companies try to find ways to monetize cities.
Sidewalk Labs, the Alphabet offshoot, is perhaps the best example of this recent struggle, but any news story that gets to take shots at the current whipping boys, using science fiction metaphors and political buzzwords, is sure to please that outlet’s advertisers. This dynamic, wherein public opinion is swayed by gleeful news coverage, creates a problematic feedback loop for technology providers that don’t come out on the front foot.
These technologies could be immensely valuable, but this has to be proven to the public with absolute clarity. Any missteps will be seized on. Had the industry’s collective offerings been demonstrably valuable, as well as sufficiently accurate, the industry could have prevented backlash that bans it from an entire city. Instead, San Francisco has taken a fairly radical step, and one that is likely going to be repeated in the US and abroad.
Privacy, technology, and police brutality are hot topics, and so any news article that can manage to string these elements together is sure to generate a juicy amount of clicks for the platform that publishes it. As such, you need to be sure that you aren’t going to give these starved journalists something good to write about, as the pace of web-based news consumption means that the companies have very little control over the direction of the story.
The journalists are looking for any stick to bludgeon these companies with, and these sticks don’t need to be as credible these days. Any angle that work will be pursued, and when the leading lights of a particular industry drop the ball, they are in for a pummeling. This process then reinforces the public opinion, that these companies are out to exploit them, that they don’t have their best interests at heart.
And so, it’s not so surprising that San Francisco, a liberal city politically (by the USA’s standards), has chosen to pre-emptively move against a scary new sci-fi tool. What’s more, the city is at the heart of the US technology industry, and others are considering similar bans – Oakland and Berkeley in California, as well as Somerville, Massachusetts.
Broadly, the facial recognition technologies available are not up to scratch. They are not the only part of the machine-learning and AI umbrella to fall victim to this problem, but because every person has a face, these sorts of universal stories get a lot of traction in the press.
If the technology was able to show that it was up to scratch, that it could accurately find people when needed but more importantly not generate false positives, it could have had far-reaching impact in public safety, emergency responses, and digital government applications.
But when Amazon comes out singing the praises of its Rekognition system as an ideal tool for law enforcement, egg ends up on faces with enough velocity to cause concussions. When the American Civil Liberties Union found that the technology identified 28 members of the US Congress as criminals, based on the analysis of police mugshots, Amazon was quick to claim that the ACLU had not properly configured the test – that they had gone with the default 80% confidence threshold, and not the recommended 95%.
Yet, “80% confidence” should never have generated those results – those are Russian Roulette odds, and certainly not the technological foundation upon which we should be building systems that might dispatch armed police should a street camera spot someone that triggers the alarm. Even 95% seems too low, and any dreams of using a face as some sort of identification for payments, access, or interactions with government systems would require five-nines to actually be useful.
And yet, Amazon is already selling Rekognition to police forces, in Oregon and Florida, as of May 2018. Admittedly, Rekognition is just an extension of an existing AWS bill, with the ACLU finding that the Orlando police department was paying less than $12 per month for it during its evaluation. The ACLU was more concerned by the Washington County Sheriff Department’s use, comparing images against a database of 300,000 mugshots.
This is despite there being an issue with racial bias in the ACLU’s test, where the tool was misidentifying people-of-color at a much higher rate. Given the heightened tensions in the US due to a spate of high-profile civilian deaths at the hands of police officers, the idea that a tool as ropey as this could be used on the streets will be extremely concerning.
Microsoft seems to have tried scoring points against Amazon, asking the US government to regulate facial recognition before it becomes too widespread, later clarifying that it had virtuously chosen not to sell such technologies to law enforcement.
Microsoft President, and Chief Legal Officer, Brad Smith, argued that “the future is not simple. A government agency that is doing something objectionable today may do something that is laudable tomorrow. We therefore need a principled approach for facial recognition technology, embodied in law, that outlasts a single administration or the important political issues of a moment.”
Smith added, in a long post that is well-worth reading, that “facial recognition technology raises issues that go to the heart of fundamental human rights protections like privacy and freedom of expression. These issues heighten responsibility for tech companies that create these products. In our view, they also call for thoughtful government regulation and for the development of norms around acceptable uses. In a democratic republic, there is no substitute for decision making by our elected representatives regarding the issues that require the balancing of public safety with the essence of our democratic freedoms. Facial recognition will require the public and private sectors alike to step up – and to act.”