Last week, the city of San Francisco became the first in the U.S. to ban the use of facial recognition software for law enforcement and other agencies. The groundbreaking vote has started a nation-wide discussion of civil liberties versus public safety. Facial recognition software has proven beneficial to many law enforcement agencies, specifically in the cases of missing or trafficked children. This ban will no longer allow police to use facial recognition technology in their searches.
How Does It Work?
According to The National Center For Missing and Exploited Children, the FBI reported 424,066 missing children in 2018. Proponents of facial recognition software claim the use of their technology significantly impacts the number of children found. Elke Oberg, Marketing Manager for Cognitec Systems, the German-based industry leader in facial recognition technology shared her concerns with Parentology, “Taking this tool away from investigative environments will result in much more time and money spent on using traditional methods. Some investigations will be terminated due to lack of a lead or time constraints.” Oberg explains facial recognition technology is only one factor of a missing person investigation.

A factor with quantifiable results. In April of this year, the city of New Delhi, India, made international headlines as they recovered over 3,000 missing children within just four days of launching facial recognition software. Oberg suggests that this technology greatly improves law enforcement’s success in finding lost children. “With local or national databases now containing millions of images and growing all the time, agencies are relying on automated recognition to start the investigation and narrow down the suspect trail,” she says. “Videos demand hours and weeks of people searching through materials, while face recognition can find all the faces within seconds, present them with time and location, and even cluster people seen in multiple videos to one identity.”

A Violation Of Rights?
Opponents of technology don’t argue facial recognition’s usefulness, but claim it’s just too invasive and unpredictable to be used by government agencies. Concerns range from cameras collecting images without a person’s knowledge to inaccuracy of the technology.
In February, The New York Times reported though the technology is evolving, it’s particularly biased against minorities. Civil rights groups aren’t willing to take that chance. A spokesperson for the ACLU of Northern California told Parentology they don’t believe there’s an ability to pick and choose when it comes to this kind of technology.
“You can’t build a face recognition system for investigative purposes that can’t also be used for unprecedented mass surveillance. History shows that if we put this technology in government hands, agencies like police or ICE will inevitably use it to target communities of color, round up immigrants, and track people in their daily, private lives.”

Is There Middle Ground?
The question remains: is there a way this technology can be used to find missing children without violating people’s rights? Oberg believes there is with regulation and oversight. “Governing bodies should thoroughly evaluate the benefits vs. the risks of using such technology,” she says. “Outlawing these tools isn’t the answer. Instead, countries and communities need detailed laws addressing permitted use cases, data storage and protection, and required transparency.”
Oberg continues, “People will have trust in the use of the technology — if they’re thoroughly informed about the way it works, why it’s used, and how their personal data is stored, contained and protected. Once they’re passed and well communicated, independent oversight is necessary to frequently check if companies comply with these laws and regulations.”
When Parentology asked the ACLU of Northern California if it would be possible to utilize this technology with specific regulatory legislation, they had a different opinion, “Face surveillance technology is too dangerous, and the potential for abuse too great, for government use. Once its unleashed, the damage can’t be undone.”
The impacts of San Francisco’s decision raise a litany of questions. Can this technology be used for its ability to track missing children without infringing on the rights of citizens? The debate is just getting started and looks be taken to local, state and possibly national levels, with both sides invested in the outcome.
Sources:
The New York Times
National Center For Missing and Exploited Children
Independent
San Francisco Chronicle