On Monday, the Denver Post revealed students enrolled in classes at University of Colorado, Colorado Springs (UCCS) had unknowingly participated in a government-funded facial recognition study. During the duration of the study, approximately 16,000 photos were taken — which included 1,700 student images — to test the standards of facial recognition software used by government agencies like the US Navy.
The study was conducted by UCCS Professor Terrance Boult, and took place during the 2012 and 2013 spring semesters. Boult captured images of students on campus by aiming a long-range camera towards the west lawn. The images were intended to be candid shots — with the subjects having no idea the photos were being captured, and no reason to expect privacy — in order to replicate the type of in-the-field data that’s normally collected for comparison.
The purpose of the surveillance was to determine whether existing facial recognition algorithms would meet the standards of various government agencies, including the Office of the Director of National Intelligence. After completing their research, Boult and his team decided they did not.
Boult told the Associated Press the algorithms worked in situations like comparing two passport photos, where subjects are generally facing forward and well lit. The algorithms were less adept at coming up with a match using a candid shot of someone from hundreds of feet away.
The data Boult and his team collected remained private for five years before being released to the public in an attempt to offer the subjects of the photos as much privacy and anonymity as possible.
After the information went public, UCCS had their Institutional Review Board analyze Boult’s research protocol. Jared Verner, a review board spokesman, said the board came to the conclusion Boult’s team operated in a way that protected the rights and welfare of the students used in the study as they didn’t collect or distribute any personal information.
The data set was ultimately taken offline after an article in the Financial Times released additional information, including dates and times photos were captured, making students more easily identifiable. A move that supported Boult’s initial statement of trying to protect the privacy of the students used in the study.
Boult still believes the pictures captured in his study are being used for “the greater good,” but he understands there may be students who disagree. Anyone who visited the campus during the time of the study is welcome to contact Boult’s team to review the photos, and request any images including their likeness be removed from the data set.
While facial recognition technology is not new — it is used for everything from finding missing children to helping your Google Nest know when a stranger is in your home — it’s still controversial. Advocacy groups have raised concerns over issues like the potential for racial bias in algorithms and instances of misidentification.