On December 19, 2023, the Federal Trade Commission (FTC) settled a complaint regarding the use of facial recognition technology for retail theft deterrence. While FTC orders are only binding on the subject company, they can provide insight as to regulatory priorities and therefore guide industry best practices. This is particularly true in this case as, in a statement released alongside the order, Commissioner Alvaro Bedoya described the order as a "strong baseline for what an algorithmic fairness program should look like."
Companies should consider the practices mandated by the order when building out their internal governance and acceptable use policies; particularly those companies considering the use of consumer biometric information and/or implementation of new AI tools and systems.
The practices outlined in the order are:
- Provide consumers with notice when they are enrolled in face surveillance systems and informing the consumer as to how to contest their entry into the system;
- Provide consumers with notice when the company takes some action against that consumer that the action could hurt the consumer, and informing the consumer how to contest the action;
- Respond quickly to consumer complaints;
- Implement robust testing, including testing for statistically significant bias on the basis of race, ethnicity, gender, sex, age, or disability – acting alone or in combination
- Conduct detailed assessment of how inaccuracies may arise from training data, hardware issues, software issues, probe photos, and differences between training and deployment environments;
- Conduct ongoing annual testing “under conditions that materially replicate” conditions in which the system is deployed; and
- Shut down systems if the company cannot address the risks identified through this assessment and testing.
This order was the first public enforcement following the FTC's policy statement on the misuse of consumer biometric information in May 2023 and likely the first of many AI bias enforcements from the FTC.