This project is part of the World Economic Forum’s Shaping the Future of Technology Governance: Artificial Intelligence and Machine Learning
How can we preserve human rights concerns in the advent of facial recognition technology?
Facial recognition offers a seamless experience throughout identification processes, offering also a high-level of security.
This technology is considered to be a non-invasive method in which we do not get interrupted, stopped or in contact with any devices. Seamless and contactless experience are keywords.
Good or bad, the technology retains personal data of many citizens. And it’s a right to know by whom, to what purpose and for how long our personal data is used and kept.
While algorithms increase as fast pace the accuracy score and machines’ responsiveness, the policy and regulations still have a long way to go.
Thinking of this 2020 trend growth of facial recognition use, enhanced by sanitary crisis, world leaders in technology, digital identity and biometrics gathered with policy-makers, civil society representatives and academics to develop a governance framework, delimiting the use and safe conditions of the technology.
4 steps to build a framework that ensures trustworthy were made in a pilot phase by companies and IDEMIA, MICROSOFT, AFNOR, SNCF, AWS among others:
- Define what constitutes the responsible use of facial recognition through drafting a set of principles for action
- Design best practices to support product teams in the development of systems “responsible by design”
- Assess to what extent the system designed is responsible through an assessment questionnaire that describes for each use-case what rules should be respected to comply with the principles for action.
- Validate compliance with the principle for action through the design of an audit framework by a trusted third party
The framework takes into account different use-cases and applications across jurisdictions. For instance: Access control, safety in public spaces, marketing and KYC services, healthcare, etc.