
In November 2024, following the infamous Bunnings Determination the Office of the Australian Information Commissioner (OAIC) published a guide for organisations in using facial recognition technology (FRT).
The guide walks organisations through how the Australian Privacy Principles (APPs) should be applied in the context of using facial recognition technology (FRT) in a commercial setting. FRT refers to the collection of a digital image of an individual’s face and the creation of a biometric template from their distinct features. FRT is used to either verify an individual’s identity using one-to-one matching (facial verification) or identify an individual from a group of other individuals in a database using one-to-many matching (facial identification).
The key things organisations should take away from the OAIC guidance are:
- FRT is considered to be a highly intrusive practice and almost always will require the commissioning organisation to conduct a Privacy Impact Assessment (PIA) to assess the privacy risks associated with the particular use of FRT. PIAs should be conducted and recommendations implemented before FRT is deployed.
- Organisations should consider whether the use of FRT is “reasonably necessary” for the organisation’s functions and activities. Organisations should explore alternatives to FRT such as CCTV or deploying security guards for security purposes, so as not to fall foul of APP 3.
- As the biometric data collected through FRT is classified as ‘sensitive information’, organisations must obtain the consent of the individuals to whom the information relates (unless an exception applies). OAIC has confirmed that merely displaying signage that FRT is in use will not constitute sufficient consent for the purposes of APPs 3 and 5. Consent must be informed, voluntary, current, specific and given by someone with capacity. In the context of FRT, individuals must be notified of the use of FRT and the fact that their biometric data will be compared against a database of other images to determine if there is a match, and any adverse consequences of this. Individuals must be provided with a genuine opportunity to withhold consent before their face is scanned.
- Organisations should understand the limits of the FRT they are using and take steps to ensure their database is filled with accurate and up-to-date data that is free from bias and discrimination in order to meet obligations under APP 10. For example, an organisation should conduct testing to identify whether the system is producing any false negative or false positive matches.
- Organisations should implement robust governance policies and procedures on their use of FRT to meet obligations under APP 1. This will often require organisations to go beyond simply having a Privacy Policy.
If you have any questions about the OAIC guidelines or your organisation’s use of FRT please get in touch with our experienced privacy team.