How to tackle privacy risks when using facial recognition and biometrics

Facial recognition

In November 2024, following the infamous Bunnings Determination the Office of the Australian Information Commissioner (OAIC) published a guide for organisations in using facial recognition technology (FRT). 

The guide walks organisations through how the Australian Privacy Principles (APPs) should be applied in the context of using facial recognition technology (FRT) in a commercial setting. FRT refers to the collection of a digital image of an individual’s face and the creation of a biometric template from their distinct features. FRT is used to either verify an individual’s identity using one-to-one matching (facial verification) or identify an individual from a group of other individuals in a database using one-to-many matching (facial identification).  

The key things organisations should take away from the OAIC guidance are: 

  1. FRT is considered to be a highly intrusive practice and almost always will require the commissioning organisation to conduct a Privacy Impact Assessment (PIA) to assess the privacy risks associated with the particular use of FRT. PIAs should be conducted and recommendations implemented before FRT is deployed. 
  1. Organisations should consider whether the use of FRT is “reasonably necessary” for the organisation’s functions and activities. Organisations should explore alternatives to FRT such as CCTV or deploying security guards for security purposes, so as not to fall foul of APP 3. 
  1. As the biometric data collected through FRT is classified as ‘sensitive information’, organisations must obtain the consent of the individuals to whom the information relates (unless an exception applies). OAIC has confirmed that merely displaying signage that FRT is in use will not constitute sufficient consent for the purposes of APPs 3 and 5.  Consent must be informed, voluntary, current, specific and given by someone with capacity. In the context of FRT, individuals must be notified of the use of FRT and the fact that their biometric data will be compared against a database of other images to determine if there is a match, and any adverse consequences of this. Individuals must be provided with a genuine opportunity to withhold consent before their face is scanned. 
  1. Organisations should understand the limits of the FRT they are using and take steps to ensure their database is filled with accurate and up-to-date data that is free from bias and discrimination in order to meet obligations under APP 10. For example, an organisation should conduct testing to identify whether the system is producing any false negative or false positive matches. 
  1. Organisations should implement robust governance policies and procedures on their use of FRT to meet obligations under APP 1. This will often require organisations to go beyond simply having a Privacy Policy. 

If you have any questions about the OAIC guidelines or your organisation’s use of FRT please get in touch with our experienced privacy team. 

Parental Leave for Casual Employees

For casual employees the unpredictability of their employment can be a major source of stress as often casual employees miss out on many of the entitlements that full-time and part-time employees enjoy. For many, this concern is further exacerbated when they learn that they are about to become a parent. It should therefore be of …
Read more

Purchasing an Off-the-Plan Property

The interest in “off-the-plan” properties is ever increasing and is becoming more popular for buyers. An off-the-plan purchase is one where the Buyer enters into a contract to purchase a property that has not yet been constructed. Due to the prolonged settlement period for an off-the-plan purchase it is imperative for buyers and sellers to …
Read more