-
On this page
Download the Facial recognition technology and privacy factsheet
Facial recognition technology and privacy (PDF, 334 KB)Last updated: 11 November 2024
Who is this guidance for?
This guidance sets out general considerations for private sector organisations that are considering using facial recognition technology (FRT) to undertake facial identification in a commercial or retail setting. It does not cover all privacy issues and obligations in relation to the use of FRT, rather it provides information about key principles captured under the Australian Privacy Principles (APPs) that are particularly relevant when considering the use of FRT. Organisations should consider this guidance together with the Privacy Act 1988 (Cth) and the Australian Privacy Principles guidelines.
Key points
- FRT involves the collection of a digital image of an individual’s face and the extraction of their distinct features into a biometric template. The biometric template is then compared against one or more pre-extracted biometric templates for the purpose of facial verification or identification.
- facial verification refers to ‘one-to-one’ matching. It involves determining whether a face matches a single biometric template.
- facial identification refers to ‘one-to-many’ matching. It involves determining whether a face matches any biometric template in a database.
- Biometric templates and biometric information, including when used for automated verification or identification purposes is considered sensitive information under the Privacy Act.[1] Sensitive information is generally afforded a higher level of privacy protection under the Privacy Act. Organisations must take reasonable steps to protect personal information they hold from misuse, interference and loss, as well as unauthorised access, modification and disclosure.[2]
- It is best practice for organisations considering using FRT to undertake a privacy impact assessment (PIA) to identify potential privacy impacts at the outset and implement recommendations to manage, minimise or eliminate them. This will assist to ensure that a privacy by design approach is embedded from the start in accordance with an organisation’s obligations under APP 1.
- As part of this privacy by design approach, it is expected that key principles will be explored to support the appropriate use of sensitive information when using FRT, including:
- Necessity and proportionality (APP 3) – personal information for use inFRT must only be collected when it is necessary and proportionate in the circumstances and where the purpose cannot be reasonably achieved by less privacy intrusive means.
- Consent and transparency (APP 3 and 5) – individuals need to be proactively provided with sufficient notice and information to allow them to provide meaningful consent to the collection of their information.
- Accuracy, bias and discrimination (APP 10) – organisations need to ensure that the biometric information used in FRT is accurate and steps need to be taken to address any risk of bias.
- Governance and ongoing assurance (APP 1) – organisations who decide to use FRT need to have clear governance arrangements in place, including privacy risk management practices and policies which are effectively implemented, and ensure that they are regularly reviewed.
What is facial recognition technology?
FRT is the process by which an individual can be identified or verified from a digital image. FRT involves the collection and use of biometric information (i.e. face data).
Biometric templates and biometric information including when used for biometric verification or identification is considered sensitive information under the Privacy Act. Sensitive information is a subset of personal information that is generally afforded a higher level of privacy protection.
Where FRT is used, distinct features of an individual’s face are extracted into a biometric template and compared against one or multiple pre-extracted biometric templates.
Facial verification and identification
Generally, FRT can be used to accomplish two tasks: verification or identification.
- facial verification refers to ‘one-to-one’ matching. It involves determining whether a face matches a single biometric template. An example is iPhone Face ID.
- facial identification refers to ‘one-to-many’ matching. It involves determining whether a face matches any biometric template in a database. Facial identification is increasingly used by law enforcement to identify an unknown criminal suspect by comparing their faces that appear in databases.
An individual does not need to be identified from the specific information being handled to be ‘identifiable’ in a facial identification system. An individual can be identified if their facial image is distinguishable from others in a database.
Privacy management – managing and mitigating risks
FRT significantly interferes with the privacy of individuals, and live FRT in particular is highly intrusive to an individual’s privacy. This means that negative privacy impacts will be identified as part of the privacy impact analysis in a PIA, and an organisation’s compliance check, which will need to be managed or mitigated if the organisation decides to proceed with the use of FRT.
Organisations should consider the principles below to determine whether the use of FRT is appropriate in the circumstances, including:
- Privacy by design
- Necessity and proportionality
- Consent and transparency
- Accuracy and bias, and
- Governance and ongoing assurance
This is not an exhaustive list and does not intend to cover the entirety of relevant considerations that an organisation should consider relevant to its circumstances.
Adopting a privacy by design approach (APP 1)
Organisations are encouraged to adopt a ‘privacy by design’ approach to their use of FRT. [3] A Privacy Impact Assessment (PIA) will support organisations to instil this approach and comply with its obligations.
Undertaking a PIA is considered a reasonable step to take under APP 1.2 to ensure an organisation is complying with their privacy obligations.[4]
A PIA is a systematic assessment of a project that identifies the impact that the project might have on the privacy of individuals, and sets out recommendations for managing, minimising or eliminating that impact. There is very real community concern about the privacy risks associated with FRT. A PIA demonstrates commitment to, and respect of, individual’s privacy and other associated human rights.
This guidance highlights some key privacy considerations for organisations to consider before determining whether to use FRT and when completing a PIA.
Undertaking a PIA before using FRT
Organisations regulated by the Privacy Act should conduct a PIA for projects involving sensitive information such as FRT. The OAIC has identified 10 steps which should be considered when undertaking a PIA in relation to a new, or updated project. Further information on each step is available in the OAIC’s PIA Guide.
Necessity and proportionality (APP 3)
Personal information for use in FRT must only be collected when it is reasonably necessary for one or more of an organisation’s functions or activities.[5] What is reasonably necessary is an objective test based on whether a reasonable person who is properly informed would agree that collecting the personal information is necessary.[6]
In determining whether the collection of personal information is reasonably necessary for a function or activity, an organisation should consider:
- The primary purpose of collecting the personal information
- How the personal information will be used in undertaking a function or activity of the organisation, and
- Whether the organisation could undertake the function or activity without collecting that personal information, or by collecting a lesser amount of personal information.[7]
It is up to an organisation to be able to justify that collection of the information is reasonably necessary. The fact that FRT is available, convenient or desirable should not be relied on to establish that it is necessary to collect the information.
In determining whether the use of FRT is necessary, the following factors will be relevant:
- The suitability of the FRT system in addressing the relevant activity or conduct
- The alternatives available to address the relevant activity or conduct
- Whether the use of the FRT system is proportionate to the outcome achieved. An organisation will need to balance the privacy impacts of the collection of sensitive information, and holding this information, against the benefits of the use of the FRT system.
Alternatives to FRT
Alternatives to FRT to monitor for safety and security concerns may include:
- Quality CCTV coverage
- The deployment of security guards, including covert security guards
- Training employees in dealing with safety and security issues, and
Close engagement with law enforcement.
When assessing whether the use of FRT is proportionate, organisations should carefully consider whether the benefits clearly outweigh the risks posed to individual’s privacy and other human rights. For example, where an organisation is using FRT to lessen or prevent serious threats to the health, safety and security of customers in a commercial or retail setting, it must be able to demonstrate how its use is proportionate to the risks identified.
An organisation should regularly consider whether the benefits of using FRT have been realised, and the use of the technology is still needed, including whether any anticipated privacy risks have arisen. The use of the technology must be regularly reviewed, and any required steps taken to ensure practices are consistent with the assessment findings.
Organisations should consider
- Is the collection of biometric information reasonably necessary to perform a particular function or activity? ‘Reasonably necessary’ depends on whether the interference with privacy is proportionate to a legitimate aim sought to be achieved. Factors to consider include:
- What is the primary purpose of collecting the information?
- How will the biometric information be used, stored and secured in undertaking a function or activity?
- Can you undertake the function or activity without collecting the biometric information?
- Can the purpose be achieved by less intrusive means? Have you considered other alternative means?
- Have you identified and assessed the benefits and privacy risks? Do the benefits to be achieved clearly outweigh the privacy risks, and why?
- Is there a clear public interest in using FRT? Examples may include to lessen or prevent a serious threat to public health or safety.
- Would an individual reasonably expect FRT to be used in the circumstances? Will the use of FRT lead to unjustified adverse effects, such as unjust discrimination?
Consent and transparency (APP 3 and 5)
Consent
Consent is generally required to collect sensitive information such as biometric data used in FRT, subject to some limited exceptions.[8]
In order to provide meaningful consent, certain elements will need to be met. These include:
- The individual is adequately informed before giving consent
- The individual gives consent voluntarily
- The consent is current and specific, and
- The individual has the capacity to understand and communicate their consent.[9]
The nature of FRT means that it is not often practical to obtain true, express consent from individuals whose biometric information might be captured by FRT. Merely having signage or notice about the use of FRT in and of itself, will not generally be sufficient to show that an individual has consented to the use of this technology. This is because the information is sensitive information, and all four elements of consent are unlikely to have been satisfied. Further notice will be required to ensure informed consent can be provided. These are detailed below.
Tips for adhering to the four key elements of consent when using FRT
A commercial organisation must consider the following matters relating to consent before using FRT.
Informed consent
Before an individual enters the premises, they must be informed that an image will be taken of their face. The individual must be advised that biometric data will be generated from that image which will be compared against a database of other images to determine whether there is a match. You must inform the individual about the actions that may be taken if a match is identified.
Voluntary consent
You must make sure the individual has a genuine opportunity to provide or withhold consent prior to an image being taken of their face.
Current and specific consent
You must have obtained consent from the individual to collect, use or disclose their personal information when it is collected.
Consent with capacity
You may presume that an individual has capacity to consent, unless there is a factor that casts doubt on their capacity. You need to have a system in place to take into account matters such as the age, disabilities and language skills of customers whose face you propose to scan.
Implied consent arises where consent may reasonably be inferred in the circumstances from the conduct of the individual and the organisation. Generally, implied consent should not be relied on when collecting sensitive information, including biometric information, from customers. The mishandling or inappropriate use of biometric information can have adverse consequences for an individual or those associated with the individual. It can also cause humiliation, embarrassment or undermine an individual’s dignity.[10]
Opt-out mechanisms are a type of implied consent. It is only appropriate to infer consent from an opt-out mechanism in very limited circumstances.[11]
There are some limited exceptions called permitted general situations contained in s 16A of the Privacy Act that would allow an organisation to collect this information without consent. However, these permitted general situations are highly specific and confined to particular circumstances. An organisation would need to make a careful assessment about whether they satisfy the criteria for these exceptions, prior to relying on them to utilise FRT, and as a matter of best practice, ensure that they document this consideration.[12]
Organisations should consider
- How will you be transparent and provide notice to individuals to ensure they are able to provide informed consent?
- Have you provided individuals with the information required under APP 5.2?
- How will you inform individuals that their facial image may or will be subject to, or included in, a reference database for a FRT system?
- Is there an accessible way for individuals to raise a complaint or question about the collection and use of their biometric information? Your Privacy Policy should explain what individuals need to do to make a complaint.
- Have you considered the four key elements of consent?
- How will consent be obtained from individuals who have particular needs, such as individuals from a non-English speaking background and children?
- If biometric information is sourced from a third party, has the third party collected it lawfully and do they have authority to disclose it to you?
- If an individual does not consent to the collection of their biometric information or withdraws their consent, is an alternative process available which will not result in detriment to the individual?
Transparency
As an important transparency step, APP 5 requires organisations to ensure that an individual is aware of certain matters when they collect their personal information. This is important because there are many complexities surrounding FRT that can impact an individual’s ability to understand how their personal information is collected and handled.
Organisations are required under the Privacy Act to manage personal information in an open and transparent way and to take reasonable steps to provide notice under APP 5.[13] This enhances the accountability of organisations for their personal information handling practices, as well as aids community trust and confidence in those practices.
Organisations that are collecting personal information must take reasonable steps to either notify the individual of certain matters,[14] or ensure the individual is aware of those matters.[15] The reasonable steps required will depend on the circumstances, but more rigorous steps may be needed when collecting sensitive information, such as biometric information in FRT, and where the collection can result in detriment to an individual.[16]
Organisations need to ensure individuals have knowledge choice and control over how personal information, especially sensitive information relating to them, is handled. This ensures that they can make an informed decision about whether to provide their personal information to organisations.[17]
Accuracy, bias and discrimination (APP 10)
Accuracy
Organisations have an obligation to take reasonable steps to ensure the personal information collected, used and disclosed is accurate, up-to-date, complete and relevant.[18]
The reasonable steps that an organisation must take will depend on the circumstances including:
- The sensitivity of the personal information
- The nature of the organisation holding the personal information, and
- Possible adverse consequences for an individual if the quality of personal information is not ensured. More rigorous steps are required where the information collected, used or disclosed is ‘sensitive information’, such as biometric data used in FRT.
Reasonable steps to ensure accuracy
Organisations should consider the reasonable steps they will need to take to ensure accuracy. These will depend on the circumstances but may require the organisation to:
- Take steps to ensure the referenced database is made up of accurate and up-to-date information
- Run a trial and conduct regular testing of accuracy
- Undertake due diligence in relation to data quality practices, and
Clearly communicate any limitations in relation to the accuracy of the FRT system.
FRT carries inherent accuracy risks. Organisations must develop processes to check the proportion of predictions the FRT system gets right. If a FRT system is not sufficiently accurate, it may lead to:
- False negatives – a failure to identify an individual whose face is part of the reference database, or
- False positives – the matching of faces that belong to two different individuals.
Bias and discrimination
Another risk in using an FRT system is in-built bias and discrimination of certain demographic groups which may lead to adverse impacts and unfair outcomes. Even if an FRT system is highly accurate, the training data may reflect past bias and discrimination depending on the data used. Organisations must ensure this is accounted for if they are using or designing an FRT system.
Organisations relying on a third party hosted FRT system must conduct their own due diligence to manage risks associated with inaccuracy, bias and discrimination. For example, organisations should ensure that a third party hosted FRT system has been subject to robust testing and monitored for evidence of inaccuracy, bias and discrimination.
Organisations should consider
- Do you have appropriate and robust steps in place to check the FRT system is producing accurate results?
- What strategies have been developed to manage and mitigate risks associated with false negatives and false positives?
- What due diligence have you undertaken to assess the accuracy of the FRT? For example, if you are relying on a third party FRT system, have you been informed about the technical effectiveness and statistical accuracy?
- Have you implemented measures to mitigate risks of bias, discrimination and unfair treatment of different demographic groups prior to using an FRT system?
Accountability and ongoing assurance (APP 1)
An organisation will need to take reasonable steps to implement practices, procedures and systems relating to its function or activities that will ensure compliance with the APPs and any binding registered APP code.[19] In addition to conducting a PIA, this includes:
- Procedures for identifying and managing privacy risks at each stage of the information lifecycle, including collection, use, disclosure, storage, destruction and de-identification
- Clear and robust governance mechanisms to ensure compliance with the APPs, such as designated privacy officers and regular reporting to the organisation’s governance body
- Regular staff training and information bulletins on how the APPs apply to the organisation, and its practices, procedures and systems developed under APP 1.2
- Appropriate supervision of staff regularly handling personal information, and reinforcement of the organisation’s APP 1.2 practices, procedures and systems, and
- A program of proactive review and audit of the adequacy and currency of the organisation’s Privacy Policy and of the practices, procedures and systems implemented under APP 1.2, including for dealing with inquiries and complaints.[20]
An organisation should be able to demonstrate that these steps have been taken and that practices, procedures and systems are regularly reviewed and updated.
FRT topics in policies and procedures
In the context of FRT, the topics addressed in policies and procedures should include, but are not necessarily limited to:
- How the FRT system collects, uses, holds and discloses personal information
- The circumstances in which the FRT system can be used
- Controls on staff access to the FRT system and the referenced database
- The process for enrolling and reviewing images in the referenced database
- The process for assessing positives and false positives
- A retention and destruction protocol for any biometric information collected
- The process for handling complaints
- Training requirements for relevant staff, and
- Systems to review the efficacy of the FRT system and implement the relevant policies.
Organisations will also need to have a clearly expressed and up to date Privacy Policy about how they manage personal information, such as biometric information collected using FRT.[21] This needs to be regularly reviewed and updated to ensure it accurately reflects the organisation’s information handling practices, as well as that it is easy for individuals to understand and navigate.
Organisations should consider
- What governance arrangements do you have in place? Some examples include designated privacy officers and regular reporting to the organisation’s governance body.
- How is the effectiveness of privacy risk management practices and policies being assessed?
- Do you have clear processes in place to ensure you are handling personal and sensitive information in accordance with your legislative obligations?
- Are you delivering training on privacy, risk management and other practices and policies to employees? Training should be documented and conducted periodically to refresh and update employee’s knowledge on emerging privacy issues.
- Have you clearly outlined how employees are expected to handle personal and sensitive information?
- Do reporting mechanisms exist to ensure that employees are routinely informed about changes to practices and policies?
- Are you regularly reviewing and updating your Privacy Policy to ensure that it reflects your information handling practices?
- Is there an adequate level of human control or oversight over the FRT?
- Are you undertaking periodic audits of the effectiveness and necessity of using the FRT?
- Given the rapid pace of FRT advancements, are privacy risk management practices and policies flexible and adaptable to changes in technology?
- Have you developed a data breach preparation and response plan that can be relied on the event of a cyber security incident? If you are relying on a third party hosted FRT system, consider who will be allocated responsibility for meeting legislative requirements. For example, if the biometric information is jointly held, who will be responsible for complying with the Notifiable Data Breaches scheme in the event of a data breach and handling complaints?
- Are there processes that allow individuals to easily access and correct their personal information? You must respond to a request for correction within a reasonable period after the request is made. In most cases, a reasonable period will not exceed 30 calendar days.
- If you are relying on a third party hosted FRT system, have you obtained sufficient information to inform your privacy risk management practices and policies?
Additional resources
OAIC resources
- Guide to undertaking privacy impact assessments
- PIA e-Learning course
- Guide to securing personal information
- Data breach preparation and response guide
- Biometric scanning
International resources
[1] s 6 of the Privacy Act.
[2] APP 11.
[3] OAIC, Privacy by design.
[4] APP 1.2 requires organisations to take reasonable steps to implement practices, procedures and systems to ensure the organisation complies with the APPs and is able to deal with related enquiries and complaints. For more information, see Chapter 1: APP 1 Open and transparent management of personal information
[5] APP 3.2.
[6] Australian Privacy Principles guidelines (oaic.gov.au) [3.18].
[7] Australian Privacy Principles guidelines (oaic.gov.au) [3.19].
[8] APP 3.4 lists five exceptions to the requirements of APP 3.3(a).
[9] Australian Privacy Principles guidelines (oaic.gov.au) [6.17].
[10] Australian Privacy Principles guidelines (oaic.gov.au) [B.144].
[11] Australian Privacy Principles guidelines (oaic.gov.au) [B.41] – [B.43].
[12] For more information, see the APP Guidelines.
[13] APP 1.
[14] The matters are listed under APP 5.2.
[15] APP 5.1.
[16] Australian Privacy Principles guidelines (oaic.gov.au) [5.4].
[17] Australian Privacy Principles guidelines (oaic.gov.au) [5.35].
[18] APP 10.1.
[19] APP 1.2(b).
[20] Australian Privacy Principles guidelines (oaic.gov.au) [1.7].
[21] APP 1.3.