Skip to main content

Please be advised that our office will be closed from 5pm – Tuesday, 24 December, and will reopen on Thursday, 2 January 2025.

16 December 2021

Australian Information Commissioner and Privacy Commissioner Angelene Falk has determined that the Australian Federal Police (AFP) failed to comply with its privacy obligations in using the Clearview AI facial recognition tool.

Commissioner Falk found the AFP failed to complete a privacy impact assessment (PIA) before using the tool, in breach of clause 12 of the Australian Government Agencies Privacy Code, which requires a PIA for all high privacy risk projects.

The AFP also breached Australian Privacy Principle (APP) 1.2 by failing to take reasonable steps to implement practices, procedures and systems in relation to its use of Clearview AI to ensure it complied with clause 12 of the Code.

In an earlier determination, the Commissioner found that Clearview AI interfered with Australians’ privacy by scraping biometric information from the web and disclosing it through a facial recognition tool.

Clearview AI’s facial recognition tool allows users to upload a photo of an individual’s face and match it to photos of that person’s face collected from the internet. It then links to where the photos appeared.

Between 2 November 2019 and 22 January 2020, Clearview AI provided free trials of the facial recognition tool to members of the AFP-led Australian Centre to Counter Child Exploitation (ACCCE).

ACCCE members uploaded facial images of Australians to test the functionality of the tool, and in some cases, to try to identify persons of interest, and victims in active investigations. The AFP did not assess the risks to providing personal information to a third party located overseas, assess its security practices, accuracy, or safeguards.

“I recognise that facial recognition and other high privacy impact technologies may provide public benefit where they are accompanied by appropriate safeguards,” Commissioner Falk said.

“But there were a number of red flags about this third party offering that should have prompted a careful privacy assessment.

“By uploading information about persons of interest and victims, the ACCCE were handling personal information in a way that could have serious consequences for individuals whose information was collected.”

The Commissioner also considered that the AFP did not have in place appropriate systems to identify, track and accurately record its trial of this new investigative technology involving personal information handling.

Gaps in the AFP’s internal systems for identifying novel personal information collection practices meant there was not a coordinated approach to identifying high privacy risk projects. There were also gaps in the AFP’s mandatory privacy training, including insufficient information about conducting PIAs.

These gaps were particularly relevant for teams like the ACCCE that are exploring new and innovative investigative solutions, including capabilities for identifying potential offenders and victims.

The Commissioner said she recognised the AFP’s commitment to reviewing and strengthening its privacy governance framework and embedding a culture of privacy compliance across the agency.

“This determination should provide additional assurance to Australians that deficiencies in the AFP’s privacy governance framework will be addressed, under the OAIC’s oversight,” Commissioner Falk said.

Commissioner Falk has directed the AFP to:

  • engage an independent assessor to review and report to the OAIC on residual deficiencies in its practices, procedures, systems and training in relation to privacy assessments, and make any necessary changes recommended in the report
  • ensure that relevant AFP personnel have completed an updated privacy training program.

The full determination can be found on the OAIC website.