Skip to main content

Please be advised that our office will be closed from 5pm – Tuesday, 24 December, and will reopen on Thursday, 2 January 2025.

Published:  

Read the keynotes address prepared for delivery by Privacy Commissioner Carly Kind for the IAPP Sydney KnowledgeNet event on Monday 6 May 2024, 'How to power up a privacy program for emerging and evolving technologies'.

Introduction

It’s a pleasure to be here today, on the first day of Privacy Awareness Week 2024 – and at such a busy and interesting time in privacy.

I would like to begin by acknowledging the Traditional Custodians of the land on which we meet today, the Gadigal people of the Eora Nation. I pay my respects to Elders past and present and extend that respect to any First Nations peoples with us today.

I know I won’t be the only one in the room who reads the news every day and sees privacy as central to so many of the issues that challenge politicians, policymakers and ordinary people:

  • There are the repeated data breaches, of course – just last week there were two high profiles incidents with Qantas and NSW clubs.
  • There is the continued concentration of immense power in the hands of a few large tech companies, which causes problematic dynamics in digital markets, as evidenced most recently by the reticence of one tech boss who shall not be named to comply with regulatory restrictions in the safety realm, but also the ongoing battle between tech companies and security services when it comes to encryption.
  • The seemingly unstoppable growth of artificial intelligence, and in turn the continued concentration of power in those same few companies who also control much of the infrastructure, compute, data and expertise likely to fuel the AI revolution.
  • There is even the continued scourge of misogyny online, as recently identified as one of the key components of persistent egregious levels of domestic violence in this country, which is in part fuelled by the incentives of the online attention economy.

All of these issues, and many more, relate to privacy, and in my view could be tempered or mitigated through stronger, better privacy protections. Which is not to say that privacy alone is the solution – I think the experience of Europe in recent years has been that privacy regulators have to work hand in hand with competition regulators and others to tackle the most complex dynamics of online markets – but that at the very least privacy may be the starting point.

Now this is unsurprisingly the view of someone who works in privacy and believes wholeheartedly in the right to privacy and in privacy regulation – when you’re a hammer, everything looks like a nail. But if the dinner tables I’m at are anything to judge by, it is also, instinctively, the view of many of our fellow citizens and consumers.

Privacy is on everyone’s lips these days.

In a way, privacy has gone mainstream. It’s interesting to reflect on where we’re at in this respect in contrast to 10, or even five years ago, when it was common to hear people lamenting that 'we don’t have any privacy anymore anyway'. It’s hard to find and compare good longitudinal data on this, but some examples include the Pew Research study that shows that the share of Americans who are worried about the government’s use of data increased from 64% in 2019 to 71% in 2023. This increase is reflected in our own research of Australian attitudes to privacy; for example, the proportion of Australian adults who care enough about protecting their personal information to do something about it increased from 75% in 2020 to 82% in 2023

This trend towards valuing privacy more and more is reflected around the world. If we just look at trends in regulation, and drawing on the excellent research of Australia’s very own privacy guru Professor Graham Greenleaf, from February 2021 to March 2023, 17 new countries enacted data privacy laws, bringing the total to 162 globally. Of course, there is even now draft privacy legislation under contemplation in the US, a jurisdiction historically adverse to federal privacy legislation, and it seems possible that the country will enact a privacy law before the end of the year.

Privacy Awareness Week

It is against this backdrop, then, that we commemorate Privacy Awareness Week. This year, awareness of privacy is higher than ever before, arguably. Expectations for better privacy practices are stronger than ever before. Recognition by business of the ethical imperative to be good privacy players is more widespread than ever before.

And yet – privacy harms are still widespread, data breaches occur weekly, data-driven business models are still pervasive and individuals still feel a lack of agency and control when it comes to their personal data.

For that reason, this year we’re calling on people, organisations and government to power up privacy – to take control and to step things up. We really want to see entities inject some power into their approach to privacy, rather than simply being in responsive mode or dealing with privacy issues late in the day. We would also like to see government power up privacy Australia-wide by introducing the reforms to the Privacy Act that are so overdue. In doing so, we believe that we can restore some power to individuals to feel in control of their personal information once more.

Privacy reform

It is no coincidence that I have taken up the role of Privacy Commissioner at a time in which Privacy Act reform is on the agenda. For me, the prospect of reform is hugely exciting for a number of reasons:

  • There are substantial gains to be made for the Australian community through the proposed reforms, not least those that relate to the protection of children and vulnerable groups.
  • There will be an exciting opportunity for the Office of the Australian Information Commissioner to become a more enforcement-focused regulator, with a range of new enforcement powers at our fingertips.
  • There is the potential for the Australian regime to leapfrog equivalent frameworks overseas and take some novel approaches, including the new fair and reasonable test, which should aid in dismantling the practice of organisations using consent as a gateway to problematic privacy practices.

I come into this role having spent the past five years working on AI and data governance and policy as Director of the London-based research institute, the Ada Lovelace Institute, which has a remit to ensure that data and AI work for people and society. In that role, I thought a lot about the role of data privacy regulation and regulators in grappling with new and emerging technologies, particularly AI.

This, I know, is probably the biggest issue on many of your minds at the moment. I recently had the good fortune to attend the IAPP Global Privacy Summit in Washington. It was an excellent opportunity to converse with many of my regulatory peers and to hear first-hand many of the views and concerns within the privacy and data protection world.

The clearest issue of interest and challenge for privacy professionals worldwide that came through the many events and panels at the conference was how should privacy professionals be thinking about AI, and what would AI governance and regulation ultimately look like. How should the privacy profession navigate their way through this new era?

This goes to the overarching theme of today’s convening and Privacy Awareness Week, which is 'Privacy and technology: improving transparency, accountability and security'. In thinking about what this means in the context of emerging technologies, I think privacy professionals should have a few things in mind:

The first is that you can’t go wrong with a precautionary approach. Undoubtedly many of you will be faced with colleagues who are ready and eager to start deploying experimental AI tools in every aspect of your product lines or services. Some of those deployments may not involve personal information, in which case the risks may still be substantial (particularly of error), but may be less likely to impact on individuals’ data. Where you’re considering using personal information in the context of AI technologies, such as generative models or LLMs, it is important to remember the immense privacy challenges associated with the use of such tools.

In this context I want to acknowledge that often privacy professionals are put in a very difficult position when it comes to the development of new technologies that promise efficiency savings. Privacy professionals are often put in the unenviable position of saying 'slow down' to their colleagues. Indeed, research into the experience of 'ethics' actors within organisations shows that often they have the same KPIs as colleagues – i.e. related to the speed and financial success of product or service delivery – even though their role is a substantively different one that may even thwart product or service delivery altogether.

I have been in this position many times, and I understand how difficult it can be. As you will all know, I think the key here is finding a way to convince colleagues that prioritising privacy is good for business.

Years ago I did some work with the UN Refugee Agency in West Africa, developing their first ever data protection policy. Although the organisation was full of well-meaning people who instinctively understood why the protection of personal information might be important to refugees, they felt passionately that it came second to their most important job, which was to protect refugees. The privacy practices were pretty woeful, and ranged from no process for obtaining consent of children or their parents to take photographs that may one day end up on billboards for the purpose of soliciting donations, to pretty liberal information sharing practices. I could talk until I was blue in the face about domestic or international regulation and standards but ultimately the only thing that was able to shift practices in the end was explaining that data protection would be good for business, that it that it would help those committed to protecting refugees to protect them well. Helping them work through the risks of bad privacy practices was part of that – drawing out the implications of sharing biometric registries with countries who might be seeking to identify those fleeing political persecution, for example.

I would encourage you, then to get into the habit of using privacy impact assessments to surface privacy challenges of new and emerging technologies, and to share them with your colleagues.  Ensuring that these risks are elevated throughout the governance frameworks of your organisation is also key.

The second point I’d make around AI and new technologies is that transparency is the best policy when it comes to technologies which themselves can obfuscate or obscure. One of the key provisions in the final text of the EU AI Act is around disclosure to consumers about when they’re interacting with AI-generated content, and the US has recently issued an executive order stating that the Department of Commerce will develop standards on AI watermarking. Increasingly, I believe AI watermarking will become commonplace. Three provisions in the Privacy Act reforms will also speak to this:

  1. a right for individuals to request meaningful information about how substantially automated decisions with legal or similarly significant effect are made [agreed]
  2. a requirement for organisations to include information in privacy policies about the use of personal information to make substantially automated decisions with legal or similarly significant effect [agreed]
  3. the introduction of a positive obligation for personal information handling to be fair and reasonable in the circumstances, which we see as a new keystone of Australia’s privacy framework [agreed in principle]

Let me reassure you that you’re not the only ones who are thinking about how AI will be changing your work going forward. On our end, we’re also following closely the work that the government is doing on AI regulation and governance. We’re working hard with our colleagues in other regulators through the Digital Platform Regulators Forum, which is made up of the OAIC, the Australian Competition and Consumer Commission, the Australian Communications and Media Authority and the eSafety Commissioner. We’ve published working papers on algorithms and LLMs, and a literature summary on the harms and risks of algorithms, which considers the harms and risks posed by some commonly used types of algorithms to end-users and society.

Online privacy and high privacy impact technologies, including practices involving the use of generative AI, facial recognition and the use of other biometric information, are also high on our regulatory priorities. The Australian Information Commissioner has made determinations concerning the collection of biometric information by Clearview AI and 7-Eleven to match facial biometric templates. The OAIC also has ongoing investigations into the use of facial recognition technology by Bunnings Group Limited and Kmart Australia Limited. These technologies typically rely on artificial intelligence through the use of machine learning algorithms to match biometric templates, and constitute some of the most concerning technological developments from the perspective of the Australian community.

Regulatory practice

Finally, I think it’s incumbent on regulators everywhere to think about how AI should inform our regulatory practice, either through necessitating new investigative techniques such as algorithmic audits, or through deployment in house of AI technologies to streamline complaints handling and provide more efficient access to information for citizens. We’ve begun looking at this at the OAIC.

I don’t need to tell you this as privacy professionals: strong privacy practices are good for everyone: for consumers, who feel more confident participating in the digital economy; for businesses, which can boldly innovate knowing that guardrails are in place to protect customers; and for government, which can realise the benefits of new technologies with the trust of its citizens.

At this time of immense change, and recognising the building community support for stronger privacy protections, we are urging Australian businesses, agencies and other organisations to step up and ‘power up’ privacy and make a real difference for the community.

With privacy reform on the way, and developments in technology continuing to evolve and challenge privacy practices, there has never been a more pressing moment for individuals, businesses and governments to pay attention to privacy.

I’m so grateful that we have this robust community of privacy professionals to help us spread the message. As I said, I acknowledge you don’t always have the easiest role, but it is a vital one. I look forward to working alongside you all to ensure that privacy across Australia is powered up in the months and years to come.