Retailers face increasing security challenges, but privacy risks need to be carefully managed.
As the title of my session today suggests, security and privacy aren’t a one or the other equation.
Today, I’m going to cover:
- why we can and must have both – narrow thinking is the enemy
- key privacy considerations for retail security tech – yes, I will be referring to the Bunnings case
- work we are doing in this space.
I challenge you to consider this topic today and what it means for your business, but also to consider this from another perspective – you as an individual – what are you comfortable with? Where’s your line?
Many of the technologies increasingly used in retail settings give rise to ethical questions, like whether we’re comfortable with indiscriminate surveillance.
It is a pivotal moment for tech and privacy, and the decisions we make now will shape our society into the future.
Retail security and privacy – we can have both
Security and privacy are often framed as competing interests, but responsible use of technology allows us to achieve both.
It’s imperative that we have both.
Security, of course, is vital to ensure the safety of retail staff, contractors and customers. And it’s important to prevent stock loss and damage and in-store incidents – protecting the business and contributing to growth – including of the Australian economy. Security initiatives, generally speaking, are pursued with good intentions.
Privacy underpins freedom of association, thought and expression, as well as freedom from discrimination. Information privacy is about promoting the protection of information that says who we are, what we do and what we believe.
While our right to privacy isn’t absolute, it is a fundamental human right, and as Edward Snowden once said: ‘Arguing that you don’t care about the right to privacy because you have nothing to hide is no different than saying you don’t care about free speech because you have nothing to say.’
Privacy is not just a legal and an ethical obligation. Privacy makes good business sense.
Done well it can support positive business outcomes such as:
- transformation and growth, through underpinning responsible innovation
- improved customer trust
- competitive advantage in an increasingly privacy-conscious market.
Our research shows Australians place a high value on privacy when choosing a product or service, with it ranking only after quality and price. They are even prepared to experience some inconvenience if their privacy is guaranteed.
Just as privacy can support positive outcomes, privacy done poorly has business impacts like:
- legal risks, including scrutiny from the regulator
- loss of customer trust – we’ve seen this time and time again with data breaches
- ultimately, lower revenue. It can hit the bottom line.
So, it’s important to get the balance right
Key privacy considerations
Striking that balance is a matter of looking at what the law says (the ‘can we’), but also asking ‘should we’.
The Privacy Act is a principles-based and technology-neutral law – this was by design to ensure the law was flexible to the circumstances of specific entities and relevant in the case of technological change.
At the centre of the framework are the Australian Privacy Principles – there are 13 of them; they set out standards, rights and obligations around the handling of personal information.
When I made my Bunnings decision, we also released guidance on the use of facial recognition technology, focusing on 4 key privacy concepts. I’d like to spend some time on them, as they are relevant to many of the security initiatives your businesses might be considering.
When exploring new initiatives that involve collecting personal information, you need to consider whether collecting personal information is necessary and the reason for it is proportionate to any impacts on privacy. There might be less privacy intrusive options available that achieve the goal at hand.
Depending on what information you are collecting, you may need consent – you generally need consent to collect sensitive information, a category of personal information that includes biometric information. You also need to be transparent around how you collect, use and share individuals’ personal information.
You need to ensure accuracy – that any personal information is accurate, up to date, complete and relevant – and minimise any risks of bias and discrimination.
And you need to have governance arrangements in place to minimise privacy risks. These should be documented, but more importantly, actually put into action.
Another key privacy principle is the need to take reasonable steps to secure any personal information you hold, including minimising data breach risks. Of course, the best way to do that is to not collect it in the first place.
Now, I’d like to look at some of those principles in the context of the Bunnings decision – and I note that my decision is subject to an appeal in the Administrative Review Tribunal.
Consent wasn’t an issue I considered in this case as Bunnings relied on an exemption – had they got consent, their use of facial recognition technology may’ve been defensible.
Under the exemption, where an organisation is using facial recognition technology to lessen or prevent serious threats to the health, safety and security of customers in a commercial or retail setting, they must be able to demonstrate it is proportionate to the risks identified.
I’d like to acknowledge upfront that Bunnings had good intentions in trying to protect their staff and customers and prevent other unlawful activity.
I had to watch confronting footage in reviewing the evidence – which very likely mirrors the experiences of many in this room, and I sympathise with you.
However, in many of the examples, facial recognition technology wouldn’t have enhanced Bunnings’ ability to address the issue, for example, where the perpetrator was wearing a balaclava, or had a knife, or was physically aggressive on entry.
The system could only be relied upon to address a relatively small number of incidents, and yet interfered with the privacy of everyone, including children and other vulnerable people, who entered the store by enabling covert and indiscriminate surveillance.
On transparency: Bunnings failed to take reasonable steps to notify individuals that their personal information was being collected, and did not include required information in their privacy policy. This meant that individuals who entered the relevant stores would’ve had no idea that facial recognition technology was in use and that their sensitive information – their faces, which we can’t change – was being collected, even if momentarily.
I would argue that beyond being a legal requirement, being transparent about the use of facial recognition technology is a smart thing to do. It may have the effect of influencing customer behaviour and, of course, it empowers customers to make informed choices about whether they want to frequent that retail setting.
Our investigation also found various governance shortcomings, such as a failure to conduct a thorough assessment of privacy implications of the system and develop policies and procedures governing its use.
Then there’s the other question I mentioned earlier: Should we? This requires a consideration of ethics, community expectations and corporate social responsibility.
The kind of surveillance enabled by facial recognition technology undermines our ability to control our personal information, and can have a bigger societal impact. It also carries with it risks such as the potential for bias and inaccuracy.
Our research told us that more than a quarter of Australians feel that facial recognition technology is one of the biggest privacy risks faced today, and only 3% of Australians think it’s fair and reasonable for retailers to require their biometric information when accessing their services.
This level of community concern is not just the case for facial recognition, but many other technologies growing in use – AI being one that comes to mind.
Thinking about what the law permits, but also what the community would expect, and what would and would not pass the pub test is critical.
There are mixed views on my Bunnings decision. I am pleased to see the high level of interest in this privacy issue, as it’s important as a community that we have conversations like these about what we want our society to look like into the future.
Bunnings’ appeal of my determination will be heard by the Administrative Review Tribunal in October. I welcome the opportunity to test our interpretation of the Privacy Act in the courts. And we are also working to finalise our investigation into Kmart about similar issues.
What’s next
So, what’s next – well, facial recognition technology is the issue of the moment.
We are planning targeted engagement with stakeholders in the retail industry to develop our understanding of key issues around its use. Issues like:
What are the practical challenges of establishing the 4 elements of consent required under the law: informed, voluntarily, current and specific, and with capacity?
Where and when should consent take place in retail and commercial premises?
Can retailers use signage, together with other means of notice (such as privacy policies or terms of entry), to meet the requirements of consent?
What is the relationship between consent processes and factors like:
- spatial context, that is, whether the space is publicly accessible, restricted or semi-public
- how essential the retail or commercial space or service is – is there a difference for luxury goods stores versus supermarkets?
- whether there are other options nearby – for example, in rural versus urban areas
- where in a retail or commercial space facial recognition technology is used?
We are also thinking about how consent could be obtained from individuals who have particular needs, such as individuals from a non-English speaking background and individuals with limited legal capacity. How could children, who lack capacity to provide consent, be excluded from facial recognition technology systems? Where it is not practicable or reasonable to assess the capacity of an individual on a case-by-case basis, how could this element be satisfied?
These engagements with industry – which we will be reaching out to stakeholders about soon – will develop our understanding of these issues so we can provide clarity to those considering the technology.
It's aligned with one of my key priorities as privacy commissioner, which is to make complying with the law easier.
To that end, we’ve recently released guidance for organisations on a range of issues, from using commercially available AI products, to privacy obligations around third-party tracking pixels that you might use on your website.
Another priority area I’d like to touch on briefly is privacy reform. Many of you will be aware that the Privacy Act was strengthened last year to include new measures, including expanded enforcement and investigation powers for the OAIC.
We’ve recently stood up a privacy reform implementation taskforce that is coordinating the OAIC’s and the regulated community’s preparedness for the implementation of those reforms.
Our job as the regulator is to use the right tool in our toolbox, in a proportionate and responsive way, to ensure the best privacy outcomes for Australia. I want to put you in the best possible position to comply with the law, so you can innovate with confidence.
The importance of pausing
But innovating doesn’t mean rushing in. New tech offers convenience and security, but the risks must be carefully weighed.
So, I’d like to conclude by reiterating the need for thoughtful adoption:
- Is what you’re seeking to do truly necessary, and is it proportionate to any impacts on privacy?
- Have you considered key privacy principles: consent, transparency, accuracy, governance, security?
- A good methodical way to step through these considerations is to do a privacy impact assessment. We’d like to see privacy impact assessments become standard practice before implementing new technologies.
Retailers do not have to choose between security and privacy – both are possible with the right approach.
Let’s take the opportunity to pause, get the balance right, and not sleepwalk into using technology that may have greater ramifications.