Skip to main content
  • On this page

Published date: 24 February 2023

Introduction

1 The Office of the Australian Information Commissioner (OAIC) welcomes the opportunity to make a submission to the Committee’s inquiry into online gambling and its impacts on those experiencing gambling harm (the Inquiry).

2 The OAIC is an independent Commonwealth regulator, established to bring together three functions: privacy functions (protecting the privacy of individuals under the Privacy Act 1988 (Cth) and other legislation), freedom of information functions (access to information held by the Commonwealth Government in accordance with the Freedom of Information Act 1982 (Cth)), and information management functions (as set out in the Australian Information Commissioner Act 2010 (Cth)).

3 We understand that the dynamic digital market and constantly evolving technologies and products are challenging the parameters of existing legislative frameworks that apply to online gambling and related content.[1] We welcome the Committee’s consideration of the effectiveness of the existing regulatory framework and consumer protections in addressing the risks and harms associated with gambling in the online environment.

4 A similar need has arisen in the privacy jurisdiction to consider whether Australia’s privacy framework is proportionate, sustainable and responsive to a rapidly evolving online environment. The Australian Government has recently released the report of the Attorney-General’s Department’s review of the Privacy Act (Privacy Act Review report), which contains various proposals that aim to ensure the Act and its enforcement mechanisms are fit for purpose in the digital economy.[2]

5 While gambling-related regulations are outside the OAIC’s regulatory remit, this submission highlights how the handling of personal information can facilitate certain ancillary practices (such as profiling and targeted advertising) that may amplify online gambling harms. We consider that an enhanced privacy framework would function as an important safety net to supplement any proposed reforms to existing sector-specific regulatory frameworks.

Privacy risks and harms in the online environment

6 The Privacy Act protects the privacy of individuals by regulating the collection, use and disclosure of personal information by Australian Privacy Principle (APP) entities. The APPs are technology-neutral and apply to the handling of personal information in both online and offline environments.

7 Many of the privacy risks and harms in the online environment have emerged due to the increase in the amount of data and personal information collected, used, and shared, both in Australia and globally, to support the ads-based revenue model of the internet.

8 Personal information allows social media and other platforms to create detailed profiles about their users, which enables them to sell highly targeted advertising units. These platforms also employ sophisticated and privacy-invasive methods such as profiling and cross-device tracking to more accurately personalise and target individuals with marketing material.

9 Most platforms still generally allow gambling advertising subject to some restrictions. For example, Twitter is the only platform which appears to prohibit rather than restrict gambling-related advertising. Facebook, Google and Snapchat impose some additional safeguards by requiring advertisers to gain pre-approval for gambling-related ads.[3]

10 Targeted advertising for gambling products may have the potential to exacerbate the harms faced by those at a heightened risk, including problem gamblers and children.

11 In its Digital Platforms Inquiry Final Report, the ACCC recognised that the risks associated with data collection and use could be particularly acute for children.[4] In the online environment, entities may share children’s data for advertising purposes, or engage in harmful tracking, profiling of, or targeted marketing to children.[5]

12 In a report titled Profiling Children for Advertising: Facebook’s Monetisation of Young People’s Personal Data, Reset Australia found that Facebook used personal data it collected about underage users to create profiles of young people with harmful or risky interests such as smoking, gambling, alcohol or weight loss.[6]

13 Facebook restricts advertising content about alcohol products, gambling, and other age-inappropriate categories, however, at the time of the report, it allowed advertisers to target underage users interested in gambling and other inappropriate categories with advertising content that indirectly promoted these products or services.

14 To demonstrate this, Reset Australia developed an ad campaign and were able to get approval from Facebook to deliver ads through Instagram Stories to 13–17-year-olds that, amongst other things, encouraged users to ‘win prizes’ in mobile-game-style gambling, set to be delivered to teenagers profiled as interested in gambling. While Reset Australia did not run the ad campaign for ethical reasons, their report noted that most of the ads they submitted were approved by Facebook.[7]

15 Personal information also fuels the curation of highly personalised streams of content calculated to maximise a user’s engagement on the platform. The longer a social media platform can keep someone engaged on their platforms, the more advertising they can sell in their curated stream (and the more data they can collect about users).[8]

16 Social media platforms use detailed information about a user such as their behaviour on the platform (including how long they have watched a certain video or what content they have interacted with), their geographical location, their interests, and what other sites they have visited (such as whether they’ve visited gambling sites) to serve them with content that they are interested in and want to engage with.

17 The more a user seeks out and interacts with certain content the more the platform learns about their interests and provides them with content that aligns with the platform’s assessment of those interests, resulting in increased exposure to certain content, such as gambling-related content, which may be harmful for certain users.

18 We understand that personal information may also be used to personalise loot-boxes, enticing consumers to spend more time and money in games. Loot-boxes are packages of digital content in online games where consumers spend money to receive random in-game content. Studies have found that young people who spend money on loot-boxes are more than 10 times as likely to be problem gamblers than those who do not.[9] The mixed commercial messaging between the advertisement (e.g. loot-box) and the game itself, can make it difficult for users to understand how their personal information is being used to shape the gaming environment.

19 A lack of transparency by online platforms around complex data practices also presents significant challenges for individuals in making informed decisions about how their personal information is handled online. Many online platforms present take-it-or-leave-it terms that do not provide individuals with meaningful choice or control over how their personal information will be handled.

Reforming Australia’s Privacy Act

20 The OAIC considers that enhancing Australia’s existing privacy framework as outlined in the Attorney-General’s Department’s Privacy Act Review report would supplement the broader public policy and regulatory response to online gambling issues.

21 We take this opportunity to highlight some of the Department’s key proposals in the report that we consider would help to mitigate the risks associated with those practices described above that may contribute to online gambling harms.

22 We support the proposal to introduce a positive obligation on APP entities to handle personal information fairly and reasonably, which we consider would operate as the new keystone for the Privacy Act.

23 Personal information handling obligations under the APPs are largely framed through the lens of what is reasonably necessary for a business’s functions and activities. The APPs do not currently require APP entities to consider the privacy impacts on individuals at the outset.

24 The fair and reasonable test will address this gap by requiring entities to take more proactive steps to actively consider the foreseeable risks to individuals and take reasonable steps to mitigate these potential impacts. This would set a baseline standard of information handling that is flexible and able to adapt as circumstances and technology changes.

25 The requirement that the collection, use and disclosure of personal information must be fair and reasonable would also apply irrespective of whether consent has been obtained. This will prevent consent from being used to legitimise activities that are inherently unfair and unreasonable.

26 We support the introduction of legislated factors to help entities determine whether the collection, use and disclosure is fair and reasonable in the circumstances. Those factors include:

  • whether an individual would reasonably expect the personal information to be collected, used or disclosed
  • the kind, sensitivity and amount of personal information being collected, used or disclosed
  • whether the impact on privacy is proportionate to the benefit
  • the risk of unjustified adverse impact or harm, and
  • where the collection, use or disclosure of the personal information relates to a child, whether that collection, use or disclosure is in the best interests of the child.

27 An example of a practice that may be unfair or unreasonable under the new test may include where social media companies infer highly personal or sensitive information about individuals, particularly children, including about their moods or socio-economic status, and target them with inappropriate content, such as gambling advertising or gambling-related content.

28 We also support the proposal in the report that APP entities must conduct a Privacy Impact Assessment (PIA) for activities with high privacy risks and should be required to produce a PIA to the OAIC on request. In our submissions to the Privacy Act Review, we recommended that activities with high privacy risks (‘restricted practices’) should be specifically set out in the Act including but not limited to:

  • direct marketing, including online targeted advertising
  • the handling of sensitive information on a large scale
  • the handling of children’s personal information on a large scale, and
  • the handling of personal information for the purposes of online personalisation and delivering targeted advertising.

29 The Department has also proposed express prohibitions on certain activities including:

  • direct marketing to a child unless the personal information used for direct marketing was collected directly from the child and the direct marketing is in the child’s best interests
  • targeting to a child unless the targeting is in the child’s best interests
  • trading the personal information of children, and
  • targeting individuals based on sensitive information with an exception for socially beneficial content.

Digital Platform Regulators Forum (DP-Reg)

30 The OAIC has observed growing intersections between domestic frameworks including privacy, competition and consumer law, online safety and online content regulation.

31 While there are synergies between these frameworks, it is important to note that there are also variances given that each regulatory framework is designed to address different economic, societal and policy issues. In this way, each regime is an essential and complementary component in the ring of defence that is being built to address the risks and harms faced by Australians in the online environment. The harms that may arise from online gambling raise several intersecting issues and require a multi-faceted regulatory response.

32 To support a streamlined and cohesive approach to the regulation of digital platforms, the Australian Communications Media Authority (ACMA), the Australian Competition and Consumer Commission (ACCC), the Office of the Australian Information Commissioner (OAIC), and the Office of the eSafety Commissioner together formed the Digital Platform Regulators Forum (DP-Reg).

33 DP-Reg is an initiative of Australian independent regulators to share information about, and collaborate on, cross-cutting issues and activities on the regulation of digital platforms. This includes consideration of how competition, consumer protection, privacy, online safety and data issues intersect.

34 Each regulator brings an important and distinct lens based on their different remits to intersecting issues on digital platforms, including how to address online gambling harms. One of the current priorities of DP-Reg forum is improving transparency of what digital platforms are doing to protect Australians from potential harm, including how data is being handled. Collaborating and coordination on these topics ensures a proportionate response and a shared focus on improving safety and trust in our digital economy.

Conclusion

35 The OAIC recognises the complex and multifaceted nature of online gambling and its impacts on those experiencing gambling harm. We remain committed to collaborating and working cooperatively with policy agencies and other regulators that have primary carriage of addressing, and responding to, the risks and harms associated with online gambling.

36 The reforms proposed as part of the Privacy Act Review provide an important opportunity to enhance existing privacy protections that would support the broader public policy and regulatory response to addressing online gambling harms that the Committee is considering as part of this Inquiry.

Footnotes

[1] Australian Communications and Media Authority (ACMA), Submission to the Inquiry into online gambling and its impacts on those experiencing gambling harm, ACMA, November 2022, accessed 16 February 2023, p 1.

[2] Attorney-General’s Department (AGD), Privacy Act Review – Report, AGD, February 2023, accessed 23 February 2023.

[3] N Witzleb, M Paterson, J Wilson-Otto, G Tolkin-Rosen and M Marks, Privacy risks and harms for children and other vulnerable groups in the online environment, report to OAIC, Monash University and elevenM Consulting, 2020, p. 154-155

[4] ACCC, Digital Platforms Inquiry – Final Report, ACCC, 2019, accessed 16 February 2023, pp 447-448.

[5] Attorney-General’s Department (AGD), Privacy Act Review – Discussion Paper, AGD, October 2021, accessed 16 February 2023, p 100.

[6] Dylan Williams, Alexandra McIntosh, Rys Farthing, Profiling Children for Advertising: Facebook’s Monetisation of Young People’s Personal Data, Reset Australia, April 2021, accessed 16 February 2023.

[7] It should be noted that Facebook has since changed its advertising policies so that advertisers can only target ads to people under 18 based on their age and location (not gender or interest categories). See https://about.fb.com/news/2023/01/age-appropriate-ads-for-teens/

[8] Dylan Williams, Alexandra McIntosh, Rys Farthing, Profiling Children for Advertising: Facebook’s Monetisation of Young People’s Personal Data, Reset Australia, April 2021, accessed 16 February 2023, p 6.

[9] Zendle, David, Meyer, Rachel and Over, Harriet. Adolescents and Loot Boxes: Links with Problem Gambling and Motivations for Purchase. Royal Society Open Science 6, no 6 (2019).