-
On this page
11 December 2020
20. Does notice help people to understand and manage their personal information?
21. What matters should be considered to balance providing adequate information to individuals and minimising any regulatory burden?
26. Is consent an effective way for people to manage their personal information?
5.1 Notice and consent provide foundational protections in privacy law across the world, including in the Privacy Act. Their purpose is to ensure that individuals have knowledge of, and choice and control over, how information about them is handled by organisations. Transparency obligations, through privacy policies (APP 1.3), collection notices (APP 5), and obligations to obtain consent when collecting sensitive information and handling personal information beyond the primary purpose of collection (APPs 3.3 and APP 6.1) are aimed at privacy self-management.
5.2Privacy self-management empowers individuals to make choices and exercise control around their personal information. This can be a way of addressing power imbalances and information asymmetries between individuals and APP entities. This is particularly the case where a person is able to make choices within the service on offer, or to choose between services and decide whether to engage with a business, based on their information handling practices. Where alternative choices, product or services exist, privacy self-management mechanisms can influence the market to increase privacy protection in accordance with consumer demand.
5.3While the discrete transparency and notice requirements in APPs 1 and 5 underpin the exercise of individual choice and control, they also support the accountability of APP entities. Through transparency, an individual may decide to exercise control in how they deal with a service (such as adjusting privacy settings) or decide not to engage with the business. The transparency obligations also assist regulators to hold entities to account.
5.4Privacy self-management relies on entities making information about their personal information handling practices accessible and understandable. Privacy policies and notices need to communicate information handling practices clearly and simply, but also comprehensively and with enough specificity to be meaningful. However, the complexity of today’s information ecosystem, where unprecedented amounts of personal information are collected and shared for a range of different purposes, makes it challenging to give individuals’ clear information about how their personal information will be handled.
5.5APP 1 privacy policies and APP 5 notices were not intended to be consent mechanisms that amount to contractual terms and conditions for consumers. There has, however, been a shift towards bundling privacy policies and notices into one document, sometimes called ‘terms and conditions’ and purporting to use them to seek ‘agreement’ to broad data handling practices. This has likely been driven by global, USA-based corporations operating in Australia, which have imported and spread American norms where privacy is a matter of contractual negotiation.
5.6Rather, the objective of APP 1 is to ensure APP entities mange personal information in an open and transparent way. An APP 1 privacy policy provides higher level information to the world at large about how an organisation generally handles personal information, how to access personal information and make a complaint. APP 5 notices play a complementary role as a transparency measure and in promoting individual participation in decision making about their personal information by explaining in clear terms how their personal information is collected, used and disclosed in the particular circumstances. In contrast to the more general privacy policy under APP 1, an APP 5 notice is designed to provide specific information relevant to a particular collection of personal information.[93]
5.7Consent is also an important part of privacy self-management, supporting transparency, choice and control for individuals. As observed in the Issues Paper, there are specific circumstances in the APPs where an APP entity must seek consent to collect, use or disclose information. In particular, the Privacy Act seeks to establish consent as basis for collection, use or disclosure in circumstances that are higher risk, for example, where an APP entity collects ‘sensitive information’ or uses or discloses personal information for a purpose that is different to the primary purpose for which it was collected.
5.8However APP entities are currently permitted to collect, use or disclose personal information without the consent of individuals. Collection of personal information is permitted where it is reasonably necessary for, or, for agencies, directly related to, the entity’s functions or activities. Use or disclosure is permitted without consent if, for example, the use or disclosure is for the primary purpose that the information was collected, or if the purpose of the use or disclosure is for a purpose that is related to the primary purpose and the individual would reasonably expect the entity to use or disclose their information in this way.
5.9This recognition that consent is not necessary or appropriate in all circumstances reflects the fact that many instances of personal information handling in the economy are reasonably expected by individuals. Requiring consent in these expected circumstances may make this mechanism into a tick-box exercise which will detract the value of consent in higher-risk situations where it will actually be valuable. .
5.10 The OAIC supports the need to strengthen notice and consent requirements in the Privacy Act, however this should occur by introducing measures to address the limitations of notice and consent to ensure they are likely to be understood and valid, rather than expanding the use of these privacy self-management tools. These limitations, and recommendations to address them, are discussed below.
5.11 The complexities of data practices are now such that reforms to notice and consent should be complimented with the introduction of an overarching fair and reasonable requirement and additional organisational accountability obligations that will redress the imbalance in knowledge and power between individuals and organisations. Discussion and recommendations about these measures is below at Parts 6 and 7.
Recommendation 31 Strengthen notice and consent requirements in the Privacy Act to address the limitations in these mechanisms, but preserve the use of consent for high privacy risk situations, rather than routine personal information handling.
Limitations of notice and consent
5.12 Notice and consent only achieve their goal of giving individuals choice and control if they are used effectively and in appropriate circumstances. In the OAIC’s view, the overuse of these mechanisms will place an unrealistic burden of understanding the risks of complicated information handling practices on individuals. This will not address the privacy risks and harms facing individuals in the digital age.
5.13 There are several important practical limitations on the use of notice within the privacy framework. A fundamental dilemma of notice is that it usually comes to a choice between making notice simple and easy to understand or fully informing an individual of the consequences of handing over data, which can be quite complex if explained in meaningful detail.
5.14 APP 5 notices are increasingly being embedded within long and complex statements of terms of provision and use of service, while other privacy information is ‘incorporated’ by reference to a general privacy policy. Individuals may want to know how their personal information will be handled, but including these descriptions in long complex notices, often drafted with legal obligations in mind, fails to deliver on this objective.
The results of OAIC’s 2020 ACAPS survey found that:
- 69% of individuals do not read privacy policies attached to any internet site.
- The key reasons Australians don’t read privacy policies attached to internet sites is because of the length (77%) followed by their complexity (52%).
- Even among those who normally read the privacy policy attached to a site, 41% sometimes don’t because it is too long and 26% sometimes don’t because it is too hard to read.
- When Australians do read privacy policies, comprehension difficulties are widespread. Fewer than 2 in 5 Australians (37%) are confident they have understood them when they read them, and 53% are not confident. The remaining 10% never read privacy policies.[94]
5.15 Emerging, innovative technologies, such as artificial intelligence tools, may use personal information as the basis for a decision that could have significant effects for the individual, but do so in a way that is often invisible or difficult to comprehend, and may be challenging for APP entities to clearly explain.
5.16 The practical consequence of these issues is that the descriptions of more impactful or unusual privacy affecting acts or practices are often ‘lost in the noise’ of descriptions of more generic or expected information handling activities.
5.17 Expanding transparency requirements could have the effect of further increasing the length of notices, which may make them more difficult to interpret and be less useful in informing individuals, particularly of unusual or unexpected information handling practices. It is also important to note that the notice and consent model does not scale, meaning that while transparency reforms may make it easier for individuals to read and understand a few privacy policies or notices, it is unreasonable to expect individuals to engage meaningfully with notices from the large (and likely increasing) number of APP entities seeking to handle their personal information.
5.18 Consent is also a privacy self-management tool that has its limitations, particularly in light of the various challenges and complexities created by digital technologies. These limitations constrain the usefulness of consent as a privacy protection:
- Consent is only a meaningful and effective privacy self-management tool where the individual actually has a choice and can exercise control over their personal information. In many cases, consumers may feel resigned to consenting to the use of their information to access online services, as they do not consider there is any alternative.
- The challenge faced by APP entities in seeking consent will vary depending on how necessary the individual considers the relevant product or service. Anecdotal evidence suggests that individuals already feel that they are dependent on the services offered by global social media platforms, search engines and e-commerce sites, which typically offer all-or-nothing terms and conditions. As these products or services become more entrenched in individuals’ lives, not engaging with a product or service, even where an individual holds privacy concerns, may not be a realistic option without having a significant impact on an individual’s ability to engage online.
- Consent must be freely given, specific, unambiguous, and informed, which can be particularly difficult to achieve in the digital environment where data flows and data practices are increasingly complex and difficult to understand. It is becoming increasingly apparent that individuals are not always well placed to assess the risks and benefits of allowing their personal information to be shared. For example, at the moment of sharing, individuals cannot always know what other personal information can be derived about them, and what other information it may be combined with in the future to develop additional insights about them. This is supported by studies that have suggested that there are cognitive limitations that impact the ability of individuals to accurately assess risks when deciding whether to consent to privacy terms. For example, individuals have been shown to overvalue immediate benefits and costs (for example, the benefits of immediate access to a desired online service), while struggling to accurately assess more delayed benefits and costs (for example, privacy risks).[95]
- The notice and consent model is predicated on consumers making individual privacy decisions about their own personal information. However, recent privacy issues in relation to COVID-19 or attempts to manipulate political processes are illustrating that privacy decisions by individuals are increasingly impacting the community or the wider Australian public.[96]
- Research suggests that some APP entities operating online use so-called ‘dark patterns’ designed to nudge individuals to consenting to more collections and broader uses of personal information.[97]
5.19 As noted, consent is only required under the Privacy Act for higher risk information handling activities. This is why there is a high threshold for valid consent. If consent became the primary basis for personal information handling, this high threshold would place an unnecessary compliance burden on entities for much of their information handling across the online and offline environment. For example, APP entities will be required to seek informed, voluntary, current and specific consent for standard business activities, such as providing records that contain personal information to an accountant or reviewing records for auditing purposes. It would also require individuals to ‘consent’ to a myriad of information handling practices that they do not currently need to consent to and which they would reasonably expect.
5.20 As noted above, the OAIC is supportive of reforms to the notice and consent framework in the Privacy Act, particularly to prevent overly broad, unfair or unreasonable information handling practices. This submission recommends several reforms to further strengthen and clarify notice and consent requirements.
5.21 However, the OAIC does not consider that the privacy risks and harms facing individuals in the digital age will be addressed by expanding APP entities’ notice obligations, or the circumstances where consent is required.
5.22 The burden of understanding and consenting to complicated practices should not fall on individuals but must be supported by enhanced obligations for APP entities that promote fair and reasonable personal information handling and organisational accountability. The OAIC sees one of the key goals of the law reform process as being to ensure that the APPs provide a framework for the handling of personal information that is fair and reasonable, with consent only required in limited circumstances that will be of most benefit to individuals. Recommendations 37, 38, 39 and 40 in this submission are aimed at addressing this key issue.
Recommendations to strengthen notice requirements
22. What sort of requirements should be put in place to ensure that notification is accessible; can be easily understood; and informs an individual of all relevant uses and disclosures?
24. What measures could be used to ensure individuals receive adequate notice without being subject to information overload?
25. Would a standardised framework of notice, such as standard words or icons, be effective in assisting consumers to understand how entities are using their personal information?
Countering information overload
5.23 The OAIC considers that an appropriate balance must be struck between strengthened notice requirements and the practical consequences of increased provision of notices to consumers, which could include increased notification fatigue.
5.24 Developments in technology present the opportunity for more dynamic, multi-layered and user-centric privacy policies and notices in the online environment. The OAIC supports innovative approaches to privacy notices, for example, ‘just-in-time’ notices,[98] video notices, privacy dashboards and multi-layered privacy policies[99] to assist with readability and navigability. The OAIC considers that the proposed Online Platforms code provides an appropriate instrument to include such measures.
5.25 The OAIC considers that the following measures would assist to address the limitations of notice outlined above and strengthen the utility of notice under the Privacy Act. These measures could be introduced into the Privacy Act, in industry-specific codes, legally-binding rules supported by Commissioner-issued guidelines.[100]
Language and accessibility
5.26The OAIC recommends that requirements should be introduced for APP 5 notices to be concise, transparent, intelligible and written in clear and plain language.[101]
5.27The OAIC also recommends the practical application of the enhanced obligations in APP 5 could be supported through the use of codes, legally-binding rules or Commissioner-issued guidelines addressing the following requirement:
- Formatting notices in a way that will draw the consumer’s attention to it at the time of collection, or before the point of collection
- Ensuring readability across multiple digital devices, including smaller screens
- Require notices to be written at a level that can be readily understood by the minimum age of the reasonably likely audience of affected individuals[102]
- Notices should be reasonably accessible, particularly for those with disabilities.[103]
5.28These requirements would ensure that APP entities create succinct and transparent notices that narrow the focus of what must be addressed, and ensure the notice is manageable to for individuals.
Recommendation 32 Introduce requirements that APP 5 notices should be concise, transparent, intelligible and written in clear and plain language.
Standardised icons or lexicon
5.29The OAIC supports measures to create a common language to assist individuals make informed decisions about their personal information, for example, through the use of standardised icons or phrases, as recommended in the ACCC’s DPI report. This will allow individuals to readily identify the information handling practices of most relevance to them, and to compare products and services in order to make consumer choices based on privacy credentials. It may also allow the development of a ‘traffic light’ system to compare privacy settings across products and services.
5.30 The use of standardised language and icons will also streamline compliance by all entities when developing privacy policies, notices and consent mechanisms. This will also support compliance by small business if the exemption is removed from the Privacy Act (see OAIC Recommendation 27).
5.31 The OAIC considers that, initially, sector-specific standard icons or lexicons, developed in collaboration between the OAIC, industry and consumer groups, will be most effective. This process was used in the CDR regime, where Data61 developed the consumer experience standards and the mandatory data language standards in collaboration with the OAIC, ACCC and industry. These standards were subject to extensive user testing.[104]
5.32 Recommendations 14, 15 and 16 will provide the Commissioner with greater regulatory options to operationalise such measures, for example through a code, legally-binding rules and Commissioner-issued guidelines. This could facilitate an industry-led, collaborative process to develop standard icons or language that are flexible and tailored to the specific needs of the sector. These standardised icons and lexicons can be refined and iterated based on consumer experience and as the needs of the sector evolve. This would complement similar measures that will be included in the upcoming code aimed at digital platforms.
5.33 However, in order to assist individuals to understand the specific information handling practices of the entity they are dealing with, standardised icons and phrases may need to be supported by additional information where required. This is primary because it is important that APP 5 notices remain context specific with a clear goal of explaining the particular purpose for which a specific entity is proposing to collect an individual’s personal information.
Recommendation 33 OAIC supports the development of standardised icons or lexicon through an industry led process to assist individuals make informed decisions about their personal information.
23. Where an entity collects an individual’s personal information and is unable to notify the individual of the collection, should additional requirements or limitations be placed on the use or disclosure of that information?
5.34 APP 5 requires an APP entity to take reasonable steps to notify an individual about the collection of their personal information, regardless of whether the APP entity has collected the personal information directly from the individual or from a third party.
5.35 The OAIC’s APP guidelines outline a limited number of scenarios in which not providing notice under APP 5 may be reasonable, such as where an individual is aware that the personal information is being collected, the purpose of the collection and other APP 5 matters relating to collection; when notification may jeopardise the purpose of collection or the integrity of the personal information; when notification may pose a serious threat to life or safety; or if notification would be inconsistent with other legal obligations. It is the responsibility of the collecting APP entity to be able to justify not taking any steps.
5.36The OAIC considers that Recommendation 37 to introduce a fairness and reasonableness requirement in relation to collections, uses or disclosures of personal information will serve to address the privacy risks that may arise if an APP entity does not notify an individual about the collection of their personal information under APP 5. Additionally, the OAIC considers that strengthening the obligations on APP entities collecting from third parties (as outlined in paragraphs 3.26-3.33) will further serve to reduce privacy risks in these circumstances.
Recommendations to enhance the use of consent
27. What approaches should be considered to ensure that consent to the collection, use and disclosure of information is freely given and informed?
28. Should individuals be required to separately consent to each purpose for which an entity collects, uses and discloses information? What would be the benefits or disadvantages of requiring individual consents for each primary purpose?
29. Are the existing protections effective to stop the unnecessary collection of personal information?
a. If an individual refuses to consent to their personal information being collected, used or disclosed for a purpose that is not necessary for providing the relevant product or service, should that be grounds to deny them access to that product or service?
30. What requirements should be considered to manage ‘consent fatigue’ of individuals?
32. Should entities collecting, using and disclosing personal information be required to implement pro-privacy defaults for certain uses and disclosures of personal information?
33. Should specific requirements be introduced in relation to how entities seek consent from children?
5.37 The OAIC considers that it is important to preserve the use of consent for situations in which the impact on an individual’s privacy is greatest, and not require consent for uses of personal information for purposes that individuals would expect or consider reasonable.[105] Seeking consent for routine purposes may undermine the quality of consents obtained from consumers, and result in consent fatigue. It is also essential that consent be relied on only where an individual is actually being given meaningful control over their personal information.
5.38; Rather than expanding the use of consent broadly, the OAIC recommends a number of measures that will ensure that consent is meaningful and relied on by entities in appropriate circumstances.
5.39 The OAIC supports the ACCC’s recommendation in the DPI report that consent should be defined to require a clear affirmative act that is freely given, specific, unambiguous and informed. This reform would align the definition of consent more closely with the GDPR.[106]
5.40 As noted in paragraphs 5.48-5.51 below, consent must also be current. This means that an individual’s consent will only last as long as is reasonable, having regard to the particular circumstances. The OAIC recommends elevating this requirement for consent from the APP guidelines into law.
Recommendation 34 Amend the definition of ‘consent’ to require a clear affirmative act that is freely given, specific, current, unambiguous and informed.
Specific and purpose-based consent
5.41 Consent will only be valid if an individual understands what they are consenting to and is given the opportunity to consent to specific personal information handling practices. As noted in the OAIC’s APP guidelines, consent given at a particular time in particular circumstances cannot be assumed to endure indefinitely. An APP entity should not seek a broader consent than is necessary for its purposes, for example, consent for undefined future uses, or consent to ‘all legitimate uses or disclosures.' Requesting broad or ‘bundled’ consents has the potential to undermine the voluntary nature of consent.[107]
5.42An amended definition of consent, as per Recommendation 32 above, could be supported by Commissioner-issued guidance that sets out expectations for ensuring specific and purpose-based consent,[108] including that:
- Consent is not freely given when the provision of service is conditional on consent to personal information handling that is not necessary for the provision of the service, as per Article 7(4) of the GDPR.
- APP entities must clearly and narrowly define the purposes for which the personal information will be handled and consent is being sought.
- Consent must be specific and granular.[109]
- APP entities should consider the use of graduated consent and tiered consent, similar to the approach to consent currently being proposed under the CDR regime.[110]
Pro-privacy default settings
5.43 Default settings provided by entities nudge users towards privacy intrusive options as research shows most users do not look at/change their default settings.[111] These are known as ‘dark patterns’.[112]
5.44Research suggests some entities may use design choices and language to manipulate users to choose the less privacy-friendly options, and ultimately discourage them from making an active choice. For example, through the:
- Use of salient colours, buttons or options, playing towards the consumer’s tendency to choose the easier road.
- Need for significantly more clicks to adjust away from default settings.
- Focusing on positive aspects of one choice whilst glossing over potentially negative aspects, assisting consumers to comply with the service provider’s wishes.
- Giving consumers the illusion of control, making them more susceptible to taking risks with sharing information.[113]
5.45The OAIC considers that default settings that aim for data maximisation run counter to the policy intentions of the Privacy Act and increase the risk of harm to individuals. This is particularly the case where this information is being used to facilitate direct marketing through online advertising as part of an entity’s business model and is not necessary to reasonably enable the provision of the particular product or service in a manner reasonably contemplated by the user. They are also counter to community expectations, as evidenced in the OAIC’s 2020 ACAPS results, which found that 85% of Australians considered that digital platforms should only collect information needed to provide their product and/or service.
5.46 Pro-privacy default settings require a higher level of user engagement before APP entities can collect and use personal data for a secondary purpose. For an entity to collect personal data for a secondary purpose, they will need explicit opt-in from the consumer.
5.47While this will cause some regulatory burden on entities such as the digital platforms which rely on data collection for their business model, this is an essential protection for individuals. It will also incentivise entities to design consumer friendly, easy to use privacy controls and place the responsibility on these entities to provide clear notices that persuade individuals why positively electing to change these default settings is in their best interests.
Recommendation 35 Amend the Privacy Act to require all settings to be set to privacy protective as default except for collections of personal information that reasonably enable provision of the particular product or service.
Refreshing and withdrawing consent
38. Should entities be required to refresh an individual’s consent on a regular basis? If so, how would this best be achieved?
39. Should entities be required to expressly provide individuals with the option of withdrawing consent?
5.48The OAIC’s APP guidelines indicate that consent must be current and specific. This includes enabling an individual to withdraw their consent at any time, which should be an easy and accessible process. Once an individual has withdrawn consent, an APP entity can no longer rely on that past consent for any future use or disclosure of the individual’s personal information. Individuals should be made aware of the potential implications of withdrawing consent, such as no longer being able to access a service.
5.49 The OAIC recommends that this guidance is elevated into law, including a requirement that an individual be notified of their right to withdraw consent, where consent has been required for the personal information handling. This could be modelled on current requirements in the CDR. [114] This would complement the OAIC’s recommendations to introduce a right to erasure and right to object, as outlined in Part 3.
5.50The OAIC’s APP guidelines also note that consent given at a particular time in particular circumstances cannot be assumed to endure indefinitely. It is good practice to inform the individual of the period for which the consent will be relied on in the absence of a material change of circumstances.
5.51 The OAIC is supportive of APP entities having processes in place to check whether the consent that an individual has provided remains current. This must be balanced with issues around information overload and consent fatigue, as discussed above.
Recommendation 36 Elevate OAIC guidance on withdrawing consent into the Privacy Act, including a requirement that APP entities must notify an individual of their right to withdraw consent, where consent has been required for the personal information handling.
Emerging technologies and privacy self-management
34. How can the personal information of individuals be protected where IoT devices collect personal information from multiple individuals?
5.52 IoT devices and services offer great benefits and opportunities to individuals and the Australian economy. However, as these devices become more widespread and interconnected, they are becoming increasingly capable of collecting more significant volumes of personal, and often sensitive, information. This can create significant security and privacy risks.[115]
5.53 IoT devices may appear in various contexts, ranging from consumer items such as smart speakers or smart appliances, devices used as part of smart city initiatives as well as industrial applications of this technology. Of particular risk are IoT devices used in toys or devices that will be used by children.
In 2016, the OAIC and other members of the Global Privacy Enforcement Network undertook a global sweep of IoT products, which identified several problems with the privacy practices of IoT devices. The results of the sweep found that:
- 71% of devices and services considered failed to properly explain how information was stored
- 69% did not adequately explain how customers could delete their information off the device
- 38% failed to include easily identifiable contact details if customers had privacy concerns
- 91% did not advise customers to customise their privacy settings.[116]
5.54 Transparency obligations are particularly important for IoT devices and can present compliance challenges. Under the Privacy Act, APP entities are required to ensure that individuals have access to information about the types of personal information that will be collected, and the ways it will be used and disclosed. Where IoT devices do not have screens or other interfaces, APP entities will have to take other steps to ensure compliance with their transparency obligations.
5.55IoT devices also pose challenges for seeking valid consent where required, particularly ensuring that consent is voluntary and informed.[117] Devices that collect personal information in public spaces automatically may rely on individuals to opt-out of collection. To the extent that individuals are aware of the use of these devices, the non-interactive nature of IoT devices means that opting-out may be challenging. This may also result in an individual simply having to move to a different area.[118] Obtaining informed consent will require APP entities to place notices prominently so that individuals are aware of how their personal information will be handled.[119]
5.56 The OAIC’s recommendations about notice and consent and ensuring that all settings to be set to privacy protective as default in the above section will assist in addressing the challenges to these privacy self-management tools that are posed by IoT devices.[120] The OAIC also recommends amending APP 1 to expressly require entities to adopt a ‘privacy by design’ and ‘privacy by default’ approach, which will require APP entities to consider privacy compliance while developing and designing IoT devices (see Recommendation 42). [121]
5.57 Given the challenges that IoT devices pose to privacy self-management requirements, ensuring that APP entities deploying this technology act fairly and reasonably, and comply with other appropriate organisational accountability requirements, is particularly important. This submission recommends several requirements that will be relevant protecting individuals in the context of IoT devices:
- Introducing requirements for APP entities to collect, use and disclose personal information fairly and reasonably to ensure that APP entities providing IoT devices handle information in a manner that meets Australian community expectations (see Recommendation 37)
- Implementing full or partial prohibitions for certain acts or practices in relation to IoT, particularly in relation to children, as well as the surveillance of individuals through their personal devices (see Recommendation 40)[122]
- Introducing a right to erasure which, subject to exceptions, would allow individuals to request the deletion of their personal information, particularly where there is a transfer of ownership of an IoT device (see Recommendation 23)[123]
5.58 IoT devices may often collect technical data which may be used for purposes such as profiling individuals. The granularity of information collected by IoT devices may also allow increasingly accurate inferences about individuals. The OAIC’s recommendations about the definition of personal information to clarify that technical data and inferred information are captured will address this issue (see Recommendations 4, 5 and 6).
5.59 Finally, information security obligations under APP 11 are particularly important in relation to IoT devices, given the volume of personal information that may be collected by this technology.
Footnotes
[93] OAIC (May 2017) ‘The definition of personal information’ [online document], OAIC, accessed 18 November 2020, [1.10].
[94] OAIC (2020) Australian Community Attitudes to Privacy Survey 2020, report prepared by Lonergan Research, p. 70.
[95] See discussion of bounded rationality at Taylor M & Paterson J (in press) Protecting privacy in India: The role of consent and fairness in data protection Indian Journal of Law and Technology, p. 18.
[96] See the discussion of these limitations with the notice and consent model in Susser, D (2019), ‘Notice After Notice-and-Consent: Why Privacy Disclosures are Valuable Even if Consent Framework’s Aren’t’, Journal of Information Policy, 9, pp 37-62
[97] Oyvind H. Kaldestad (2018) Report: Deceived by design, Forbruker Rådet website, accessed 26 November 2020
[98] Just-in-time notices can be used across digital devices; for example, when the consumer is using an application, and the entity managing the application is collecting information via the application’s settings, a pop-up window can alert the consumer with a summary of their data being collected. Particularly in relation to information handling that an individual would not reasonably collect, the OAIC supports point in time notifications during specific interactions with consumers.
[99] A notice can be presented in a layered format, which can link to other documents and may assist in reducing information overload for consumers.
[100] See OAIC Recommendation 16, which recommends including a new provision in the Privacy Act that would require entities to have regard to any guidelines issued by the Commissioner when carrying out their functions and activities under the Privacy Act.
[101] In its submission to the Australian Government in response to the ACCC’s DPI, the OAIC argued for striking a balance between appropriate, strengthened notice requirements, whilst also recognising that excessive use of these would create a ‘notification fatigue’ (See OAIC (2019) Digital Platforms Inquiry final report — submission to the Australian Government [online document], OAIC website, accessed 4 November 2020) . The ACCC recommended that one way to counteract this would be through not require consent when personal information is being processed in accordance with a contract to which the consumer is a party.
[102] See discussion of transparency at Chapter 4 of UK ICO (2020) Age appropriate design: a code of practice for online services, ICO Website, accessed 25 November 2020.
[103] These could be modelled on § 999.305 of the California Consumer Privacy Act Regulations which came into force on 14 August 2020.
[104] Data61 was required to make data standards in s 56FA of the Competition and Consumer Act 2010 (Cth) and Rule 8.11 of the Competition and Consumer (Consumer Data Right) Rules 2020 (Cth).
[105] This aligns with the position of the Canadian Government, as set out in Innovation, Science and Economic Development Canada (2019) Strengthening Privacy for the Digital Age, Government of Canada website, accessed 20 November 2020.
[106] Article 4(11) of the General Data Protection Regulation defines ‘consent’ of the data subject as any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her.
[107] Bundled consent refers to the practice of an APP entity ‘bundling’ together multiple requests for an individual’s consent to a wide range of collections, uses and disclosures of personal information, without giving the individual the opportunity to choose which collections, uses and disclosures they agree to and which they do not.
[108] As per the OAIC’s Recommendation 16, entities would be required to take account of Commissioner-issued guidance when carrying out their functions and activities under the Privacy Act.
[109] For example, see UK ICO, Guide to the General Data Protection Regulation (GDPR): Lawful Basis of Processing: Consent, ICO website, accessed 20 November 2020.
[110] Graduated consent: where a consumer can give consent to different uses of their data throughout the relationship with the entity. Tiered consent: where the consumer agrees to disclose increasing amounts of personal information in exchange for different products and levels of services, which can occur ‘just-in-time’.
Under the Consumer Data Right (CDR) system, an entity must ask a consumer to consent to specific uses of their CDR data: Rule 4.11(1)(a)(ii) of the CDR Rules. The ACCC is proposing to amend the CDR Rules so that a consumer may ‘amend’ this consent at a later point in time, for example where they wish to consent to additional/fewer/different uses of their data and/or consent to the collection of additional/fewer/different types of data (see, for example, subdivision 4.3.2A of the exposure draft for 3rd amendment of the CDR Rules, available on the consultation page on proposed changes to the CDR Rules, which closed on 29 October 2020)
[111] It was found that Facebook and Google default settings pre-selected the use of personal data for ads based on third-party data/personalisation, and users were required to actively disable these settings. See Oyvind H. Kaldestad (2018) Report: Deceived by design , Forbruker Rådet website, accessed 26 November 2020.
[112] Forbruker Rådet (2018) ‘Every Step You Take: How deceptive design lets Google track users 24/7’, Forbruker Rådet website, accessed 26 November 2020, p 12 [3.8].
[113] Oyvind H. Kaldestad (2018) Report: Deceived by design, Forbruker Rådet website, accessed 26 November 2020.
[114] CDR Rules, Rule 4.11(3)(g)
[115] See the discussion of the privacy risks associated with IoT devices in Office of the Victorian Information Commissioner (2020), The Internet of Things and Privacy, OVIC, Victorian Government.
[116] For more information, see the Office of the Australian Information Commissioner (23 September 2016) Privacy Commissioners reveal the hidden risks of the Internet of Things [media release], Australian Government, accessed 21 November 2020.
[117] For a detailed consideration of challenges in seeking valid consent in relation to IoT devices in Office of the Victorian Information Commissioner (2020), The Internet of Things and Privacy, OVIC, Victorian Government, pp. 6-8.
[118] Office of the Victorian Information Commissioner (2020), The Internet of Things and Privacy, OVIC, Victorian Government, p. 10.
[119] These issues will arise where IoT devices collect personal information passively, including for example, vehicles that are connected to the internet. The European Data Protection Board has provided guidance on privacy issues in the context of connected vehicles and other mobility related applications.
[120] For example:
- Recommendation 32: Requiring notices to be concise, transparent, intelligible and written in clear and plain language.
- Recommendation 34: Strengthen requirements for valid consent to ensure that it is informed, freely given, voluntary, current and specific and individuals must have capacity to give consent.
- Recommendation 35: Settings for IoT devices should be set at the most privacy protective by default.
[121] See OAIC (March 2020), Voluntary Code of Practice Securing the Internet of Things for Consumers — submission to the Department of Home Affairs, OAIC website, accessed on 24 November 2020, [15]-[17].
[122] See also OAIC (March 2020), Voluntary Code of Practice Securing the Internet of Things for Consumers — submission to the Department of Home Affairs, [49]-[59].
[123] See also OAIC (March 2020), Voluntary Code of Practice Securing the Internet of Things for Consumers — submission to the Department of Home Affairs, [40]-[42].