-
On this page
Introduction
1 The Office of the Australian Information Commissioner (OAIC) welcomes the opportunity to provide comments to the Department of Home Affairs regarding the Draft Code of Practice – Securing the Internet of Things (the draft code).
2 The draft code forms part of the Australia’s 2020 Cyber Security Strategy.[1] The OAIC made a submission regarding this strategy in November 2019.[2]
3 Internet of Things (IoT) devices and services offer important benefits and opportunities to individuals and the Australian economy but they can also carry significant security and privacy risks. As IoT becomes more widespread and integrated into our daily lives, it is imperative that these products are designed with strong security and privacy features to protect against threats, such as cyber-threats, which may put individuals’ personal information at risk.
4 IoT devices are becoming increasingly interconnected and are capable of collecting significant volumes of personal information, including sensitive information such as health and biometric information. IoT technology is built into services such as smart home assistants, wearable health trackers, connected home automation and alarm systems, home appliances and children’s toys, with many of these products now ‘always on’ and connected to the internet by default. It has been estimated that the average Australian household will have 37 devices by 2023, half of which will be IoT devices. [3] The ubiquitous nature of how these devices collect, store and share personal information means that customers are not always fully aware of the privacy risks.
5 The draft code is an important suite of measures to create a nationally consistent framework to strengthen cyber security safeguards for IoT. It encourages best practice security standards and promotes more transparent business practices and organisational accountability across all levels of the IoT supply chain, including device manufacturers, IoT service providers, mobile application developers and retailers, to ensure adequate security standards for IoT devices. We support initiatives in this sector to address security risks to individuals and meet community expectations in relation to this technology.
6 The protection of personal information is critical to community trust and confidence in IoT. Cyber security and privacy are inextricably linked and this is reflected in both Principles 5 and 11 of the draft code and Australian Privacy Principles (APPs) 1 and 11. Manufacturers of IoT devices and IoT service providers should demonstrate that they are designing security, and accordingly, the protection of personal information, directly into their devices and services from the outset.
7 In this submission, the OAIC recognises the strong connection between cyber security and privacy protection and we propose privacy enhancing amendments to build on and support the work that has been done to develop the draft code, including in relation to:
- Privacy by design
- Privacy protective default settings
- Notice and consent
- Transfer and deletion of data
- Data minimisation
- ‘No go zones’, and
- Children.
Relationship between IoT security and privacy
8 IoT devices and services inherently involve the collection, use and disclosure of personal information. The relationship between security and privacy is codified in the Privacy Act 1988 (Privacy Act), particularly through APPs 1 and 11 and the Notifiable Data Breaches (NDB) scheme.
9 APP 11 requires all entities covered by the Privacy Act to take reasonable steps to:
- protect personal information that they hold from misuse, interference and loss, and from unauthorised access, modification or disclosure, and
- destroy or de-identify the personal information they hold once it is no longer needed for any purpose for which it may be used or disclosed under the APPs.
10 Robust information security practices and procedures, including cyber-security practices, are recognised as a necessary privacy protection and key consideration for entities taking ‘reasonable steps’ under APP 1 and APP 11.[4] In complying with APP 11, businesses are expected to actively monitor their cyber risk environment for emerging threats and take reasonable steps to protect personal information by mitigating those risks. This is a dynamic responsibility which scales proportionately to the volume and type of personal information held by an entity. The expectations placed upon the entity to protect that information increase with the volume or sensitivity of personal information held by an entity.
11 Under APP 1, entities must also take steps beyond technical security measures in order to protect and ensure the integrity of personal information throughout the information lifecycle, including implementing strategies in relation to governance, internal practices, processes and systems, and dealing with third party providers.[5]
12 The NDB scheme requires entities covered by the Privacy Act to carry out an assessment whenever they suspect that there may have been a loss of, unauthorised access to, or unauthorised disclosure of personal information that they hold. If serious harm is likely to result to an individual, entities must notify the OAIC and also affected individuals so they can take actions to address the possible consequences. Since the inception of this mandatory reporting regime, approximately 60% of notified data breaches of personal information have been attributed to cyber intrusion.[6]
13 The NDB scheme incentivises entities to improve security standards in relation to the protection of personal information. The scheme has led to a growing awareness of privacy rights and the risks inherent in the collection of personal information by entities, as well as proactive measures that every person can take to protect themselves.
14 We note that Principle 5 in the draft code – ‘Ensure that personal data is protected’ and Principle 11 – ‘Make it easy for consumers to delete personal data’ are the primary principles that deal specifically with privacy. Other principles in the draft code deal with matters which are directly relevant to the requirement to take reasonable steps to protect personal information under APP 11. The OAIC welcomes the inclusion of principles in the draft code that relate to APP 11 as this aligns with the obligations that entities have under the APPs in relation to securing personal information.
Privacy by Design
15 Privacy should be built into the design specification and architecture of IoT devices and services. The draft code sets out a security by design approach to securing IoT. The introduction to the draft code notes that IoT devices and services are often developed with functionality as a priority and security is often absent, or considered as an afterthought.[7] In our view, it is also important that privacy is factored into the design and development of IoT. Adopting a privacy by design approach to IoT embeds good privacy practices into the design specification of these devices and services and ensures that privacy impacts are considered at the start of the lifecycle of a product or service.[8] An entity can conduct a privacy impact assessment (PIA) in order to understand the privacy impacts connected with IoT. [9]
16 Implementing privacy by design complements the security by design approach of the draft code and promotes the widest possible consumer protection for consumer IoT to address the interconnected security and privacy risks. Embedding privacy into the design of IoT from the start is fundamental to enabling individuals to self-manage their privacy and make entities more accountable for their use of personal information. This goes beyond security of personal information, to cover the handling of personal information throughout its lifecycle.
17 We recommend that Principle 5 in the draft code is amended to make reference to privacy by design in this principle.
Recommendation 1 Principle 5 in the draft code is amended to make reference to privacy by design.
Privacy protective default settings
18 Settings for IoT devices and services should be set at the most privacy protective setting by default, with the option for individuals to relax settings, once they are given adequate notice and guidance.
19 IoT amplifies concerns around secondary uses of personal information, particularly where devices are always on or have surveillance and tracking capabilities.
20 IoT devices and services have the potential to impact the lives of individuals and should operate within community expectations and have sufficient community support (‘social licence’). A social licence for IoT should be built on transparency around the collection, use and disclosure of personal information.
21 Configuring privacy settings to privacy protective by default in IoT is an essential step to securing personal information and protecting individuals from harm. Privacy settings should align with the APPs and control how an entity collects, uses and discloses personal information. Privacy settings should also provide corresponding choices for individuals to control the collection, use and disclosure of their personal information by the entity.
22 Under APP 3 an entity must not collect personal information unless it is reasonably necessary for (or for an agency directly related to) one or more of its functions or activities. In addition, unless an exception applies, express consent must be sought to collect sensitive information.
23 In relation to use or disclosure of personal information, APP 6 requires that if personal information is collected for a particular purpose, unless an exception applies, it must not be used for another purpose (secondary purpose) unless:
- consent is obtained, or
- the individual would reasonably expect the entity to use or disclose the personal information for that secondary purpose and the secondary purpose is related to the primary purpose (and for sensitive information directly related to the primary purpose).
24 Individuals may have limited understanding and insight in relation to the ways that personal information may be used and disclosed (including secondary purposes of use or disclosure) by a particular IoT product or service. IoT devices and services may be technical or complex in the way they operate, making it difficult for meaningful consent to be provided. In consideration of APP 6, individuals may not reasonably expect their personal information to be used or disclosed for a secondary purpose. Privacy notices should clearly specify the ways in which personal and sensitive information may be used for secondary purposes, to assist individuals to understand these matters.
25 Having settings privacy protective by default will reduce collection, use and storage of personal information. Reducing the scale and scope of personal information that an entity holds should lessen the risks of loss, unauthorised access to and disclosure of this personal information. This in turn may also reduce the impact on individuals in the event of a data breach. Where an individual is required to opt-in to secondary use and disclosure of their personal information (that is not within their reasonable expectation or related to the primary purpose), this should be supported with appropriate notice and consents. We consider notice and consent later in this submission.
26 High privacy protections and settings assist individuals to reduce their exposure to malicious attacks perpetrated through IoT as well as unwanted secondary use of their personal information. These measures also give individuals the tools to self-manage their privacy and respond to incidents.
27 Privacy settings should be privacy protective by default in IoT, for example:
- Settings for use and disclosure of personal information for secondary purposes that are not related (and for sensitive information directly related) to the primary purpose for which the information was collected are set to ‘off’ by default and require individuals to opt-in.
- Tracking and targeting technologies should be set to ‘off’ by default, for example geolocation and profiling options.
28 We recommend that Principle 5 in the draft code is amended to require privacy settings are set to privacy protective by default.
Recommendation 2 Principle 5 in the draft code is amended to require privacy settings are set to privacy protective by default.
Notice and consent
29 The Australian Government has confirmed its commitment to strengthen notice and consent requirements in the Privacy Act in its response to the Digital Platforms Inquiry.[10]
30 The issue of effective notice and consent is being considered by other jurisdictions, for example the Information Commissioner’s Office (ICO) of the United Kingdom (UK) has released a code of practice ‘Age appropriate design: a code of practice for online services’ (UK Code of Practice) [11] . This code applies to connected toys and devices which collect personal data and transmit it via a network. This code includes specific guidance for connected toys and devices in relation to notice and consent including (among others) that providers of these products and services:
- be clear about who is processing the personal data and what their responsibilities are
- provide clear information about their use of personal data at point of purchase and on set-up, and
- find ways to communicate ‘just in time’ information.[12]
Notice
31 APPs 1 and 5 require IoT devices and services to be transparent about how personal information is being collected, used and disclosed.
32 Privacy notices should be provided at or before the time of collection and be concise, transparent, intelligible, written in clear and plain language and provided free of charge. These measures engender trust and provide important privacy protections that assist individuals to exercise choice and control over how their personal information is collected, used and disclosed. Increased transparency and accountability can also provide a competitive advantage for providers of IoT devices and services. Robust notices also demonstrate an entity’s accountability to individuals and alignment with community expectations.
33 We recommend that the Department of Home Affairs considers how effective notification can be achieved in relation to IoT, including the use of multi-layered notification or standardised, simple, icons or phrases. The OAIC would welcome collaboration with the Department in relation to these matters.
Consent
34 APPs 3 and 6 require there to be consent to the collection, use and disclosure of personal information in specified circumstances. There are additional requirements regarding consent to the collection, use and disclosure of sensitive information, particularly in relation to secondary uses and also children’s consent.
35 Consent to the collection, use and disclosure of personal information should be informed, freely given, voluntary, current and specific and individuals must have capacity to give consent. Special considerations may apply to consent in certain circumstances, for example:
- in relation to obtaining consent from children, where an entity is collecting personal information from an individual over the age of 15, the entity may assume that individual has capacity unless the entity is unsure,[13] and
- the collection of sensitive information including health and biometric information generally requires express consent.[14]
36 Consent will only be meaningful where an individual is offered a real choice and the ability to control how their personal information is collected, used and disclosed. Privacy protective default settings and giving individuals the ability to control these settings will enable this.
37 When dealing with vulnerable individuals such as children and the collection, use and disclosure of sensitive information, or invasive data practices such as surveillance, the issue of consent becomes paramount.
38 Principle 5 in the draft code deals with notice and consent. We recommend that the language in this principle is enhanced to include additional requirements to protect vulnerable individuals such as children and in relation to sensitive information, for example:
- where personal information of children is being collected the notice is written at a level that can be readily understood by the minimum age of the permitted device or service user[15]
- consent to the collection of a child’s personal information must be obtained from the child’s guardian,[16] and
- express and up to date consent should always apply to the collection of sensitive information including health information and biometric data.
Recommendation 3 Draft Principle 5 is amended to include that notices directed at children are written at a level that can be readily understood by the minimum age of the permitted device or service user, children’s consent should be obtained from the child’s guardian, and express and up to date consent should apply to the collection of sensitive information including health information and biometric data.
Transfer and deletion of data
40 Draft Principle 11 supports the deletion of personal information where there is a transfer of ownership of an IoT device or service, when the consumer wishes to delete it, or when the individual wishes to dispose of an IoT device. The OAIC supports this principle, which reflects the right to erasure in the GDPR.[17] In our submission to the Government in relation to the Digital Platforms Inquiry Final Report (DPI Final Report) we stated our support for the recommendations to enable individuals to request erasure of their personal information, subject to exceptions, and we recommended that individuals are notified of their ability to request erasure of their personal information.[18]
41 We suggest that the Department consider the extension of this principle to include data portability, to allow individuals to easily and securely transfer their data to another device or service upon request, at the end of ownership, or when they wish to dispose of the device. This would require devices and services to be configured to permit the transfer of personal information.
42 The data portability right is central to the Consumer Data Right, which allows consumers to access particular data in a usable form and to direct a business to securely transfer that data to an accredited data recipient.[19] Data portability is an important privacy enhancing measure as it provides individuals with control over which entities use, hold and disclose their personal information.
Recommendation 4 Consideration be given to amending Draft Principle 11 to require devices and services to be configured to enable the easy and secure transfer of personal information to another device or service when there is a transfer of ownership or as directed by the individual.
Data minimisation
44 We note that generally, APP entities are required to collect the minimal amount of personal information that is reasonably necessary to carry out their functions or activities (APP 3) and take reasonable steps to destroy or de-identify personal information that the entity no longer needs for any purpose for which the information may be used or disclosed under the APPs (APP 11).
45 APP 11.2 provides that if an APP entity holds personal information about an individual and the entity no longer needs the information for any purpose for which the information may be used or disclosed under the APPs, then unless the information is contained in a Commonwealth record or the entity is required to keep the information under an Australian law or a court/tribunal order, the entity must take reasonable steps to destroy the information or ensure that it is de-identified.
46 Data minimisation principles are particularly important in relation to IoT devices or services, which may be always on, and may collect large amounts of personal information passively. Wearable IoT devices such as fitness trackers are designed to collect sensitive information such as health information and biometric data. These devices may also engage in a form of constant surveillance of individuals by recording location information.
47 We recommend that Principle 5 is amended to refer to the obligation on IoT service providers, mobile application developers and retailers to employ data minimisation principles and ensure that they take reasonable steps to destroy or de-identify personal information that they no longer need for any purpose for which the information may be used or disclosed under the APPs.
Recommendation 5 Draft Principle 5 is amended to refer to data minimisation principles that apply to IoT service providers, mobile application developers and retailers, that they only collect the minimal amount of personal information required to carry out their functions or activities and take reasonable steps to destroy or de-identify personal information that they no longer need for any purpose for which the information may be used or disclosed under the APPs.
Children
49 IoT may be specifically designed for children, including internet connected toys, game consoles and wearables. Children also frequently have access to IoT devices in other ways, for example when they use devices that belong to their parents, or they interact with smart devices in the home. There have been significant privacy and security incidents internationally that have involved children’s IoT devices and services. Regulators in other jurisdictions have taken action under surveillance laws to prohibit smart toys,[20] have issued fines under privacy laws in relation to data breach incidents[21] and issues around notice and consent involving children’s IoT devices.[22]
50 IoT poses particular risks for children because of the significant capability of these devices to collect personal information, for example through sensors, cameras or microphones. Always on capability means that IoT may act as a form of surveillance, collecting significant volumes of personal information which may include sensitive information such as health and biometric information. IoT may also enable children to knowingly or unknowingly disclose their personal information including potentially sensitive information such as photographs, videos and location information.
51 Children may be particularly vulnerable in the context of IoT and surveillance, targeting and profiling. For example, children may be the target of inappropriate products or scams or may have diminished understanding of the risk of harms associated with the use of IoT such as the collection of sensitive information and in built surveillance and monitoring capabilities of wearable IoT such as fitness trackers.
52 The ICO in the UK Code of Practice has referred to the children being ‘datafied’, with ‘companies and organisations recording many thousands of data points about them as they grow up’.[23]
53 Other jurisdictions protect children in the context of IoT in a range of ways.
54 The Office of the Privacy Commissioner of Canada (OPC) has recognised inappropriate data practices, otherwise known as ‘no-go zones’. Examples of ‘no-go zones’ include profiling or categorisation that leads to unfair, unethical or discriminatory treatment contrary to human rights law, and collection, use or disclosure for purposes that are known or likely to cause significant harm to the individual and surveillance by an organization through audio or video functionality of the individual’s own device. Even with consent these practices are considered to be ‘offside’ for the purposes of Canadian privacy law. [24] A no-go zone can be designated according to specified criteria, for example the sensitivity of the personal information, the nature of the proposed use or disclosure of that information, or the vulnerability of the individuals whose personal information is being processed.[25] The OPC in their policy position on online behavioural advertising has imposed a no-go zone that organizations ‘should avoid knowingly tracking children and tracking on websites aimed at children’.[26]
55 Privacy protective settings aim to set certain practices to ‘off’ by default unless consent is obtained from the individual (including consent from a guardian in the case of a child). In contrast, ‘no-go zones’ are matters that are prescribed as inappropriate purposes for collection, use and disclosure of personal information for which consent cannot be obtained in any circumstance.
56 The UK Code of Practice aims to ensure children have high privacy protections by default, are provided proper notice and guidance before changing settings, and adequate protections when their data is used. There are sections dealing with data minimisation, parental controls, and high privacy settings by default, such as switching off geolocation and profiling.
57 There are also prohibitions in the UK Code of Practice against the use of children’s data in ‘ways detrimental to their wellbeing’. Specifically, in the context of IoT and children’s toys, the UK Code of Practice prohibits the passive collection of personal data without effective protections. The UK Code of Practice suggests features that allow passive collection or listening modes to be easily switched off, so children’s devices can be used as a non-connected device.
58 The draft code does not specifically deal with matters relating to children. In the context of the handling of personal information of children, additional safeguards are necessary to minimise the collection, use and disclosure of children’s personal information and also to protect children from any form of targeting, profiling and surveillance.
59 The OAIC supports the implementation of a ‘no-go zone’ in relation to children and IoT. We recommend Principle 5 in the draft code is amended to restrict the use of targeting, profiling and surveillance with IoT devices that are marketed at children.
Recommendation 6 The Department of Home Affairs should amend Principle 5 in the draft code to restrict the use of targeting, profiling and surveillance with IoT devices that are marketed at children.
61 For further information please contact Kellie Fonseca, Director, Regulation & Strategy, via [contact details removed].
Footnotes
[1] Department of Home Affairs, Australia’s 2020 Cyber Security Strategy
[2] OAIC, Australia’s 2020 Cyber Security Strategy: A call for views — submission to the Department of Home Affairs
[3] Telsyte, Australian IoT@Home Market Cracks $1bn, Paving The Way For IoT-Commerce Services. [link not available]
[4] Compliance with APP 11 is context dependent, the OAIC has published the Guide to securing personal information, which provides guidance on the reasonable steps that entities are required to take under the Privacy Act to protect the personal information they hold. The Guide is intended for use by entities covered by the Privacy Act, but may also be relevant to other organisations as a model for better personal information handling practices. See Office of the Australian Information Commissioner 2018, Guide to Securing Personal Information, OAIC, Sydney.
[5] APP 1 also requires entities to take reasonable steps to implement practices, procedures and systems that will ensure compliance with the APPs.
[6] OAIC, Notifiable Data Breaches Statistics Report: 1 April to 30 June 2019.
[7] Draft code, 2.
[8] Data protection by design and by default is a key aspect of the European Union General Data Protection Regulation (GDPR), and this aims to ensure that appropriate data processing principles, such as data minimisation, are embedded in the practices of entities regulated by the GDPR (Article 25, European Union General Data Protection Regulation (EU) 2016/679).
[9] The OAIC has published a Guide to undertaking privacy impact assessments, which may be helpful in this regard, as well as a Privacy Impact Assessment e-learning tool.
[10] The Treasury, Government Response and Implementation Roadmap for the Digital Platforms Inquiry, 12 December 2019.
[11] Information Commissioner’s Office (UK), ‘Age appropriate design: a code of practice for online services’.
[12] A just in time notice provides clear information about what is done with children’s personal data in more specific, ‘bite-size’ explanations, at the point at which the use of the personal data is activated. It prompts the child to speak to an adult before they activate any new use of their data and not to proceed if they are uncertain. Information Commissioner’s Office (UK) Age appropriate design: a code of practice for online services, 4 Transparency.
[13] Paragraph B.58, APP Guidelines, OAIC, Sydney.
[14] APP 3.3, Schedule 1 of the Privacy Act.
[15] See Australian Competition and Consumer Commission’s Digital Platforms Enquiry, Final Report, recommendation 16(b) strengthen notice requirements.
[16] See Australian Competition and Consumer Commission’s Digital Platforms Enquiry, Final Report, recommendation 16(c) Strengthen consent requirements and pro-consumer defaults.
[17] Article 17, European Union General Data Protection Regulation (EU) 2016/679.
[18] OAIC, Digital Platforms Inquiry final report — submission to the Australian Government, 23 September 2019, [35].
[19] OAIC Consumer Data Right. The OAIC has published the CDR Privacy Safeguard Guidelines.
[20] For example the My Friend Cayla doll was banned in Germany, The Guardian, German parents told to destroy doll that can spy on children , 18 February 2017.
[21] CloudPets, manufactured by Spiral Toys was subject to a data breach affecting more than 800,000 users in 2018, Smart toy flaws make hacking kids’ info child’s play, CNET, 28 February 2017.
[22] For example, the Federal Trade Commission (US) settled a complaint against Vtech Electronics: This was a significant data breach in November 2015 where malicious actors gained access to children’s and parents’ personal information. An estimated 5 million parent records and 227,000 child records were compromised. VTech was fined $650,000 USD. VTech failed both to inform parents how their children’s data would be collected and used, and to take reasonable steps to secure that data.
[23] Information Commissioner’s Office (UK) Age appropriate design: a code of practice for online services, Executive Summary.
[24] Office of the Privacy Commissioner of Canada, Guidance on inappropriate data practices: Interpretation and application of subsection 5(3), May 2018.
[25] Office of the Privacy Commissioner of Canada, A discussion paper exploring potential enhancements to consent under the Personal Information Protection and Electronic Documents Act, May 2016.
[26] Office of the Privacy Commissioner of Canada, Policy position on online behavioural advertising, December 2015.