Skip to main content

Carly Kind

Carly Kind
Privacy Commissioner

Published:  

The idea of banning kids from social media is a controversial one. While undoubtedly well-intentioned, it gives rise to risks, including some for children. Young people like going online, and many use social media to build community, exercise autonomy and find information. According to UWS academic Amanda Third, ‘[a] social media ban will close down this avenue and force children into lower-quality online environments’. Social media platforms will no longer need to be made safe for children, so for the many who are able to circumvent the ban, they will potentially find themselves in even more depraved places. The risks of a ban extend to adults too; after all, in order to ensure children aren’t on the platforms, the age of every internet user will need to be checked and verified. As the country’s Privacy Commissioner, the widespread privacy implications of a social media ban have me concerned.

At the same time, I can’t help but ask why we have accepted that social media platforms are – by their very nature – places of darkness, violence and outrage. These are, after all, technological tools, designed by humans, for humans. It is within our power to change that design. One need only look at how the user experience on Twitter/X has fundamentally altered since the company’s acquisition in 2022 by Elon Musk, and his decision to reduce by 80% in the number of safety engineers at the company, to understand that technical and regulatory investments in improving safety on online platforms pay off.

There is a great deal more we could be doing to make social media a better place to be for kids, and for everyone. Key to this equation is strengthening privacy protections online.

Social media platforms are freely available because they leverage data-driven business models, in which platforms collect individuals’ personal information in order to sell companies targeted advertising opportunities. The more time users spend on the platform, the more opportunities to receive targeted ads, the more data platforms have about the users to sell to the advertisers. The algorithms used by the platforms are tweaked to optimise for eyeballs on the screen. This may be an acceptable bargain in theory, but in practice what the algorithms have learned is that the content that keeps users’ attention is often the worst kind of content – outrageous, graphic, hateful, misogynistic or eye-popping content. Content that plays into our addictions, insecurities, anxieties or fears.

This may be the current state of the social media data economy, but it need not be the ultimate one. I firmly believe a number of reforms contemplated for the Privacy Act could help us shape the online environment into a better place for children. For example, legislation currently before parliament would enable my office to develop a Children’s Online Privacy Code, which would particularise the requirements of the Privacy Act for social media platforms and other tools likely to be used by children. More broadly, the proposal to introduce a fair and reasonable test for the collection, use and disclosure of personal information would dramatically increase the ability of the Office of the Australian Information Commissioner to put unfair algorithmic practices under the microscope.

In the meantime, we are already turning our focus to using our existing regulatory tools to shape the online world for the better. Half of Australians surveyed in our latest Australian Community Attitudes to Privacy Survey said that social media platforms collecting information about them was one of the biggest privacy risks they face, and similar numbers expressed concerns about websites, apps and devices collecting location information and other data about them. To meet this concern, we’ve begun scrutinising the use of tracking pixels. Tracking pixels are one of the technologies websites and social media companies use to share personal information about users to enable the tracking and surveillance of users across platforms. We’ve recently issued guidance about how websites can ensure they comply with the Privacy Act when using tracking pixels, particularly on websites that might include sensitive information. We will consider taking enforcement action if we uncover particularly concerning non-compliance with the Privacy Act in this space – especially where children are concerned.

We should not be too quick to accept that social media is so bad that it needs to be banned for the most vulnerable. Changes, even small and incremental, could positively shape that environment. Strong privacy law and application of that law in the online domain is part of that puzzle.