YouTube has warned that Australia’s new social media restrictions for users under 16 could make the platform less safe for children by removing key parental control tools that families depend on.
Beginning 10 December, teenagers under the age of 16 will be automatically signed out of their YouTube accounts and barred from uploading videos or posting comments, as the Social Media Minimum Age Act comes into effect. While minors will still be able to view videos, doing so without an account will disable features such as content filters, blocked channels, and wellbeing reminders. The YouTube Kids app will remain unaffected.
The company criticised the government’s policy as a form of “rushed regulation,” arguing that it undermines years of progress in building online safety features for families.
“Parents will lose their ability to supervise their teen or tween’s account,” said Rachel Lord, Google and YouTube Australia’s Public Policy Senior Manager. She added that the new law threatens to undo more than a decade of work developing safety tools.
“Most importantly, this law will not fulfil its promise to make kids safer online, and will, in fact, make Australian kids less safe,” Lord said, noting that many parents and educators share these concerns.
In response, Communications Minister Anika Wells dismissed YouTube’s criticism, describing it as “outright weird” for the company to warn about safety issues on its own platform. “If YouTube is reminding us all that it is not safe, that is a problem YouTube needs to fix,” she said.
YouTube was initially excluded from the ban, but the government reversed its position in July after the eSafety Commissioner determined that the platform was most frequently cited by children aged 10 to 15 who encountered harmful content. Reports indicate that Google has explored a potential legal challenge to the law, though the company has not confirmed this publicly.
As the ban nears implementation, regulators have turned their attention to rapidly growing platforms Lemon8 and Yope, both popular among teenagers. eSafety Commissioner Julie Inman Grant has requested that the two platforms self-assess whether they fall within the scope of the new restrictions.
Wells acknowledged that the rollout could face early challenges, saying the first few weeks may bring “teething problems.” Nonetheless, she insisted that the legislation is vital to protect Generation Alpha from what she described as “predatory algorithms” designed to deliver a “dopamine drip” of addictive content.
Under the new law, platforms including YouTube, Facebook, Instagram, TikTok, Snapchat, X, Twitch, Threads, Reddit, and Kick must deactivate existing under-16 accounts, prevent the creation of new ones, and report their compliance every six months. Companies that fail to comply face fines of up to A$49.5 million (US$33 million, £25 million).





