Google Criticizes Australia’s New Social Media Law for Teens, Calls It Hard to Enforce
Teen safety on social media has become a global concern, and governments are starting to take bold action. In September 2024, Australia introduced a new Social Media Law that requires users under 16 to get parental consent before joining platforms like Instagram, TikTok, or YouTube. The government says this move will protect teens from cyberbullying, harmful content, and data misuse. However, not everyone agrees with the plan. Tech giant Google publicly criticized the law, arguing that it is extremely difficult to enforce in the real world.
How do you truly verify someone’s age online? What if teens lie or use fake accounts? What happens to privacy when companies collect sensitive data to confirm identity? These questions have sparked a heated debate between lawmakers and tech companies. Australia believes strict rules are the answer, while Google warns the law may create more problems than it solves.
This issue is not just about one country. It reflects a global struggle to balance teen safety, privacy rights, and technology freedom in a rapidly changing digital world.
Background and Goals of the Law
Australia passed amendments to the Online Safety Act in late 2024 to set a new minimum age for social media. The law aims to keep children under 16 off age-restricted platforms. Canberra says the rule will reduce exposure to harmful content, cyberbullying, and addictive features.
The government also plans to hold platforms accountable if they fail to take “reasonable steps” to prevent under-16s from having accounts. The Social Media Minimum Age framework is due to apply from 10 December 2025.
What the Law Requires?
Platforms designated as “age-restricted social media” must implement age assurance measures. They do not have to use a single method. Instead, regulators say platforms can use a mix of tools. These include AI-driven age inference, behavioural signals, and third-party checks.
Platforms may face fines or enforcement action if they do not reasonably prevent children under 16 from holding accounts. The government has said guidance and trials will continue throughout 2025 to shape how rules are applied.
Google’s Criticism and Main Objections
Google has publicly warned that the law will be “extremely difficult” to enforce in practice. Company witnesses told Australian parliamentary hearings that rules based on age inference and account deactivation will create technical and social problems.
Google also stressed that including services like YouTube often used for education, could harm users who rely on the platform for learning. The company said stricter safety tools and parental controls are better levers to protect young people.
Why Tech Firms Call Enforcement Hard?
Verifying age online is hard. Teen users can use fake IDs, borrow adult accounts, or hide behind VPNs. Behavioural signals and AI models are imperfect. They can misclassify people or create false positives. Global platforms must obey many different laws at once.
A solution designed for Australia could clash with privacy rules elsewhere. Collecting extra data to confirm age may itself raise privacy and safety concerns. These technical limits are at the heart of industry objections.
Social Media Law: Privacy and Legal Tension
The law pushes platforms to collect and analyse more data about users. That can conflict with data minimisation principles in privacy law. Experts warn that storing more identity information creates new risks. If verification systems are breached, sensitive information about young people could leak. Legal scholars also note that platforms face unclear liability if their age checks fail. Policymakers must balance protection against the risk of creating new harms.
Wider Industry and Civil Responses
Reactions are mixed. Some child-safety advocates applaud a tougher stance. Others worry the law will push teens to unregulated corners of the internet. Major platforms such as TikTok and Meta have questioned parts of the plan in public commentary and through industry channels. Tech associations urge clearer guidance and better tools rather than blunt access restrictions. At the same time, parent groups ask for stronger safety education and easier parental controls.
Possible Unintended Consequences
A strict age rule could fragment online youth culture. Teens barred from mainstream platforms may migrate to smaller services with weaker moderation. That could increase exposure to harmful content rather than reduce it. For educators and researchers, removing accounts also limits the ability to detect and respond to online harm. Some experts warn about social isolation and reduced digital literacy if children have less supervised exposure to mainstream spaces.
Enforcement Mechanics and State Readiness
Australia’s eSafety regulator is running trials and consultations through 2025 to test age-assurance approaches. Officials have said they will publish technical guidance and hold industry consultations. Still, regulators face a steep learning curve. Effective enforcement will need clear standards, reliable technical audits, and cross-border cooperation with global firms. The timeline to 10 December 2025 leaves a limited runway for full implementation.
Innovation Opportunities for Platforms
The law could spur new verification tools that are privacy-preserving. Techniques such as zero-knowledge proofs, device-based attestations, and improved parental control interfaces are possible paths. Startups may build niche services that help platforms comply. Any technical fix must be safe, affordable, and respectful of user rights.
What should Parents and Schools do now?
Parents should review privacy and safety settings on apps today. They should talk with children about safe habits. Schools should teach media literacy and coping skills. Those measures do not replace regulation. But they do reduce risk while governments, platforms, and civil society test technical approaches.
Conclusion and the Road Ahead
Australia’s law is ambitious and controversial. It aims to protect children, but it raises hard technical and legal questions. Google’s critique highlights real enforcement limits. Policymakers must refine rules to avoid new harms. Industry and civil society will need to work together to create practical, privacy-respecting solutions before 10 December 2025. The global debate on youth safety online will almost certainly continue.
Disclaimer: The above information is based on current market data, which is subject to change, and does not constitute financial advice. Always do your research.