Australia is set to become the first country to implement a nationwide ban on social media accounts for children under 16, starting December 10. In a move to enhance child protection online, Meta has begun the process a week early by deactivating accounts of children aged 13 to 15 across its platforms, including Facebook, Instagram, and Threads.
This ban resonates deeply amid growing concerns about the psychological and social impacts of social media on children. Recent studies show a significant percentage of Australian youth are exposed to harmful content online, emphasizing the urgency of legislative action in safeguarding younger users.
Key Developments
- Meta is deactivating accounts for users aged 13 to 15 starting December 4, affecting around 500,000 accounts.
- The new social media ban, which imposes heavy fines for non-compliance, officially begins on December 10.
- Meta suggests a regulated age verification process across app stores to streamline compliance.
- Concerns have been raised that the ban may isolate children and push them to less-regulated platforms.
- YouTube, initially exempt from the ban, is now included and criticized the legislation as “rushed.”
Full Report
Implementation Ahead of Schedule
In a proactive measure, Meta began notifying users aged 13 to 15 last month that their accounts would be deactivated before the official ban kicks in. An estimated 150,000 Facebook accounts and 350,000 Instagram accounts will be affected. Threads, which requires an Instagram account for access, is also included in this shutdown.
Government Support and Enforcement
The Australian government is fostering this ban primarily to shield children from the negative ramifications of social media, like cyberbullying and exposure to inappropriate content. Communications Minister Anika Wells emphasized the need to protect Generation Alpha, focusing on the influence of “predatory algorithms” on youthful users. The government has outlined fines of up to A$49.5 million (approximately US$33 million) for companies failing to comply with age restrictions.
Public Concerns
Despite the intention behind this legislation, critics argue that the ban could alienate children who rely on these platforms for social interaction. Wells indicated that the government would closely monitor developments to ensure children do not seek refuge in less regulated apps like Lemon8 and Yope, both of which have been reached out to by Australia’s eSafety Commissioner for compliance assessments. However, Yope’s CEO claimed that their platform is designed as a private messaging service and isn’t a social media site, thus isn’t subjected to the ban.
Meta’s Position
A spokesperson for Meta acknowledged the company’s commitment to adherence but argued for a more standardized approach for user age verification across all apps. Under this framework, youths identified as under 16 will have the opportunity to download their data before account deactivation. Additionally, they can request a review if they believe they’ve been mistagged, a process that involves submitting personal identification or a “video selfie.”
YouTube’s Critique
YouTube, which initially avoided the ban’s scope, has since been included, voicing concerns that restricting minors undermines its built-in parental controls and may render the platform less secure for young users. Highlighting that 96% of children ages 10-15 in Australia utilize social media, the platform’s critiques highlight the balance between user safety and open access to communication tools.
Context & Previous Events
Earlier this year, a study commissioned by the Australian government revealed alarming statistics regarding harmful content exposure among young social media users. The findings indicated that over half of Australian youth aged 10-15 have faced cyberbullying, while many reported encounters with inappropriate or violent material as well as manipulative content targeting mental health issues.










































