With Australia's landmark social media ban for under-16s set to begin on 10 December, a significant shift is already underway. Children and teenagers are migrating en masse to lesser-known platforms not covered by the initial legislation, raising serious questions about the policy's effectiveness and potential unintended consequences.
The Great Migration to Unregulated Platforms
The ban will notionally block access to ten major platforms, including TikTok, Instagram, Snapchat, YouTube, Reddit, Twitch, Kick, and X. However, as the deadline approaches, apps like Lemon8, Coverstar, and Yope have seen a dramatic surge in downloads from Australia. Lemon8, a photo and video app owned by TikTok's parent company ByteDance, and Yope, a private photo messaging app, currently hold the top two spots in Apple's Australian lifestyle download charts.
This exodus has been actively encouraged. One teen TikTok influencer promoted the US-based app Coverstar in a paid collaboration video, stating: "The social media ban is fast approaching, but I found the new cool app we can all move to." The video was later deleted after The Guardian made enquiries.
Safety Concerns and Regulatory Whack-a-Mole
Experts are deeply concerned that the legislation may simply push young users into less regulated digital spaces, potentially increasing their risk. Dr Catherine Page Jeffery, a digital media expert at the University of Sydney, warns that the ban could make children more secretive about their online activity. "There is a very real possibility that if young people do migrate to less regulated platforms... they become more secretive about their social media use because they’re not supposed to be on there," she says. This secrecy could prevent them from seeking help if they encounter harmful material.
The government has stated its ban list is "dynamic," but critics argue this sets up a game of "whack-a-mole" with tech companies. Australia's eSafety commissioner, Julie Inman Grant, has already written to Lemon8 and Yope, recommending they self-assess whether the new laws apply to them.
Examining the Alternative Platforms
Coverstar markets itself as a safer alternative for 'Gen Alpha,' powered by AI and lacking direct messaging. However, it allows children as young as four to livestream and post content, with only a verbal parental permission video required for under-13s. Dr Jennifer Beckett from the University of Melbourne questions its heavy reliance on AI moderation, noting its lack of nuance and context.
Yope is a photo-messaging app pitched as a Snapchat alternative. Its CEO, Bahram Ismailau, claims it is exempt as a messaging service, akin to WhatsApp. Despite its terms stating it is for over-13s, The Guardian successfully created an account for a fictional four-year-old without any age verification or parental consent.
The Chinese app Rednote (Xiaohongshu) is another potential destination. Dr Beckett suggests its Chinese origins might mean stronger content moderation, but cybersecurity experts flag extensive data collection practices.
An Inevitable Circumvention
The overarching consensus among observers is that determined young people will find a way. "They’re going to get around it," asserts Dr Beckett. "Kids are geniuses when it comes to pushing boundaries." Anecdotal reports suggest some are discussing creating their own forums via website builders or using shared Google Docs for communication.
This digital cat-and-mouse game highlights the fundamental challenge of legislating adolescent online behaviour. As one expert succinctly put it, it is inevitable that children will find a way 'to get around it', potentially trading the perceived dangers of mainstream platforms for the unknown risks of emerging and less-scrutinised apps.