WhatsApp to Allow Under-13s with Parental Consent in Major Policy Shift
WhatsApp Allows Under-13s with Parental Consent

Meta, the parent company of WhatsApp, has unveiled a significant policy change that will allow children under the age of 13 to create accounts on the popular messaging platform, provided they have explicit parental consent. This move comes in response to growing demand from parents who seek a safer digital environment for their younger children to communicate.

New Account Structure and Parental Controls

Under the new system, which is set to roll out over the coming months, parents will be responsible for signing up their children for WhatsApp accounts. These accounts will be directly linked to the parent's account, with robust parental controls enabled to manage the child's online experience. This linkage must remain intact until the child reaches their 13th birthday, though parents have the option to extend it for up to an additional 12 months if desired.

Restricted Features for Young Users

The under-13 accounts will operate with limitations designed to enhance safety. Children will only have access to basic calling and messaging functions. They will not be able to use Meta AI, the artificial intelligence chatbot integrated across Meta's platforms, nor will they have access to status updates or channels. Additionally, disappearing messages will be disabled for one-on-one conversations on these accounts.

Parents will wield significant control over their child's account, including the ability to manage who can contact the child, which groups they can join, and privacy settings. However, in a nod to privacy, parents will not be able to view the contents of messages, as communications will remain end-to-end encrypted. Notifications will be sent to parents if a WhatsApp group expands in size or if someone disables disappearing messages within a group.

Age Verification and Enforcement

When questioned about how WhatsApp plans to enforce age restrictions and whether new age verification tools will be implemented, Meta clarified that the responsibility lies with parents. The company believes parents are best positioned to determine when their child is ready for WhatsApp. If Meta discovers an account is being used by someone under 13 without parental linkage, the account will be blocked until it is properly connected to a parent or guardian.

Meta acknowledged that it does not currently have data on how many under-13s are already using WhatsApp, but emphasized that this new feature was developed in response to parental requests for a safer platform for children. The standard minimum age for WhatsApp accounts remains 13 globally, aligning with UK law, which prohibits companies from processing data of children under 13 without parental consent.

Regulatory Context and Precedents

This policy shift occurs against a backdrop of increased regulatory scrutiny. Last month, the Information Commissioner's Office fined Reddit over £14 million for failing to implement adequate age checks to prevent under-13s from using its platform, a fine that Reddit is currently appealing. This highlights the growing pressure on tech companies to address child safety and data protection concerns effectively.

The decision by Meta to introduce controlled access for younger users reflects a broader trend in the tech industry towards balancing innovation with responsible oversight, as debates around social media usage by minors continue to evolve.