WhatsApp has introduced parent-managed accounts, allowing parents or guardians to set up and oversee accounts used by pre-teens (under 13). The company says the feature gives parents control over key safety and privacy settings while allowing children to use WhatsApp primarily for messaging and calls.
Key controls include:
- Contact controls: Parents can decide who can contact the account.
- Group controls: Parents can approve or manage which groups the child can join.
- Message request review: Parents can review requests from unknown contacts.
- Privacy settings: Parents can manage the account’s privacy settings.
- Parent PIN: Required to access or change these controls.
WhatsApp said personal conversations remain protected by end-to-end encryption, meaning messages and calls cannot be accessed by anyone outside the chat, including the company. However, WhatsApp has not specified a timeline for when the feature will be rolled out and which regions will get it first. The FAQ page reads, “Parent-managed accounts are rolling out gradually and may not be available in your region.”
Parental Supervision On Social Media
Other Meta platforms already have parental supervision tools. For example, Instagram’s“Teen Accounts” default to private profiles, restrict messaging, and allow parents to view interaction history and set time limits.
Furthermore, Meta has also introduced safeguards across Messenger and Instagram that block direct messages from unknown users unless a teen is connected to them. Meanwhile, other platforms have launched similar tools. Google, Snapchat, and TikTok offer family supervision features such as screen-time limits, contact restrictions, and content filters.
Malcolm Gomes, Chief Operating Officer at Privy by IDfy, said WhatsApp’s parent-managed accounts are a positive step but warned that compliance with India’s Digital Personal Data Protection Rules 2025 (DPDP) depends on how platforms authenticate parents, collect data, and record consent.
Does WhatsApp meet DPDP’s “verifiable consent” test?
Gomes said WhatsApp’s design is “a strong step in the right direction,” but warned parental linking may not meet the legal threshold.
He explained, “public reports describe parental linking and controls, which is not the same as ‘verifiable consent’ under Indian law”. Meeting DPDP standard depends on ‘the exact age verification flow, the evidence collected, the parent or guardian authentication, and proof of the adult‘s authority to act on behalf of the child’.
If the setup only links accounts without verifying the adult’s identity or authority, Gomes said platforms may need extra verification. He emphasized, “It’s not about the parental design; consent must be verifiable, auditable, and tied to lawful child data processing”.
Balancing parental oversight with privacy
Gomes said parental-control systems must comply with the DPDP law’s data-minimisation principles. He said, “If child safety is the goal, platforms should let parents control settings, contact approvals, group permissions, and suspicious requests without routine content surveillance”.
He added that this distinction matters particularly for encrypted messaging services. “A parent may need supervisory controls over the account environment, but that does not automatically justify broad collection or retention of additional personal data about the child”. Platforms should collect only what is necessary to establish age, verify the consenting adult, keep consent records, and operate parental settings.
More broadly, Gomes said the debate should not be framed as a choice between privacy and control. Instead, “the real challenge is designing systems that keep children safer online while still protecting their data and dignity”. In the long run, he said the ecosystem may need “privacy-preserving trust infrastructure for age assurance and parental consent” rather than fragmented verification systems created by individual platforms.
Also Read