New Delhi: The Digital Personal Data Protection Act (DPDP Act) of 2023 aims to safeguard individuals’ personal data, particularly that of minors. The DPDP Act, for which the rules were released for public consultation earlier this month, mandates that platforms cannot process the data of individuals under 18 without verifiable parental consent.
Despite this, the lack of robust mechanisms to verify a user’s age has raised questions about the effectiveness of the DPDP rules in protecting children’s data. Social media platforms often rely on self-declared dates of birth, which can be easily falsified. This discrepancy has created a loophole that undermines the very intent of the law.
Dr Pavan Duggal, Advocate, Supreme Court of India and cyberlaw expert in an exclusive conversation with APAC News Network and CXO News shared his insights on the challenges and potential solutions regarding age verification under the DPDP Act. The discussion highlighted concerns about existing practices, the role of technology, and recommendations to strengthen the DPDP Rules.
How does the DPDP Act address the issue of falsified age information, and what are the potential legal consequences for platforms that fail to adequately verify the age of their users?
The DPDP Act does not really address the issue of falsified age information. However, the DPDP Act wants the data fiduciaries to ensure that appropriate age verification mechanisms are put in place. If these service providers or data fiduciaries do not comply with this and tomorrow it is found out that the age verification mechanisms are not up to the mark the potential legal consequences for platforms that fail to adequately verify the age of the users is crystal clearly defined under the DPDP Act.
They will be liable to pay fines up to rupees to 250 crore per contravention. This is going to put a lot of undue stress and strain on intermediaries, platforms and data fiduciaries. Hence, there is a need for far more clarity under the draft DPDP rules on dealing with the issue of falsified age information.
Unfortunately, the current draft DPDP rules and the DPDP Act do not discuss the legal consequences of falsified age information, barring a Rs 10,000 fine on data principals. This assumes more significance in the Indian context.
Since the Indian ‘Jugaad School of Management’ runs large and that being so, it is often found that children will go ahead and furnish falsified age information to use the services provided by data fiduciaries. The legal implication of this discrepancy needs to be specifically dealt with in greater detail under the draft DPDP rules 2025.
What are the limitations of current age verification methods used by social media platforms, and what alternative or supplementary measures could be implemented to enhance accuracy?
Right now, the social media platforms only at the time of registration ask you to give your date of birth to identify your age verification. There are no ways to find out whether the age that is being so stipulated by the user on the social media platform is the correct age or not. No accompanied documentation is also being sought.
Consequently, whatever the user is saying on social media is being believed. This itself encourages more falsified age information generation processes. Under the draft DPDP rules 2025, the social media platforms will be made liable in case tomorrow it is found out that falsified age information was made available on the platform because of which the said service providers and data fiduciaries failed to ensure compliance with the provisions of the DPDP Act 2023.
Several other alternative supplementary measures will also have to be implemented to enhance accuracy. This can be done in the form of appropriate stipulation under the draft DPDP rules. These could include inter alia amongst others the following:
- Social media platforms must require users to upload valid government-issued identification (e.g., Aadhaar card, passport, or driving license) at the time of registration to verify their age. This documentation should be processed in a secure and privacy-compliant manner, with provisions for periodic audits by regulatory bodies.
- Platforms can implement AI-powered facial recognition systems to estimate the user’s age during registration. This technology must adhere to strict data privacy standards and should not store biometric data without user consent.
- Platforms should introduce a two-step age verification process, which includes initial self-declaration followed by an OTP (One-Time Password) sent to a government-registered mobile number or email ID linked to a verified identity.
- Social media platforms should conduct periodic, randomized checks to verify users’ ages after registration. Users flagged for suspicious activity or non-compliance should be required to revalidate their age through additional documentation.
- For users under 18 years of age, platforms must mandate parental or guardian consent. A valid consent mechanism, such as digitally signed forms or authenticated government ID of the guardian, could be implemented.
- To ensure compliance with data protection principles, platforms must adopt data minimization practices, where only the minimum required data for age verification is collected, stored, and retained for a predefined period.
- Social media platforms must proactively educate users about the importance of providing accurate age information and the legal consequences of falsifying data. This can be achieved through pop-up messages, onboarding tutorials, and community guidelines.
- Strict penalties must be outlined in the draft DPDP rules for social media platforms that fail to implement robust age verification measures. These penalties could range from monetary fines to suspension of services.
- Platforms could collaborate with certified third-party age verification agencies to conduct unbiased and secure verification processes, reducing the burden on individual platforms and ensuring uniformity.
- Social media platforms must submit an annual compliance report detailing their age verification mechanisms, challenges, and resolutions to the Data Protection Board of India.
- For enhanced transparency and security, platforms could explore the use of blockchain to create tamper-proof age verification records, ensuring that the data is immutable and securely stored.
- Platforms should provide a separate, highly regulated version of their services for users identified as minors. This version should include stricter content controls, advertising restrictions, and monitoring features.
- Independent regulatory bodies or auditors must be tasked with monitoring and evaluating the age verification practices of social media platforms periodically.
By implementing these measures under the draft DPDP Rules 2025, the government could aim to strengthen the accountability of social media platforms, minimize the risk of falsified age data, and ensure compliance with the provisions of the DPDP Act 2023. This approach will also foster a safer and more regulated digital ecosystem for users of all age groups.
Can technology such as facial recognition or voice analysis play a role in improving age verification processes while respecting privacy concerns?
Technology such as facial recognition or voice analysis could play a role in improving age verification processes while respecting privacy concerns. However, now with the growing adoption of deepfake mechanisms and with falsified KYC information being available at the drop of a hat by various AI applications, there will be intrinsic limitations in which facial recognition or voice analysis could be relied upon in improving age verification processes.
This becomes even more important as the data fiduciary will not have the ability to adjudicate on the falsity or otherwise facial recognition or voice analysis. Further, the ramifications of facial recognition or voice analysis infringing the fundamental right to privacy as a part of a fundamental right to life under Article 21 of the Constitution of India is also going to be an immense practical problem.
What specific amendments or clarifications to the DPDP Rules would you recommend to better address the challenges of age verification and ensure the effective protection of children’s data?
Specific recommendations or clarifications to the DPDP Act and DPDP rules are required to better address the challenges of age verification and ensure the effective protection of children’s data.
- Specific parameters of age verification mechanisms and processes, procedures and practices have to be detailed and defined.
- Aspects pertaining to cyber security in the context of securing data of children and data of data principals have not been adequately incorporated under the draft DPDP rules 2025. They need to be appropriately so defined.
- Further, there is also a need for providing sustainable age verification methods which are not capable of being bypassed easily.
- In addition, the issues pertaining to the reasonable security practices, procedures and processes that have to be followed by companies in the area of age verification have to be incorporated as part of the proposed draft DPDP rules 2025. This is all the more important as once guidance regarding age verification is given under the draft rules, that is going to help in adopting a more uniform and harmonized approach in the entire issue of dealing with the challenges of age verification.
Also Read –
DPDP Draft Rules: Experts Highlight Challenges, Opportunities and Key Compliance Needs
Discussion about this post