The recently proposed Rule 10 of the Digital Personal Data Protection (DPDP) Rules, 2025, while aiming to safeguard children's privacy, raises significant concerns about the potential unintended consequences for marginalised communities. This rule mandates that verifiable parental consent be obtained before processing a child’s personal data. On paper, this seems like a protective measure but, on closer examination, it might inadvertently exclude children of illiterate parents from accessing social media and digital platforms. An indirect Australian effect gets noticeable too.
The Core of Rule 10: Safeguarding or Gatekeeping?
Rule 10 requires that data fiduciaries adopt technical and organisational measures to verify the identity and age of parents or guardians. On the surface, Rule 10, requiring verifiable parental consent for processing children's data, sounds perfectly reasonable. Who wouldn't want to protect children online? But, as we peel back the layers, we find a troubling paradox.
This Rule, designed to shield children, might actually shut the digital door in the faces of the most vulnerable among them.
The core problem lies in the reliance on digital verification. It assumes a level playing field of digital literacy and access that simply does not exist in India or, indeed, in many parts of the world.
Imagine a parent who has never used a computer, let alone a DigiLocker. How can they possibly provide the ‘verifiable consent’ required by Rule 10? The answer, tragically, is they can't. And their children pay the price. This verification can be done using:
• Reliable identity details already available with the fiduciary.
• Voluntarily provided identity details or tokens issued by government entities (e.g., DigiLocker services).
The Problem: Accessibility and Awareness
1. Exclusion of Illiterate Parents: Many illiterate parents in India lack digital literacy or access to services like Digi Locker. Without the ability to verify their identity, they cannot provide the required consent. This effectively bars their children from accessing social media or online platforms. In a country as diverse as India, digital accessibility is far from universal.
Rule 10 of the DPDP Rules 2025 mandates verifiable consent from parents for processing a child’s personal data. While this regulation emphasises the importance of protecting children in the digital realm, it inadvertently creates barriers for illiterate or digitally illiterate parents.
A significant percentage of India’s population lacks the digital literacy required to navigate platforms like Digital Locker or other identity verification services. These parents, unable to provide the mandated digital consent, face exclusion from enabling their children to participate in the digital world. As a result, children of such parents are effectively barred from accessing social media, online learning platforms, or any other digital service—a restriction that could limit their educational and social development.
Imagine a farmer in rural India who wants his child to access an educational platform. Without access to or understanding of digital verification tools, their intent to support their child’s learning is hindered. This problem is not merely about digital access—it is a reflection of how technological progress can unintentionally marginalise those on the periphery of the digital economy.
2. Marginalisation of Underprivileged Communities: A significant portion of India's population, especially in rural areas, remains outside the digital economy. Requiring digital identity verification could disproportionately impact children from these communities, creating a digital divide that exacerbates social inequities.
India’s digital divide is stark and Rule 10 risks exacerbating these inequalities. Many children from rural or economically underprivileged backgrounds rely on digital platforms for education, information and entertainment.
However, requiring digital identity verification as a prerequisite to access these platforms disproportionately impacts children from these communities. They often lack access to the necessary infrastructure, such as stable internet connections, smartphones, or government-issued digital IDs. This creates a scenario where children from well-off, digitally connected families can thrive online, while others are left behind, perpetuating a cycle of inequity.
For example, while urban parents might seamlessly use Digi Locker to verify their identities, rural parents may struggle due to poor internet connectivity, lack of necessary documentation, or limited understanding of digital tools. This divide has broader implications for social mobility, as children from underprivileged communities miss out on opportunities available to their urban counterparts.
3. Over-reliance on Government Infrastructure: The Rule heavily depends on government-backed digital verification systems like DigiLocker. While these systems are robust, their reach is limited in rural and underserved areas where many parents may not have the required documents or familiarity with the system.
Rule 10 leans heavily on government-backed digital verification systems like DigiLocker which are undoubtedly secure and efficient. However, their reach and adoption remain limited in rural and underserved areas. Many parents in these regions do not possess the required documents, such as Aadhaar-linked credentials or digitally verifiable age proof. Others may not even be aware of the existence of such systems, let alone understand how to use them. This over-reliance raises questions about the feasibility of implementing Rule 10 in its current form.
While DigiLocker and similar platforms are robust solutions, they are not yet inclusive enough to cater to India’s vast and diverse population. The challenge is not only infrastructural but also educational—parents need to be made aware of these tools and trained to use them effectively. Without significant investment in outreach and digital literacy programmes, the promise of Rule 10 may remain out of reach for millions.
Illustrative Cases
Let us analyse scenarios covered by Rule 10 to highlight its practical implications:
• Case 1: Parent is Digitally literate and registered
A child (C) informs the data fiduciary (DF) that she is a minor. The parent (s) (P), already registered with DF, provides their identity details, which DF verifies. The child is granted access. This is the ideal scenario, a smooth transaction in the digital world. But it represents a privileged minority.
Outcome: Seamless process for families with digital literacy and access.
• Case 2: Parent is unregistered but digitally literate
P identifies herself as C’s parent but is not registered. DF verifies P’s identity using Digital Locker or similar tools. The struggling adapters: This scenario represents those who are trying to navigate the digital world but might face hurdles. This highlights the digital divide, where access and technical know-how are not evenly distributed.
Outcome: Requires technical know-how and access, potentially excluding illiterate parents.
• Case 3: Parent lacks digital literacy or documentation
P cannot use Digital Locker or provide verifiable identity details. DF cannot confirm her identity. This is the heart-breaking reality for many. It's not just about illiteracy; it's about a lack of access to devices, internet connectivity and even basic digital awareness. This isn't just a technical problem; it's a social justice issue.
Outcome: Child’s access to social media is denied or effectively banned by law.
Right to Access vs Right to Privacy
While Rule 10 seeks to protect children's privacy, it inadvertently clashes with their right to access the internet, which has been recognised as a critical enabler of education and participation in today’s world. This exclusionary effect raises fundamental questions:
• Are we unintentionally penalising children for their parents' circumstances?
• Is the Rule, designed to protect privacy, inadvertently deepening social inequalities?
As a thought leader, I believe, we need to shift our perspective. We need to move beyond a simplistic ‘either/or’ approach and embrace a more nuanced understanding of the digital landscape.
The internet is no longer a luxury; it's a necessity for education, communication and participation in modern society. Denying access is denying opportunity, and that has profound social and economic consequences.
Potential Solutions
• Alternative Verification Mechanisms: Introduce community-level verifications through schools, local authorities, or non-government organisations (NGOs) for parents who lack digital access. We need to move beyond a one-size-fits-all approach. Community-based verification through schools, local leaders, or trusted NGOs could provide a more inclusive solution. Think of it as a digital bridge, connecting those who are currently excluded.
• Awareness and Capacity Building: Conduct digital literacy programmes in rural and underprivileged areas to familiarise parents with digital verification tools like DigiLocker. This is not just about teaching people how to use computers; it's about empowering them with the skills and knowledge to navigate the digital world safely and effectively. This requires sustained investment in education and infrastructure, particularly in underserved communities.
• Special Provisions for Marginalised Groups: Allow exceptions for families without access to digital infrastructure. Develop simplified offline mechanisms to verify identity. Technology should serve humanity, not the other way around. We need systems that incorporate human judgment and flexibility, allowing for exceptions and alternative approaches when necessary.
• Partnerships with Schools and Local Governments: Schools and local bodies can play an intermediary role in authenticating parental consent, ensuring inclusivity. This is not a problem that can be solved in isolation. We need a collaborative effort involving governments, tech companies, civil society organisations and, most importantly, the communities affected by these rules.
Legal and Ethical Dimensions
• Proportionality and Necessity: While safeguarding children's data is necessary, the measure must be proportionate. Excluding vulnerable children fails this test.
From a legal standpoint, the principle of proportionality is key. Any restriction on a fundamental right must be proportionate to the aim it seeks to achieve. In this case, the aim is laudable—protecting children's privacy. But the means–mandatory digital verification—may be disproportionate, as it creates a significant barrier for a large segment of the population.
Furthermore, the Rule raises serious concerns about equality and non-discrimination. By disproportionately impacting children from marginalised communities, it risks violating their right to equal protection under the law.
• Right to Education and Digital Access: Denying access to digital platforms may hinder a child’s education and development, especially in a digital-first era.
Conclusion
The intent behind Rule 10 is commendable—it seeks to protect children's privacy in an increasingly digital world. However, the Rule’s implementation raises serious concerns about accessibility and inclusivity. By mandating digital identity verification, it risks excluding the very children who most need access to online resources.
Rule 10 is not just about social media; it's a microcosm of a much larger challenge: How do we build a digital future that is inclusive and equitable for all? We must remember that technology is a tool and, like any tool, it can be used for good or for ill. It is up to us to ensure that it serves the interests of all of humanity, not just a privileged few.
We must strive to create a digital world where no child is left behind, where access to information and opportunity is not determined by their parents' circumstances. This requires not just technical solutions, but a fundamental shift in our thinking, a commitment to building a truly just and equitable digital society. This is not just a legal or technical challenge; it's a moral imperative.
The government must revisit this Rule to ensure it serves as a protective measure without becoming a barrier. As a society, we must strive to create digital policies that are not only robust but also equitable.
Let us ensure that no child is left behind in the digital age.
(This article first appeared on dpdpa.com, a site maintained by Adv Dr Mali)
(Advocate (Dr) Prashant Mali is an internationally renowned Cyber & Privacy Lawyer with a Master's in Computer Science and Law, and holds a Ph.D. in Cyberwarfare & International Cyberlaw. He is a sought-after expert who has represented Fortune 500 companies, celebrities, and governmental agencies. An author of six books and numerous research papers, one of his books serves as an official textbook in prestigious academic institutions. Beyond law, he is actively involved in charitable activities and cyber education initiatives to support underprivileged communities.)