Global: India’s Data Protection Law Must Hold Platforms Accountable for Child Safety

India’s Data Protection Law Must Hold Platforms Accountable for Child Safety
Share this article

In the ongoing debate about regulating age-restricted content online, universal truths often clash with regional challenges. This was evident in a recent webinar hosted by the Citizen Digital Foundation (CDF) titled “With Alice, Down the Rabbit Hole,” where digital rights advocates, policy experts, and parents discussed online child safety and age assurance in the context of India’s Digital Personal Data Protection Act (DPDPA).

CDF’s research indicates that “subversive and harmful content and threats from bad actors online pose a greater threat to children and vulnerable groups in India.” Panel moderator Nidhi Sudhan criticized the contradictory stance of tech CEOs on child safety, highlighting instances such as Meta’s research on teenage brain function to boost youth engagement on its platforms.

YouTube emerged as a significant concern due to its engagement-driven algorithms. CDF researcher Aditi Pillai emphasized that these algorithms are a key issue. However, journalist and parent Dhanya Krishnakumar pointed out the complexities of imposing age verification without causing additional harm to children. She underscored the need to balance safety with the risk of excessive protection leading to peer pressure and cyberbullying, advocating for open discussions across all age groups to enhance digital literacy.

Aparajita Bharti, co-founder of the Quantum Hub and Young Leaders for Active Citizenship (YLAC), stressed that “the Indian ecosystem needs to be evaluated differently from the West. Placing the entire responsibility on parents is ineffective, as many lack the knowledge and resources to ensure online safety.” She highlighted the importance of balancing Internet access, which provides economic opportunities, with ensuring children’s safety online.

India’s diverse landscape, with its 28 states, 8 union territories, and 1.5 billion people, makes consistent digital regulation challenging. Arnika Singh, co-founder of Social & Media Matters, noted that “India changes every 2 kilometers,” rendering one-size-fits-all policies inadequate for addressing the country’s rich socio-cultural and geographical diversity.

A significant point raised was the need for “techno-legal solutions that allow parental control over algorithmic behaviors on platforms,” alongside a realistic approach to where accountability lies for children’s online activities. Nivedita Krishnan, director of the law firm Pacta, cautioned that the DPDPA’s provision requiring verifiable parental consent for processing a child’s personal data could inadvertently make parents responsible for any illegal activities their children engage in online. She argued that expecting parents to monitor all their children’s online activities is unrealistic and burdensome.

Chitra Iyer, co-founder and CEO of consultancy Space2Grow, argued that parents are being stretched across an accountability gap that should be shared with the platforms delivering harmful content. She criticized the low level of accountability currently held by these platforms and called for better government regulation.

Arnika Singh emphasized the need for tech platforms to prioritize user safety over profit and for context-specific content moderation that reflects India’s cultural diversity. Ultimately, the panel concluded that the DPDPA needs stronger enforcement mechanisms, including specific audit and impact measurement processes, and should look to international models as benchmarks for improving India’s laws.

The full webinar is available on YouTube for those interested in a deeper dive into the discussion.

Share this article

Nigeria: “Our Vision is Now Global,” Says Nigeria’s Access Holdings as It Begins $233 Million Capital Raise

Previous article

Nigeria: Senate Urges Increased Budget for Securities and Exchange Commission (SEC)

Next article

You may also like


Comments are closed.

More in Regulatory