A groundbreaking study by the Childlight – Global Child Safety Institute, University of Edinburgh, has uncovered a disturbing truth: one in eight children across South Asia have endured rape or sexual assault during their formative years. This alarming research, titled “Index on Global Child Sexual Exploitation and Abuse 2025,” was presented during the c0c0n 2025, an annual cybersecurity conference hosted by the Kerala Police.
The study’s findings are stark: 14.5% of women and 11.5% of men in the region reported experiencing such abuse in their childhood. Childlight clarified that this data specifically represents India, Nepal, and Sri Lanka. “Even when considering population size, this translates to a staggering 54 million children across these three nations,” stated Deborah Fry, Childlight’s Global Director of Data and Professor of International Child Protection Research, emphasizing the profound scale of the issue.
Further intensifying the crisis, India, Pakistan, and Bangladesh collectively reported nearly 4.5 million instances of Child Sexual Abuse Materials (CSAM) in 2024. CSAM encompasses any visual content depicting the sexual abuse or exploitation of a minor, critically including computer-generated materials. The report also highlighted a frightening global trend: a massive 1,325% surge in malicious AI-generated online abuse content between 2023 and 2024.
Maldives Grapples with Highest CSAM Rate
Within South Asia, the Maldives faces the highest per capita rate of CSAM, with 94 reports for every 10,000 individuals. While India records the largest overall volume of CSAM, its vast population size means it has the lowest reported rate in the region at 15.5%.
Childlight issues a grave warning about the potential explosion of AI-generated CSAM in the near future. Ms. Fry specifically noted, “India is poised to see a significant rise in AI-generated CSAM, making it imperative to enact future-proof legislation capable of addressing these novel and evolving forms of harm. This stands as a crucial recommendation within the report.”
The study urgently calls for sustained efforts in detecting and removing CSAM, alongside a paramount focus on identifying victims and providing essential safeguarding support.