Combating online child sexual violence with the help of AI 

Tech Ireland

In September of 2021, the End Violence Partnership and the Technology Coalition awarded five organisations with grants to tackle online child sexual exploitation and abuse (CSEA). Through these grants, organizations across the world will undertake groundbreaking research to inform both policy and practice to tackle online child sexual exploitation and abuse (CSEA). One of those organizations is Technological University Dublin, which is implementing N-Light: An innovative application to uncover patterns of online child sexual exploitation through national helpline and hotline analysis of caseloads from November 2021 to May 2023. 

Through this project, Technological University Dublin will develop a deployable tool that reveals the patterns of adults perpetrating online child sexual abuse and the children who are affected by such violence. By using advanced artificial intelligence machine learning for text, the study will advance global understanding of trends in perpetrator behaviour (conduct, contact, content) – including grooming – and debunk strategies and tactics used to lure and coerce children into sexually exploitative acts.  

N-Light will be created in collaboration with two essential partner organisations, the Irish Society for the Prevention of Cruelty to Children (ISPCC) and Hotline.ie, the Irish national centre combatting illegal content online, specifically child sexual abuse material (CSAM) and activities relating to online child sexual exploitation (OCSE). Once finalized, N-Light will be tested by both partner organisations, with the intention to make it available to other hotlines in the INHOPE network and child agencies for their use, which would in turn lead to an enriched, more robust and representative data sample and analysis capacity. In addition, the data and insights will serve to better understand and conceptualise victim and perpetrator behaviour, patterns and activity, ultimately informing the further development of evidence-based solutions that would have the potential of transformative impact in tackling this heinous crime against children. 

End Violence spoke with Dr. Susan Mckeever and Dr. Christina Thorpe about the N-Light project, and the impact it will have on children across the world. 

Your project seeks to develop a tool to identify patterns of behaviour among perpetrators of child sexual exploitation and those they are targeting. Why is identifying these patterns important to preventing – and ending – child sexual abuse online?  

Online child sexual abuse involves real children, real abuse, real suffering, and real repeat-victimisation. For child sexual abuse material (CSAM) to exist, a crime must have been committed in real life; a child has been sexually exploited and often, actually raped. When imagery is uploaded to the internet it becomes documented evidence of the crime and a permanent record of the child’s abuse. The child is re-victimised and exploited ad infinitum while images remain online.  

For child sexual abuse material (CSAM) to exist, a crime must have been committed in real life. 

Dr. Susan Mckeever and Dr. Christina Thorpe, The N-Light Project

There is now an increasing sense of urgency for a paradigm shift where effective prevention and protection requires a more holistic approach. Hotline.ie has witnessed perpetrators becoming increasingly tech savvy, year on year going to new lengths to evade detection. The distribution of online child sexual abuse has expanded across all available digital mediums, where perpetrators do not abandon traditional means but also utilise the latest technology and the latest fora for communications.  

Some online services are misused as meeting environments, where individuals converse casually about the sexual abuse and exploitation of defenseless children as if they are a commodity; Others serve as end-destination facilitating access to child sexual abuse material repositories. 

 The depth of abuse depicted in these repositories is immense. Images and videos depict children of all ages, nationalities and genders being subjected to sexual abuse by men and women, or being forced to act out sexual activity while being filmed and photographed. Given the complexities and scale of the abuse, only by identifying the patterns of human behaviour involved can the roots of the issue be targeted. The ultimate goal of this project is to challenge the elements in society that are generating and enabling the abuse rather than dealing primarily with its aftermath. N-light will contribute actionable intelligence that could be used to reframe and amplify our collective response to tackling child sexual abuse and exploitation. 

How will your artificial intelligence tool work? 

We will utilise AI to examine text-based material and communications identified during the assessment of suspected child sexual abuse and exploitation reports, where such exchanges may occur in forums or via web chats, aiming to uncover patterns of how children are being exploited or groomed via online engagement.  The AI machine learning techniques now available for text are highly sophisticated for discovering text patterns in a general sense. However, using real or representative data is a key part of any AI project that exploits machine learning. 

Our partners on N-Light, Hotline.ie and ISPCC, will provide anonymised data to the project. This will give us the ability to apply machine learning and text-mining techniques on real examples to generate insights. The anonymised data will include, for example, text data captured from forums and, anonymised web chat/texts from our partner child agency. 

First, we will undertake a data exploration exercise to finalise the types of insights we will aim to determine using our AI tool. Subsequently, we will determine the underlying AI tasks required to deliver these insights. The labelling of the data will be conducted in co-operation with the domain expertise of our project partners. 

Working backwards from each task, we will undertake an experimental phase, applying the candidate techniques, machine learning algorithms and language models to fulfil each task. This experimental phase will result in models which we will wrap with a user interface layer to make these accessible and deployable to non-expert users. This will bring our research beyond the experimental results phase to a working pilot tool that can be tested. 

Another key part of your project is producing actionable insights that lead to policy change. Can you explain this objective further, and how you hope it will change children’s safety online? 

Transformative impact can be achieved with approaches that are adequate in scope, targeting and evaluation. Technology facilitated abuse manifests differently across the spectrum and it is ever evolving and changing, which is also true for online attitudes and behaviours. Child sexual abuse and exploitation online needs to be understood and addressed by looking at the different parts of the whole, and the whole. 

Prior to N-Light, there was no mechanism to perform joint pattern analysis of online child abuse behaviour and perpetrator activities. By bridging the gap, our findings will inform future policy developments, hopefully across different targets, known abusers and those with predatory/risky behaviour tendencies, situations in which sexual abuse and exploitation can occur, children and young people, and communities. We are hopeful that the data and insights available through N-Light may also serve to identify and debunk potential common misconceptions about child sexual exploitation and abuse online, whilst better informing the development of deterrence, educational and awareness campaigns and programmes; trainings programmes across multi-stakeholder groups; communication strategies inclusive appropriate messaging for vulnerable or at risks groups; more accurate assessments of the level of risk and harm allowing for signposting to adequate supports. 

There is great potential to identify gaps in policy, emerging or evolving threats, technologies being misused for nefarious purpose, etc. which could better equip front-line responders, such as hotlines, helplines, and child’s rights agencies in their work.  

While our objectives are ambitious, we remain hopeful that during the lifespan of the project, N-Light will provide valuable insights and shed light on a very complex issue. This would enable us to make contributions to the further development of policy and child protection practices. We have well established channels for public relations, media campaigns, and social media channels, and we intend to use these to communicate and inform stakeholders and the wider public of our findings, developments, N-Light goals, and impact. 

How do you hope this tool will strengthen hotline and front-line child services in Ireland and across the world? What existing work are you building on with this research and what impact will it have on other sectors? 

N-Light, our proof-of-concept tool, will employ state-of-the-art machine learning analysis of textual data and will be deployed for the use of our national children’s helpline (ISPCC Childline) and the national centre combating illegal content online (Hotline.ie) in Ireland. It will strengthen our front-line child services by enabling the discovery of trends and patterns in important detection and prevention information that will better equip organizations to enhance their bespoke services delivery and inform the development of awareness-prevention solutions, campaigns, and programs. Prevention work, while difficult, remains one area where more can be done. In an ideal world, preventing child sexual abuse is always preferable to responding to it. 

In an ideal world, preventing child sexual abuse is always preferable to responding to it. 

Dr. Susan Mckeever and Dr. Christina Thorpe, The N-Light Project

In our current TU Dublin-based research projects on automated content analysis, we apply state-of-the-art text mining, language modelling, and machine learning techniques to auto detect abusive content. With N-Light, we will apply these techniques and fine-tune them to the partner data (report logs, perpetrator discussions/forum entries, twinned with child victim reports). 

Importantly, N-Light will be designed to be agile and scalable, and as such to be potentially implemented by similar cross-border associated services (hotlines, helplines), providing an even greater opportunity to harness a wealth of data, which can in turn provide robust and representative patterns, trends, and findings. The project will employ forward engineering design principles to leverage its suitability for other agencies. The initial findings from using N-Light on Irish data will be impactful on education and outreach strategies – including more informed early intervention and prevention strategies and resources. To maximize this, the project will work towards the development of key findings briefing for education, policy, and law enforcement stakeholders. 

How will this research translate to impacts for the tech industry? What are the impacts for other sectors? 

Technology-facilitated child sexual abuse and exploitation is a matter of public interest and policy where cybercrime, legislation, corporate social responsibility, education and child protection intersect. For the tech sector, we will publish our findings on the use of AI techniques as it specifically relates to child sexual abuse and exploitation. We believe that our findings will enable other computer science researchers working to end online CSEA and related spaces to build on it. 

When we think of the tech industry and the digital world, we immediately think of household names. But there is a myriad of online service providers and tech companies from start-ups, small and medium-size companies that would not have the resources to invest in the research and develop tools to tackle these crimes in the same way household names do. Our project and findings could help them either develop or improve their online child protection policies and/or build on our findings. Everyone in society has a duty of care towards child protection and a role to play. 

Everyone in society has a duty of care towards child protection and a role to play. 

Dr. Susan Mckeever and Dr. Christina Thorpe, The N-Light Project

Large technology corporations contribute significantly toward the AI research space, publishing their own state-of-the-art AI findings for use by other researchers. One example is the Think BERT (Bidirectional Encoder Representations from Transformers) language modelling from Google. Our work will contribute in this space, through the application of state-of-the-art techniques to online child sexual abuse and exploitation specifics. This use case is particularly pertinent to any of the large tech companies hosting user generated content on their platform. 

We also aim to make N-Light, our proof-of-concept tool, available to other researchers and interested eligible parties. This may be as a step that could enable future development of the tool, either commercially or on a not-for-profit basis. There is scope for N-Light to provide intelligence to law enforcement and the educational sector. 

What do you think needs to happen to end online child sexual exploitation and abuse for good? 

Transformative impact can be achieved with approaches that are adequate in scope, targeting and evaluation. There is no one size fits all, and whilst technological responses to online child sexual abuse and exploitation continue to be important, technology is not a silver bullet solution. Furthermore, there is a gap between who is able the implement technological solutions and who is responsible for the fora (i.e. websites) misused by child sexual abuse perpetrators. 

Effective online safety and child protection strategies require careful balance of public and private, legal, and voluntary measures at various levels, grounded on conceptual and operational clarity and shared responsibility between relevant stakeholder groups. Developing and building knowledge through research is a key component. Efforts to disrupt the production, distribution and “consumption” of online CSEA should be framed with due account of the intricacies within the Internet tapestry and by taking into account the fundamental differences that exist between various players, such as divergent practices, functions, layering of services and products. 

Tech companies must continue to challenge, improve, and innovate with a child’s rights and safety-centred approach. However, designing new technology, improving existing ones and adopting new tools to combat online CSEA requires a regulatory framework to facilitate sharing and cooperation. 

BLOCK QUOTE: We believe that a greater focus on prevention research, initiatives, campaigns, development of early intervention programmes, can make a tangible long-term difference. 

We believe that a greater focus on prevention research, initiatives, campaigns, development of early intervention programmes, can make a tangible long-term difference. Experts across the board and the globe have been highlighting the need for a different approach to prevention. Additionally, with the year-on-year increase in (child) self-generated sexually explicit content, greater education, awareness, and understanding is imperative to reduce the level of risk, prevent and protect children from becoming victims of sexual crime and exploitation. Last but not least, investment in child victim support services, to provide survivors of online CSEA with the appropriate supports on their healing journey.