2021 Tech Coalition Safe Online Research Fund
The Safe Online Initiative of End Violence and the Tech Coalition launched the Tech Coalition Safe Online Research Fund to tackle online child sexual exploitation and abuse (CSEA). In 2021, we selected 5 winners who were awarded grants of between $120,000 and $200,00 for research that will expand knowledge of online CSEA and explore the most effective measures for preventing it.
This Open Call was focused on innovative research that can impact relevant policy and product development, with a priority given to research that can help inform the technology industry's approach to combating online CSEA.
We currently are accepting Expressions of Interest for the 2022 Tech Coalition Safe Online Research funding round. Please see the details to apply!
The Tech Coalition Safe Online Research Fund aims to advance the world’s understanding of online child sexual exploitation and abuse (CSEA). US$1 million was invested towards this research fund, with $120,000 to $200,000 allocated to each of the five awarded grantees. The research projects aim to deliver actionable insights that contribute to practitioner understanding of online CSEA, with a priority on technology industry practice across four specific areas. The objectives of the 2021 Open Call were to invest in research that increases our understanding and provide the application of:
- Efficacy and impact of online CSEA deterrence and prevention interventions, including an understanding of offender pathways and children’s digital experiences. This includes increasing the evidence base for effective, actionable industry reporting to and investigation by the appropriate authorities; impactful education and outreach strategies; and strengthening the capacity of all stakeholders in preventing, disrupting and responding to online CSEA.
- Evolving technology may present new or exacerbate existing threats by contributing to changes in offender or children’s behaviour or creating new challenges and risks. Research could provide solutions to respond to implications of evolving technology, for example, abuse detection in encrypted environments or real-time reporting in live streams. This area of focus also includes potential solutions and opportunities offered by technology developments for risk mitigation and better tools and strategies around behavioural interventions.
- Well-being, collaboration and support for survivors, improving well-being and resilience for staff such as content moderators and others, as well as enhancing collaboration among all sectors working to end online CSEA.
- Policy and legal considerations are critical to shaping online CSEA deterrence and response. In this channel, research could include assessment of policy implementation over time and upstream interventions such as public health or safety-by-design approaches; addressing age-appropriate design and experiences; and understanding how privacy and ensuring children’s rights intersect in the fight against CSEA
End Violence Safe Online and the TechCoalition invited not-for-profit organisations (i.e. research institutes and academic institutions, civil society organisations (CSOs), non-governmental organisations (NGOs), international organisations) to respond to the first Tech Coalition Safe Online Research Fund Request for Proposals (RfP).
Consortia were also encouraged to apply; however, the organisation submitting the application were considered the main grantee and must be a not-for-profit organisation (i.e. research institutes and academic institutions, CSOs, NGOs, and international organisations), bearing all the contractual responsibilities vis-à-vis End Violence.
Only entities that fulfil these mandatory requirements were considered eligible:
- legally registered organization that is not-for-profit
- the research addresses one or more of the four areas of the 2021 Open Call
- all funding will be directed to an institution, not to individuals
- the research project lead may submit one proposal
- at minimum, the applicant organisation should have demonstrated relevant prior expertise and/or research experience
- the organisation has a safeguarding policy in place or is willing to develop a policy *
Understanding Traffickers and Pathways to Offending: Analysis and recommendations to better detect, deter, and prevent Online Sexual Exploitation of Children in the Philippines
Justice and Care is a United Kingdom-based non-governmental organisation that works with law enforcement officers to rescue victims of human trafficking, protect at-risk communities, and dismantle criminal networks. With support from the Tech Coalition Safe Online Research Fund and in partnership with International Justice Mission (IJM), Dublin City University and De La Salle University in the Philippines, Justice and Care is launching a study to help the world better understand online sexual exploitation of children in the Philippines – and as a result, provide analysis and recommendations to better detect, deter and prevent this type of violence.
To do so, Justice and Care will explore the profiles of those who perpetrate and facilitate online sexual exploitation of children, interviewing convicted offenders, key informants, and others. This analysis will fill a gap in global research into online child exploitation of children and shed light on the “supply-side” of such violence in a country known to be an epicenter of live-streamed child sexual abuse. Ultimately, this research will seek to inform practical strategies and enhance industry, prevention and law enforcement response to the issue.
Learn more about the project in this interview: protecting children in the Philippines from online sexual abuse and exploitation
Invisible risks: Combating secondary trauma to safeguard children
We know that online child sexual abuse material is highly damaging to children. But today, little primary research exists about the impact such material has on content moderators – individuals who are charged with constantly surveilling and removing traumatic images and videos of child sexual abuse.
Through this project, researchers at Middlesex University, in collaboration with INHOPE and other sectors specific organisations, will explore and quantify the issues facing content moderators, specifically as it relates to their exposure of traumatic child sexual abuse material. They will also identify coping strategies currently used by content moderators, and highlight what works – and what does not work – for individuals and organisations that do this work. Results of this study will be used to develop a pilot intervention to support and protect the mental health of content moderators.
Learn more about the project in this interview: Invisible Risks: content moderators and the trauma of child sexual abuse materials
N-Light: An innovative application to uncover patterns of online child sexual exploitation through national helpline and hotline analysis of caseloads
Through this project, Technological University Dublin will develop a deployable tool that reveals the patterns of adults perpetrating online child sexual abuse and the children who are affected by such violence. By using advanced artificial intelligence machine learning for text, the study will advance global understanding of trends in perpetrator behaviour (conduct, contact, content) – including grooming – and debunk strategies and tactics used to lure and coerce children into sexually exploitative acts.
N-Light will be created in collaboration with two essential partner organisations, the Irish Society for the Prevention of Cruelty to Children (ISPCC) and Hotline.ie, the Irish national centre combatting illegal content online, specifically child sexual abuse material (CSAM) and activities relating to online child sexual exploitation (OCSE). Once finalized, N-Light will be tested by both partner organisations, with the intention to make it available to other hotlines in the INHOPE network and child agencies for their use, which would in turn lead to an enriched, more robust and representative data sample and analysis capacity. In addition, the data and insights will serve to better understand and conceptualise victim and perpetrator behaviour, patterns and activity, ultimately informing the further development of evidence-based solutions that would have the potential of transformative impact in tackling this heinous crime against children.
Learn more about the project in this interview: Combating online child sexual violence with the help of AI
Prevention of online child sexual exploitation and abuse in Latin America and evaluation of mitigation strategies with Artificial Intelligence
Through support from the Tech Coalition Safe Online Research Fund, Universidad de los Andes in partnership with Programa Aulas en Paz will use artificial intelligence strategies and tools to study the language and patterns of interactions between potential and current offenders of online child sexual exploitation and abuse and current or potential victims of such violence. At the same time, researchers will explore strategies designed to mitigate such abuse, such as parental mediation and industry-created and deployed protection tools.
As a result, the study’s findings will be used to develop artificial intelligence tools to analyse interaction patterns between aggressors and victims of online sexual exploitation and abuse. These strategies, which will be designed for adaptability across contexts, will then be channeled to law enforcement agencies in the region. The project will ensure applications for families, industries, and governments to better protect children online, including and especially for organisations that process information related to online child sexual abuse.
Learn more about the project in this interview: Using AI tools to end online child sexual abuse and exploitation in Latin America
Understanding and improving help-seeking by people at risk of perpetrating online child sexual exploitation and abuse
The Centre of Research and Education in Forensic Psychology at the University of Kent is a group of leading psychologists working on offending behaviour, including sexual exploitation and abuse. In partnership with the Lucy Faithfull Foundation and researchers based in South Africa, Mexico, the US and UK, this project will shed light on the psychological processes through which people at risk of online sexual exploitation and abuse may instead seek professional support. In addition, the group will explore the efficacy and impact of prevention interventions targeting people engaging with online abuse. Overall, the project will ask a fundamental – and often overlooked – question: who seeks help for child sexual exploitation and abuse, and can we get more people to do so before committing a crime?
This project will expand the group’s existing model of psychological predicators of help-seeking for people at risk of offending, and examine how to amplify the psychological factors that support such help-seeking behaviours. At the same time, the project will also look into the psychological barriers that prevent help-seeking, and explore ways to weaken those barriers in the digital sphere.
Learn more about the project in this interview: Changing minds: interventions to prevent online child sexual exploitation and abuse
The Tech Coalition and End Violence Safe Online set up an Advisory Group, made up of representatives from key online CSEA focused organisations, alliances and independent experts along with tech industry representatives. The Advisory Group played a central role in outlining the RfPs for selection criteria, dissemination strategies and knowledge-exchange activities to amplify the impact of the research.
Key roles of the Advisory Group included:
- Designing and determining the scope of the RfPs
- Establishing adequate, transparent scoring and assessment frameworks to evaluate funding proposals
- Supporting with RfP outreach and promotion
- Reviewing the most viable proposals and rating them through the agreed-upon framework
- Deciding on finalists and grantees of the Tech Coalition Safe Online Research Fund
Members of the 2021 Tech Coalition Safe Online Research Fund Advisory Board included: