The world is grappling with simultaneous crises – COVID-19, climate change, multiple conflicts, food shortages and the highest rates of inflation in decades. Many of these are interrelated. All of them put children at greater risk of violence. But despite these global challenges, change is being driven. All around us – across countries and sectors – there are individuals and organisations fighting to ensure safe, secure and nurturing childhoods for children. End Violence is catalysing change with and through its diverse and powerful community and partners – from grassroots to global – whose dedicated efforts are driving progress and delivering impact. Explore the #ActionToEndViolence from around the world.
Every half a second, a child goes online for the first time. Especially since the COVID-19 pandemic, all forms of online activity have sharply increased. As children’s digital experiences increase, so do the threats and dangers they experience online – which includes online child sexual exploitation and abuse (CSEA), the fastest growing form of violence against children.
Investing for change
In 2021, End Violence teamed up with the Tech Coalition to expand knowledge of online child sexual exploitation and abuse and deliver actionable insights to prevent it. Tech Coalition is a group of 18 tech companies including Google, Microsoft, Facebook, Apple and Twitter, and its collaboration with the End Violence Partnership is part of the work to unite the global tech industry to protect children from online CSEA. Through the Safe Online initiative, the Tech Coalition Safe Online Research fund awarded 5 grants of between $120,000 and $200,00 in 2021 for research that will expand knowledge of online CSEA and explore the most effective measures for preventing it. A new round of investments is already underway, with finalist applicants under review for the next window of ~US$ 800,000 in grants.
Supporting those working behind the scenes
Last year’s grantees are making strides towards furthering our insight into addressing the issue and strengthening our response. End Violence grantee Middlesex University is doing this by researching to better support those who are working behind the scenes to make the internet safer. Witnessing violence against children is psychologically harming, yet there are individuals such as content moderators who are charged with constantly surveilling and removing traumatic images and videos of child sexual abuse – or child sexual abuse material (CSAM) – to support the reporting, monitoring and elimination of this grave problem.
Today, little primary research exists about the impact such material has on content moderators. Middlesex University, in collaboration with INHOPE, is working on the "Invisible risks: Combating secondary trauma to safeguard children'' project to explore and quantify the issues facing content moderators, identify coping strategies and provide insights on what works and what does not work for individuals and organisations that do this work. The results of this study will be used to develop a pilot intervention to support and protect the mental health of content moderators.
We asked those involved in the project about insights from their crucial research. Here’s what they are learning:
- Content moderators are impacted in many ways. Content moderators experience a range of psychological effects due to the job, although the extent to which these are found distressing varies. This includes insomnia, intrusive thoughts, enduring anger, increased distrust, panic attacks, and they can even induce anxiety around children and difficulty with relationships at large.
- Given the impacts of the job, content moderators have developed strategies to cope. Moderators tend to be coping reasonably well and have developed a range of strategies including: talking to colleagues, creating boundaries to separate work/home life, consciously developing a good work-life balance, engaging in physical exercise, and viewing the material more analytically.
- Moderating content for CSAM is a critical job – and the moderators must be supported in the workplace. It is crucial that managers have experience in assessing and dealing with CSAM and understand the role of content moderators within the organisation, including professionals employed by wellbeing services. Content moderator job satisfaction is higher when there is clear internal communication, regular feedback from management and employee development opportunities.
Middlesex University co-created a Theory of Change model for content moderators identifying resources and activities needed to effectively support content moderators and the appropriate outputs, outcomes and impacts. This has been shared with experts from eight companies across regions, including Europe, India and South America. With support from End Violence, this and other crucial projects are creating impact to protect children from harm online.
These investments are a part of the End Violence Partnership’s Safe Online’s global efforts to invest financial resources to create a safer and more secure digital world for children. Since our inception in 2016, the Safe Online portfolio has grown to reach US$ 68 million in investments in 80 projects working to end and prevent online CSEA in over 75 countries.
Action to #ENDviolence
Despite the global challenges society faces today, positive change for children is taking place across countries and sectors – driven by governments, individuals and organisations fighting to ensure safe, secure and nurturing childhoods for girls and boys. As part of the Together to #ENDviolence campaign, we are placing a spotlight on these dedicated efforts that are delivering impact.
Image: © UNICEF/UN0597175/Palomino