Invisible risks: Combating secondary trauma to safeguard children

We know that online child sexual abuse material is highly damaging to children. But today, little primary research exists about the impact such material has on content moderators – individuals who are charged with constantly surveilling and removing traumatic images and videos of child sexual abuse.

Through this project, researchers at Middlesex University, in collaboration with INHOPE and other sectors specific organisations, will explore and quantify the issues facing content moderators, specifically as it relates to their exposure of traumatic child sexual abuse material. They will also identify coping strategies currently used by content moderators, and highlight what works – and what does not work – for individuals and organisations that do this work. Results of this study will be used to develop a pilot intervention to support and protect the mental health of content moderators.

Learn more about Tech Coalition Safe Online Research Fund and the Advisory Group of experts who have collaborated to guide this work.