TITLE: Moderating illegal content online and staff welfare – Where do the responsibilities lie?
The price others pay for digital dumping – Aspects of online child protection. Huge volumes, unknowable quantities, of unambiguously illegal or profoundly harmful materials are circulating on the Internet. Child sex abuse materials (CSAM) and terrorist propaganda are the types of content most frequently mentioned in this context but there are several others.
There has been an entirely proper focus on the supply chain companies use to manufacture or deliver their products or services. Typically, these initiatives have been designed to eliminate child labour, slavery or environmental harms. Isn’t it time internet businesses and institutions were pressed to do something for those who daily have to face the unfaceable on our behalf?
Already we are aware of at least one case that is being brought in a US court by ex moderators who claim their former employer did not do enough to shield them from Post Traumatic Stress Disorder. Whatever the eventual outcome of that case might be it seems a portent of other actions that could be brought. However, our aim should be to avoid the possibility of such suits by insisting on high standards.
INHOPE co-ordinates a global network of hotlines that receive reports of CSAM. To be a member of INHOPE a hotline has to sign up to several commitments in terms of staff welfare needs e.g. access to counsellors, safe spaces and secure places.
Any company or institution/platforms employing moderators, either in-house or via third parties, should sign up to something similar to the standards that institutions like INHOPE and IWF use for their own analysts who review CSAM all day long, something similar and there should be a mechanism to reassure the public that its terms are being honoured in practice, not just in theory.
Speakers: