The issue of content moderation has been bubbling furiously beneath the surface of bodies politic throughout the Covid-19 pandemic and appears no closer to being resolved. This is relevant for operators, as well as platform providers, as they will be involved at least indirectly as implementers of impending legislation and technology, as so much of the content to be moderated is consumed on mobile devices. The debate has been driven by concerns over content of various types, especially around harassment, bullying, child abuse, pornography, misinformation, self-harm and hate speech, also touching on combating of crime and terrorism. The perceived urgency led to a lot of legislation being cobbled together hurriedly, only to be followed by a kickback as impact on…