WITNESS to attend content moderation conference at EU Parliament
Like many governing bodies, the European Parliament is looking closely at how tech companies and platforms develop rules and policies that govern content moderation: what content might violate their terms of service, what content should be removed, users blocked, etc.
Content moderation is a major focus area of our Tech Advocacy program. The policies often adversely affect human rights content as our Program Manager for Tech Advocacy, Dia Kayyali has written about.
On Tuesday, February 5, 2019, Dia will speak at the “Content Moderation & Removal at Scale” conference being held at European Parliament in Brussels. The conference will explore how Internet companies develop and implement internal rules and policies in the area of content moderation. What are the challenges they currently face to moderate or remove illegal and controversial content, including hate speech, terrorist content, disinformation, and copyright infringing material? And how could or should future European regulations affect these practices?
Dia will be sharing a response to the panel “Illegal content: terrorist content and hate speech.” Dia recently spearheaded an effort by WITNESS to bring together 26 human rights defenders, journalists, archivists, digital rights organizations, and alternative media to tell members of the European Parliament that a proposed regulation to erase extremist content online will erase human rights too. Read the open letter here.