A group of Oxford and Stanford researchers (Timothy Garton Ash, Robert Gorwa, and Danaë Metaxa) started with nine items, in their report released Thursday via Oxford and Stanford. The focus is on ways Facebook could improve itself as a »better forum for free speech and democracy.«
Part of the report focuses on the amends Facebook has attempted, such as broader transparency with academics and policymakers and introducing content appeal processes, but also points to the impact (and issues) that can arise from self-regulatory actions instead of external policies. These ideas aren’t just limited to the Facebook, but they also include Instagram and WhatsApp, two growing platforms owned by the mothership itself.
The biggest takeaway, the authors argue, is that Facebook could make immense progress by being less power hungry: »Ideally, the user interface and experience on Facebook should be designed to promote active, informed citizenship, and not merely clickbait addiction for the commercial benefit of Facebook, the corporation. That would be change indeed«.
Facebook therefore should:
- establish clearer definitions of hate speech,
- hire »more and culturally expert content reviewers«,
- share more details — maybe even case studies — on the decision-making process for users like Infowars,
- expand and improve the appeals process,
- go further with sliders users can set for different types of political content, news, and the level of News Feed curation overall,
- invest in fact-checking partnerships in some of the 76 countries where the platform operates but has no fact-checking partner,
- involve regular meaningful auditing mechanisms,
- enhance meetings with an external content policy group comprised of individuals from »affected communities,« and
enhance useful interactions with an external appeals group, as so far »all the major questions remain unanswered.«