Content Moderation

Content Moderation

Democratizing content moderation decisions among social media users

Client

Meta

My Role

Evaluation Lead

Employer

BIT Americas

Challenge | How might users feed into content moderation policies about misleading information?

Social media platforms struggle to regulate content containing mis- and disinformation. This becomes even more challenging for content that is problematic, but not verifiably false— e.g. 'if the earth is warming up, why is it -2° today?'. Working with Meta's governance team, my team at BIT designed a pilot forum ('deliberative assembly') to investigate the effectiveness and feasibility of involving social media users in making decisions about content moderation practices.

Approach | We designed deliberative assemblies of users to democratize decision making— but we needed to know how well they worked

While my team designed and ran three forums, two in the US and one global, I owned the nebulous task of evaluation. What could success look like for something that had never been done before? And how do we know if we accomplished it?

To illustrate the challenge of designing an evaluation for this topic, consider: in a deliberative forum, the hope is that people engage with good faith open-mindedness. This means the forum should lead to some shifts in opinion over the course of multiple rounds of facilitated discussion. However, the forums were designed to generate debate and build consensus, not to persuade in any specific direction; therefore, it cannot be the goal for everyone to change their mind. On the other hand, if no one changed their mind, that would indicate the forums were not viable as a democratic decision-making tool.

Design | I designed an evaluation framework based on validated discourse quality indices, customized for our context

The evaluation was an ambitious mixed-methods endeavor, including survey responses, observations, interviews, and text analysis. In addition to mapping out the methodology, I designed the guides for our interviews and observations (and conducted several of them), and led the qualitative thematic analyses across data sources.

Why so many data sources?

Anticipating the likelihood of certain inconsistencies between the perceptions and behaviors of participants in the deliberative fora, I insisted on an approach that would give us insights regarding both how the participants felt (reflected in interviews and survey responses) and what they had said and done (reflected in observations and voting record at the end of each forum).
For example, in interviews participants tended to say they enjoyed participating in the forums, but that their views on moderation of problematic content had not changed. However, in several cases I saw these same participants demonstrating discourse quality measures such as 'force of better arguments,' or openness to being persuaded when presented with evidence or reasoning that is better than one's own evidence or reasoning.

Results | We found evidence of discourse quality, and set up a framework for future deliberative fora

The evaluation was a huge undertaking, and ultimately a huge success: qualitative evidence of indicators such as 'force of better arguments,' combined with quantitative pre-post data from opinion surveys provided our client with the proof of concept they needed to continue advocating for democratized decision making. It also provided directionality for further research, which is currently ongoing.