Why Content Moderators Need Compassionate Leaders More than Ever

Why Content Moderators Need Compassionate Leaders More than Ever

Increasing UGC and U2U Interactions 

More than ever, Content Moderators are faced with increasing user-generated content and user-to-user interactions which threaten the safety of platforms’ communities of users. The advent of generative AI, highly politicized social issues, and over 40 national elections this year, combined with a community of users that are more ‘in-the-know’ about evading policy restrictions means that moderators have more work than they will have ever encountered previously. 

Increasing regulation and compliance such as the Digital Services Act and the Online Safety Bill place additional pressure on platforms’ moderation teams to remove harmful content more swiftly and accurately. For example, Ofcom’s recent proposals for how online services should approach their duties related to illegal content – such as child sexual abuse materials (CSAM), harassment, and intimate image abuse – recommends that “performance targets are set for content moderation functions and services [which] measure whether they are achieving them.” While this recommendation works to safeguard the community of users, it simultaneously increases the strain on moderator resources. 

Moderators’ employers would be remiss to ignore the herculean task ahead of their teams, and the potential impacts this may have on their psychological health and safety. 

Leaders’ Impact on Moderator Retention and Commitment 

Leaders are the frontline defence for moderation team, in particular direct line managers. Research demonstrates that line managers largely influence employee retention and turnover, as well as their level of engagement – or “the rational or emotional commitment to something or someone in the organization, how hard they work as a result of this commitment and how long they intend to stay.” Moderators who are engaged and invested in their responsibilities, and who continue in their position for an extended period, provide advantages not just to the financial success of the company, but also to the group of platform users they oversee.  

Imagine for a moment that a team of 30 moderators removing CSAM content experiences 50% turnover in the span of 6 months. As a team of 30, they moderate over 3000 pieces of content per week. Specialized training to moderate CSAM content may require upwards of 5 weeks for a new hire to the organization therefore, the platform is risking increased user exposure to CSAM content throughout these 6 months while recruitment and training of new hires takes place.  

The Science Behind Compassion 

Compassion is defined as “a sensitivity to suffering in self and others with a commitment to try to alleviate and prevent it”. It plays a key role in our physiological state. When we are feeling threatened, our sympathetic nervous system is activated which is commonly known as our fight, flight or freeze response. Individuals may be familiar with this experience if they struggle with anxiety or if they have previously experienced a traumatic event like a car accident. When we are giving or receiving compassion however, the parasympathetic nervous system is activated via the vagus nerve which results in a state of soothing and feelings of connection 

For Content Moderators, the threat to their psychological state is inherent. Prolonged and routine exposure to egregious content has the potential to activate the sympathetic nervous system more often than the general working population. Other work-related stressors such as performance targets may also activate the sympathetic nervous system. Moderators who experience continuous activation of their sympathetic nervous system without the counterbalance of their parasympathetic nervous system are more likely to develop anxiety disorders, post-traumatic or acute stress disorders, and even depressive disorders. 

The Moral Imperative for Compassionate Leadership 

Considering the potential risks to moderators’ psychological wellbeing and the increasing challenges that moderators are facing, compassionate leadership is a moral imperative. As a leadership style, compassionate leadership can be defined as a motivational system which is rooted in the human instinct of compassion. Leaders who exhibit this style make their teams feel valued and supported to meet collective objectives. 

In content moderation, the collective objective is clear: remove harmful content and minimise harmful interactions to preserve the psychological health of a community of platform users. Whilst this objective is clearly outlined to moderators and their leaders, employing a compassionate leadership style to enhance moderators’ effectiveness to meet this objective may be murkier. 

Compassionate leaders not only role model self-compassion, but they also encourage their teams to engage in self-compassion and provide an environment that is supportive, caring, friendly, and helpful. For Content Moderators, mastery and improvement of their task at hand is necessary to ensure accurate decision-making based on platform policies for the removal of harmful content and moderation of user-to-user interactions. 

Furthermore, if we reconsider the impact of managerial relationship on employee retention and engagement, we can assume that increased care and compassion will positively affect these outcomes. 

This is exactly why Content Moderators need compassionate leaders. Without compassionate leaders, Content Moderators are at higher risk of developing mental health difficulties which will not only minimize their productivity and quality of work – it will have a huge knock-on effect on users’ safety online. However, with compassionate leaders, Content Moderators and users alike will reap the benefits. 

Trust and Safety teams require leaders who are equipped to handle the concerns and challenges faced by their moderators – and the first step is investing in leaders who know how to lead with compassion. 

 

 

Back to blog