In recent years, TikTok has skyrocketed in popularity, boasting over a billion users each month and surpassing Google in terms of online hits, according to Cloudflare. With a platform of such massive reach, TikTok must enforce strict content moderation to ensure that harmful material does not reach its users. However, the psychological toll of content moderation has led to a new lawsuit against TikTok and its parent company, Bytedance, highlighting the significant impact on the mental health of content moderators.
Candie Frazier, a former moderator who worked for Telus International, a third-party contractor for TikTok, has filed a lawsuit alleging that her role exposed her to graphic, traumatic content daily, resulting in severe psychological harm. She claims that her exposure to disturbing videos—ranging from violent acts to graphic depictions of abuse—has led to symptoms of anxiety, depression, and post-traumatic stress disorder (PTSD). Frazier’s case sheds light on the unseen challenges facing those responsible for keeping the digital space safe.
Content Moderation: The Dark Side of Social Media
Social media platforms like TikTok rely on thousands of moderators to filter content and ensure compliance with community guidelines. This work includes reviewing videos and accounts that could expose users to graphic, violent, or otherwise harmful material. According to Ms. Frazier’s lawsuit, her position required her to review hundreds of videos daily, including content related to sexual assault, cannibalism, and other deeply disturbing acts. Additionally, she alleges that to keep up with the volume, she was often expected to watch up to 10 videos simultaneously.
In response, TikTok maintains that they strive to create a “caring working environment” for both employees and contractors. The company partners with third-party firms to support moderators’ mental health and emotional well-being. However, the lawsuit raises questions about the adequacy of these measures and whether they align with the needs of moderators like Frazier, who are on the front lines of content moderation.
Allegations of Insufficient Support for Moderators
Ms. Frazier’s lawsuit, filed in California, claims that the conditions under which moderators work violate labor laws and fall short of industry standards. She describes long 12-hour shifts with limited breaks—15 minutes after the first four hours, then a 15-minute break every two hours, and one hour for lunch. These conditions, she argues, contribute to the severity of her psychological trauma, leaving her feeling unsupported and at risk of long-term harm.
TikTok counters these claims, stating that its safety team is committed to creating a supportive environment and offering a range of wellness services for moderators. The company is also a member of a coalition of social media companies that have developed best-practice guidelines to protect moderators from the adverse effects of viewing graphic content, such as child exploitation and abuse.
The Growing Mental Health Crisis Among Moderators
The emotional toll of content moderation has garnered increasing attention in recent years. In 2020, Facebook faced similar scrutiny and settled a lawsuit with moderators, agreeing to pay out $52 million in compensation to those suffering from PTSD due to their job requirements. Frazier’s lawsuit builds upon this momentum, pushing for industry-wide changes that prioritize the mental health of those tasked with curating content for billions of users worldwide.
Ms. Frazier’s experience isn’t unique, as other content moderators have reported similar symptoms, including heightened anxiety, depression, and PTSD. Studies have shown that constant exposure to traumatic material, especially over long hours without adequate breaks, can have lasting effects on mental health. As social media continues to expand, the demands on moderators are likely to grow, making it critical for platforms to evaluate their practices.
TikTok’s Response and the Need for Industry Standards
In its response, TikTok stated that its current practices meet industry standards and that moderators are provided with mental health resources and wellness services. Telus International, Frazier’s employer, asserts that its employees have access to robust mental health programs and multiple internal channels to voice concerns. However, these claims do not fully address the unique stress and emotional strain that content moderation entails.
As the demand for content moderation grows, more companies are likely to face similar legal challenges. The tech industry may need to implement more stringent guidelines, similar to those established in traditional fields with exposure to traumatic material, such as emergency responders and mental health professionals.
Frequently Asked Questions(FAQs)
1. What does a content moderator do, and why is it stressful?
Content moderators review online posts, videos, and other materials to ensure they comply with community standards. The role often involves viewing graphic or disturbing content, which can lead to mental health issues like PTSD, anxiety, and depression.
2. How does TikTok support its content moderators?
TikTok claims to offer wellness services and mental health resources for moderators through partnerships with third-party firms. However, the effectiveness and availability of these resources have been questioned in light of recent lawsuits.
3. What legal protections exist for content moderators?
In the U.S., labor laws require safe work environments, but specific protections for psychological well-being are still evolving. Some industry guidelines have been created, but standards vary between companies, leaving gaps in protections for moderators.
4. Has any other social media company faced similar lawsuits?
Yes, in 2020, Facebook settled a lawsuit and agreed to pay $52 million in compensation to moderators who developed PTSD and other mental health issues due to their job responsibilities.
5. What can be done to reduce the psychological toll on content moderators?
Social media platforms can consider stricter industry guidelines, provide mandatory counseling, and ensure sufficient break times. Additionally, developing automated content filtering systems can reduce the amount of traumatic content human moderators must handle.
Conclusion
The lawsuit against TikTok underscores the urgent need for social media companies to reassess how they manage content moderation and support those responsible for enforcing their community standards. Candie Frazier’s case brings to light the severe psychological impact of this work and highlights the importance of developing better systems and safeguards. Ultimately, as social media platforms continue to grow, establishing robust, sustainable practices for protecting moderators’ mental health will become not only an ethical obligation but also essential for long-term industry viability.