Skip to main content

Content Moderator Sues TikTok for Exposing Her to Graphic Videos, Is Placed on Leave on Christmas Eve

 

In this photo illustration, the social media application logo, TikTok is displayed on the screen of an iPhone on April 13, 2020, in Arlington, Virginia - TikTok has pledged $250 million to local organizations around the world supporting healthcare, education, and struggling communities impacted by the coronavirus pandemic.

In a move her lawyer called “Dickensian,” a TikTok employee was placed on leave on Christmas Eve after raising workplace safety concerns in a new lawsuit. Candie Frazier, a content moderator for TikTok, was ousted after she sued the social media giant for allegedly creating a dangerous work environment by negligently exposing her and coworkers to thousands of violent videos.

Frazier filed a federal lawsuit against TikTok and its parent company ByteDance on December 23 in U.S. District Court in California alleging that Frazier and other content moderators are regularly exposed to unsafe work conditions as a result of TikTok’s policies and practices. Frazier asks that the court certify her lawsuit as a class action and order the defendants to establish a fund to compensate employees for health issues caused by watching thousands of disturbing and graphic videos uploaded by TikTok users.

In an email to Law&Crime on Tuesday, Frazier’s attorney Steve Williams of Joseph Saveri Law Firm, LLP said:

In retaliation for bringing these important issues to the public, on Christmas Eve she was advised that she was being placed on some form of leave and had to give up her work equipment so that she can no longer do the job she relies on to support her family.  TikTok’s Dickensian conduct violates the law, and we call upon TikTok to change course and restore our client to her former position and responsibilities without delay.

Frazier is an employee of Telus International, a private company that contracts with TikTok and its owner, ByteDance, Inc., to provide content moderation. Moderators screen videos uploaded by users to ensure that videos do not violate the platforms’ terms of service.

In her lawsuit, Frazier explains her job as follows:

Every day, TikTok users upload millions of videos to its platform. Millions of these uploads include graphic and objectionable content such as child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder. To maintain a sanitized platform, maximize its already vast profits, and cultivate its public image, TikTok relies on people like Plaintiff Frazier—known as “Content Moderators”—to view those videos and remove any that violate the corporation’s terms of use.

Frazier continues, detailing some of the many ways uploaded content is harmful. In addition to graphically violent videos, she asserts, users are also engaged in disinformation campaigns that “has destabilized society.” From the complaint:

Content Moderators also face repeated exposure to conspiracy theories (including suggestions that the COVID-19 pandemic is a fraud), distortions of historical facts (like denials that the Holocaust occurred), fringe beliefs, and political disinformation (like false information about participating in the census, lies about a political candidate’s citizenship status or eligibility for public office, and manipulated or doctored videos of elected officials). This type of content has destabilized society and often features objectionable content.

According to Frazier’s complaint, she and her fellow content moderators are regularly subjected to harmful and illegal work conditions. These moderators are required to work 12-hour shifts with just two 15-minute breaks and one hour long break, and to watch thousands of videos each day. Because so many of the videos are disturbing, violent, and graphic, the moderators have suffered serious psychological harms as a direct result of their work. Frazier alleges that she has trouble sleeping and has “horrific nightmares,” in addition to being at increased risk for post-traumatic stress disorder (PTSD) as a direct result of having done her job.

Frazier alleges that “ByteDance and TikTok fail to implement workplace safety measures that meet industry standards that other companies and non-profits have implemented,” failing even “to implement the standards suggested by the Technology Coalition, despite being a member.”

Further, she claims, ByteDance and TikTok did not check whether new hires had any personal experience with graphic videos, and did not show any sample videos to new moderators as part of the hiring process. Moreover, the company provided no psychological support for moderators, and instead required them to work an excessive amount of hours.

“Content Moderators were further punished for any time away from [watching videos], making it almost impossible for them to utilize the meager wellness protections available unless they did so before or after their 12-hour workday,” Frazier claims. ” “Without any meaningful counseling or similar mental support,” she continues, “Plaintiff and other Content Moderators were ill equipped to handle the mentally devastating imagery their work required them to view.”

Frazier’s lawsuit brings multiple claims against ByteDance and TikTok including standard and strict liability negligence and violation of California’s unfair competition law. She asks for unspecified compensatory damages, as well as that the court require the defendants to establish a fund “to pay for a medical monitoring program to facilitate the ongoing screening, diagnosis, and adequate treatment of Plaintiff and the class for psychological trauma— including to prevent or mitigate conditions such as PTSD, anxiety and depression—until it can be determined that psychological trauma is no longer a threat to their health.”

Attorneys for TikTok and ByteDance could not immediately be reached for comment.

[Photo illustration by OLIVIER DOULIERY/AFP via Getty Images]

Tags:

Follow Law&Crime:

Elura is a columnist and trial analyst for Law & Crime. Elura is also a former civil prosecutor for NYC's Administration for Children's Services, the CEO of Lawyer Up, and the author of How To Talk To Your Lawyer and the Legalese-to-English series. Follow Elura on Twitter @elurananos