The WhatsApp messenger, owned by Facebook, uses at least 1,000 moderators through the contractor Accenture to check the content that users are complaining about, Report informs referring to the materials of the investigation of the non-profit organization ProPublica.
To do this, the company removes end-to-end encryption. The offices of the moderators are located in Austin city of Texas, Dublin, and Singapore. If the moderators receive a complaint about content that violates the rules of the messenger, the message is considered offensive, and the messages preceding it are transmitted in decrypted form for verification. First, artificial intelligence is engaged in this, and then the correspondence gets to the moderator. During one shift, the moderator must study at least 600 user complaints.
WhatsApp’s director of communications, Carl Woog, acknowledged that teams of contractors in Austin and elsewhere review WhatsApp messages to identify and remove 'the worst' abusers. But Woog told ProPublica that the company does not consider this work to be content moderation, saying: “We actually don’t typically use the term for WhatsApp.”
In addition, they cannot delete individual messages that contain insults, although they can block the user’s account.