Facebook employees hired to go through millions of private WhatsApp messages: Report

Published: Updated:
Enable Read mode
100% Font Size

Using Facebook software, WhatsApp has more than 1,000 employees go through millions of private messages including images and videos and flagged messages that are deemed risky, such as fraud, spam, and illegal plots, according to a report.

For the latest headlines, follow our Google News channel online or via the app.

Advertisement

ProPublica, a New York City-based organization, published a report earlier this week highlighting Facebook privacy concerns.

However, WhatsApp messages with end-to-end encryption cannot be read by others, according to Facebook’s CEO Mark Zuckerberg. The encryption ensures that messages are converted to unreadable formats that can only be read by the sender and the recipient.

Facebook’s founder and CEO Mark Zuckerberg. (File photo: Reuters)
Facebook’s founder and CEO Mark Zuckerberg. (File photo: Reuters)

Employees tasked with going through messages only have access to ones that are flagged by users as “abusive” or spam, and they review unencrypted messages.

According to the report, WhatsApp’s director of communications acknowledged to ProPublica that teams of contractors in Austin, Texas, and other cities review messages to identify and remove abusers, adding that it is not content moderation.

Facebook Inc. has content moderators for its Facebook and Instagram platforms, which don’t offer the end-to-end encryption that WhatsApp has. Facebook moderators focus on abusive content and misinformation.

ProPublica’s report says that WhatsApp uses “outside contractors, artificial intelligence systems and account information to examine user messages, images and videos.”

Transparency

Last month, Facebook’s Q1 Transparency Report was published, and it showed that a news article suggesting a “healthy” doctor had died after receiving the COVID-19 vaccine was Facebook’s top viewed link.

Media outlets have reported that Facebook withheld the Q1 report, and the company released it a day after such reports circulated.

The transparency report examined US content views between January 1, 2021, and March 31, 2021.

“News outlets wrote about the south Florida doctor that died. When the coroner released a cause of death, the Chicago Tribune appended an update to its original story; NYTimes did not. Would it have been right to remove the Times story because it was COVID misinfo?” Facebook’s Policy Communications Manager, Andy Stone, said in response to critics.

‘Fake news’

According to a study, from August 2020 to January 2021, “fake news,” or misinformation, got six times more clicks and likes than trustworthy news sources on Facebook.

The peer-reviewed study looked at user behavior on Facebook around the 2020 US election campaign, according to a report by The Washington Post.

The study offers evidence that misinformation widely exists and is widely shared on social media platforms such as Facebook.

Read more:

‘Fake news’ on Facebook got six times more clicks than trustworthy sources: Study

Most popular Facebook post of 2021 cast doubt on COVID-19 vaccines

Facebook apologizes for putting racist ‘primates’ label on video of Black men

Top Content Trending