Facebook employing 3,000 people to filter out violent content

This article is more than 12 months old

SAN FRANCISCO: Facebook Inc will hire 3,000 more people over the next year to speed up the removal of videos showing murder, suicide and other violent acts, in its most dramatic move yet to combat the biggest threat to its public image.

The hiring spree, announced by Chief Executive Mark Zuckerberg on Wednesday, comes after users were shocked by two video posts last month showing killings in Thailand and the US.

He said in a Facebook post the workers will be in addition to the 4,500 who already review posts that may violate its terms of service. The move is an acknowledgement by Facebook that it needs more than its recent focus on automated software to identify and remove such material.

Artificial intelligence techniques would take "a period of years... to really reach the quality level that we want," Mr Zuckerberg told the company's investors.

"Given the importance of this, how quickly live video is growing, we wanted to make sure that we double down on this and make sure that we provide as safe an experience for the community as we can," he said.

The problem has become more pressing since the introduction last year of Facebook Live, a service that allows any of Facebook's 1.9 billion monthly users to broadcast video, which has been marred by some violent scenes.

Some violence on Facebook is inevitable given its size, researchers say, but the company has been attacked for its slow response. - REUTERS

social mediavideo