Facebook to fix streaming violence with 3,000 new workers

Facebook Inc (FB.O) will hire 3,000 more people over the next year to speed up the removal of videos showing murder, suicide and other violent acts, in its most dramatic move yet to combat the biggest threat to its valuable public image.

The hiring spree, announced by Chief Executive Mark Zuckerberg on Wednesday, is an acknowledgement by Facebook that it needs more than automated software to identify and remove offensive posts that have exploded online and made headlines in the traditional news media.

The problem has become more pressing since the introduction last year of Facebook Live, a service that allows any of Facebook’s 1.9 billion monthly users to broadcast video, which has been marred by some violent scenes.

Some violence on Facebook is inevitable given its size, researchers say, but the company has been attacked for its slow response.

UK lawmakers this week accused social media companies including Facebook of doing a “shameful” job removing child abuse and other potentially illegal material.

In Germany, the company has been under pressure to be quicker and more accurate in removing illegal hate speech and to clamp down on so-called fake news.

German lawmakers have threatened fines if the company cannot remove at least 70 percent of offending posts within 24 hours.

So far, Facebook has avoided political fallout from U.S. lawmakers or any significant loss of the advertisers it depends on for revenue. Some in the ad industry have defended Facebook, citing the difficulty of policing material from its many users. Police agencies have said Facebook works well with them.

Facebook shares were down slightly on Wednesday, ahead of quarterly earnings after the bell.

ARTIFICIAL INTELLIGENCE

Zuckerberg, the company’s co-founder, said in a Facebook post the workers will be in addition to the 4,500 people who already review posts that may violate its terms of service. Facebook has 17,000 employees overall, not including contractors.

Zuckerberg said the company would do better: “We’re working to make these videos easier to report so we can take the right action sooner – whether that’s responding quickly when someone needs help or taking a post down.”

The 3,000 workers will be new positions and will monitor all Facebook content, not just live videos, the company said. The company did not say where the jobs would be located, although Zuckerberg said the team operates around the world.

The world’s largest social network has been turning to artificial intelligence to try to automate the process of finding pornography, violence and other potentially offensive material.

In March, the company said it planned to use such technology to help spot users with suicidal tendencies and get them assistance.

However, Facebook still relies largely on its users to report problematic material. It receives millions of reports from users each week, and like other large Silicon Valley companies, it relies on thousands of human monitors to review the reports.

 

Zainab Sa’id