SUMMARY: Facebook CEO Mark Zuckerberg announced Wednesday that the company will add 3,000 more workers to monitor live video after problems with hate speech and violence, including murder and suicide. Hari Sreenivasan talks to Farhad Manjoo of The New York Times about the ever-growing scope of the social media network and the company's responsibility.
JUDY WOODRUFF (NewsHour): Now questions about the ever-growing scope of Facebook's empire and social network, and whether the company is embracing enough responsibility for its reach.
Today, Facebook CEO Mark Zuckerberg announced that they will add 3,000 more people to monitor live video, after problems with violence and hate speech.
Hari Sreenivasan takes it from there.
HARI SREENIVASAN (NewsHour): The decision comes after a series of cases where people shared live video of murder and suicide, recent examples, a murder in Cleveland last month that was posted live on Facebook, and a man in Thailand posted video of him murdering his 11-month-old daughter. It wasn't removed for 24 hours.
Once Facebook makes these announced hires, there will be 7,500 employees to monitor thousands of hours of videos uploaded constantly.
Farhad Manjoo is a tech columnist for The New York Times who has been closely covering Facebook. He joins me now to talk about this issue and other questions facing the company.
Farhad, so let's first — today's news, how significant is this?
FARHAD MANJOO, The New York Times: I think it's significant.
I mean, it's a significant sort of step up in their ability to monitor these videos, and it should help. The way it works is, there's lots of videos going on, on Facebook all the time. If somebody sees something that looks bad, that looks like it may be criminal or some other, you know, terrible thing, they flag it, and the flagged videos go to these reviewers.
And just having more of these reviewers should make the whole process faster. So, it should help. I mean, I think the question is why it took them a year to do this.