-1.1 C
New York
Saturday, December 21, 2024

For removing streaming Violence, Facebook is hiring 3,000 workers!

According to the recent news, Facebook is going to hire about 3,000 more people in the coming year to respond to reports of the inappropriate material on the social media network and to speed up the removal of the videos that shows murder, suicide and other violent activities.

Facebook live streaming violence

For improving the monitoring of the posts an automated software is not enough. Thus new people are going to be hired by Facebook. Facebook Live service has already been launched last year and it allows any user to broadcast live. Since its launch, many instances have been there of people streaming violence.

Zuckerberg, has said in a post that hired employees will be an addition to the current team of 4,500 people. These are the ones already reviewing the posts that may violate its terms of service.

Father broadcasted himself killing his daughter in Thailand

Last week, a father in Thailand broadcasted himself killing his daughter on Facebook Live, this was confirmed by the Police. The video was removed a day after, but it had already got about 370,000 views. There are other videos too, which contain violence in places like Cleveland and Chicago.

Zuckerberg has commented over this that “We’re working to make these videos easier to report so we can take the right action sooner – whether that’s responding quickly when someone needs help or taking a post down.”

The company confirmed: 3,000 people will be on new positions to monitor all the content and not just live videos. The locations of the jobs hasn’t been confirmed yet.

Machines are being used to find out violent and offensive material

Facebook, being the world’s largest social network, with 1.9 billion monthly users, has been turning to artificial intelligence to try to automate the process of finding pornography, violence and other potentially offensive material. In March 2017, the company stated the plans to use such technology to help spot users with suicidal tendencies.

But, still Facebook relies on its users to report any problematic material. Every week, millions of reports are being received from the users and like other Silicon Valley Companies it also relies on thousands of human monitors to review the reports.

No automated mechanism can do this job with 100% accuracy

Sarah Roberts, a professor of information studies at UCLA who looks at content monitoring, stated that “Despite industry claims to the contrary, I don’t know of any computational mechanism that can adequately, accurately, 100 percent do this work in lieu of humans. We’re just not there yet technologically.”

The workers who are currently monitoring the material are working on contract in places like India and Philippines, and they are facing difficult working conditions because of the hours they spend making quick decisions while sifting through traumatic material, Roberts said in an interview

Conclusion

This is a great thought and hopefully it will prevent any offensive material from being shown on the social media. We should also be aware and must report any such activities on Facebook immediately.

Share This Article With Friends

To get news on WhatsApp free, just send ‘Start’ to 9729997710 via WhatsApp

Saying Truth
Saying Truthhttps://www.sayingtruth.com/
Saying Truth Newspaper is your news, entertainment, music fashion website.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Stay Connected

2,313FansLike
0FollowersFollow
32,595FollowersFollow
0SubscribersSubscribe
- Advertisement -

Latest Articles