It’s no longer a news that the unlawful happenings around the globe such as amputation of a living being and proudly making a video update of it on many social platforms are now calling for serious attention. The benefit of social media is to bring the individual necessities close to their easy grab, such as friend searching for and getting his long time lost friend or family. It’s used for information and news broadcasting and amongst many other things, for the benefits listed and many more yet, unlisted to have been a priority should not be left out for a violent and heart aching post like one man slaughtering another, rather it should better be for fun as supposed.
Facebook is such a great platform for information sharing, image files, and text even to extent of video sharing, its presence has brought the very easy means to attain a greater height of entertainment, information sharing and lucrative up keeping of the public.
Amid a spate of live broadcasts of grisly incidents, Facebook CEO Mark Zuckerberg announced he will be hiring 3,000 additional employees over the next year to monitor content and remove violent videos.
“Over the last few weeks, we’ve seen people hurting themselves and others on Facebook — either live or video posted. It’s heartbreaking, and I’ve been reflecting on how we can do better for our community,” Zuckerberg said in a Facebook post. “If we’re going to build a safe community, to have this long desired community, we the public needs to take a participative stand in ensuring that this happens and within a short while to become a reality. We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.”
New hires join and what to be doing:
The new people to be hired are expected to join the Facebook community team to start working immediately for the purpose of their designation. They will review the “millions of reports” Facebook receives each week regarding posts that may violate its terms of service, in addition to the 4,500 employees who already review posts. The move will hopefully “improve the process for doing it quickly,” Zuckerberg said.
“These reviewers will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation. And we’ll keep working with local community groups and law enforcement who are in the best position to help someone if they need it — either because they’re about to harm themselves, or because they’re in danger from someone else,” he said in the Facebook post.
In addition to hiring more reviewers, Zuckerberg said Facebook will also be “building better tools to keep our community safe.”
“We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help,” he said. ” Facebook is not promising to be a lifeguard or anyone, but as the case, maybe no one should be in this situation to get hurt in the first place, but if they are, then we take part in building a safe community that gets them the help they need.”
Facebook has been criticized recently for not doing enough to prevent videos showing violent incidents, including a murder in Cleveland and a killing of a baby in Thailand, from spreading on the social network. The company does not allow videos and posts that glorify violence, but this content is often only reviewed and possibly removed if users report it.
It’s a high-profile announcement of what has become a high-profile problem for the tech giant. In recent months, an elderly man was murdered in Cleveland in a video later uploaded to Facebook; a teenage girl was sexually assaulted in a live-streamed video; and four people were charged with hate crimes for the assault of a man who authorities said had “mental health challenges” in a video that was also streamed on the site.
And then, there have been issues in the other direction — where the problem wasn’t that violence went un-flagged, but that an overactive flagging process removed less-than-offensive content. Perhaps the most notable of these incidents came last year when the Pulitzer Prize-winning “Napalm Girl” photograph was removed from the site for violating Facebook’s Community Standards before the company finally relented and allowed it after an international outcry.
Click Here For More News and Blog |