Monday 22 May 2017

Facebook Will Not Delete Videos Of Violent Death, Self-harm Or Abortion

Facebook

With the social media giant under pressure from world leaders to remove controversial content, the extensive guidelines used by Facebook employees has been revealed to the Guardian.

Facebook will allow users to lifestream self-harm attempts because it “doesn’t want to censor or punish people in distress who are attempting suicide”, according to the leaked documents.

The footage will be removed “once there’s no longer an opportunity to help the person” – unless the incident is particularly newsworthy.

One document says: “Removing self-harm content from the site may hinder users’ ability to get real-world help from their real-life communities.

“Users post self-destructive content as a cry for help, and removing it may prevent that cry for help from getting through. This is the principle we applied to suicidal posts over a year ago at the advice of Lifeline and Samaritans, and we now want to extend it to other content types on the platform.”

Facebook’s head of global policy management, Monika Bickert, defended leaving some suicide footage on the site.

She said: “We occasionally see particular moments or public events that are part of a broader public conversation that warrant leaving this content on our platform.

“We work with publishers and other experts to help us understand what are those moments. For example, on September 11 2001, bystanders shared videos of the people who jumped from the twin towers. Had those been livestreamed on Facebook that might have been a moment in which we would not have removed the content both during and after the broadcast.

“In instances where someone posts about self-injury or suicide, we want to be sure that friends and family members can provide support and help.”

With Facebook’s global reach still growing at a rapid rate, the social media giant faces constant difficulties in determining how to deal with new challenges, such as ‘revenge porn’.

Facebook is set to hire an additional 3,000 content moderators to maintain control of its content, although the social media giant has acknowledged that they have a “challenging and difficult job”.

Moderators receive a wealth of advice on how to deal with certain social media posts, with some of the controversial guidelines expected to court serious criticism considering Facebook’s enormous global influence.

Here are just a few of the guidelines Facebook moderators receive:

On threats of violence

Statements of intent to commit violence against heads of state, such as “someone shoot Trump”, should be deleted.

However, statements such as “to snap a b*’s neck, make sure to apply all your pressure to the middle of her throat”, or “f*** off and die” can be permissible because they are not regarded as credible threats.

On videos of violent deaths

Videos of violent deaths are not always deleted, although they are marked as disturbing. The reason they are always deleted is because in some instances they can create awareness of certain issues, such as mental illness.

The Facebook guidelines state: “We do not allow people to share photos or videos where people or animals are dying or injured if they also express sadism.”

On animal abuse

Photos of animal mutilation, including images showing a human kicking or beating an animal, can be marked as disturbing rather than deleted.

One slide explains: “We allow photos and videos documenting animal abuse for awareness, but may add viewer protections to some content that is perceived as extremely disturbing by the audience.

“Generally, imagery of animal abuse can be shared on the site. Some extremely disturbing imagery may be marked as disturbing.

“We allow people to share images of animal abuse to raise awareness and condemn the abuse but remove content that celebrates cruelty against animals.”

On nudity

Facebook allows users to post videos of abortions, as long as they do not contain any nudity.

Meanwhile, ‘handmade’ art showing nudity and sexual activity is permissible, but digitally-created art showing sexual activity is not.


SHARE THIS

Author:

Etiam at libero iaculis, mollis justo non, blandit augue. Vestibulum sit amet sodales est, a lacinia ex. Suspendisse vel enim sagittis, volutpat sem eget, condimentum sem.

0 comments: