At Facebook we get things wrong but we take our safety role seriously | Monika Bickert

Our reviewing of difficult positions and images is complex and challenging. We realize the Guardian exposing how hard it to be able to get the balance right

Last month, parties shared several horrible videos on Facebook of Syrian children in the aftermath of a chemical weapons attack. The videos, which also materialized abroad on the internet, proved the children shaking, struggling to breathe and eventually dying.

The images were seriously offending so much better so that we residence a advice screen in front of them and determined sure they were only visible to adults. But the images also elicited international scandalize and reincarnated notice on the plight of Syrians.

The Guardians reporting on how Facebook deals with difficult issues/ images such as this gets a lot of things right. Re-examine online cloth on a global scale is complex, challenging and essential. The essays and the training materials published alongside this show just how hard it can be to identify what is harmful and what is necessary to allow people ability to share freely. As the person or persons in charge of doing this work for Facebook, I want to explain how and where we draw the line.

On an average day, more than a billion people will use Facebook. They will share posts in dozens of conversations: everything from photographs and status updates to live videos. A very small percentage of those will be reported to us and investigated by our moderation units. The range of issues is broad-minded from bullying and hate speech to terrorism and war crimes and complex. Designing public policies that both keep parties safe and enable them to share freely conveys understanding emerging social issues and the route they manifest themselves online, and being able to respond quickly to millions of reports a week from parties all over the world.

For our reviewers, there is another obstacle: understanding context.

Its hard to judge the intent behind one post, or health risks shown in another. Someone posts a graphic video of a terrorist attack. Will it inspire parties to mimic the brutality, or speak out against it?

Someone with a dark sense of humour posts a joke about suicide. Are they are only being themselves, or is it a cry for help?

Cultural context is part of it more. In the UK, being critical of the monarchy might be acceptable. In some parts of the world it will get you a incarcerate decision. Its fast aimed at implementing a clear-cut ordinance, but the majority of members of the time whats acceptable is more about norms and expectations. Social outlooks are constantly evolving, and every society has its flash point. New ways to tell narratives and share images can bring these tensions to the surface faster than ever.

Our approach is to try to set public policies that preserves parties safe and enable them to share freely. We aim to remove any reliable threat of violence, and we respect neighbourhood constitutions. We dont ever share the details of our policies, since we are dont want to encourage people to find workarounds but we do produce our Community Touchstone, which set out what is and isnt tolerated on Facebook, and why.

Our touchstones passed over duration as our community germinates and social issues of all the countries advance. We are in constant dialogue with experts and local organisations, on everything from child security to terrorism to human rights.

Sometimes this conveys our policies can seem counter-intuitive. As the Guardian reported, experts in self-harm admonished us that it can be better to leave live videos of self-harm extending so that people can be alerted to promotion, but to make them down subsequentlies to prevent copycats.

Sometimes this is not enough to prevent misfortune, but sometimes it is.

When a girl in Georgia, USA, attempted suicide on Facebook Live 2 weeks ago, her friends were able to notify police, who managed to reach her in time. We are aware of at least another half-dozen actions like this from the past few months.

We also try hard to stay objective. The actions we recollect arent the fast ones: by definition something is reviewed where reference is falls within a grey zone. Art and porn arent ever easily distinguished, but weve found that digitally generated images of nudity are more likely to be prurient than handmade ones, so our programme reflects that.

Theres a big difference between general express of antagonism and specified calls for a mentioned individual to be injured, this is why we allow the former but dont tolerate the latter.

Facebooks Mark Zuckerberg on video decimate: We have a lot of work to do

These frictions between raising awareness of violence and promoting it, between freedom of expression and freedom from fear, between bearing witness to something and gawking at it are involved, theoretical interviews. Many organisations grapple with them, and there are rarely universal law touchstones to provide clarity. Being as objective as we are in a position is the only way we can be consistent in various regions of the world and in different contexts. But we are continuing sometimes end up performing the inaccurate call.

We realize the Guardian showing how hard it can be to draw the lines. The hypothetical situations we use to train reviewers are intentionally extreme. Theyre designed to help the people who do this work deal with the most difficult actions, and they reflect the real troubles they deal with every day. When we firstly organized our content touchstones nearly a decade ago, much was left to the discretion of individual employees. But because no two parties will have identical judgments of what characterizes detest speech or bully or any number of other issues we soon progressed our touchstones to include clear definitions.

These definitions are deliberately precise, designed to achieve compatible decisions across the globe, whether the person or persons scrutinizing the posts grew up in India or Brazil or the UK.

As the clause memorandum, we face criticism from people who want more censorship and people who want less. We see that as a helpful signal that we are not leaning more far in any one counseling. The alternative doing good-for-nothing and allowing anything to be announced is not what our community wants.

We get things inaccurate, and were constantly working to make sure that happens less frequently. We put a lot of detailed think into trying to find right answers, even when there arent any.

I hope that readers is aware that we make our character extremely seriously. For many of us on the team within Facebook, security is a passion that predates our work on the company: I wasted more than a decade as a criminal lawyer in the United States and Thailand, probing everything from brat sexual exploitation to brutal mobs to terrorism. Our team also includes a counter-extremist expert from the UKs Institute for Strategic Dialogue, the former investigate superintendent of West Points Combating Terrorism Center, a abuse crisis centre worker, and a teacher.

All of us know there is more we can do. Last-place month, we announced that we were hiring an extra 3,000 reviewers around the world to review whats posted on Facebook. This is demanding work, and we will continue to do more to ensure we are giving them the right supporting, both by making it easier to heighten hard decisions quickly and by providing the mental supporting and care they need.

We are performing these financings because its the best thing to do and since we are is argued that capacities necessary to share is worth protecting.

Technology “ve been given” more parties more capability to transmit more widely than ever before. We speculate the added benefit of sharing far outweigh health risks. But we also recognise that society is still figuring out what is acceptable and what is harmful, and that we, at Facebook, play games an important part of that conversation.

Read more: https :// www.theguardian.com/ commentisfree/ 2017/ may/ 22/ facebook-get-things-wrong-but-safety-role-seriously

Advertisements