Facebook to allow appeals over removed content
Facebook will allow users to appeal decisions to remove their content from the social network, the company has announced.
Beginning later this year, Facebook said it will enable users to lodge an appeal if a photo, video or post is removed having been judged to violate the site’s community standards.
The social network said it would initially allow appeals for content that were removed for nudity or sexual activity, hate speech or graphic violence.
Facebook said a new option would appear on removal notifications that would enable users to request a review, which would be carried out within 24 hours - by a person, not an algorithm, Facebook said - and the content would be restored if a mistake was found to have been made.
Monika Bickert, the social network’s vice president of global product management, said: “We are working to extend this process further, by supporting more violation types, giving people the opportunity to provide more context that could help us make the right decision, and making appeals available not just for content that was taken down but also for content that was reported and left up.
Rise in home heating oil cost heightens fears of deepening cost of living crisis
MOT tests in Northern Ireland: PSNI and DVA advise how to stay legal despite five month backlog
Poots meets with agri-food industry to discuss the supply chain pressures and challenges
Head over heels in love on Inishowen
Five month wait for MOT: PSNI advises how to avoid prosecution if driving without valid certificate in Northern Ireland
“We believe giving people a voice in the process is another essential component of building a fair system.”
Earlier this year, the platform removed the official pages of far-right group Britain First, as well as those of its leaders Paul Golding and Jayda Fransen, for breaching the site’s rules on hate speech.
Alongside the appeals announcement, the company also published its internal guidelines for enforcing its rules.
It includes guidance on when nudity is acceptable on the platform - such as in paintings or images of breastfeeding - as well guidance on when threats of violence can be deemed credible.
“We decided to publish these internal guidelines for two reasons. First, the guidelines will help people understand where we draw the line on nuanced issues,” Ms Bickert said.
“Second, providing these details makes it easier for everyone, including experts in different fields, to give us feedback so that we can improve the guidelines - and the decisions we make - over time.”
The firm’s chief technical officer Mike Schroepfer is due to face questions from MPs from the Digital, Culture, Media and Sport select committee on Thursday over the site’s business and policy practices, as it continues to face scrutiny in the wake of the Cambridge Analytica scandal.