Facebook plans external board to decide on questionable spots on its site
The Executive Council of Australian Jewry’s co-CEO Peter Wertheim travelled to Singapore to participate in a two-day consultation on the establishment of an independent External Oversight Board proposed to empower to make final decisions about contested content appearing on Facebook.
The planned Board will also put forward recommendations about Facebook’s community standards, policies and internal processes. 38 participants attended from across Asia, Australia and New Zealand, including academics, media workers and civil society representatives. This was the first of a small number of consultations which Facebook has scheduled with participants from various regions in the world.
Facebook currently has 23 categories of community standards which may require it to remove a range of content including hate speech, credible threats of violence, terrorist activity, inciting or advocating physical harm, bullying and harassment. Each day Facebook receives an average of one million notifications from its 2.3 billion users all over the world reporting content that is said to breach its community standards.
Facebook currently employs 30,000 people to deal with the formulation and enforcement of its community standards, half of whom are tasked with reviewing specific complaints about content. The company has also developed, and is continually improving, artificial intelligence to detect and remove prohibited content electronically, regardless of whether a complaint has been made. Due to variations in language and culture, and difficulties in assessing precise nuances of meaning, the electronic detection of hate speech is especially challenging. Nevertheless, Facebook has had a growing success rate in dealing with this problem.
The meeting considered and discussed several real-life case studies. Participants then debated and put forward recommendations about the precise functions of the External Oversight Board (EOB), the number of EOB members, their term of office, the process for selecting members and the criteria for selection, caseload, criteria for selecting cases to go before the EOB, the processes for assessing and determining cases, the transparency of the EOB’s decisions, maintaining consistency of decision-making, protections against infringements on the EOB’s independence, indemnity against loss, EOB recommendations about changes to Facebook’s policies and processes, and numerous other questions.
A final decision about all of these matters is expected to be announced by the end of 2019.
Commenting on the experience, Peter Wertheim said: “This was one of the most challenging meetings I have attended. The ECAJ has had its differences with Facebook in the past, so full credit to them for inviting a representative from the ECAJ to participate. The creation of an External Oversight Board is a hugely complex and ambitious undertaking and I commend Facebook for the initiative. A final judgment about the proposal will have to await the decisions which are ultimately made about the range of matters we discussed. However, I would like to think Facebook is doing this for the right reasons and that it has matured in its outlook, has a clear view of its moral and legal responsibilities about problematical content and takes those responsibilities very seriously. The ECAJ looks forward to our expectations being confirmed and exceeded”.
Peter Wertheim travelled to Singapore as a the guest of Facebook.