Mark Zuckerberg, chief executive officer and founder of Facebook Inc., arrives for a House Financial Services Committee hearing in Washington, D.C., U.S., on Wednesday, Oct. 23, 2019.
Andrew Harrer | Bloomberg | Getty Images
LONDON — Facebook’s much-anticipated Oversight Board has confirmed that it is planning to launch ahead of the U.S. election on Nov. 3 after being criticized for a perceived lack of action.
The board will rule on appeals from Facebook and Instagram users and questions from Facebook itself, although it will have to pick and choose which content moderation cases to take due to the sheer volume of them.
Following a report from The Financial Times, a spokesperson for the independent Oversight Board told CNBC that it expects to start in mid to late October.
“We are currently testing the newly deployed technical systems that will allow users to appeal and the Board to review cases. Assuming those tests go to plan, we expect to open user appeals in mid to late October.”
They added: “Building a process that is thorough, principled and globally effective takes time and our members have been working aggressively to launch as soon as possible.”
The Oversight Board said it expects to decide on a case, and for Facebook to have acted on this decision, within a maximum of 90 days.
A spokesperson for the Oversight Board said: “In terms of passing rulings around the time of the election, the Board will be prepared to consider cases on any matters that come before it and are in scope for us, and it’s premature to guess what the Board may or may not consider until we launch. Whether Facebook will send the Board expedited cases around this time is a question for Facebook.”
A Facebook spokesperson told CNBC that it has been “helping” the Oversight Board members to get up and running as quickly as possible since they were announced in May.
“That has included finalizing a new software tool that allows members to securely access and review case information from anywhere in the world; and training them on our community standards and policy development processes.”
The social media company has been under pressure to demonstrate that it is ready to deal with what stands to be one of the most polarizing U.S. elections in recent history, with experts concerned that some of the platform’s users may try to incite violence.
The board will receive cases through a content management system that is linked to Facebook’s own platforms. They will then discuss the case as a group before issuing a final decision on whether the content should be allowed to stay up or not.
Facebook announced it was creating the independent board in November 2018. It came soon after a report was published in The New York Times that detailed how the company avoided and deflected blame in the public conversation around its handling of Russian interference in U.S. politics and other social network misuses.
At the time it said the board’s members are a globally diverse group with lawyers, journalists, human rights advocates and other academics. Between them, they are said to have expertise in areas such as digital rights, religious freedom, conflicts between rights, content moderation, internet censorship and civil rights.
Notable members include Alan Rusbridger, former editor in chief of The Guardian newspaper, and Andras Sajo, a former judge and VP of the European Court of Human Rights.
The Oversight Board could help Facebook avoid accusations of bias if it removes content deemed problematic. Some lawmakers and conservative speakers have said that Facebook censors politically conservative points of view, a claim the company rejects.
Facebook pledged to give the board $130 million in funding last December, with the money expected to cover operational costs for at least six years. The board will be compensated for their time, although the amount they will be paid has not been made public.
Facebook in January outlined the board’s bylaws, making it clear that the social media giant was still in control. The board’s decisions do not necessarily set any precedents that Facebook has to follow in the future, and the board is limited when it comes to content it can address.
The board has said it will publish transparency reports each year and monitor what Facebook has done with its recommendations.