A Critique of Facebook’s Dispute Systems Design: Procedural Fairness and the Problem of Power

Background

On 9th March 2016, the New Matilda published a full transcript of Queen Victoria Women’s Centre’s annual International Women’s Day address titled “Looking Past White Australia and White Feminism.” The author of the keynote was Celeste Liddle, who through her Facebook public profile page Black Feminist Ranter posted a link to the New Matilda article. On 10th March Celeste tweeted “been locked out of @facebook because someone reported my @newmatilda transcript for nudity as it contained a pic of desert women painted up”. The photograph was published by New Matilda and showed two aboriginal women engaging in ceremony, wearing only paint over their chests, and this nudity had been deemed to breach Facebook’s Community Standards. The Black Feminist Ranter timeline demonstrates that outrage ensued, Facebook users reposted the apparently offending article, some were blocked themselves, Black Feminist Ranter was repeatedly blocked for posts about the blocking, and eventually the matter was featured in the mainstream media (including ABC, Sydney Morning Herald, Public Radio International and Daily Life). A change.org petition received considerable support.

Flaws in Facebook’s dispute management system have been exposed by the way the complaints about the image were handled. Key failings include lack of procedural fairness or due process, and inconsistency. There is a clear need for Facebook to dramatically overhaul the way that its complaints handling system operates, as the current system facilitates malicious targeting by “trolls” and allows legitimate and valuable voices (in this case feminist and indigenous) to be silenced. This is not consistent with the standards expected by the global community and therefore Facebook’s means of decision making according to global “community standards” are failing. There is a history of Facebook responding constructively to community objection to its censorship policies, which suggests that a user led campaign may succeed in pressuring Facebook to change its processes, despite the enormous power imbalance between it and users.

Community Standards

Facebook is populated by a diverse global community and acknowledges the challenge of establishing rules about the nature of content that is allowed to be posted on its platform. Users are offered advice about how to avoid content that they don’t like. However, Facebook (along with other social networking sites) is prepared to and frequently does censor content posted to its platform. Facebook relies upon peer to peer reporting of inappropriate content and assesses that content against its Community Standards, to which all users have agreed to comply when they committed to the Terms of Use. Safety, respect, cultural diversity and self-control are principles that Facebook claims are its goals in striking the right balance between self-expression and promoting a welcome and safe environment for all users. The standards and processes of censorship adopted by Facebook have been described as “opaque” and it has been suggested that this may be a deliberate obfuscation, to avoid accountability. This is despite Facebook’s claimed commitment to transparency.

It has been observed that Facebook’s approach to nudity reflects “odd prejudices about sex.” Nudity is a separate category of banned content in addition to pornographic imagery (Term 3(7)). The focus of nudity is not upon the sexualised nature of the image, but the body part exposed. Prohibited body parts include: genitals, fully exposed buttocks, female nipples. Male nipples are not prohibited from display. There are precedents of images of women with painted nipples being determined not to breach the Community Standards (including recent posts by Kim Kardashian and of a naked women riding a bicycle with a dildo strapped to the handlebars).

Facebook’s revision of its Community Standards

Facebook recognises that its processes of applying Community Standards about nudity “can sometimes be more blunt than we would like and restrict content shared for legitimate purposes” and claims to be “always working to get better at evaluating this content and enforcing our standards.” In March 2015 three kinds of image were added to a list of exceptions to the prohibition of nudity: breastfeeding women, post-mastectomy scarring, and photographs of paintings, sculptures and other art depicting nude figures. The revision of the Community Standards was a response to widespread campaigns protesting against and petitioning Facebook to change its banning of these three categories of image. The banning of the female nipple has been the subject of much consternation, with acceptable male nipple pasties circulated to facilitate covering the apparently offensive female nipple on user’s images prior to posting.

Another example of the willingness of Facebook to change its systems to better cater to users is the introduction in 2012 of an opportunity to resolve content objections without the need for intervention by Facebook’s moderators. One of the options offered when a person objects to a post, in addition to making a report, hiding it from their person view or “unfriending” the person who made it, is to select a box and send an anonymous message explaining “Hey, I don’t like this photo. Please remove it.” The complainer can indicate that they object to the photograph for reasons such as “it makes me sad”, “it’s embarrassing” or “it’s a bad photo of me”. More than 8 million people per week use these social resolution tools to resolve their differences about content posted on Facebook. More than half of those asked by another user to remove a photograph do so, and at least 75% reply. Often the acknowledgement is enough for the complainer to take no further action in relation to the post, and often the person who posted the content was not intending to cause offence. This social resolution tool could be mirrored in the process that is applied when a moderator becomes involved in decision making. The advantages of giving users a voice in the decision making process are obvious.

Facebook’s Dispute Handling System

As noted above, peer to peer reporting is the means by which Facebook identifies content that ought to be censored and other breaches of the Community Standards. According to Monika Bickert, Facebook’s head of policy management, more than 1 million reports of violations of the Community Standards are filed every day. Facebook claims, under the Terms of Use 5(2), absolute discretion to decide whether or not content should be removed.

The Dispute System Design is explained in the “Reporting Guide.” Although Facebook uses automated learning systems to identify content that has been previously removed, the main engine of the reporting system is outsourced human workers who review the content that has been reported, according to a manual. They determine whether the content should be deleted, allowed, or the report should be “escalated” to Facebook employees for determination.

Where content is deemed not to breach the community standards, the reporter will receive a message through Facebook notifying them that it “doesn’t violate the Facebook Terms” and referring them to the guide to avoiding content that they don’t like.

Where content is deemed to breach the community standards, the reporter will receive a notification of the decision and the person who posted the content will receive a notice that the content had been removed. Where a person has previously had a report upheld and content removed, Facebook may initiate a suspension or termination of the person’s account (Term 14). For Celeste Liddle the sequence of punishments were: removal of content, a 24 hour ban, a 3 day ban, and a 7 day ban. Although the Daily Mail and ABC Facebook pages were not blocked by Facebook for posting two of the original articles, many other users had the same content removed (including New Matilda) or were blocked from access to Facebook.

The reporting guide reveals that the person against whom a report is made has no voice in the decision making process about whether or not their content breaches the Community Standards. There is only one reference to “reportee can appeal a decision in some cases”, and that comes after the person has been blocked (as opposed to their content removed). There is no transparency about which cases and upon what basis. Furthermore, as Celeste Liddle found, it is incredibly difficult to communicate with Facebook once a user has been blocked from access to the platform.

There is no dispute resolution clause in the Terms of Use apart from a reference to choice of law and jurisdiction in California (Term 15(1)). Unlike most internet organisations, Facebook has no online dispute resolution process to facilitate the resolution of disputes between its users nor between itself and its users – other than the opaque and one-sided reporting system.

Malicious reporting happens, and the failure of Facebook’s dispute handling system to account for such behaviour is a flaw that perpetuates unfair treatment of some users. Malicious reporting occurs where reports are made against a person repetitively, as a way of triggering a removal of content and then a ban of the person reported against. Celeste Liddle maintains that this is what occurred to her, with four separate posts being the subject of reports over only one week. Once an initial report was upheld, she became a “repeat offender” in the Facebook moderating system, and therefore greater penalties were imposed for subsequent reports.

Power analysis

One problem with Facebook’s apparently unfettered power over censorship of content is the danger (illustrated by the Celeste Liddle example) that provocative voices that challenge the status quo will be silenced. Gil’ad Idisis has observed that:

“There is no reason to trust that commercial entities will want to, or even know how to, make a balanced, good faith determination of whether content is “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable”…Commercial companies are inherently biased, primarily focusing on maximising profits, which is generally achieved by reaching the broadest client base and by keeping content as consensual and non-provocative as possible” (at 162).

Changing the way that Facebook deals with reports will only occur if the organisation is motivated to make that change. This must be achieved within an environment of considerable power imbalance between Facebook and other parties, and it’s extremely strong market position.

There is a huge power imbalance between Facebook and its individual users, who need Facebook much more than Facebook needs them. Facebook suffers no financial detriment by blocking a user. Users can choose not to use Facebook, but it has become a core part of many personal, social, commercial and political activities. Unlike other social network platforms, Facebook’s existence is not threatened by poor dispute handling practices, as its customers are effectively “locked in” to using the platform to maintain connection with their network. The benefit of Facebook for users is the access to the network of other users with whom content is shared. The Facebook platform enables people to bring content to the people who are likely to be interested in it, and for independent content producers such as Celeste Liddle, this is a means of maintaining their livelihood.

The power of Facebook has caused some argument that it wields power and influence comparable to a nation state, without being bound by the same international laws that seek to moderate state behaviour. Some people have argued for governmental regulation or the creation of an international body capable of regulating the behaviour of organisations such as Facebook. Both of these proposals are cumbersome, expensive, and arguably unrealistic.

There are significant barriers to judicial processes to pursue a claim against Facebook. Although Facebook is based in the USA, and users agree to the choice of Californian law when they sign up, at least 80% of users are located outside the USA. There are complex practical difficulties in pursuing formal legal action against or imposing governmental regulation of Facebook across jurisdictional borders and between different systems of law. Demonstrating a financial loss may also be difficult, which restricts the remedies available even if a wrong can be demonstrated. Time is always a significant barrier to the efficacy of formal justice processes, with access to faster decision making a key attraction of other dispute resolution processes.

Proposals

One of the most powerful ways of moderating the behaviour of social networking providers is public opinion (as demonstrated by previous alterations to Community Standards regarding nudity and reversal of decisions not to remove pro-rape content). Users (en masse) and shareholders are possibly the most well placed to influence the revision of processes and the Community Standards. There ought to be a clear, transparent and constructive process that gives voice to all users, not just those who make a report. A dialogue with the community of users is absent from Facebook’s content moderation process, and to date it has primarily occurred through user petitions. Users could engage with Facebook to demand that the processes exhibit better transparency and fairness, and that a review of the nudity standards be conducted so that outcomes are more consistent with the views of the community. Cultural sensitivity should be supported through Facebook’s process of content moderation. These substantive issues are not considered further here.

The principles of dispute system design maintain that disputes can be handled systemically rather than on an ad hoc basis, and that a systemic approach leads to more effective dispute handling and prevention of disputes arising (because the rules are made clear). There are systemic failures in Facebook’s current system.

It is a basic tenet of procedural fairness that a person affected by a decision ought to be given an opportunity to respond to the case against them. This could be achieved by a transparent facility for users against whom a report is made to make a case to Facebook or lodge a complaint about a decision that has been made to remove their content. As a private organisation, the question of whether or not Facebook owes its users procedural fairness would be determined according to it’s  private rules of corporate structure and/or contract. Again, public demand that users be accorded procedural fairness is not unreasonable.

Key changes that could be made to Facebook’s dispute handling system to reflect good dispute system design include the following:

  1. Giving users against who a report has been made an opportunity to respond to the report before a decision is made. This could be modelled on the peer to peer response that is already in place – the person whose content has been objected to could be sent a message informing them that the content had been reported and the reason selected, and a checklist of responses could be made available to assist the decision maker. The list might include options such as “I do not object to the content being removed” and “I disagree that the content breaches the Community Standards.”
  2. Creating a transparent online internal review process after decisions have been made. This could occur after the person is notified that their content has been removed. Again, a checklist approach could be adopted, with options such as “I understand now that the content breached the Community Standards”, “I disagree that the content breaches the Community Standards”, and “I want to appeal the decision to remove my content.” Checking the last option could trigger an online conversation with a Facebook moderator and the user about the content, with a view to resolving whether the Community Standards had been misapplied. Where on consideration, it was determined that the censorship was mistaken or unfair, the record of the report against the user should be deleted from their account, to prevent aggravated consequences of malicious reporting in the future.
  3. Referring appeals from censorship decisions to an external online dispute resolution provider. Any external appeals process needs to be quick and cost effective (particularly as most users do not have a financial relationship with Facebook). Online dispute resolution is an established process with many providers available in the market.
  4. Before Facebook triggers a suspension of a user’s account, the user should be given an opportunity to engage online with a Facebook representative to clarify the reasons for the blocking and make arguments against the decision (such as alerting Facebook to a pattern of malicious reporting). This is important because currently when a person is blocked, it is extremely difficult for them to communicate with Facebook outside the platform.
  5. Malicious reporting could be dealt with by Facebook registering complaints of malicious reporting (in the same way that it does users whose content has been reported and removed) and applying sanctions against such reporters. Again, before imposing a sanction, that user ought to be given an opportunity to make arguments against the decision to sanction them. Arguments might be made on the basis that their motivations for reporting were genuine because they believed that the content breached the Community Standards. Such claims could be reviewed against records of outcomes of reports.

In conclusion, there are substantial problems with Facebook’s dispute handling system. The most effective way to bring about a change to the system would be to convince Facebook that its own interests would be served by the change. Public outcry has been effective in the past and may be effective in this case. The lack of transparency, arbitrariness and inconsistency of outcomes in the current approach all contradict Facebook’s own statements of principle, mission and standards.

Advertisements
This entry was posted in Uncategorized by Dr Olivia Rundle. Bookmark the permalink.

About Dr Olivia Rundle

Dr Rundle is a senior lecturer at the Faculty of Law, University of Tasmania. She has worked as a nationally accredited mediator and a Family Dispute Resolution Practitioner. Dr Rundle is especially interested in the role of lawyers in dispute resolution processes and the policy environment that positively encourages lawyers to engage with dispute resolution. She teaches and researches in broad areas of Dispute Resolution, Civil Procedure and Family Law.

3 thoughts on “A Critique of Facebook’s Dispute Systems Design: Procedural Fairness and the Problem of Power

  1. I just found another article posted earlier this month making a similar observation about the use of Facebook’s reporting mechanism to, effectively, harrass. http://scienceblogs.com/insolence/2016/03/10/once-again-facebook-reporting-algorithms-facilitate-harassment-of-pro-science-advocates-by-antivaccine-cranks/
    “What is disappointing is that one of the richest companies in the world, a company that revolutionized social media, is apparently incapable of preventing itself from being played for a sucker by antivaccine activists turning its tools intended to prevent and stop harassment into tools to harass.
    Either that, or the people running Facebook just don’t care.”

    Like

Post your comment

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s