Online abuse is driving women out of public life. It’s time to act
Public scrutiny is one thing, but the degree of online abuse now suffered by female political candidates is another. The Independent Committee for Standards in Public Life has recommended that platforms be held liable for hate speech and incitement to violence – but, writes María Rún Bjarnadóttir (University of Sussex), domestic UK law currently does not include gender in its definition of hate crime. She points to efforts by Nottinghamshire Police to tackle misogynist speech and suggests how platforms, the justice system, Parliament and researchers could address the problem.
Conservative MP Anna Soubry has received death threats on Twitter. Photo: UK Parliament via a CC BY 3.0 licence
People standing for office and participating in politics have to anticipate a higher level of public scrutiny than the general population. But a recent review from the Independent Committee for Standards on Public Life indicates that during last General Election in the UK, political candidates had to endure threats and intimidation far beyond the scope of “scrutiny”. It also indicates that female[1] politicians are disproportionately the targets of intimidation, as well as those from ethnic and religious minorities and LGBT candidates.
The review relies on a range of evidence, including recent findings by Amnesty International UK, showing that black and Asian female politicians face a disproportionate amount of online abuse. In a recent survey by BBC Radio 5, 64% of the 113 MPs participating believed female politicians were subject to more abuse than their male counterparts.
Britain is far from unique in this respect. A 2016 study of female lawmakers surveyed over 500 female politicians from 107 countries. Nearly 50% of the respondents had received “insulting or threatening comments about women’s ability and/or role” in the context of their profession.
Research from both sides of the Atlantic indicates that women face online abuse and harassment outside the political sphere. The Danish Institute for Human Rights revealed that due to the brutality of social media half of Danes refrain from participating in public debate online. A public debate that loses half of its participants will quickly lose the “democratic” pronoun.
Concern for the democratic interest is core to the Independent Committee for Standards in Public Life review. It recommends that social media platforms should be held liable for user-generated abuse towards candidates and politicians. Similar calls have been issued to the social media platforms regarding extremism and hate speech.
Despite not having dealt with the issue in the context of social media platforms, the European Court of Human Rights (ECtHR) has found that it is not a breach of freedom of expression under Article 10 to hold internet media platforms liable for user generated hate speech and incitement to violence. It has further stated that the scope of the distribution and the commercial incentive of the platform could play a role in the assessment of their possible responsibility.
The ECtHR has, in a number of findings, sought to define the borders between freedom of expression and hate speech. In doing so it has effectively established that not all speech is equal under the Convention system. Simply put, speech that undermines the democratic values that the Convention is built to protect does not enjoy its protection.
In light of the expanding role of social media, the call for increased responsibility is understandable. However, democratically-elected institutions are still the appropriate bodies to set the law of the land, and they carry out the obligations set out in human rights conventions. Demanding that social media platforms regulate their content with terms and conditions raises the question: what about the law?
Under current UK legislation, hate crimes are crimes in which a victim has been targeted as a result of their race, sexuality or disability, but not because of their gender. Twitter recently updated its hate speech policy to largely mirror the UK hate crime legislation, building on a legal framework that does not address sexism as hate speech. Why should the platforms regulate in order to protect female politicians when the legislation does not? As shown in a recent study from Gothenburg University, the UK is not the only country that offers no legal response to the gendered nature of online abuse.
Recently, the University of Sussex organised a seminar about gender and hate in the online sphere, bringing together researchers and domestic and international stakeholders engaged in the topic. The success of the Nottinghamshire Police district policy in tackling sexist speech under the current hate crime legislation was discussed as a possible best practice. There are indications that other police districts in the UK might adopt the same approach.
The outcome suggested that both policy responses and further research are needed in order to address the reality of online abuse towards women. In terms of policy, it was suggested that:
- Legislators redefine the scope of hate speech legislation to include gender as a protected characteristic.
- Police and the judicial system build on the experience of Nottinghamshire Police by treating misogyny as a form of hate speech, and introduce training for staff and police officers.
- Social media platforms ensure their terms and conditions address the gendered reality of abuse, and have clear reporting mechanisms in place for those who suffer sexism and gendered hate online.
Further, it was found that further evidence is needed to examine:
- The impact of online misogyny, both on those affected and the bystander effect.
- The background of perpetrators, in order to understand where the behaviour stems from.
The democratic process is important and fragile. Social media has had an extraordinary impact on the way politicians engage with voters, as it has in a number of fields of communication. It is essential to ensure that political participation does not diminish, and the review of the Independent Committee for Standards on Public Life is an important contribution that demands a response. That response should not overlook the gendered aspect of online abuse against female politicians.
In December 2018, the UK will celebrate the centenary of women’s suffrage. How timely it would be if by that time women could stand for election without facing abuse due to their gender.
[1] Despite the research referred to does not explicitly do so, the term female will for the context of this piece be used in an inclusive manner, for both cis and transgender women. Further, this should not be read as diminishing the experience of cis or transgender women.
This post represents the views of the author and not those of Democratic Audit.
María Rún Bjarnadóttir is a doctoral researcher in Law at the University of Sussex.
María – interesting article. Certainly, we should all be striving for more civility in public life. However, your use of Pew Research data is rather selective: the survey which you cite found that men were actually more likely than women to receive online abuse and physical threats (though abuse directed towards women was more likely to be sexualised). Why, then, do we need to protect women online more than men, rather than trying to improve civility for all?