Weekly Notes: legal news from ICLR — 25 February 2019

This week’s roundup of legal news and commentary includes robotic policing, knife crime, judicial bullying, and fake news.


A report from Liberty reveals that 14 police forces are using, have previously used or are planning to use algorithms which ‘map’ future crime or predict who will commit or be a victim of crime, using existing police data. Policing by Machine — Predictive Policing and the Threat to Our Rights collates the results of 90 Freedom of Information requests sent to every force in the UK, laying bare the full extent of ‘predictive policing’ for the first time. The report highlights what it calls

“a severe lack of transparency with the public given very little information as to how predictive algorithms reach their decisions — and even the police do not understand how the machines come to their conclusions”.

It explains how “opaque” predictive policing programs use “hordes of biased police data” to “assess a person’s chances of victimisation, vulnerability, being reported missing or being the victim of domestic violence or a sexual offence, based on offensive profiling”. By “entrenching pre-existing discrimination” in the data, they create a significant risk of “automation bias”, with a “decision-maker simply deferring to the machine and accepting its indecipherable recommendation as correct”.

Policing by Machine focuses on two specific types of predictive policing program:

(1) predictive mapping programs — which evaluate police data about past crimes and identify ‘hot spots’ or ‘boxes’ of high risk on a map. Police officers are more likely to patrol these areas, collecting in turn more data justifying their hot-spot status. If the area has significant racial minority populations, data about these will also be more concentrated, adding racial bias to the mix.

(2) individual risk assessment programs — which predict how people will behave, including whether they are likely to commit, or be victims of, certain crimes. Such programs base their predictions on 34 pieces of data, 29 of which relate to a person’s past criminal history, alongside things like their postcode (see above), which can act as proxies for ethnicity, adding to the risk of bias. Thus the simple elimination of explicit references to race is insufficient because the presence of proxy data often serves to generate the effect of a discriminatory profile.

The report raises timely questions about the sufficiency of human (preferably independent) oversight, as well as transparency, accountability and public scrutiny. The report suggests that section 49 of the Data Protection Act 2018 (Right not to be subject to automated decision-making) does not go far enough. It also draws attention to a wider issue with the transparency of profile datasets, and the use of big data to manipulate political messaging and influence consumer behaviour. It links in with issues about fake news (see below) as well as, more directly, the management of policing initiatives to deal with gang-related violence and knife crime (see below). It cites the report last year in which the Information Commissioner’s Office (ICO) found that the Metropolitan Police Service’s (MPS) use of the Gangs Matrix database led to multiple and serious breaches of data protection laws, and calls for review of the use of all such data.

Meanwhile, there is some rather more benign news of legal mechanisation. First, Legal Futures reports that

“A ‘robot mediator’ has been used to settle a dispute in the court system, for what is believed to be the first time. The online tool, which uses artificial intelligence (AI) algorithms in place of a human mediator, settled the three-month dispute in less than an hour.”

The parties were directed to the Canadian dispute resolution tool Smartsettle ONE, which was developed in British Columbia by ICan Systems.

Secondly, in India, police in the state of Kerala have introduced what is believed to be the first Indian use of a ‘RoboCop’ — a mechanical officer, known as KP-Bot, and given the rank of Sub Inspector (SI). According to India Today,

“The KP-Bot will be deployed to perform duties of the front office of the police headquarters which means that it will receive visitors and direct them to different places as and when required. … The robot is equipped with facilities to fix appointment with officers, provide them with identity cards and also open new files based on their grievances.”

The robot was given a female appearance, on the somewhat dubious basis, explained by Director general of police (DGP) Loknath Behra, that

“Women empowerment and gender equality were kept in mind while deciding on the gender of the first robot. Also, the fact that most front office jobs are managed by women was considered…”

Toasters, kettles and computers are not gendered; it seems rather absurd to impose such a distinction on a robot. Perhaps Siri or Alexa could provide an answer? Or even Judge Droid:


The Ministry of Housing, Communities and Local Government has pledged to provide over £9.5m towards community-based projects across England aimed at enabling earlier intervention and support for children and families vulnerable to knife crime and gang culture, with a further £0.3m to train frontline staff.

The new funding will be channelled through the Troubled Families programme, according to an announcement by Communities Secretary James Brokenshire MP. The Supporting Families Against Youth Crime fund will enable keyworkers, community groups, teachers and other professionals working with children and young people at risk, to intervene early and help stop them from becoming drawn into gang crime, serious violence and the youth justice system.

The new funding forms part of the Home Office’s Serious Violence Strategy. It appears to be different from, and in addition to, the £22m Early Intervention Youth Fund which will be spent over 2 years and a £1.5m Anti-Knife Crime Community Fund.

As part of these further measures to intervene early, this spring will also see the launch of the next phase of the #knifefree campaign, which is intended to inspire young people at risk of being drawn into gang culture to pursue more positive alternatives.

The Counter-Terrorism and Border Security Act 2019, which received the Royal Assent on 12 February, gives the give the UK greater powers to crackdown on hostile state activity, according to an announcement by the Home Office. It also ensures sentencing for certain terrorism offences can properly reflect the severity of the crimes, as well as preventing re-offending and disrupting terrorist activity more rapidly. In addition, the act updates existing counter-terrorism legislation to reflect the digital age including the way in which people view content online (eg by viewing or streaming rather than permanently downloading).

Among other things it will introduce an independent review of Prevent, the government’s strategy for supporting those vulnerable to radicalisation. The legislation is said to be the result of a review of the government’s counter-terrorism strategy, CONTEST, in June last year.

‘Upskirting’, which typically involves someone taking a picture under another person’s clothing without their knowledge, with the intention of viewing their genitals or buttocks, is now a specific criminal offence under a new statute, the Voyeurism (Offences) Act 2019, which received the Royal Assent on 12 February.

Hitherto the behaviour was successfully prosecuted under the offence of Outraging Public Decency. However, following concerns raised by victims that not all instances of ‘upskirting’ were covered by current law, the government acted to create a new, specific offence. It creates two new offences under the Sexual Offences Act 2003 to capture this behaviour. The changes will cover England and Wales; ‘upskirting’ is already a specific offence in Scotland.

The Ministry of Justice have issued guidance: Upskirting: know your rights

Legal Professions

Almost a year ago, on 20 March 2018, the then chair of the bar, Andrew Walker QC, commented

“we strongly condemn any bullying or inappropriate treatment of our members by judges, or by any other legal professionals they encounter. We know it can happen, and we have resources available to support barristers on our Wellbeing at the Bar website, and we can give advice and guidance via our confidential helplines.

“Our advice is always to be civil but firm with any bullying judge, opponent or clerk; to seek advice about it; and to report it. Both the Bar Council and Bar leaders are committed to making sure that bullying is addressed and not tolerated.

Now the Bar’s Equality, Diversity and Social Mobility Committee has published Advice to the Bar about bullying by judges as a downloadable PDF on its Ethics & Practice Hub. The advice recognises that only a small minority of judges are bullies, and that judges work under significant pressures and stress. However, “there is no excuse for bullying, or for tolerating it. It has no place in the rule of law, in the legal profession or in our courts. It must be addressed if it happens.”

A major problem for victims is the risk that any attempt to complain or object to the judge in the face of bullying behaviour will adversely affect their client’s case or their own career advancement. This is where the senior members of the profession can help, the advice suggests:

“If you are a senior practitioner, or in a leadership position, and are in court at the time of the bullying then please consider taking issue with it at the time, especially if the target is a more junior practitioner. Your intervention may help to reduce the impact on the target and show them that the senior members of the profession are prepared to stand up and be counted in support of our more junior members.”

Bullying can come from other professionals as well, as can sexual harassment, and there is advice about this on the Ethics & Practice Hub as well.

See also, in Counsel Magazine:

Media and Communications

On 18 February the House of Commons Digital, Culture, Media and Sport Committee published its Disinformation and ‘fake news’: Final Report (HC 1791). It follows and 18-month inquiry covering

“individuals’ rights over their privacy, how their political choices might be affected and influenced by online information, and interference in political elections both in this country and across the world — carried out by malign forces intent on causing disruption and confusion.”

At times the committee have had to use its powers to order people to give evidence and to obtain documents sealed in another country’s legal system. It has worked with politicians and parliaments from other countries, sharing the “worldwide appetite for action to address issues” relating to malign disinformation, propaganda and political interference. One of the problems has been the willingness of many (at all levels of intelligence and social status) to

“accept and give credence to information that reinforces their views, no matter how distorted or inaccurate, while dismissing content with which they do not agree as ‘fake news’,”

with a consequent “coarsening” of public debate. What needs to change, the report’s summary says, is

“the enforcement of greater transparency in the digital sphere, to ensure that we know the source of what we are reading, who has paid for it and why the information has been sent to us.”

The big tech companies, it says, “must not be allowed to expand exponentially, without constraint or proper regulatory oversight.” There is particular criticism of Facebook, whose founder Mark Zuckerberg showed “contempt towards the UK Parliament” in refusing to appear before the committee’s hearing:

“Facebook’s handling of personal data, and its use for political campaigns, are prime and legitimate areas for inspection by regulators, and it should not be able to evade all editorial responsibility for the content shared by its users across its platforms.”

The report (which we have not read in full) concludes with a number of recommendations, including:

  • a compulsory Code of Ethics, similar to Ofcom’s broadcasting code, overseen by an independent regulator with statutory powers
  • protection of “inferred data” (derived from monitoring a person’s online activity and preferences) under the law as personal information
  • a levy should be placed on tech companies operating in the UK to support the enhanced work of the ICO
  • the Competitions and Market Authority (CMA) should conduct a comprehensive audit of the operation of the advertising market on social media
  • Electoral law should be updated to reflect changes in campaigning techniques, and the move from physical leaflets and billboards to online, micro-targeted political campaigning
  • all political parties should work with the ICO, the Cabinet Office and the Electoral Commission, to identify and implement a cross-party solution to improve transparency over the use of commonly-held data
  • welcome and support for the Cairncross Review report on safeguarding the future of journalism, and the establishment of a code of conduct to rebalance the relationship between news providers and social media platforms
  • digital literacy should be a fourth pillar of education, alongside reading, writing and maths.

For commentary on the report, see Inforrm’s blog, DCMS final report: Disinformation and “fake news”, Some initial thoughts — Zoe McCallum

Dates and Deadlines

Institute of Advanced Legal Studies, Thursday 28 February, 5 to 6.45 pm

Mind the Gap: a blueprint for a new regulatory framework that effectively captures citizen journalists’ is the full title of a seminar in which Peter Coe (ILPC Research Associate, Anthony Collins Solicitors LLP) will argue that citizen journalism, facilitated by the Internet and social media, is no longer an outlier of free speech, but is now a central component of the media, and public discourse, for which there is currently no effective means of regulation.

Other speaks include Dr Paul Wragg, Associate Professor of Law, University of Leeds School of Law, Dr Laura Scaife, Dr Richard Danbury, ILPC Associate Research Fellow, Associate Professor of Journalism, De Monfort University, and as Chair: Dr Nóra Ni Loideain, Director of the Information Law and Policy Centre, Institute of Advanced Legal Studies. Advance booking required: here.

Middle Temple Library, Monday 4 March, 6 to 8 pm.

This event looks at retained EU legislation — what it is, where you can find it and why it is so significant. Speakers include Chair of the Bar’s Brexit Working Group & Leader of the European Circuit, Hugh Mercer QC, representatives from the National Archives who are tasked with archiving EU law, and representatives from Thomson Reuters.

If you would like to attend, please contact the Library at library@middletemple.org.uk. Places are limited.

Armada House, Telephone Avenue, Bristol, Friday 8 March, 2 to 6 pm.

“Women in Law — Support, Retention, Progression.” An afternoon of inspirational speakers to celebrate International Women’s Day. Further details here. (NB seems to be fully booked now.)

Deadline for applications is 20 March 2019.

The Supreme Court of the United Kingdom invites applications for up to 11 Judicial Assistants to support the work of the Justices. Fixed term contracts will start on Monday 9 September 2019 and finish on 31 July 2020. Applicants must be solicitors, barristers or advocates qualified in one of the UK jurisdictions, having completed a training contract or pupillage by the start of the appointment. Candidates can apply with CV and a supporting statement demonstrating how the key skills and behaviours are met. Full details here.

And finally…

shows a proper ‘baby barrister’: Congratulations!

That’s it for this week! Thanks for reading. Watch this space for updates.

This post was written by Paul Magrath, Head of Product Development and Online Content. It does not necessarily represent the opinions of ICLR as an organisation.

The ICLR publishes The Law Reports, The Weekly Law Reports and other specialist titles. Set up by members of the judiciary and legal profession in 1865.