Menu
Subscribe to Holyrood updates

Newsletter sign-up

Subscribe

Follow us

Scotland’s fortnightly political & current affairs magazine

Subscribe

Subscribe to Holyrood
Uncertainty over the deployment of facial recognition tech for police use continues

Face recognition software has previously raise concerns over being gender and racial bias | Alamy

Uncertainty over the deployment of facial recognition tech for police use continues

A House of Lords committee has heard conflicting views over the use of facial recognition technology by police authorities.    

During a recent meeting, members of the Justice and Home Affairs Committee heard from stakeholders on the potential of using the innovative technology. 

However, the cohort, comprised of two polices services as well as representatives from the public and private sector, disagreed over levels of transparency and the technology’s potential to breach an individual’s privacy – both being the focal points of concern amongst committee members.

Evidence was put forward by spokespeople from four different organisation with the South Wales Police and the Metropolitan Police Service (MET) amongst those invited to take part in the discussion panel. Both organisations began trialling the systems over six years.      

Karen Yeung, interdisciplinary professorial fellow in law, ethics and informatics at the Birmingham Law School & School of Computer Science pointed out that too much “transparency” may hinder the potential benefits of the technology, as the system might be “gamed”. In other words, those who are of interest might avoid the areas using the tech.   

However, Mark Travis, Senior Responsible Officer for facial recognition technology at the South Wales Police; said that such transparency might be a “preventative measure”. He explained that it may stop serious harm, as awareness of its use might deter individuals from carrying out dangerous acts, as they know they could be easily identified among crowds.       

Yeung also called for “clear” legislative framework going forward. She pointed out that although “advances are quite fast”, there are still “massive operational challenges” for progressing from a match alert to the arrest of a person “for genuine reasons that match the test of legality”.    

These methods of surveillance are formed by three elements: live data from CCTV cameras, software able to scan faces and turn them into numbers, and a watchlist of facial images which have been turned into numbers.   Most systems create a similarity score between the two faces detected – the one detected by the software and the one in the watchlist. It is this score which has caused controversy recently, as there have been concerns raised the system might be gender and racial bias.    

The EU recently banned police and national security bodies from using live biometric data, with some narrow exceptions. An announcement Baroness Sharmishta Chakrabarti referred to when raising her concerns on what was the democratic, ethical and legal justification for trialling the technology, especially in relation to its compliance with Article 8 (right to privacy).   

Lindsey Chiswick, director of intelligence at the MET said that although there is no specific legislative authority for the deployment of the tech, the MET's legal mandate takes common law as "the primary law", which she said was found sufficient to be able to deploy the technology in accordance with the ruling from the Bridges case.   

The Bridges case dates to 2019, when civil rights campaigner Edwards Bridges filed a claim against the South Wales Police for using the technology. He said it breached rights to privacy, data protection laws, and equality laws.    

Although formerly rejected by the High Court, the Court of Appeal ruled in his favour in 2020, saying Bridge's right to privacy, under Article 8 of the European Convention on Human Rights, had been breached due to “too broad a discretion” left to police authorities in applying the technology.      

Addressing concerns over data and privacy, Chiswick also stated that faces with no match are automatically destroyed and people scanning through the data only see a pixelated version of these faces.    

The meeting follows a recent boost in the use of facial recognition by police forces in Scotland, after an investigation by The Ferret revealed searches of the facial matching function on the police database increased from less than 1,300 in 2018 to more than 2,000 searches during the first four months of 2023.  

Holyrood Newsletters

Holyrood provides comprehensive coverage of Scottish politics, offering award-winning reporting and analysis: Subscribe

Get award-winning journalism delivered straight to your inbox

Get award-winning journalism delivered straight to your inbox

Subscribe

Popular reads
Back to top