The UK public is getting fairly used to facial recognition being used to police the streets of London. However, as the technology is shifting from employment by the public sector into the private sector, questions are being raised about who is now policing Britain?
This weekend, a new controversy emerged. The Observer reported that Home Office officials had drawn up secret plans to lobby the Information Commissioner’s Office (ICO) and speak in favour of facial recognition cameras as a solution to rising retail crime. As a result the UK Home Office is effectively being accused of privately backing facial recognition technology for retail premises to tackle shoplifting. Though the Home Office is yet to make any official statement on these allegations.
All of this followed an alleged meeting between UK policing minister Cris Philp, Home Office officials and private firm, Facewatch back in March.
Facewatch was founded in 2010 by the owner of the well known and characterful London wine bar, Gordon’s. Simon Gordon the current owner, annoyed at the presence of pickpockets and the absence of the presence of any police, started his own technology security system. He installed a camera at the entrance to the bar, created a private “watch-list”of people he believed to be thieves and using a biometric information database, began to identify when a person was a suspected match on the watchlist.
Today, that security system, or surveillance system depending on how one perceives this, has been scaled up and sold to several large retailers who can upload photographic images of people suspected of shoplifting. The Sports Direct holding company Fraser Group is on record as a user of the Facewatch camera system, despite over 50 M.P.s drafting a letter opposing their use when it was announced.
The UK’s Office of Biometrics and Surveillance Cameras Commissioner (OSBCC) awarded Facewatch a certification of compliance with the Surveillance Camera Code of Practice for its use of live facial recognition and ruled in April of this year that Facewatch was fully compliant with UK Data Protection Law. That be as it might, there is growing concern that the public has not been asked to engage in the creation of such codes and that there still exists a lack of a sufficient regulatory framework and code of practice that takes into account the thoughts of the public.
Recognising the importance of the evolving use of biometrics and these growing concerns, the Ada Lovelace Institute undertook a three-year programme of public engagement, legal analysis and policy research, to explore the complex ethical challenges raised by biometric technologies to consider what governance around biometrics could shore up public legitimacy. They published that report in June 2022.
It showed that context matters greatly in people’s comfort with biometrics and that across various use cases, people had concerns about individual and societal harms. One recommendation was the need for a stronger “legal framework with independent oversight and minimum standards to prevent harm, create accountability and transparency, and ensure proportionate use”. The independent legal review the Ada Lovelace Institute commissioned, led by Matthew Ryder KC, also found that, “Current governance structures and accountability mechanisms are fragmented, unclear and not wholly effectual. The regulatory body governing police use of biometrics is not adequately empowered”.
Furthermore, the Review found that human rights law does not adequately cover non-public-sector uses of biometric technologies.
Facewatch’s company website this month features the statistic showing a 17% reduction in violence in Facewatch stores, stating that overall crime has increased by 1.4% in stores carrying Facewatch, a very modest figure compared to the 44.1% rise across retail stores that don’t carry the system. The company is also keen to reassure visitors to the site that Facewatch complies with the Data Protection Act. But this does not address the findings of the Ada Lovelace report which recommended that a regulator that should assess any biometric technologies in their proposed contexts, prior to use, planned for use in any publicly accessible spaces.
Moreover, any assessment should be carried out based on clearly established standards of accuracy, reliability and validity.
In the modern age, it is very easy for people to agree to all kinds of invasive technologies because they are sold to the public as being convenient or cost effective. Once they are installed, however, their removal is almost impossible. These commercially-owned facial recognition systems in retail premises is yet another example of the outsourcing of local policing to technology purely because it is more convenient and effective than the human policeman or woman currently in short supply on the streets and in the shops. But can societies keep relying on the latest technologies to do the job that our public sector services and socio-political institutions should be doing?
Many technology watchers are only too aware of the fiasco around Clearview, an AI and facial recognition company that collected 20 billion images of people’s faces and data from publicly available information on the internet to create an online database without those people’s consent. The company then offered the police the ability to upload an image of a person to the company’s app to check for any match against the database. The ICO fined the company £7.5m for breaching data protection rules and the company no longer operates in the U.K.
However, it is a stark reminder that private companies want to enter this security or surveillance space for a variety of reasons but that the broader societal question for all of us is, “Who polices Britain?” and that is getting harder to answer by the day.
Facewatch has been contacted for comment. At time of publication none had been received.
Read the full article here