What just happened?
A ground-breaking case has just begun in the Court of Appeal. The case concerns automated facial recognition technology in public places and if its use breaches privacy rights. The High Court ruled that South Wales police did not breach the rights of a man whose face had been scanned by a camera[1], but this judgement is being appealed. This case was the first legal challenge to police use of automated facial recognition.
What does it mean?
This case will potentially set the precedent going forward on whether facial recognition cameras in public places breach an individual’s privacy. There is also concern about the extent to which these cameras will be used. Dan Squires QC (acting for Liberty and a Cardiff resident) submission to the court, stated “it is not difficult to imagine that police forces nationally could soon – if they cannot already - have access to photographs of the vast majority of the population.”[2] This isn’t just an issue for South Wales, the Metropolitan Police announced in January 2020 that live facial recognition cameras will be used operationally on London streets for the first time.
In the 2018 case of Big Brother Watch v UK[3], the European Court of Human Rights ruled that mass surveillance and intelligence sharing did not violate international law. Big Brother Watch have openly opposed the increase of national surveillance and stated that the judgement in Edward Bridges v South Wales Police represented "an enormous expansion of the surveillance state and a serious threat to civil liberties in the UK".[4] Big Brother Watch’s director stated, "It flies in the face of the independent review showing the Met's use of facial recognition was likely unlawful, risked harming public rights and was 81% inaccurate."[5]
How does it affect the legal industry?
The judgement in the case of Edward Bridges v South Wales Police will have a huge impact on the legal industry. I expect that law firms will see a large influx of individuals attempting to pursue similar cases, claiming privacy has been breached. Here at LawMiracle, we believe that it is unlikely Squires and his client will be successful. The future implications of a successful judgement would open floodgates and allow anyone to make money from pressing charges for a violation of privacy. In turn would force the reduction of surveillance.
In line with technological advances, we as a society seem to be updating our thoughts on privacy, as with social media and mobile phones, the amount of information that exists in the current sphere about individuals is much higher than in previous generations. We must consider the impact this will have on policing the society, the police are bound to utilise technology, and facial recognition cameras allow for effective identification of individuals. The ability to locate a subject without any officers or people searching will be a big relief to the police and their resources, however if this case rules in favour of Bridges, then this innovation will not be in use.
On the other hand, if they are not successful, we will have to see the extent of which an individual’s biometrics can be accessed legally. This is a very controversial area, and many groups along with Big Brother Watch have spoken out in opposition of the usage of this technology. It is scary to think about the amount of information the police and government have on individuals already, and with the increase of biometric use in day to day life, how far could this potentially go?
This issue isn’t just concerning the police; there has been a vast increase in usage of this technology in recent years by private companies also, in terms of CCTV facial recognition and in terms of private technology – like phones with facial scanners to unlock. This could disadvantage the legal sector in that privacy rules could become a lot more difficult to navigate. Furthermore, the success of the legal sector is directly tied to the success in the business world, and so if this technology is prevented by the current case or later cases, will this stunt the success of businesses who no longer have access to certain technologies, and will this hinder the success of legal communities?
Lightfoots LLP has become the first UK firm to have a digital facial recognition identification service for clients, suggesting the facial recognition technology could permeate the legal industry. The technology is regulated by the Data Protection Act 2018, which gives anyone who is scanned the right to be told how their image has been used.[6] This technology has only very recently been used on a national scale, and therefore the implications on privacy law are still unknown, but if the appeal on the South Wales Police usage fails, the Court is essentially ruling that facial recognition technology in public places cannot breach privacy, therefore entitling the police to possess potentially millions of pictures of private citizens.
On the 15th August 2019, Elizabeth Denham, Information Commissioner released a statement in which she said “Scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all, that is especially the case if it is done without people’s knowledge or understanding.”[7]
She went on to add that she is “deeply concerned about the growing use of facial recognition technology in public spaces, not only by law enforcement agencies but also increasingly by the private sector.”[8]
No legal framework has been established specifically for this technology, and this is a daunting thought. Currently, there are few regulations on live facial recognition technology, and this is potentially a big threat to the privacy of UK citizens.
Written by Lucy Stone
Assessing firms:
Bird & Bird LLP, Bristows LLP, Fieldfisher, Hogan Lovells International LLP, Linklaters LLP, Allen & Overy LLP, Baker McKenzie, Dentons, DLA Piper, Eversheds Sutherland LLP, Latham & Watkins, Taylor Wessing LLP.
References:
[1] Edward Bridges v Chief Constable of South Wales Police [2019] EWHC 2341
[2] Owen Bowcott ‘UK’s facial recognition technology ‘breaches privacy rights’’ (Guardian, 23rd June 2020)
[3] Big Brother Watch and Others v United Kingdom App no 58170/13 (ECHR 13 Sept 2018)
[4] ‘Met Police to deploy facial recognition cameras’ (BBC News UK, 30th January 2020)
[5] Ibid
[6] GDPR tailored by Data Protection Act 2018
[7] Information Commissioner’s Office < https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2019/08/statement-live-facial-recognition-technology-in-kings-cross/>
[8] Ibid
Disclaimer: This article (and any information accessed through links in this article) is provided for information purposes only and does not constitute legal advice.