![f0018-01.jpg](https://article-imgs.scribdassets.com/5p7bovkldscdq1hw/images/file0ELMIEQG.jpg)
THE EXPANSION OF FACIAL RECOGNITION technology (FRT) in the UK is happening at a rapid rate, with very little public debate, scant parliamentary discussion and, critically, without any clear laws to govern it. It’s long past time for the public to be put in the picture about what technology without proper regulation could lead to.
Chris Philp, the policing minister, wants police forces to ramp up their use of FRT. Up to now, some forces have been trialling live FRT only at specific events, such as music festivals, demonstrations and the Coronation. However the police have been using FRT to identify people in a retrospective context for some time. Since 2014, they have been employing it on CCTV images of criminal acts, drawing on the 12 million custody images held in the Police National Database.
Live facial recognition (LFR) takes the technology a step further, using it in real time to identify criminals. Many will see this as efficient while some will consider the loss of individual anonymity the worst consequence of facial tracking in public spaces. But the potential harm goes much deeper. To properly understand the implications, it is necessary to place it in the context of the move towards predictive policing and the fallibility of machine programs.
SEEING THEM SEEING YOU
OTH THE PRIVATE AND PUBLIC SECTORS PLACE FAITH in the future of behavioural