FACIAL RECOGNITION TECHNOLOGY COULD SOON BE EVERYWHERE – HERE’S HOW TO MAKE IT SAFER

In case you missed it Most Read Opinion Tech

Mon 31 July 2023:

The recent coronation of King Charles III was a high-profile example of when facial recognition technology has been used to monitor a crowd, but there are plenty of others. The technology is used by law enforcement all over the UK and other countries.

It’s now common in US airports. It’s being used to monitor refugees and identify dead bodies in Ukraine. Even Beyoncé fans have been subjected to it.

And there’s more to come. The UK government is reportedly planning to add facial recognition to the police’s body-worn devices, drones and numberplate cameras. It may soon be very difficult to leave your house without having your face scanned.

There are serious questions about whether the benefits of this technology outweigh such concerns. But steps could be taken to address the issues people are worried about.

ARTIFICIAL INTELLIGENCE CAN PREDICT POLITICAL VIEWS USING FACIAL ANALYSIS, RAISING PRIVACY CONCERNS

Uses and limits

Facial recognition can be used by police to scan many faces in a crowd and compare them with a “watch list” of known criminals. This “live facial recognition” is used with the aim of reducing crime. It can also be used retroactively on recorded CCTV footage.

In the UK, the Protection of Freedoms Act 2012 provides a legal basis for the use of surveillance camera systems in a public place.

FACIAL RECOGNITION TECH FIGHTS CORONAVIRUS IN CHINESE CITY

And according to the government’s surveillance camera code of practice, it’s justifiable to use facial recognition systems in decisions that could negatively affect people, such as whether to arrest them, so long as there is a human in the loop to supervise and make decisions.

So the use of facial recognition systems, or those for other types of biometric information, cannot be used for autonomous decision making, such as automatically tracking a suspect across multiple camera feeds.

FRENCH SENATE APPROVES TESTING OF FACIAL RECOGNITION TECHNOLOGY IN PUBLIC SPHERE

Problems with facial recognition

But why should this be of concern to law-abiding citizens? Civil liberties groups argue facial recognition use in public places affects our privacy and freedom, particularly in terms of its ability to track individuals at mass gatherings and to potentially engage in racial profiling.

Security cameras have long captured us as we went about our daily lives. However, authorities easily being able to put a name to a face in the video footage is something we’re not so used to.

The technology creates a situation where many more people could get caught in the sights of the authorities than before. A person’s casual indiscretions or errors of judgement can now be easily tracked and linked to a name and address.

ITALY OUTLAWS FACIAL RECOGNITION TECH, RESTRICTS USE TO FIGHTING CRIME

Those with a criminal record could be targeted in public based on their past, regardless of whether they intend to carry out any illegal activity. The technology could provide new opportunities for racial profiling, where authorities track or suspect people based on their background, rather than because of specific information about them.

Facial recognition could also be used against people with no criminal past or plans to commit a crime but who the police simply want to stop, such as protesters. The Metropolitan Police may have announced that facial recognition would not be used to target activists at the coronation, but they also provoked outrage for arresting anti-monarchy demonstrators who were later released without charge.

It’s also important to recognise facial recognition technology still suffers from inaccuracies, which can result in false positive matches where an innocent person is mistaken for a known criminal.

With facial recognition posing such perceived threats, it could have a chilling effect on free speech and demonstrations

CONTROVERSIAL FACIAL RECOGNITION TOOL THAT IDENTIFIES EMOTIONS IS RETIRED BY MICROSOFT

What can be done?

However, there are ways that the technology could be used more safely. Law enforcement teams could perform two preliminary steps – activity recognition or event detection – before they resort to face recognition. This approach can help minimise the potential for privacy violations and false positive matches.

Activity recognition refers to the process of identifying and categorising human activities or actions based on CCTV or other sensors. It aims to understand and recognise the activities of individuals or groups, which can include standard activities such as running, sitting or eating.

GLOBAL FACIAL RECOGNITION MARKET TO SURGE BY $4.7BN BY 2025

On the other hand, event detection focuses on identifying specific events or occurrences of interest within a given context. Events can range from simple events like a car passing by or a person entering a room to more complex events like accidents, fights, or more unusual behaviour. Event detection algorithms typically analyse CCTV and other sensors to detect and locate events.

Hence, activity recognition or event detection should be the first step before applying facial recognition to a surveillance camera feed.

Ensuring the data from cameras remains anonymous can also enable police to study the activities of people in the crowd while preserving their privacy. Conducting regular audits and reviews can ensure that the collected data is handled responsibly and in compliance with UK data privacy regulations.

EU PRIVACY WATCHDOGS CALL FOR BAN ON FACIAL RECOGNITION IN PUBLIC SPACES

This can also help to address some of the concerns related to transparency and accuracy. By using activity recognition or event detection as a first step, it may be possible to give people more clarity – through signage, for example – about what exactly is going on during police surveillance in a public place.

It is the responsibility of the state to ensure the privacy and security of its citizens in order to foster a healthy society. But if facial recognition is implemented in a way that a significant proportion of citizens feel infringes their rights, it could create a culture of suspicion and a society where few people feel safe expressing themselves publicly.

Nadia Kanwal

Senior Lecturer, Computer Science, Keele University

Dr. Kanwal completed her master’s and Ph.D. degrees in Computer Sciences at the University of Essex, UK in 2013. As a principal investigator, she was honored with a prestigious Marie Skłodowska-Curie Research fellowship, which spanned three years and focused on an industry-led project involving secure and privacy-protected CCTV video storage and retrieval. Her research interests primarily lie in Computer Vision and Machine Learning, and she actively explores the application of machine learning techniques to advance solutions in healthcare, security, and vision-related domains.
Dr. Kanwal’s contributions to the field have been reflected in her publications, which have garnered significant support from the research community. Her work encompasses areas such as multimedia data security, visual privacy, low-level image features, virtual reality, EEG/ECG signal analysis, and pupillometry. Furthermore, she serves as a diligent reviewer for esteemed journals and conferences, maintaining an active role in the academic community.
Recognizing her accomplishments and expertise, Dr. Kanwal became a Senior Member of IEEE in 2019. Additionally, in 2022, she was honored with a senior fellow status by Advance HE, UK, further acknowledging her professional achievements.

______________________________________________________________ 

FOLLOW INDEPENDENT PRESS:

TWITTER (CLICK HERE) 
https://twitter.com/IpIndependent 

FACEBOOK (CLICK HERE)
https://web.facebook.com/ipindependent

Think your friends would be interested? Share this story!

Leave a Reply

Your email address will not be published. Required fields are marked *