Skip to main content
main content, press tab to continue
Article | FINEX Observer

Facial recognition: Balancing security and privacy in the retail sector

By Conal McCurry and Lauren Baldwin | April 2, 2024

Facial recognition helps to reduce the risk of fraudulent activities, but it also poses significant security and privacy concerns.
Cyber Risk Management|Financial, Executive and Professional Risks (FINEX)
N/A

Facial recognition software is one of the latest biometric identifiers that companies are relying upon to streamline business operations. While facial recognition is being utilized to fight shoplifting threats, it also poses a significant security and privacy threat.

Loss of product, other than sales, in the retail industry is known as shrinkage. Shrinkage can be comprised of theft, accounting errors or broken items. While the average 2023 shrink rates were 1.6%, the largest percentage of shrink rates were between 2% to 2.99%, with 22.6% of polled retailers reporting levels in this range. Overall, theft makes up 65% of a retailer’s shrink rate, but can be as high as 70% for some retail subsectors. Also, alarming is the statistic that retailers saw the number of shoplifting incidents involving violence increase by 35%.

Given the fact that retailers strive to provide a safe environment for workers and customers, and employees are often not trained to interfere with violent thefts, utilizing biometric identifiers can be a key way to mitigate theft without endangering employees.

How does facial recognition software work?

Security cameras in retail stores identify a customer’s face and tracking technology compares an individual’s biometrics against a database, which can include information on known criminals or previous shoplifters. This software can also be used for more positive reasons, such as tracking foot traffic, user behavior and identifying VIP or loyalty program customers.

The software’s matching stems from two different algorithmic datasets. The first being 1:1 matching, which utilizes information collected from a specified individual and compares it to alternative information from the same individual. We see 1:1 matching commonly, as this is the verification process in place where our personal identification is utilized and compared to our actual appearance when attempting to enter a venue or location.

Alternatively, 1:many matchings can be utilized to validate collected information from a specific individual compared to a larger database. A common use of 1:many matching is what we have seen within casinos in recent years. Casinos frequently utilize this algorithmic dataset to validate a potential cheater versus a current library of known cheaters. While both algorithmic datasets have their distinct advantages, there are also significant areas of concern created through the utilization of this technology.

What are the concerns?

The utilization of facial recognition software to prevent theft can pose several different concerns. One area of concern is customer consent in collecting or retaining customer biometrics, as well as the accuracy of the facial recognition software being used[1].

When it comes to consent, state statutes vary widely on what exactly is required and the GDPR also can potentially come into play for foreign nationals. Companies utilizing facial recognition software need to also be cognizant of preset functions governing who the software is intended to collect information from. If the software is programmed to monitor a select group based on specific parameters, the organization could leave themselves open to discrimination scrutiny resulting in allegations of discrimination or more specified allegations of wrongful detention or detainment.

Organizations that use generative artificial intelligence (AI) need to be aware how data is leveraged and collected.

Furthermore, we urge all organizations to stay up to date on the current regulatory environment. The ever-changing regulatory landscape leaves the door open to potential litigation stemming from the usage of facial recognition software and the collection of biometric data. While posted notices and disclosures to all individuals where this software is utilized is a great place to start, such blanket notice might not be sufficient for each state that an organization does business in.

Additionally, if information is collected from foreign nationals, companies could be subject to fines and penalties enforceable through violations of the GDPR and other similar statutes.

When it comes to the use of AI, there seems to be a common misconception that because the technology teaches and learns on its own, frequent performance testing is not needed. Organizations utilizing AI should create a regular cadence to assess the performance of the AI and the algorithms being utilized. The National Institute of Standards and Technology (NIST) continues to publish their Face Analysis Technology (FATE) quality assessment reports which provides a scoring grid of all current algorithms submitted to NIST for review.

What are the implications to cyber coverage?

As there has been an uptick in the collection and retention of personal information, whether currently classified as biometrics or not, many carriers have begun addressing this. It is becoming more common for markets to broadly exclude the wrongful collection and retention of personal information. In certain instances, exclusionary language may be broader than what is required by current legislation and restrict coverage if facial recognition software is being used without proper consent.

It is imperative for organizations to keep their brokering partners aware of their use of any new biometric collection hardware or software so that coverage can be assessed for potential claims or losses resulting from use of this technology.

Sources

  1. Facial Recognition Technology and Privacy Concerns”, ISACA, date published December 21, 2022 Return to article

Disclaimer

Willis Towers Watson hopes you found the general information provided in this publication informative and helpful. The information contained herein is not intended to constitute legal or other professional advice and should not be relied upon in lieu of consultation with your own legal advisors. In the event you would like more information regarding your insurance coverage, please do not hesitate to reach out to us. In North America, Willis Towers Watson offers insurance products through licensed entities, including Willis Towers Watson Northeast, Inc. (in the United States) and Willis Canada Inc. (in Canada).

Authors

Senior Cyber Broker, FINEX
email Email

Senior Vice President, Southeast Region Co-Leader
FINEX Cyber
email Email

Contact us