It was an entirely valid question: “hey so why do the stupid m&m machines have facial recognition?” Below it, two photos posted on Reddit this past February showed an error message on a vending machine screen at the University of Waterloo. It looked like the machine’s facial recognition software wasn’t working. The Reddit thread helped spark a campus newspaper article that then set off a flurry of national and international headlines: “Ontario university students freaked out about facial recognition vending machines”; “Vending machines got caught peeping on snackers.”
There’s something disconcerting about a sophisticated piece of surveillance technology deployed for something as banal as selling candy. Invenda, the Swiss company behind the machines, issued a statement to the CBC that no cameras were inside the machines and the software was designed for facial analysis, not recognition—it was there to “determine if an anonymous individual faces the device, for what duration, and approximates basic demographic attributes unidentifiably.” (The CBC report pointed out that Invenda’s CEO had used the term “facial recognition” in a previous promo video.) The Waterloo Reddit thread opened up a can of other questions. Why would a company collect this information to begin with? How many of the mundane transactions that make up our daily lives are being mined for intimate biometric data without our knowledge or approval? And how far will this technology go?
Police forces and border security commonly use facial recognition software to identify people deemed security threats. New surveillance tech is coming online amid increasing concerns about retail theft. Critics argue stats about retail crime are murky—some companies amp up anti-theft measures without releasing info on how much they’re actually losing. And sometimes the data is flawed: last December, the National Retail Federation, a US lobby group, retracted its claim that organized retail crime was responsible for half of the $94.5 billion (US) in inventory losses.
Facial recognition as an anti-theft measure is surreptitiously being used in malls and big-box stores. Last year, a report from British Columbia’s privacy commissioner found that four Canadian Tire locations used highly sensitive facial recognition technology to capture biometric data from customers between 2018 and 2021 without adequately notifying them or asking for their consent. Twelve stores in the province were using the technology, justifying its deployment for theft monitoring and staff safety. In 2020, an investigation by multiple privacy watchdogs flagged that Cadillac Fairview had used facial recognition software in twelve of its malls to convert 5 million images into numerical representations of faces, without consent, from “inconspicuous” cameras placed inside digital information kiosks. Cadillac Fairview has described its use of the software as a safety and security measure. But the placement of the cameras feels at odds with this assertion: Do thieves and other rabble rousers regularly queue up in front of those finicky mall maps?
The UK is so taken with facial recognition that, in April, the government announced plans to fund vans that could scan people walking along high streets—in other words, public spaces—in search of shoplifters. Last March, CBS reported that Fairway supermarket locations in New York City had started using biometric surveillance to deter theft, alerting shoppers with a sign in store entrances that the chain is harvesting data such as eye scans and voice prints. Some customers hadn’t realized they were being watched. One told the outlet: “I noticed the cheese sign and the grapes, but not the surveillance, yeah.” Last year, New York Times reporter Kashmir Hill found facial recognition software in use at a Manhattan Macy’s department store as well as at Madison Square Garden. She tried to accompany a lawyer involved in litigation against the arena’s parent company to a Rangers-versus-Canucks game there. The lawyer was promptly recognized by the tech, security was alerted, and a guard kicked her out.
It’s somewhat ironic that large corporations, seemingly concerned with theft, are taking the biometric data of individuals without their knowing. But facial recognition and analysis provide another perk to companies: opportunities to amass more data about us and sell us more stuff. For example, FaceMe Security, a software package patented by CyberLink, promises retailers intel through a face-scanning security and payment system. It can provide data on who is buying what and when, insights into a customer’s mood during the purchasing process, as well as age and gender information—to better time potential sales pitches and other demographic-specific marketing possibilities.
Those who champion the use of biometric data in retail spaces frequently speak of personalization, efficiency, and “frictionless” transactions for consumers, a euphemistic script selling these changes as inevitable and good. The description for one episode of Stories from the Frictionless Future of Payments, a Mastercard podcast, includes the line “Would you prefer to pay using touch, feelings, or your smile today?” In a 2020 op-ed for the magazine Retail Insider, the CEO of a facial recognition software company wrote that the use of facial recognition for point-of-sale systems will become the new norm in Canada. Customers presumably won’t have to engage in the laborious practice of taking out their credit card or phone to pay for something; face recognition could just link to their store account.
Anyone who has struggled with a self-checkout kiosk while another customer whizzes through a register operated by an actual human being may have doubts about technology that’s meant to be friction free. I do not want to pay for things with my face. I don’t want surveillance technology to pick up on my depressed mood over the increasingly worrying state of the world and offer me a discount on cookies. I like cookies. A lot. But not enough to trade in a dollar-off promo for, say, a retina scan.
Companies already have plenty of intel on us as is. Loyalty programs such as PC Optimum track shopping behaviour. Store websites can track us online, while search engines monitor our queries, then adjust ads based on what we look up. Credit card companies nudge us into swiping plastic more frequently by offering rewards, travel points, and other perks; some have been accused of selling that information to third-party companies, again without consent, since information about race, geography, age, and education can help marketers push out highly specific, targeted ads.
Drunk on data, businesses now want more access to our bodies in the physical realm. And while they’ll use carrots, like promotions, special offers, and the promise of breezier shopping experiences to get us on board with facial recognition, I’m more worried about the sticks.
One clear concern is how this technology will impact marginalized and racialized people, who are often harassed and racially profiled while trying to perform basic transactions, like going to the bank or shopping for food. Facial recognition has been shown to worsen inequities in policing. Several Black people have filed lawsuits against police forces across the US after being misidentified by facial recognition technology. In one case, a woman who was eight months pregnant was suspected of a carjacking and wrongfully arrested. In another, a man was arrested in Georgia after a detective used surveillance video from a Louisiana consignment store, where a stolen credit card was used, and ran it through facial recognition technology. It generated a match, and an arrest warrant was issued, the lawsuit alleges, and the plaintiff spent several days in jail as he tried to sort out what crime he was suspected of having committed in a state he says he hadn’t even visited. Yet now there’s a push to put this technology in the hands of retailers and mall cops.
With wider adoption of facial analysis, it would become increasingly difficult for Canadians to opt out of this invasive technology. About 18 percent of Canadians live rurally—which often means they’re patronizing a handful of local stores. What happens when you combine a lack of choice with software errors? Let’s say you live in a small town with one grocery store. Facial recognition software marks you as a threat, and security kicks you out of the store. Will you be banned from that store forever? Will you be banned across the entire chain and any other space the parent company owns? Maybe one of the bots that are increasingly operating customer service lines will hear you out.
A lack of options is also an issue for people outside of rural areas—consider Canada’s grocery industry, which is dominated by a small group of players. If a few big retailers opt in, facial recognition could influence a lot of our shopping choices.
In 2023, Canada’s privacy commissioner released draft guidance on biometrics for public institutions, noting that while they promise to “deliver faster services,” there are serious concerns, particularly when it comes to leaving individuals vulnerable to fraud and identity theft.
In the US, the Federal Trade Commission has also warned of the technology’s potential for discrimination, threats to both privacy and civil rights, as well as concerns that malicious actors could hack their way into sensitive information, including whether a person has “accessed particular types of healthcare, attended religious services, or attended political or union meetings.”
In 2019, the Guardian reported that the fingerprints, facial recognition data, and other personal information collected by the security platform BioStar 2, which is used by banks, UK police, and defence firms, was hacked and put onto a publicly accessible database. In 2021, Forbes reported on a “new wave of biometric crimes” that included hacking facial recognition data to commit tax fraud and faking fingerprints using 3D printers. And a data breach in Indonesia in 2023 led to the biometric data of nearly 35 million citizens going up for sale on the dark web.
Absent any robust regulations, there’s little individuals can do to avoid the creep of facial recognition. Rage against the vending machines prompted the University of Waterloo to remove all twenty-nine Invenda devices from campus. The Record reported that, following the complaints, the office of the privacy commissioner of Ontario has “open files” on the issue. A spokesperson for the office of the privacy commissioner of Canada also told the newspaper it was looking into the matter.
Many of us more or less get that we’re giving up our privacy when we go online. We have willingly uploaded our faces to social media networks and shared our thoughts and feelings in online spaces in the spirit of connecting with others. We’ve built our own avatars, and we’ve barely glanced at the “terms and conditions” before accepting them. But the surreptitious use of facial analysis in physical spaces feels like a line is being crossed—as though existing offline, trying to buy books or socks or grapes in a brick-and-mortar store, or even just considering a transaction near a vending machine are activities that are an affront to the data mongers who always need more intel to commodify. In the new norm they’re pushing for, there will be fewer and fewer spaces where we aren’t being watched, analyzed, and monetized. Even if we try to outsmart the marketing schemes, our faces might give us away.