In the hours after Michael Brown’s parents learned that a St. Louis County grand jury had voted not to indict Ferguson police officer Darren Wilson, who shot and killed their son in August, the couple released a statement calling for law enforcement agencies across the US to adopt the use of body-worn cameras.
Already in place in many California municipalities, and currently being field-tested in New York City, these battery-operated digital devices, advocates say, will not only calm potentially lethal encounters between police and civilians; they will also provide an irrefutable record of what transpired, thereby eliminating—or at least significantly reducing—the ambiguity arising from seemingly contradictory eyewitness accounts that did so much to fuel months of upheaval in Ferguson.
As I reported in last Saturday’s Globe and Mail, body-worn cameras are probably coming soon to a Canadian city near you. The Calgary Police Service is leading the way, with a rollout to 800 front line officers set for later this year. Other cities, including Vancouver and Edmonton, have run pilot projects with these cameras; the Toronto Police Service will launch its own test early next year.
What makes the Calgary deployment noteworthy is that the CPS plans to use the body-worn cameras in conjunction with biometric facial recognition software—in effect, using the videos as a means of identifying suspects in criminal investigations. The coupling of those two technologies attracted the attention of Alberta privacy commissioner Jill Clayton, and with good reason: if body-worn cameras were originally intended to improve transparency and accountability, then using the technology as a way of augmenting police surveillance surely represents an instance of the sort of mission creep described in my current Walrus feature about the militarization of domestic policing.
Unlike many contemporary law enforcement technologies, facial recognition software—which matches facial photos as a means of identifying individuals and then establishes links to other databases of computer records—did not emerge from the military industrial complex. According to Christopher Parsons, a post-doctoral fellow and the managing director of the telecommunications transparency project at the University of Toronto’s Citizen Lab, the broadest applications to date involve tranches of official photos maintained by government agencies that issue identification documents, such as passports and driver’s licenses.
In recent years, he adds, facial recognition software has become substantially more sophisticated. The advent of so-called 3-D recognition techniques allows the software to make matches between official posed photos and informal, un-posed ones—e.g., images posted on social media sites. What’s more, these biometric algorithms, which can “learn” to recognize faces based on composites developed from multiple images, are no longer restricted to government security. Facebook has a facial recognition app, and at least two developers have built apps for Google Glass that purport to be able to run facial images through picture databases from dating sites or sex offender registries, Forbes reported earlier this year.
To date, this kind of cross-referencing hasn’t produced great results, says Parsons, although he adds that the latest generation “is better than it used to be.”
There’s also evidence to suggest that body-worn video cameras may soon have streaming capacity. At the moment, most devices on the market store several hours of video, which must be uploaded either to police computers or secure cloud-based servers at the end of a shift. But one of the main suppliers, Vievu, this summer unveiled a new “military grade” body cam that allows users to stream footage “to a smartphone using the VIEVU² mobile app,” according to a press release. “The video can then be reviewed, stored, and shared without requiring a computer.”
During last summer’s FIFA World Cup in Brazil, military police tested similar devices, described by the Telegraph as “Robocop-style glasses,” that would allow wearers to scan faces in the crowds. The software, which tracks 46,000 points on a human face and then compares it to images in police databases, is meant to provide security crews with the capacity to spot and positively identify “troublemakers” and known criminals in the audience—even, apparently, if they aren’t necessarily doing anything wrong. As one Brazilian police official told the newspaper at the time, “It’s something discreet because you do not question the person or ask for documents. The computer does it.”
Argentina has gone even further, moving in recent years to use facial recognition technology to connect large databases of official photo IDs with images drawn from surveillance equipment, such as closed-circuit TV cameras. “This raises the specter of mass surveillance, as Argentinean law enforcement will have access to mass repositories of citizen information and be able to leverage existing facial recognition and fingerprint matching technologies in order to identify any citizen anywhere,” according to the Electronic Frontier Foundation, which noted that the UK abandoned a national identification “scheme” after public outcry over privacy.
Despite that, one UK municipal police department is testing a facial recognition system designed to match CCTV images and a police database of 92,000 facial images. The software makes a successful match 45 percent of the time, according to a report in The Week. The article noted that if the test is deemed to be successful, the technology will be deployed in municipal police departments across the UK.
And in Canada? Police in Vancouver successfully used facial recognition technology to identify looters during the Stanley Cup riot in 2011, drawing from videos submitted by bystanders as well as CCTV images. The technology was also deployed during the G8/G20 in Toronto. But Parsons points out that at date, there’s not enough data on general law enforcement applications to determine whether this sort of facial recognition is effective.
Last year, the Officer of the Privacy Commissioner published a sixteen-page overview of the issues associated with facial recognition technology. The report noted that the FBI is in the process of rolling out a vast biometric/facial recognition system—known as its Next Generation Identification Program—that will be coupled to a database with biographical and biometric information on 100 million Americans. The Privacy Commissioner’s analysis also pointed out that such technologies will be deployed in military contexts, for example on surveillance drones or with specially designed eyeglass-mounted cameras.
“Faces,” the study observes, “have been transformed into electronic information that can be aggregated, analyzed and categorized in unprecedented ways [italics added].” Body-worn cameras, it seems, will become part of that picture.