Your mall is watching you

Your mall map sees the expression on your face. It knows how you feel. Creeped out yet?
A shopper uses the interactive information map at the downtown shopping mall. Eaton Centre parent company Cadilac Fairview uses facial recognition software embedded into the maps. Not yet confirmed the technology is at EC, but CF has said it is used nationally. Shoppers at Eaton Centre may not know about recognition software being used. July 26, 2018. (Rene Johnston/Toronto Star/Getty Images)

Before Sharon Polsky sets foot inside the Eaton Centre mall in downtown Toronto on an autumn afternoon, she snaps a photo of the entrance. There’s a sign with a drawing of a CCTV security camera and text that reads: “For your safety and security, these premises are video recorded.” Polsky opens the door and walks a few steps to the mall directory, the interactive display where shoppers locate their desired stores on a digital map. But she doesn’t look at the screen, pointing instead to a tiny black circle about the size of a quarter imprinted on the black glass bezel surrounding it.

“If there’s a white frame, there’s a white circle. If it’s a black frame, it’s a black circle,” says Polsky, president of the Privacy and Access Council of Canada, an independent advocacy group. “It’s very subtle. Your attention is focused on the map, not the frame.” It looks as though a sticker has been applied and peeled off, leaving a faint ring. It’s a camera, Polsky says, noting that nowhere in the mall does it say there are all-but-hidden cameras that are able to use facial-recognition technology on unsuspecting shoppers.

As shoppers go about their day, buying everything from shoes to books, most are unaware of the latest technology their mall may be using to gather data on them—be it their age, their gender or even their mood.

READ MORE: Why can’t Ontarians buy booze in corner stores? Blame the surveillance state

Cadillac Fairview, the company that owns the Eaton Centre, deployed facial-analysis technology inside interactive directories at two of its Calgary malls earlier this year. At least, it did, until a software glitch revealed to passersby folders bearing names like “FaceAnalyzer,” “gender” and “age.” A visitor snapped a picture of the text and posted it on the website Reddit, where it was quickly picked up by the media. That led to the federal privacy commissioner opening an investigation into the practice, and Cadillac Fairview saying it would stop using the cameras at those two malls, pending the outcome of the investigation.

What the company won’t say, though, is whether it suspended the practice at any of the other 21 Cadillac Fairview-owned malls—a portfolio that spans six provinces and includes major shopping venues like Rideau Centre in Ottawa, Winnipeg’s Polo Park and the Carrefour Laval, outside Montreal. A Cadillac Fairview spokesperson declined to comment to Maclean’s, citing the ongoing privacy investigation.

The reticence has made the commercial property giant a target for critics. But as Polsky notes, Cadillac Fairview malls “are not the only ones doing it.” The revelation of the cameras in Calgary count among a series of recent unsettling cases of data-gathering technology creeping up on the public and governments, raising concerns few seem to have considered. Do we, as individuals, have any right to the information that machines are now able to glean from our appearance, movement or habits? Should we have the right to opt out? Is it too late to demand it?

It’s not that the companies using it are breaking laws, say experts. It’s that they’ve taken advantage of a legal void, insisting they’ll act responsibly with the information available to them.

One of Tim Bratton’s jobs involves supporting someone with an intellectual disability, and the Centre Mall in Saskatoon is one of the best places for them to go for a stroll. It was here that, over the summer, Bratton, who also works as an actor, noticed a Cineplex-branded display screen with a camera on the top. “There’s no explanation anywhere of what the cameras are, why they’re there, what they are or aren’t collecting,” Bratton says. When he contacted the mall’s owners, Morguard Retail Leasing, Bratton got an answer he found unsatisfying. “A PR person said: ‘Don’t worry, we don’t keep information,’ ” he recalls. “When big corporations say, ‘Just trust us,’ I hope we’ve learned not to trust them carte blanche.”

The displays are run by Cineplex Digital Media (a branch of Cineplex unrelated to showing movies), which uses software from the France-based tech firm Quividi that deploys so-called “anonymous video analytics” to track the demographics of passersby.

RELATED: How the Internet of Things is going to transform retail

Quividi’s chief marketing officer, Denis Gaumondie, draws a distinction between that and facial recognition, where data is stored so a person can be identified in the future. “We can’t do that because we delete all the pictures after a 10th of a second, so we can’t say if you passed by the same screen twice,” says Gaumondie. “That’s why it’s called anonymous video analytics.” By compiling the metadata, he says, a device can track how many people walk by the screen and how long people stand there, as well as their age range and gender. It can also gauge whether a person looks happy or unhappy. “We don’t track race. We refuse to do ethnic recognition,” he adds. “We never felt at ease with that because it could be used for discrimination.” The general manager of Saskatoon’s Centre Mall, David Bubnick, says via email that the aggregated information helps “identify visitor trends, which helps us determine our retail mix and offer new retailers and services to suit the community.” A Cineplex Digital Media spokesperson echoed the reassurance, adding “there is no personal identification, no pictures stored and no tracking or profiling of individuals.”

Bratton, for one, is skeptical. “Often this stuff gets marketed back to us users as ‘We want to better know who the people are so we can better serve them,’ ” he says, adding that “there’s a little dishonesty in that,” because that information is then shared with advertisers for better marketing: “I’ve become the product,” he says. And Polsky disputes the claim that the companies are not storing pictures, citing what she says is an important technical nuance: “If you’re capturing enough of my face to run analytical software, you are capturing it, collecting it and storing it—if only for a nanosecond,” she says. “You are storing it for that moment until the algorithm can make a mathematical formula that represents my unique face.”

It’s against the Personal Information Protection and Electronic Documents Act (PIPEDA) to collect, use or disclose personal information without first getting the individual’s consent. But the law doesn’t clarify the grey zone where pictures are constantly captured, analyzed and deleted, all within a snap of the fingers. “It’s pretty clear that a picture or video of you is personal information,” says privacy lawyer David Fraser, a partner with the Halifax law firm McInnes Cooper. “You’ll end up with an interesting discussion about whether an image that only exists in memory for a millisecond is, in fact, collected or used—because it’s immediately discarded.”

READ MORE: When it comes to our data privacy, we don’t really have a choice

There’s no obligation for companies to register such devices with the privacy commissioner, Fraser adds. “If they take the view that they’re not subject to the legislation, they can do whatever they want with that information—subject to the privacy commissioner, or court, telling them they’re wrong.”

Trouble is, the office of the privacy commissioner doesn’t typically investigate until it’s received a complaint. And to file one, people would have to know their image is being analyzed in the first place. According to Polsky, without the glitch in the Cadillac Fairview mall display, the privacy commissioner would never have known.

What few dispute is the paucity of notification that facial recognition or visual analysis is taking place in malls. When asked about signage in Centre Mall, Bubnick, the general manager, pointed only to signs at the mall’s entrances stating that CCTV cameras were in use. He did not respond to follow-up queries asking if there was any signage disclosing the use of anonymous visual analytics, or how the information is shared.

“The aggregate audience data that is collected would be used in the same manner as any other form of demographic data used in the advertising industry,” says Sarah Van Lange, a Cineplex spokesperson, via email. “Again, all such data collected is used in compliance with applicable laws and in accordance with industry best practices.”

Quividi’s Gaumondie notes that many countries have no requirement to disclose that the systems are being used. But he adds: “We think it’s good practice to inform [shoppers] that anonymous visual analytics is in operation. The more you can explain the use of these cameras, the better. We would advocate for that.”

Tiny frame-mounted cameras analyze your face, discerning your age, gender and mood. (Rene Johnston/Toronto Star/Getty Images)

Mappedin is one of Canada’s newest and most successful start-ups. Based in Waterloo, Ont., the company is dedicated to remapping the indoor experience, making it easier to navigate places like malls, airports and hospitals without having to stand for minutes at a stationary map scanning for a “you are here” dot.

Instead, mallgoers type in the store or product they’re looking for and receive easy-to-follow directions. It’s simple and intuitive—which might explain why Mappedin boasts on its website that it works with nine of the 10 largest shopping centres in the country, including those owned by Cadillac Fairview.

Mappedin doesn’t talk much about its software having facial-recognition capabilities. After “mappedin” appeared in one of the files seen on that malfunctioning display in Calgary, the company acknowledged to CBC that its software indeed has recognition capability. But the specifics of what the private company is tracking—and how prevalent its technology is across Canada—remain a mystery, because they won’t say.

Mappedin’s director of marketing and CEO both did not respond to multiple requests for comment on a range of questions, including whom the facial-recognition data is shared with and whether the company tracks race or ethnicity. A former employee reached by Maclean’s also declined to comment, stating the company had him sign a non-disclosure agreement in relation to Mappedin’s facial-recognition technology.

READ MORE: How Mappedin kept outgrowing its offices (from 2013)

For privacy experts, that’s worrying. “This type of facial-recognition technology should never be used without the consent of the individuals,” says Ann Cavoukian, a former information and privacy commissioner for Ontario. “It can raise all sorts of consequences in terms of having your image on file to use for other purposes.” Cavoukian’s biggest fear is the prospect of images and personal information being used for identity theft, a crime she dealt with during her tenure as privacy commissioner. “It’s a nightmare trying to establish your identity. They steal your identity and rack up charges against you, and you have to prove it wasn’t you who bought all this stuff.”

She warns people to be aware of unintended consequences, drawing a key distinction between privacy and secrecy. “Privacy is all about personal control of the use of your personal information,” she says. “Your facial image—your biometrics—are the most sensitive personal information. You should be in control of who gains access to that and how it’s used.”

Polsky worries that facial-recognition technology could be used in conjunction with cell phone data that can track an individual throughout the mall. Every cell phone already has a unique digital identifier that constantly chirps out signals as a phone tries to connect to, say, a mall’s WiFi network. That information, paired with facial recognition from a mall directory, for example, could theoretically allow a mall operator to figure out a customer’s age range, gender, mood and desired store from the directory display, and whether the customer in fact visited that store or stood in line at the cash register. Says Polsky: “This isn’t some tinfoil-hat theory.”

But even with an investigation involving Cadillac Fairview under way in Alberta, the federal privacy commission doesn’t have much enforcement power. “Currently, the [federal] privacy commissioner is an ombudsman and may make recommendations to an organization he has investigated,” says Tobi Cohen, a spokesperson for the office of Privacy Commissioner Daniel Therrien, in an email. “The privacy commissioner can also enter into compliance agreements with organizations.” What the post of commissioner lacks, though, are key powers to issue orders and impose fines. As Polsky puts it: “The commissioner can say, ‘This isn’t right. Please stop.’ And Cadillac Fairview can say, ‘Thank you very much, and we’ll take your recommendations under advisement.’ ”

RELATED: It’s time to overhaul Canada’s data protection: your rights are at stake

It’s called “privacy by design.” It’s a framework Cavoukian pioneered, whereby privacy measures are proactively embedded into every step in the development of a new product or service. The concept was used in the newly implemented General Data Protection Regulation (GDPR) in the European Union. The EU’s new privacy legislation also incorporates the concept of “privacy by default,” where the strictest privacy settings must be the default for any new service.

In Canada, privacy regulations aren’t nearly as up-to-date. Back when PIPEDA was adopted, consumer interactions with businesses were “generally predictable, transparent and bidirectional,” Therrien said in a 2017 appearance before the Standing Committee on Access to Information, Privacy and Ethics. “Consumers understood why the company that they were dealing with needed certain personal information.” Now, said Therrien, “it’s no longer entirely clear who’s processing our data and for what purposes.” Which is why he’s urging Ottawa to adopt something similar to the GDPR. “Canada needs powers comparable to those in other jurisdictions in terms of order-making powers and fines in order to have meaningful impact on privacy protection,” says Cohen.

A House of Commons report released this year recommended multiple improvements to Canada’s privacy law, including making “privacy by design” a key principle. But Cavoukian has since criticized the government’s lack of urgency, and stresses that technologies like facial recognition or anonymous visual analytics should be something customers opt in to use. “At the very least, you have to give notice,” she says. “Have a sign that says: ‘We use facial recognition and extract data related to age and gender.’ ”

Cadillac Fairview, in exchanges with media at the time of the directory glitch, contended that it didn’t require consent from customers because it neither records nor stores video. But when companies don’t disclose what they are gathering, “many people infer that they’re hiding something,” warns Fraser, the privacy lawyer. “My advice to clients is to err on the side of more transparency. That builds trust.” Canadian Tire, another company that says it uses Mappedin’s wayfinding software at one of its locations, has cameras located along the frame of multiple in-store interactive displays. A company spokesperson says such displays “come standard with a camera,” and adds, “We have never enabled that functionality at any of our in-store displays, and to report otherwise would be grossly inaccurate.”

A spokesperson for Oxford Properties, a major real estate company with several retail developments in Canada, confirmed to Maclean’s in an email that its mall directories have cameras that use anonymous video analytics to measure mall traffic. He added: “There is no personal identification, tracking or profiling of individuals, and the information is not recorded, stored or used by Oxford.”

Still, Fraser has a standard piece of advice for companies deploying facial recognition or any similar technology: “If you can’t look your customer in the eye and explain this to them, maybe you shouldn’t be doing it.”