Getting targeted ads on social media has been a general practice for a while now, but what about seeing an ad created for you while making your way to catch a bus at Union Station?
Recently on Reddit, a user noticed billboards around the Union Station Bus Terminal using a form of facial recognition software to pump out an ad in line with a person’s presumed preference. Metrolinx confirmed to CTV News Toronto that these billboards are found on the way to the bus terminal, but not inside of it, as the poster suggests.
“This media unit runs anonymous software, used to generate statistics about audience counts, gender and approximate age only,” the sign reads. “To ensure privacy, no images and no data unique to an individual person is recorded by the camera on this unit.”
These images are then immediately and permanently deleted after being processed in mere milliseconds, according to the sign.
Cineplex Digital Media (CDM), the company that owns the billboards, said online that the camera sensor gauges how many people are looking at any given piece of advertising or content at any given time.
While these billboards have caught the attention of Torontonians recently, a spokesperson for CDM confirmed to CTV News Toronto that they were installed about three years ago.
“We strictly adhere to the guidance provided by the (Office of the Privacy Commissioner of Canada), ensuring that our technology is used ethically, responsibly and in compliance with all relevant privacy laws and regulations,” CDM’s Jeevan Vivegananthan said in an email.
Viveganathan says the purpose of this Anonymous Video Analytics data is primarily for audience counting, adding that “there is no personal identification, no pictures stored, and no tracking or profiling of individuals.”
Wendy H. Wong, professor and Principal’s Research Chair at the University of British Columbia, noted that this technology—while apparently being used to glean some demographics from passersby—is still collecting information, even if the image will be deleted.
“I think the issue is not even about whether you know. They kind of say it’s anonymous, I don’t know what it means to be anonymous because faces are pretty non-anonymous, as far as identifying features are, so even if they’re not taking a photo, which is technically correct, they are taking facial data,” Wong said.
“It’s a rhetorical difference that I think has, in terms of implications, not very much difference and, in fact, perhaps greater implications because facial data (is) more easily transferrable and useful for more than one purpose.”
She said these billboards are also asking passersby to trust the company to uphold their word, in that their images will be immediately and permanently deleted.
“But you’re also not being given the option to not trust the company, right? You can’t opt out, so you’re forced to trust the company and I think if you were just a busy commuter … you’re not looking at the warning,” Wong said.
There is also a power differential in obtaining informed consent in this setting, according to associate professor of media economics Brett Caraway, as it is not like someone can negotiate certain terms of an agreement in real time.
“The same sorts of power differentials exist in the online world. If I don’t like one particular part of, say, the Facebook license agreement, do you think I can just call up Mark Zuckerberg and say, ‘Hey, I’m not really cool with section 3.1,’” Caraway said. “There’s no good faith bargaining going on in a video wall you walk up, have you already been recorded before you see (the sign)?”
He also noted that, even though the images are being immediately deleted, for this technology to work, this data has to be stored somewhere—even if it’s just for a millisecond.
“There is always a digital record. There is an artifact that is left over from the process of being observed because that’s the only way in which an analysis can happen,” Caraway said.
“Because the analog world is translated into the translated world, even if it’s only for a split second, it is somewhere and there is risk at that moment that the data can be compromised.”
This same technology was used at some Cadillac Fairview malls, but has been removed
About seven years ago, Cadillac Fairview installed the same facial recognition software inside a handful of “wayfinding” directories at a handful of malls across Canada, unbeknownst to shoppers.
Using the same software as the billboards spotted near the Union Bus Terminal, the corporation said it would immediately delete the images of people after viewing them—but a joint investigation by federal, Alberta and B.C. privacy commissioners found the information generated from these pictures was being stored by a third-party contractor. Cadillac Fairview said then that it was unaware of this.
“When asked the purpose for such collection, Mappedin was unable to provide a response, indicating that the person responsible for programming the code no longer worked for the company,” the report reads.
Then-Privacy Commissioner of Canada Daniel Therrien noted that Cadillac Fairview’s unawareness of this storage of information “compounded the risk of potential use by unauthorized parties or, in the case of a data breach, by malicious actors.”
About five million images of Canadian shoppers were collected through Cadillac Fairview’s kiosks in malls across the country, including Sherway Gardens and the Toronto Eaton Centre.
“The five million representations referenced in the OPC report are not faces. These are sequences of numbers the software uses to anonymously categorize the age range and gender of shoppers in the camera’s view,” Cadillac Fairview told CTV News then.
The facial recognition software is no longer in use at Cadillac Fairview malls.
What could this mean for the future of advertising?
With many media platforms offering their goods and services for free, Caraway says there’s an economic incentive to go deeper with advertising—especially when it can be combined with AI.
“In this video wall example, maybe you walk up and you’re carrying a particular purse or you’re wearing a particular hat or you have a certain logo on your clothing. You walk up to this video wall, AI can analyze that and not just reference some prefabricated advertisement, but could make up a brand new advertisement tailored specifically to you,” Caraway said.
While not quite there, Caraway said advertising is getting closer to that point—meaning there’s going to be an increased economic incentive to “quantify all of us.”
“This is already a marketplace that exists, but with AI and the capacity to deliver tailored advertisements to individual consumers, I fully expect that we will see firms continue to push the boundaries of what constitutes and invasion of people’s privacy and so I think this is just an early case, and I expect many more of these to follow,” Caraway said.
With files from CTV News’ Rachel Aiello