Rehtaeh Parsons and the horrors of automation

Jesse Brown on the ubiquity of auto content, and the problem of responsibility

The "accidental" ad, now blocked by Facebook.

Months after her death, the online abuse of Rehtaeh Parsons continues. An admin for, the spammy dating site that ghoulishly illustrated a Facebook ad with Parsons’ face, says that he’s sorry.

“I sincerely apologize,” Anh Dung told CTV news, “I simply used a tool to scrape images randomly on Google Images and inserted it into the ad campaign.”

It might seem simple enough within technology circles, where the use of algorithms to automatically pull and repurpose Internet content is the norm. The code for Ionechat’s ad campaign probably worked something like this: use Facebook’s advertising platform to segment straight male users, determine where they live from their profile data, use that data to generate Google Image search terms, such as “Canadian girl face.” Scrape the images that result and randomly plop them in, and you’ve got a user-targeted ad that maybe one in ten thousand viewers will click on. There are thousands of little businesses that use these methods, or variations of them. Like Dung said, it’s simple.

To the rest of us, it’s sci-fi level horror, a gross future where invisible and unfeeling machines mush together random artifacts from our lives, even after we’re dead, in order to sell cheap goods and services.

Facebook has also apologized, and blacklisted Ionechat from ever advertising with them again. is offline, at least for now.

You can expect the fallout to end there. To dig any further would be to question automated content itself, which social media relies on to function. No matter how many people are hired to monitor sites like Facebook for bullying or abuse, they will only ever be able to scrape the surface of the billions of communications that occur on that platform. Similarly, YouTube employees couldn’t possibly pre-screen the one hundred hours of video that are uploaded to it every minute, but engineers are working hard to write code that analyzes video for a certain percentage of pink, porny pixels or for copyrighted content. The whole point is to replace costly humans with bots that can make decisions almost as well. But there will always be a margin of error. No human would mistake a video of squirming piglets for porn, or block a home video on copyright grounds because a radio is playing a Justin Bieber song in the background.

And no human actively chose to use the image of a tragically dead child as an enticement for online dating. It just sort of happened. Anh Dung calls it an “accident.”  Facebook says its “extremely unfortunate.” Just like the abuse Rehtaeh Parsons suffered when alive, everyone feels awful, and no one feels responsible.

Follow Jesse on Twitter @JesseBrown

Looking for more?

Get the Best of Maclean's sent straight to your inbox. Sign up for news, commentary and analysis.