Vol. 79, No. 3New technology

Hands typing on keyboard in front of computer screen.

Visionary approach

RCMP seeks software to identify child exploitation

The RCMP-led National Child Exploitation Coordination Centre received 27,000 cases in 2016. That's up from 14,000 cases in 2015 and 10,000 cases in 2014. Credit: Serge Gouin, RCMP


When a new sexually explicit photo of a child is uploaded to the Internet, it's added to a vast collection of images that can be nearly impossible for police to find.

Until now.

The RCMP is turning to artificial intelligence technology to help investigators identify new online child exploitation images, and rescue at-risk children more quickly. Partnering with researchers from the University of Manitoba and the software firm Two Hat Security Ltd., the RCMP is hoping to use the new technology to triage cases.

According to Cpl. Dawn Morris-Little, an investigator at the RCMP-led National Child Exploitation Coordination Centre (NCECC) in Ottawa, prioritizing cases is a key part of her job.

"For every single one of our files, there's a child at the end of it," she says. "Images that look homemade or images that are unknown — those take priority because you don't know when it was created, and those children could still be at risk."

Since 2011, the RCMP has used similar software called PhotoDNA to help identify known and documented explicit photos. PhotoDNA works by converting photos into a hash code, which is like a unique fingerprint for each image. That hash code is added to a database, and if it's ever found again anywhere in the world — online or on a hard drive — PhotoDNA will flag it.

But with the rise of smartphones and tablets, creating new child exploitation content has never been easier. In 2016, the NCECC received 27,000 cases, almost double the number reported in 2015. This new content can't be identified by PhotoDNA, since it hasn't been added to its database yet.

"The numbers are only going up, so we need to be handling these cases in a much smarter way," says Sgt. Arnold Guerin, who works in the technology section of the Canadian Police Centre for Missing and Exploited Children (CPCMEC), which includes the NCECC. "New technology can provide us with tools to review cases in an automated way, and bubble up to the top the ones that need to be dealt with right away."

Computer vision

The artificial intelligence technology — called computer vision — is meant to mimic human vision. It uses algorithms to scan unknown photos and pick out the ones that have a high probability of being child exploitation.

"What would take weeks for an investigator would take the algorithm minutes or hours to scan," says Brad Leitch, head of product development at Two Hat Security. "The algorithm can eliminate the photos of trees and doughnuts and Eiffel Towers pretty successfully and put those high-probability, exploitative images at the top of the list so we can identify victims and make prosecutions more quickly."

Often, minutes matter in child exploitation investigations. Certain laws govern how long police can retain data, so the sooner an investigator can find evidence, the sooner they can lay charges.

"If we seize a hard drive that has 28 million photos, investigators need to go through all of them," says Guerin. "But how many are related to children? Can we narrow it down? That's where this project comes in, we can train the algorithm to recognize child exploitation."

Achieving 100 per cent accuracy with the algorithm isn't the goal — investigators will still have to go through all the material to make sure nothing is missed. This algorithm is meant to prioritize what police look at first, to make sure they're using their time and resources efficiently.

Protecting police

Along with reducing workload, technologies like PhotoDNA and computer vision can also help protect the health and wellness of investigators.

"We see images that no one wants to see, so maintaining our mental health is a priority," says Morris-Little. "Anything that takes the human element out of cases is going to reduce the risk of mental health injury to an investigator."

Guerin says technology like computer vision can act as a shield, sifting through material before it gets to an investigator.

The computer vision product is still in development, but Guerin hopes the RCMP will be able to use it later this year.

"If I could reduce the amount of toxicity officers have to endure every day, then I'm keeping them as healthy as possible, while also keeping more kids safe," he says.

Date modified: