It requires a focus that many people couldn't sustain. The job of some RCMP officers and employees requires them to regularly review images, videos and written materials that portray the sexual abuse of children.
"Our team works to catch the baddest of the bad and there's nobody more vulnerable in our society than children," says Sgt. Dawn Morris-Little, manager at the RCMP National Child Exploitation Crime Centre's (NCECC)'s Victim Identification and Covert Investigations Unit. "Because of that, people have a real passion for it."
Aside from being motivated, members of this unit are able to do the work by taking advantage of mental-health supports and using the latest software to direct their attention to those children most at risk.
A focused search
To find those cases, Sgt. Arnold Guerin, who works in the technology section of the NCECC, and his team have for years turned to artificial intelligence.
Also known as machine learning, the software is programmed to perform specific tasks without relying on precise computer-coded instructions.
Where child sexual exploitation images are concerned, a single case — known as a data set — is run through a machine-learning program that can identify and categorize images. That means the images could be deemed either legal and therefore not require an investigator's attention, or they are identified as child sexual abuse that must be immediately reviewed or investigated.
The data sets can also include images that NCECC investigators, or those from the unit's national and international policing partners, have already declared to be child sexual exploitation. As a result, they don't have to be looked at again.
Knowing that they're not repeating work that's already been done and that their current efforts could lead to a quick arrest of an abuser or the identification of a child, provides employees with a sense of purpose and well-being.
"You don't want to be going through images you don't have to," says Morris-Little.
The NCECC has received more than 35,600 cases so far in 2019. From 2016 to 2018, the NCECC had 27,300, 35,712 and 61,220 online child sexual exploitation cases respectively. Some contained one image while others contained thousands.
The centre reviews all the cases it receives, but only those with confirmed child sexual exploitation photos or videos are investigated.
"If we know a child has already been identified, we can prioritize the newer cases and focus on the ongoing abuse," says Morris-Little.
Roberta Sinclair manages the Strategic and Operational Services Unit of the Sensitive and Specialized Investigative Services — formerly the Canadian Police Centre for Missing and Exploited Children/Behavioural Science Branch.
She agrees technological improvements help focus investigators' work and can enhance their sense of purpose. But simply getting them to take a break is important, too.
"Getting them to do that (take a break) can be tough," says Sinclair. "Some feel like they're giving up on a case, or the child, but of course they're not."
Other initiatives that support the mental health of online child sexual exploitation employees include quiet rooms, opportunities to play video and board games that can offer a much-needed distraction from their work, and a variety of RCMP-offered mental health programs.
Moving forward, the technology and mental health and wellness supports will only continue to expand for people in the online child sexual exploitation units nationally.
The number of cases referred to the NCECC will continue to climb as high-speed Internet access continues to improve, the storage capacity for child exploitation images rises, and storage costs remain affordable.
"It's our job to protect the child and prevent the sexual abuse of more children," says Guerin. "If I can help reduce the amount of toxicity employees have to endure every day, then I'm keeping them as healthy as possible, while also keeping more kids safe."