Reports of online child exploitation have risen since the start of the coronavirus pandemic
Sarah E. Needleman
Jennifer Gross had warned her 12-year-old daughter about the potential dangers of socializing with strangers online, but said the message had often fallen on deaf ears.
Now the preteen understands the concern. Since mid-March, the daughter has received several flirtatious messages from accounts that appeared to be from adult men on Instagram, Mrs. Gross said.
“I was horrified,” said Mrs. Gross, a substitute teacher and church administrator in Allentown, Pa. “It woke her up to the point where she could realize this can happen and now she’s more careful online.”
Reports of online child exploitation have risen since the start of the coronavirus pandemic. In March, the National Center for Missing and Exploited Children received two million reports of online child exploitation, up from 983,000 a year earlier. In April, the nonprofit received 4.1 million reports of child online exploitation, up from around 1 million the year-earlier month. The majority of such reports are made regarding child sexual abuse material, as federal law requires, by companies that operate online services.
A spokeswoman for Facebook, which owns Instagram, said, “Under all circumstances, including Covid-19, keeping young people safe and removing child exploitative content is our top priority across our services. During the pandemic sharing on our platforms has increased overall, and we have detected and removed more child exploitative content as a result.”
Mrs. Gross said she reported the incidents with her daughter to Instagram via a reporting feature in the app. Facebook said it doesn’t have a record of a report but that it has removed an account for violating policies against inappropriate interactions with children after an inquiry by The Wall Street Journal.
Law-enforcement officials say the rise in abuse is likely happening because both children and adults are spending more time online these days, as schools are closed and many working parents don’t have access to child-care services.
“You have the perfect storm where millions of kids are home across the country and the world, and they are probably more unsupervised than they have been before the pandemic,” said Steven J. Grocki, chief of the U.S. Justice Department’s child exploitation and obscenity section.
On the so-called Dark Web, a network of websites for sharing information anonymously, child predators have said that the pandemic is providing them with greater access to potential victims, said John Shehan, vice president of the National Center for Missing and Exploited Children’s exploited children division. They “are discussing the stay-at-home orders and their desire to use this opportunity to entice children to produce sexually explicit material,” he said.
‘Kids who suffer most are the ones who stay silent about it.’— Julie Hertzog, director of the National Bullying Prevention Center at the Pacer Center
Bark Technologies, a service for monitoring children’s internet activity, says the number of predators it has reported to law enforcement increased 23% between early March and early May. Normally in a short time frame “you might see growth of a couple percentage points,” said Titania Jordan, Bark’s marketing chief. “It’s a very big increase.”
Other forms of bad behavior that normally take place regularly online—such as hate speech, bullying and harassment—have also been on the rise lately, according to L1ght Inc., a technology firm based in San Francisco that uses artificial intelligence and machine learning to identify online toxicity. Between October and April it found substantial increases in toxic words and phrases in text-based conversations within several groups on Discord, a chat platform popular among videogame players, with the greatest spike in toxic chatter happening in the final two months of the study. L1ght says that’s likely due to an overall increase in time spent online during the pandemic.
A spokesman for Discord said it hasn’t seen an increase in reports of toxic language in text conversations. “While we don’t police foul language, we do not tolerate harassment or hate speech on our platform,” he said.
Experts recommend that parents check which social platforms their children use and discuss risks of communicating with strangers online. Ask how they know all the people on their contact lists and what they talk about. And let children know they can take steps to stop interactions that make them feel uncomfortable.
“Giving them a voice in the solution is super important,” said Julie Hertzog, director of the National Bullying Prevention Center at the Pacer Center, a nonprofit dedicated to improving the quality of life for children and young adults with disabilities. “Kids who suffer most are the ones who stay silent about it.”
If exploitation does happen, parents should contact law enforcement, said Michael DuBois, a unit chief in the criminal investigative division at the Federal Bureau of Investigation. “Sometimes parents don’t think there’s anything that can be done but that’s not the case,” he said.
The companies behind popular social platforms, message apps and videogames say they continuously work to combat inappropriate behavior, and encourage people to report problems or block offenders.
Videogame giant Ubisoft Entertainment last fall added a chat toxicity filter in its hit series “Tom Clancy’s Rainbow Six Siege,” one of several recent steps it has taken to make its online multiplayer games more welcoming. The filter alerts a team of employees whenever it detects threats of violence or personal attacks in in-game text conversations. Offenders may be given a warning, suspended or banned.
“One of the challenges is that toxic behavior has become normalized,” said Chris Mancil, senior director of community experience at Ubisoft. “But it’s not OK and we have to push back.”