Experts have warned that artificial intelligence is opening the door to a disturbing trend of people creating realistic images of children in sexual settings, which could lead to an increase in real-life child sex crimes.
AI platforms that can mimic human conversation or create realistic images exploded in popularity after the release of chatbot ChatGPT in late 2023, which served as a watershed moment for the use of artificial intelligence. As people around the world became curious about the technology for work or schoolwork, others have adopted the platforms for more nefarious purposes.
The National Crime Agency, the UK’s lead agency for combating organized crime, warned this week that the spread of explicit machine-generated images of children is “normalising” and disturbing behavior towards children with the “radicalisation” effect.
“We estimate that viewing these images – whether real or AI-generated – materially increases the risk of offenders sexually abusing children,” NCA director general, Graeme Bigger, said in a recent report.
AI ‘deepfakes’ of innocent photos fuel sex-exploitation scams, FBI warns
National Crime Agency (NCA) Director General Graeme Bigger during the Northern Ireland Policing Board meeting at James House, Belfast. Photo date: Thursday June 1, 2023. (Photo by Liam McBurney/PA Images via Getty Images) (Getty Images)
The agency estimates that there are 830,000 adults in the UK, or 1.6% of the adult population, who pose some form of sexual risk to children. According to Biggar, the estimated number is ten times the UK prison population.
According to Bigger, most cases of child sexual exploitation involve viewing explicit images, and with the help of AI, creating and viewing sexual images could “normalize” child abuse in the real world.
Artificial intelligence can detect ‘sexual assault’ before it happens and help FBI: expert
“[The estimated figures] “Partly reflects a better understanding of a risk that has historically been underestimated, and partly a real increase due to the radicalizing effect of the Internet, where the widespread availability of videos and images of child abuse and rape, and groups sharing and discussing the images, have normalized such behavior,” Biggar said.

18 July 2023 This illustration shows artificial intelligence with books on a laptop. (Getty Images)
Across the state, there is a similar explosion of AI being used to create sexualized images of children.
“Photos of children, including material from known victims, are being repurposed for this really bad result,” Rebecca Portnoff, director of data science at a nonprofit that works to protect children, told The Washington Post last month.
Canadian man sentenced to prison over AI-generated child pornography: report
“Victim identification is already a needle in a haystack problem, where law enforcement is trying to harm a child,” he said. “The ease of use of these tools is a key change as well as the realism. It makes everything a challenge.”
Popular AI sites that can generate images based on simple prompts often have community guidelines that prevent the creation of disturbing images.

A teenage girl in a dark room. (Getty Images)
Such platforms are trained on millions of images from across the Internet that act as building blocks for AI to create convincing images of people or places that don’t actually exist.
Lawyers are ready for AI’s potential to eliminate court cases with falsified evidence
Mid-Journey, for example, calls for PG-13 content that avoids “nudity, genitalia, fixation on bare breasts, people in the shower or on the toilet, sexual imagery, fetishes.” While DALL-E, OpenAI’s image creation platform, only allows G-rated content, prohibiting images that show “nudity, sexual acts, sexual services, or content otherwise intended to induce sexual arousal.” According to various reports on AI and sex crimes, dark web forums of ill-intentioned people discuss work around creating disturbing images.

Police car with 911 sign. (Getty Images)
Bigger noted that AI-generated images of children also put police and law enforcement agencies at a disadvantage when it comes to deciphering fake images of real victims who need help.
“Using AI for child sexual exploitation will make it harder for us to identify real children who need protection, and further normalize abuse,” said the NCA Director General.
AI-generated images could also be used in sex-rape scams, the FBI warned last month.
Deepfakes often involve editing videos or photos of people to look like someone else using deep learning AI, and are used to harass or collect money from victims, including children.
FBI warns of AI deepfakes being used to create ‘sexual exploitation’ schemes
“Malicious actors use content manipulation technologies and services to exploit images and videos—typically obtained from an individual’s social media account, open the Internet, or requested from a victim—to create sexually themed images that appear true to life from a victim, then circulate them on social media, public forums, or pornographic websites,” the FBI said in June.
Click here to get the Fox News app.
“Many victims, including minors, are unaware that their images have been copied, manipulated and disseminated until it is brought to their attention by someone else.”
Source by [Fox News]