Editor’s Note: On Tuesday I logged into Chat GPT for the first time, a chatbot driven by artificial intelligence technology. With search engines like Google showing mostly affiliate marketing links, it seems as there’s less and less non-commercial information to utilize for researching. I asked Chat GPT and voilá; got a valuable answer, which of course I still had to verify as any editor needs to with information found on the internet. And it – the AI machine – was even friendly to me. It is spooky how delighted I was as if I’d been chatting to a real person.

When it comes to AI it’s not all chatty though. There are many dangers with AI, that we will deal with in a series of articles starting today.

The one that concerns us the most is its relation with deepfake intimate images, particularly how AI is impacting child pornography (CP). Here, we bring to you a few articles on this topic, along with our commentary. We have only published small parts of the articles, obliging to fair share policy. You can click on the link to get to the original article.


Child Sexual Abuse Images Found in AI Training Material

Despite it’s much confusing name, Artificial Intelligence (AI) is not, in fact, intelligence. As a matter of fact, it is a series of highly complex amalgation of data. The individual data are intricately related to each other through an equally complex set of algorithms. In other words, whatever output AI produces, it is actually based on those complex links that is stored in a large set of data that it can quickly and easily access to.

Therefore, the greater the dataset that any AI program can access, the more “intelligent” it appears. In other words, for AI to appear intelligent, there needs to be a constant expansion of dataset. (Un)fortunately, there are different datasets committed to just that.

LAION is one such dataset that creates the dataset of images, linking it to the titles and alternative texts that the uploaders give. About two months ago, Stanford Internet Observatory discovered over thousand of AI-generated child pornography images on LAION. What it means is that AI is now getting trained to create child pornographic images, resulting in an increased likelihood to present similar images in the future.

Stanford Report Uncovers Child Pornography in AI Training Data

barbed-wire woman

By Editah Patrick/MSN

Stanford Internet Observatory has made a distressing discovery: over 1,000 fake child sexual abuse images in LAION-5B, a dataset used for training AI image generators. This finding, made public in April, has raised serious concerns about the sources and methods used for compiling AI training materials.

The Stanford researchers, in their quest to identify these images, did not view the abusive content directly. Instead, they utilized Microsoft’s PhotoDNA technology, a tool designed to detect child abuse imagery by matching hashed images with known abusive content from various databases. Read more…


Dangers of sharing pictures online

Two cases of using AI to create child pornography has urged AI and child experts to warn parents about the harms of AI-generated child pornography. They have warned parents against posting children’s images online, as they can be used to create new images of child pornography.

Expert warns parents of dangers with AI child pornography

AI

By Naomi Kowles/WBTV
New AI technology is being used to turn normal pictures into pornography, including child pornography. It’s a phenomenon that has touched at least two recent criminal cases in Charlotte.The FBI said agents found hundreds of AI-generated child pornography images on the digital devices for a former American Airlines flight attendant, arrested in Charlotte last month for secretly recording young girls on planes. Read more…

A Flood of AI-Generated Child Pornography Confusing Police

AI opens a Pandora’s box related to child pornography. While it can help detect CSAM (child sex abuse materials) through Microsoft’s PhotoDNA technology, it gives criminals an easy yet exploitative tool with which they can flood the web with fake AI-generated child pornography images. This makes prosecuting real crimes more complex and diverts the already understaffed police away from genuine cases.

Surge in AI Generated Child Exploitation Images

mobilephone

By Ashley Belanger/ARS Technica

Law enforcement is continuing to warn that a “flood” of AI generated fake child sex images is making it harder to investigate real crimes against abused children, The New York Times reported.

“Creating sexually explicit images of children through the use of artificial intelligence is a particularly heinous form of online exploitation,” Steve Grocki, the chief of the Justice Department’s child exploitation and obscenity section, told The Times. Experts told The Washington Post in 2023 that risks of realistic but fake images spreading included normalizing child sexual exploitation, luring more children into harm’s way and making it harder for law enforcement to find actual children being harmed.

Currently, there aren’t many cases involving AI-generated child sex abuse materials (CSAM), The NYT reported, but experts expect that number will “grow exponentially,” raising “novel and complex questions of whether existing federal and state laws are adequate to prosecute these crimes.” Read more…


Lack of legislation around AI-Generated Child Pornography

While child pornography is an abhorrent crime that is being sidelined by AI generated flooding of CP images, even AI use of CP should be a crime. Child safety experts are increasingly worried about the “explosion” of “AI-generated child sex images” which pedophiles share easily through their dark web forums. After all, it would be naive to assume that the person creating and distributing AI-generated CP would not engage in CP if given the chance.

However, creating and sharing these violent images is not a definite crime. It takes time for the legal system to recognize any new crime as a crime. The same is true for AI-generated CP.

In addition common people cannot control what the future of technology holds, AI is in a constant development by self-declared specialists whose knowledge is kept under wraps. There will be technological conditions under which perpetrators could hide anonymously and keep on doing more children harm.

Senate sends two AI child porn bills to House

By David Beard/The Dominion Post

The Senate advanced two bills to the House on Tuesday, both aimed at combating AI era child pornography.

SB 740 criminalizes altering a photograph, image, video clip, movie, or recording containing sexually explicit conduct by inserting the image of an actual minor so it appears that the minor is engaged in the sexually explicit conduct.

The vote was 34-0. SB 741 also passed unanimously.

Where SB 740 involves using real victims in artificially generated porn, this bill concerns entirely digitally or AI-generated porn where the image appears to be a minor. Read more…


Featured image by Geralt/Pixabay

Thumbnails fr.t.t.b.: Kelle Pics/Pixabay, Gerd Altmann/Pixabay, Lisa Fotios/Canva and Kuloser/Pixabay