Child safety org launches AI model trained on real child sex abuse images
-
Today, a prominent child safety organization, Thorn, in partnership with a leading cloud-based AI solutions provider, Hive, announced the release of an AI model designed to flag unknown CSAM at upload. It's the earliest AI technology striving to expose unreported CSAM at scale.
Child safety org flags new CSAM with AI trained on real child sex abuse images
AI will make it harder to spread CSAM online, child safety org says.
Ars Technica (arstechnica.com)
-
Today, a prominent child safety organization, Thorn, in partnership with a leading cloud-based AI solutions provider, Hive, announced the release of an AI model designed to flag unknown CSAM at upload. It's the earliest AI technology striving to expose unreported CSAM at scale.
Child safety org flags new CSAM with AI trained on real child sex abuse images
AI will make it harder to spread CSAM online, child safety org says.
Ars Technica (arstechnica.com)
... robo chomo?