It seems that instead of updating Grok to prevent outputs of sexualized images of minors, X is planning to purge users ...
Govt officials have been quoted saying that the 'safe harbor' status of X could be revoked because of Grok's CSAM content ...
LAION, the German research org that created the data used to train Stable Diffusion, among other generative AI models, has released a new dataset that it claims has been “thoroughly cleaned of known ...
On X, sexual harassment and perhaps even child abuse are the latest memes.
For years, hashing technology has made it possible for platforms to automatically detect known child sexual abuse materials (CSAM) to stop kids from being retraumatized online. However, rapidly ...
Two major developments reignited regulatory and technological discourse around Child Sexual Abuse Material (CSAM) this year: The first, Visa & MasterCard cracking down on adult sites that contained ...
Thousands of CSAM victims are suing Apple for dropping plans to scan devices for the presence of child sexual abuse materials. In addition to facing more than $1.2B in penalties, the company could be ...
On Friday, Sens. Marsha Blackburn (R-Tenn) and Richard Blumenthal (D-Conn) sent co-written letters to Amazon, Google, Integral Ad Science, DoubleVerify, the MRC and TAG notifying the companies that ...
Over thousands of CSAM (child sexual abuse materials) victims are now taking the fight against Apple after the company ultimately decided to skip adding tools that will help detect it on their ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results