#csam

[ follow ]
#deepfakes
fromWIRED
4 days ago
Artificial intelligence

Deepfake 'Nudify' Technology Is Getting Darker-and More Dangerous

fromTechCrunch
1 week ago
US news

California AG sends Musk's xAI a cease-and-desist order over sexual deepfakes | TechCrunch

fromEngadget
2 weeks ago
Apple

28 advocacy groups call on Apple and Google to ban Grok, X over nonconsensual deepfakes

fromWIRED
4 days ago
Artificial intelligence

Deepfake 'Nudify' Technology Is Getting Darker-and More Dangerous

fromTechCrunch
1 week ago
US news

California AG sends Musk's xAI a cease-and-desist order over sexual deepfakes | TechCrunch

fromEngadget
2 weeks ago
Apple

28 advocacy groups call on Apple and Google to ban Grok, X over nonconsensual deepfakes

#grok
fromJezebel
2 weeks ago
US politics

Everyone is Distancing Themselves from Grok. Pete Hegseth Just Let It Into the Military.

fromJezebel
2 weeks ago
US politics

Everyone is Distancing Themselves from Grok. Pete Hegseth Just Let It Into the Military.

World news
fromJezebel
2 weeks ago

Malaysia and Indonesia Apparently Care More About Banning Grok's CSAM Than the United States

Grok generated sexualized deepfakes, including images of underage girls and nonconsensual images, prompting international blocks and legal enforcement actions.
Artificial intelligence
fromTechCrunch
3 weeks ago

xAI says it raised $20B in Series E funding | TechCrunch

xAI raised $20 billion, plans expansion, has 600 million monthly users, but generated sexualized deepfakes including CSAM and is under international investigation.
fromwww.theguardian.com
3 weeks ago

I felt violated': Elon Musk's AI chatbot crosses a line

Late last week, Elon Musk's Grok chatbot unleashed a flood of images of women, nude and in very little clothing, both real and imagined, in response to users' public requests on X, formerly Twitter. Mixed in with the generated images of adults were ones of young girls children likewise wearing minimal clothing, according to Grok itself. In an unprecedented move, the chatbot itself apologized while its maker, xAI, remained silent:
Miscellaneous
EU data protection
fromwww.dw.com
3 weeks ago

Grok under fire for sexualizing women and children's images DW 01/03/2026

Grok's image-edit feature enabled creation of sexualized images of women and children, prompting urgent safeguard fixes and international regulatory scrutiny.
fromDefector
4 weeks ago

Who's Responsible For Elon Musk's Idiot Chatbot Producing On-Demand Child Sexual Abuse Material? | Defector

Twitter, also called X, the social media network owned and constantly used by the world's richest man as well as virtually every powerful person in the American tech industry, and on which the vast preponderance of national political figures also maintain active accounts, has a sexual harassment and child sexual abuse material (CSAM) problem. This has been true more or less since Elon Musk took it over, but this problem's latest and most repellent efflorescence is the result of one of Musk's signature additions as owner.
World news
Privacy professionals
fromThe Verge
2 months ago

Meta had a 17-strike policy for sex trafficking, former safety leader claims

Meta allegedly prioritized user engagement over safety, allowing repeat sexual exploitation violations and lacking clear CSAM reporting on Instagram.
fromThe Hacker News
4 months ago

DOJ Resentences BreachForums Founder to 3 Years for Cybercrime and Possession of CSAM

The U.S. Department of Justice (DoJ) on Tuesday resentenced the former administrator of BreachForums to three years in prison in connection with his role in running the cybercrime forum and possessing child sexual abuse material (CSAM). Conor Brian Fitzpatrick (aka Pompompurin), 22, of Peekskill, New York, pleaded guilty to one count of access device conspiracy, one count of access device solicitation, and one count of possession of child sexual abuse material. Fitzpatrick was initially arrested in March 2023 and pleaded guilty later that July.
US news
#child-pornography
#nonconsensual-content
fromTechCrunch
4 months ago
Information security

Pornhub owner pays $5M settlement to FTC over historic failure to block abusive content | TechCrunch

fromTechCrunch
4 months ago
Information security

Pornhub owner pays $5M settlement to FTC over historic failure to block abusive content | TechCrunch

Privacy technologies
fromwww.theguardian.com
5 months ago

Privacy at a cost: the dark web's main browser helps pedophile networks flourish, experts say

Tor's anonymity enables sprawling dark‑web communities of child predators who share CSAM, grooming strategies and normalize exploitation while the network resists content removal.
Law
fromBoston.com
5 months ago

Mass. man gets 46 years after chronicling his sexual abuse of kids

Justin Benoit received a 46-year federal prison sentence for sexually abusing children, recording those offences, uploading CSAM, and possessing hundreds of child sexual abuse files, plus supervised release.
World news
fromSearch Engine Roundtable
5 months ago

Google Ads Child Sexual Abuse Imagery Policy Update

Google will update Google Ads Child Sexual Abuse Imagery policy on October 22, 2025, expanding prohibited content and treating violations as egregious with immediate suspension.
Digital life
fromAdExchanger
5 months ago

Mediaocean Partners With The Internet Watch Foundation To Report CSAM Content | AdExchanger

A partnership between Mediaocean and IWF aims to enhance digital media safeguards against child sexual abuse material.
fromArs Technica
7 months ago

Worst hiding spot ever: /NSFW/Nope/Don't open/You were Warned/

Captain Samuel White approved a search of Bartels' gear, leading to revelations of illegal activities including possession of CSAM and buying it while stationed at Guantanamo.
Privacy technologies
DC food
fromTheregister
8 months ago

US Navy petty officer charged in horrific CSAM case

A US Navy petty officer has been charged with distributing child sex abuse material via Discord after a detailed FBI investigation.
[ Load more ]