WORLD
Elon Musk’s AI chatbot Grok generated an estimated three million sexualised images of women and children in a matter of days, researchers said Jan 22, revealing the scale of the explicit content that sparked a global outcry.
The recent rollout of an editing feature on Grok, developed by Musk’s startup xAI and integrated into X, allowed users to alter online images of real people with simple text prompts such as “put her in a bikini” or “remove her clothes”.
A flood of lewd deepfakes exploded online, prompting several countries to ban Grok and drawing outrage from regulators and victims.
“The AI tool Grok is estimated to have generated approximately three million sexualized images, including 23,000 that appear to depict children, after the launch of a new image editing feature powered by the tool on X,” said the Center for Countering Digital Hate (CCDH), a nonprofit watchdog that researches the harmful effects of online disinformation.
CCDH’s report estimated that Grok generated this volume of photorealistic images over an 11-day period – an average rate of 190 per minute.
The report did not say how many images were created without the consent of the people pictured.
It said public figures identified in Grok’s sexualized images included American actress Selena Gomez, singers Taylor Swift and Nicki Minaj as well as politicians such as Swedish Deputy Prime Minister Ebba Busch and former US vice president Kamala Harris.
“The data is clear: Elon Musk’s Grok is a factory for the production of sexual abuse material,” Imran Ahmed, the chief executive of CCDH.
“By deploying AI without safeguards, Musk enabled the creation of an estimated 23,000 sexualized images of children in two weeks, and millions more images of adult women.”
There was no immediate comment about the findings from X. When reached by AFP by email, xAI replied with a terse automated response: “Legacy Media Lies.”
Last week, following the global outrage, X announced that it would “geoblock the ability” of all Grok and X users to create images of people in “bikinis, underwear, and similar attire” in jurisdictions where such actions are illegal.
It was not immediately clear where the tool would be restricted.
The announcement came after California’s attorney general launched an investigation into xAI over the sexually explicit material and several countries opened their own probes.
“Belated fixes cannot undo this harm. We must hold Big Tech accountable for giving abusers the power to victimize women and girls at the click of a button,” Ahmed said.
Grok’s digital undressing spree comes amid growing concerns among tech campaigners over proliferating AI nudification apps.
Last week, the Philippines became the third country to ban Grok, following South-East Asian neighbours Malaysia and Indonesia, while Britain and France said they would maintain pressure on the company.
On Wednesday, the Philippines’s Cybercrime Investigation and Coordinating Center said it was ending the short-lived ban after xAI agreed to modify the tool for the local market and eliminate its ability to create “pornographic content”. – AFP/TS
READ MORE
Power Division Signs IFC Deal for 10 Million Smart Meters Rollout
ENERGY The Ministry of Energy (Power Division) has signed a Transaction Advisory Services Agreement (TASA)…
Chashma Sugar Mills Initiates Plant Modernization at D.I. Khan Unit
MARKETS Chashma Sugar Mills Limited has commenced the modernization of its plant and machinery at…
Matco Foods to Raise Funds via Rights Issue, Divest Business Unit
MARKETS Matco Foods Limited has announced plans to subscribe to a rights issue of its…
FPCCI Warns of Deepening Energy Crisis, Urges Immediate Industrial Relief
ENERGY The Federation of Pakistan Chambers of Commerce and Industry on Wednesday raised alarm over…
Pakistan Market: Technical Outlook Today
KSE-100: Upside likely SHARE MARKET April 17: The KSE-100 index extended the gain to close…
President Zardari Orders Fast-Tracking of Small Dams, Recharge Wells, Storage Projects
WATER RESOURCES President Asif Ali Zardari chaired a meeting on water resources management at Aiwan-e-Sadr…

