Opinion
Musk’s ‘Spicy Mode’ AI porn generator is not just dumb, it’s dangerous
Tim Biggs
Consumer Technology WriterThings are happening fast in the world of generative AI, and in the world of online safety. But where the two collide, particularly in the increasingly concerning regulatory environment of the United States, things seem to have shifted practically overnight.
We’re approaching a place where major Western governments are blocking access to educational material for young girls to learn about their own bodies, while AI scans images of those bodies without consent to make models for generating salacious material on demand. A place where mentions of homosexuality and transgenderism are scrubbed from scientific documents and pressure campaigns stop artists making money off their original erotic content, but where the world’s richest man can sell you deepfakes of Taylor Swift’s bare breasts for $45 a month.
Elon Musk has said Grok Imagine and its Spicy Mode will get better every day.Credit: AP
It’s easy to imagine a horror-scenario future for our culture, which is already dominated and regulated by an internet that’s increasingly privatised and vulnerable to the whims of governments and tech oligarchs. A scenario where certain kinds of sex – like anything that might empower women or queer people – are deemed woke and get buried. While other kinds – like the ability of men to see any woman they want naked, while she’s powerless to do anything about it – are fine.
And it’s all the easier to imagine as the constant procession of AI grossness and conservative government mandates make us so apathetic that barely anybody seems to point it out any more.
Last week, Elon Musk’s xAI rolled out a public version of its image-generation platform Grok Imagine, with a marquee feature being its ability to turn any generated image into a short video.
But true to form for Musk’s “free speech absolutist” X platform, which has a history of allowing, gestating and amplifying harm towards anyone of a different gender, sexuality, skin colour or belief system to the Tesla founder himself, this is mainstream AI video with a recklessly immature difference: an optional “Spicy Mode” ensures the video output is suggestive or explicitly sexual in nature.
The idea is that the user (and they must be a paying user, with access to Grok Imagine starting at $45 a month), can ask for an image of any person they want in any situation they want, and then hit the Spicy button to turn it into soft porn. X is filled with videos of users celebrating and sharing their creations.
As you might expect, almost all the generated videos are of women, with videos commonly showing them removing their tops and shorts to expose their naked bodies, rolling around while ripping at skintight suits, or messily eating ice cream that Grok seems to think should become liquid the second it’s touched by anything. The videos are uncanny, with moments of photorealism but also strange physics, impossible behaviour, weird shiny textures and other common AI video oddness, plus awful audio. They’re also extremely unimaginative, despite the name of the product.
But while it scores no points for nuance, there is something Grok Imagine is very good at: convincingly depicting real and recognisable women.
If, for example, you ask for Taylor Swift (which evidently many have), the system will do a convincing job because there are many images of the celebrity in its training data. For the same reason, Grok Imagine is good at evoking just about any woman well-known enough to have lots of images online, and notably poor at making images of men do anything sexy.
Multiple outlets have reported that Grok Imagine will quickly generate nude videos of celebrities including Taylor Swift.Credit: AP
It’s unclear what guardrails Spicy Mode has since xAI has not published them in detail. Some users have complained that they can’t get spicy results from photos they’ve uploaded to Grok, or that videos appear with blurring or censor bars. Yet others have shown videos of recognisable people fully nude. The Verge reported that, during its testing, the tool returned topless and nude videos of celebrities without the user even needing to specify that they should take their clothes off.
Until recently, Musk’s nude hyperreality is something that might have provoked a real outcry. But we’ve become so used to this behaviour from Musk and X (a platform many people who might have been bothered by this have already left), and so used to concerning AI applications, that it barely registers. I imagine some nations’ regulators may be looking at whether it breaks any rules, but that historically hasn’t bothered billionaires, and in America most companies seem to be falling in line with the conservative government’s priorities. For example, I don’t think there’s any real danger that Visa or Mastercard would consider refusing to process xAI subscriptions.
Some users have uploaded Spicy Mode AI-generated videos to X for public viewing.Credit:
To be clear, the problem I’m highlighting isn’t that there’s porn on the internet. That’s been the case for a long time and plenty of avenues are normal, healthy and ethical. The fact that Musk is selling nudes also isn’t the issue, although I’m sure the millions of creators on OnlyFans who’ve had their videos studied to create Grok Imagine would rather the cash go to them rather than a multibillionaire.
The risk here is an uncritical adoption of a technology that could soon begin to erode the importance of consent. A major platform is selling the ability to choose any prominent woman, or create one to your own specifications, and have them take their clothes off to become one-dimensional sexual objects in a single click. The platform might not be able to consistently fulfil that promise yet as it’s very janky, but occasional results are already spot-on, and it will only improve.
It’s not an exaggeration to say that the consumption of this content could impact a person’s concept of healthy boundaries and their respect for the personhood of others. It could distort their attitudes about how women do or should behave, which can be dangerous when applied to real life, especially in a situation where we can’t depend on government sex education to remain robust around the world. These kinds of consequences are why every other major AI company (for now) has guardrails on the kind of content it will generate.
While Musk obviously isn’t coding these features or training the AI himself, he’s clearly behind the direction of the product, and the fact that he doesn’t understand any of this or apply it to “Spicy Mode” and its generative output should surprise nobody.
This is the same man who, when Taylor Swift endorsed Kamala Harris for US president, somehow took it as an act of playing hard to get directed at him personally, and publicly “offered” to impregnate the singer. More recently, he made a bizarre claim that women are trained by the media to hate white people, and followed it up by boosting an incredibly hateful reply saying women automatically reinforce the dominant culture because they’re “built to be traded” among tribes.
The idea that the head of a major company could say things like this in public with little scrutiny and no repercussions should be shocking, let alone if he’s one of the most prominent businesspeople on the planet. Businesses have been boycotted for less. But that’s the reality we’re in now.
And so we should likely expect that the world’s richest man’s $45-a-month non-consensual deepfake porn generator will continue to operate. Even while the conservative governments of the world continue to dismantle the majority of sex-related rights and education we’ve built over the past century.
Get news and reviews on technology, gadgets and gaming in our Technology newsletter every Friday. Sign up here.