Existing clients

Log in to your client extranet for free matter information, know-how and documents.

Client extranet portal

Staff

Mills & Reeve system for employees.

Staff Login
22 Jan 2026
8 minutes read

What does the law have to say about Grok, digital undressing and X?

Lawmakers were embarrassed recently after users of the artificial intelligence model Grok created thousands of images of ‘digitally undressed’ women and children and shared them on X. For more on the background to this story, see here:

What’s the story? 

  1. In 2023, Elon Musk said in an interview with Tucker Carlson that he intended to develop an AI chatbot called TruthGPT, expressing concern that ChatGPT was being “trained to be politically correct.”
  2. Musk did create the chatbot, but ended up calling it Grok. 
  3. Grok is owned by xAI, a American company founded by Musk and integrated into X, the social media platform formerly known as Twitter. 
  4. In the past fortnight, it has been widely reported that Grok was fulfilling user requests to digitally undress images of real people. 
  5. Grok users were then publishing these images on X with no apparent consequences. 
  6. Political and public outrage ensued. Surely the law had mechanism for holding someone to account for this behaviour? 

What does the law have to say? 

Reading the press coverage, you might have got the impression that lawmakers had failed to anticipate this problem, but that is not the case. A number of statutory mechanisms exist to prevent the creation and publication of images of this kind. Two which were widely discussed as UK politicians scrambled to respond to the story can be found in the Data (Use and Access) Act 2025 and the Online Safety Act 2023 (OSA). This article looks at the relevant provisions in those Acts. It doesn't aim to cover the wide range of other existing laws relating to intellectual property, data or defamation (among others) that might potentially be relevant to dealing with this sort of behaviour.

DATA (Use and Access) Act 2025 (DUAA)

Section 138 of DUAA updated the Sexual Offences Act 2003, creating two new offences: 

66E Creating purported intimate image of adult 

(1) A person (A) commits an offence if:

  a. A intentionally creates a purported intimate image of another person (B),

  b. B does not consent to the creation of the purported intimate image, and

  c. A does not reasonably believe that B consents.

(2) ‘Purported intimate image’ of a person means an image which:

  a. appears to be, or to include, a photograph or film of the person (but is not, or is not only, a photograph or film of the person),

  b. appears to be of an adult, and

  c. appears to show the person in an intimate state.

66F Requesting the creation of purported intimate image of adult

(1) A person (A) commits an offence if:

  a. A intentionally requests the creation of a purported intimate image of another person (B) (either in general or specific terms),

  b. B does not consent to A requesting the creation of the purported intimate image, and

  c. A does not reasonably believe that B consents.

I have underlined some of the key elements of the offences for emphasis. 

Comment

  1. Many of the troubling images would fall within the definition of ‘purported intimate image’ as they appear to show persons in an intimate state (see here for the statutory definition of intimate state).  
  2. The AI generated images fall squarely within the definition of ‘purported intimate image’.
  3. You might ask whether Grok users ‘created’ the purported intimate images themselves, or whether Grok created the images. On that basis, you might also ask was it the company that controls Grok, or the human being prompting Grok that would be guilty of the offence in Section 66E (creating a purported intimate image). 
  4. Even if section 66E (creating purported intimate image of an adult) doesn't apply to the human being prompting Grok, section 66F (requesting the creation of purported intimate image of an adult) would. 
  5. The new offences only apply to adults, but a similar offence that would apply to children arguably already exists in section 1 of the Protection of Children Act 1978 as amended by section 84 of the Criminal Justice and Public Order Act 1994. It’s uncertain whether this is clear enough to be enforceable, however. 
  6. The main problem here was that Sections 66E and 66F of the Sexual Offences Act had not yet begun to apply. As is often the case, an Act of Parliament doesn't start to apply all at once on the day it is enacted. What happens instead is that the provisions of the Act are implemented in phases, via secondary legislation known as a statutory instrument. This was due to happen in Winter 25/26, but hadn't happened at the time of the relevant events. 

Online Safety Act 2023 (OSA)

The OSA also updated the Sexual Offences Act 2003, creating the new offence of:

66B Sharing or threatening to share intimate photograph or film

(1) A person (A) commits an offence if—

  a. A intentionally shares a photograph or film which shows, or appears to show, another person (B) in an intimate state,

  b. B does not consent to the sharing of the photograph or film, and

  c. A does not reasonably believe that B consents....

This section also says that “references to a photograph or film also include—

an image, whether made or altered by computer graphics or in any other way, which appears to be a photograph or film.

I've underlined some of the key elements of the offence for emphasis. Photographs shared in these circumstances are ‘illegal content’ for the purposes of the OSA. This means that user to user websites such as X are obliged under section 10 of the OSA to remove this type of content from their platforms when they become aware of it.

Comment

  1. The AI generated images appear to fall within the definition of ‘photograph.’  
  2. Unlike the offences created by DUAA, the offence created by the OSA had begun to apply. 
  3. The problem here is that it is difficult to enforce the law against the many human beings using Grok to generate unlawful images. 
  4. The more practical avenue for tackling the intimate image abuse was to take action against the platform where the images were shared (X)
  5. The OSA contains powers to:  
    a. issue large fines to platforms 
    b. block access to the offending platform in the United Kingdom (see section 144, Service Restriction Orders). 
  6. Several politicians suggested that blocking access to X might be the right thing to do, given the company’s initial apparent refusal to comply with its legal obligations.  

Does the law offer adequate protection against online harms of this kind? 

The Government think not, as science and technology secretary Liz Kendall has sent a letter to Dame Chi Onwurah MP, chair of the science, innovation and technology committee pledging new legislation

“to ban AI ‘nudification’ tools…The new legislation will allow the police to target the firms and individuals who design and supply these disgusting tools. We will bring forward this legislation as a priority, making amendments to the Crime and Policing Bill going through Parliament now... 

Alongside these amendments, Government will bring into force as a matter of urgency powers to criminalise the creation of intimate images without consent [ie the offences in DUAA, discussed in this article], building on existing legislation which bans sharing, or threatening to share non-consensual intimate images [ie the offence in the OSA, discussed in this article].”

It'll be interesting to see what form the new legislation takes and the specific gaps in the existing law that it seeks to plug. 

Conclusion

After a week of intense public scrutiny, the company responsible for Grok tweaked the chatbot’s settings to prevent it from digitally undressing photographs of human beings. An Ofcom investigation is ongoing and it wouldn't be a surprise to see a fine being imposed on X in due course. 

If you're interested in the issues raised in this article, you may also wish to review the following related Mills & Reeve content: 

Our content explained

Every piece of content we create is correct on the date it’s published but please don’t rely on it as legal advice. If you’d like to speak to us about your own legal requirements, please contact one of our expert lawyers.