Exit site

Journal

Back

The Murky Line Between Indecent and Prohibited Images – and How AI Is Exploiting It

07.08.25 | Blog

By Kaelin Fletcher-Taylor

It would be a cliché to say that the line between reality and fiction is blurring. So, I have no desire to lean into Orwellian stereotypes to describe a ‘post-truth era’ that would do little but encourage conspiracy. For the purposes of these ramblings, it suffices to say, in an online climate where artificial intelligence (AI) can create images we don’t look twice at., the Online Safety Act is welcome.

Of course, it has not been without controversy and various right-wing politicians have spoken out with all the vitriol of people whose own presumably self-loathing online activity has been interrupted by age verification checks. But as with all qualified human rights, censorship and freedom of speech are a balancing act. Protecting children by giving Ofcom the ability to monitor and enforce the removal of illegal content is a justifiable compromise to be made at the expense of age-appropriate material that is now being censored in an initial over-calibration.

However, I am disappointed that the act did not take the opportunity to address other glaring pit falls in the legislation covering digital abuse, particularly, the offences set out against child sex abuse images (CSA images).

Currently, there are a few main offences in this area: The making / taking of indecent photographs, the possession of these photographs, and separately, the possession of prohibited images. It is forgivable to think this a murky distinction. In short:

  • An Indecent photograph is a photo, film or piece of data of a child that would impact the modesty and decency of the average person. It can also be a tracing or edit (think photoshop or AI edits) that originates from a real-life photo.
  • On the other hand, a prohibited image is one of a child that is not necessarily ‘real’ (cartoons, drawings, animations etc) that is sexual, offensive or obscene.

In other words, the question that prosecutors are told to ask themselves is whether the image, if printed, would look like a photograph? If yes, it is probably an indecent photograph, if no, it is probably a prohibited image.

However, doesn’t it seem inevitable that AI will throw a spanner in the works?

It is worth noting that steps have been taken to tackle AI CSA images as police forces try and grapple with legislation to pursue just charges and the Crime and Policing Bill has proposed a new offence to possess, create or distribute AI models designed to generate child sexual abuse material.

It is a welcome change to crack down on the models that create this sickening content but the law in this area is still too complex.

AI generated prohibited images (i.e images that are not ‘photo-realistic’ and without traceable source material) are not victimless crimes.

Just because they have no identifiable source material does not mean that the platform is not ‘learning’ from pools of these images that each contribute to make this offending so profitable. Yet, it can fall within pockets of the law that avoids penalisation for broadly similar behaviour.

So, my argument is a simple one, the law on prohibited images must be brought in line with indecent images to stop offenders running amok of what was a sensible distinction between ‘real-life’ photographs and digitally created or drawn ones.

This must start by establishing sentencing guidelines for prohibited images. There are currently no guidelines for sentencing those found in possession of prohibited images, leaving judges to interpret severity without clear benchmarks other than the statutory maximum of 3 years. This will lead to inconsistencies or leniency, especially if the image is stylised, animated, or importantly, AI generated.

Secondly, we must make it an offence to distribute these images. Distribution of prohibited images is not separately criminalised, while distribution of indecent images is, highlighting the inconsistency that is so blindingly obvious in this area.

Of course, judges consider this at sentencing, but this is a gap in the legislation that AI looks set to fill. Judges cannot comment on offences that do not exist, and it leaves the public guessing as to why distributing prohibited images may face less severe charges than possession of indecent images, despite potentially perpetuating the cycle of this behaviour.

These are crimes that are a stain on the public conscience, they cause understandable outrage.

As such, the public need to understand these offences in simple terms and understand why sentences are given. One cannot help but admire prosecutors stretching or reinterpreting existing laws to pursue those creating or distributing AI CSA images. But inconsistency in sentencing and further need to read between the lines of our statutes do little to mitigate against the legal education that the UK is sorely missing.

Stakeholders and Funders

Helpline: 0808 800 5005

X