>The IWF report reiterates the real world harm of AI images. Although children are not harmed directly in the making of the content, the images normalise predatory behaviour and can waste police resources as they investigate children that do not exist.
>In some scenarios new forms of offence are being explored too, throwing up new complexities for law enforcement agencies.
>For example, the IWF found hundreds of images of two girls whose pictures from a photoshoot at a non-nude modelling agency had been manipulated to put them in Category A sexual abuse scenes.
>The reality is that they are now victims of Category A offences that never happened.
dfreinc says
>The IWF report reiterates the real world harm of AI images. Although children are not harmed directly in the making of the content, the images normalise predatory behaviour and can waste police resources as they investigate children that do not exist.
>In some scenarios new forms of offence are being explored too, throwing up new complexities for law enforcement agencies.
>For example, the IWF found hundreds of images of two girls whose pictures from a photoshoot at a non-nude modelling agency had been manipulated to put them in Category A sexual abuse scenes.
>The reality is that they are now victims of Category A offences that never happened.