New Laws To Prevent Using AI To Generate Sexual Images
Britain will he UK will be the first country to make it illegal to possess, create or distribute Artificial Intelligence (AI) tools designed to create child sexual abuse material (CSAM), and any criminal activity of this type will get a punishment of up to five years in prison.
The legislation is intended to tackle the use of AI for the creation of abusive child imagery and there will be four new laws to reduce the increasing threat of child sexual abuse that AI presents.
Artificially generated CSAM involves images that are either partly or completely computer generated. Software can "nudify" real images and replace the face of one child with another, creating a realistic image. In some cases, the real-life voices of children are also used, meaning innocent survivors of abuse are being re-victimised. Fake images are also being used to blackmail children and force victims into further abuse.
The British National Crime Agency reports that there are 800 arrests each month relating to threats posed to children online. It said 840,000 adults are a threat to children nationwide, both online and offline, which makes up 1.6% of the adult population.
The legislation is intended address the following criminal acts:
- “Nudifying” real-life images of children, or stitching their faces on to existing images of abuse, are among the ways AI is being used by abusers,with fake images being used to blackmail children and force them to livestream further abuse.
- Possession of AI paedophile manuals, which teach people how to use AI for sexual abuse, will also be made illegal, and offenders will get prison sentences of up to three years.
- Other laws set to be introduced include making it an offence to run websites where paedophiles can share child sexual abuse content or provide advice on how to groom children.
The UK Border Force will be given powers to instruct individuals who they suspect of posing a sexual risk to children to unlock their digital devices for inspection when they attempt to enter the UK, as CSAM is often filmed abroad. Depending on the severity of the images, this will be punishable by up to three years in prison.
The Internet Watch Foundation (IWF) has warned that more sexual abuse AI images of children are being produced, with them becoming more prevalent.
AI-generated images have been used to blackmail children and force them into more abusive situations, including the live streaming of abuse. AI tools are also helping perpetrators disguise their identity to help them groom and abuse their victims.
Senior police officers say that there is now well-established evidence that those who view such images are likely to go on to abuse children in person, and they are concerned that the use of AI imagery could -normalise the sexual abuse of children.
British Home Secretary, Yvette Cooper, said “We know that sick predators' activities online often lead to them carrying out the most horrific abuse in person. This government will not hesitate to act to ensure the safety of children online by ensuring our laws keep pace with the latest threats.
“These four new laws are bold measures designed to keep our children safe online as technologies evolve. It is vital that we tackle child sexual abuse online as well as offline so we can better protect the public from new and emerging crimes as part of our Plan for Change.”
The new laws will be brought in as part of the Crime & Policing bill, which has yet to come to parliament.
BBC | Guardian | Times of India | Sky | IWF | Yahoo | Independent
Image: Leire Cavia
You Might Also Read:
Teach Your Children About Safer Cyber Security:
If you like this website and use the comprehensive 7,000-plus service supplier Directory, you can get unrestricted access, including the exclusive in-depth Directors Report series, by signing up for a Premium Subscription.
- Individual £5 per month or £50 per year. Sign Up
- Multi-User, Corporate & Library Accounts Available on Request
- Inquiries: Contact Cyber Security Intelligence
Cyber Security Intelligence: Captured Organised & Accessible