AI

Artificial Intelligence And The Menace Of Child Sexual Abuse

Schools are now experiencing incidents of students possessing, sharing or requesting nude images every month - and the average age of these children is 11-13 years old

Artificial Intelligence (AI) has brought many benefits. It has improved productivity, triggered breakthroughs in healthcare, enabled instant translation between different languages and transformed our daily lives.

However, there is a dark side to this technology: the exploitation of AI for malicious purposes, particularly the creation and transmission of Child Sexual Abuse Material (CSAM).

Alarming evidence from Qoria

Qoria software safeguards 24 million children globally. Their new report discloses the extent to which AI is impacting the safety of children in UK schools, highlighting the urgent need for a coordinated approach from government, digital industries, schools and families to deal with these threats.

Their survey gathered the data from 447 school communities across the UK, and 603 total schools globally. They collected responses from staff in education with a responsibility for safeguarding children. These included school leaders, IT directors, designated safeguarding leads (DSLs), governors, and pastoral staff. 

The report has evidence that children as young as eight are involved in these incidents. The main age group is 11-13-year-olds. Snapchat is the primary platform for sharing this content (49%), followed by text messages (16%).

AI creates nude images

<--- The article continues for users subscribed and signed in. --->

Enjoy unlimited digital access to Teaching Times.
Subscribe for £7 per month to read this and any other article
  • Single user
  • Access to all topics
  • Access to all knowledge banks
  • Access to all articles and blogs
Subscribe for the year for £70 and get 2 months free
  • Single user
  • Access to all topics
  • Access to all knowledge banks
  • Access to all articles and blogs