Student safety

Student safety

Deepfake sexual abuse is an urgent student safety risk—does your school have a plan?

What you'll get if you choose to engage with us:

  • A trained risk management team – Your leadership will understand AI-driven sexual abuse, legal risks, and best practices.

  • Policies and incident response plans – You'll have policies and plans developed through tabletop exercises and case studies.

  • Resources for staff and parents – Your community will be prepared to respond to an incident and equipped to prevent one.

  • Expertise – The guidance and resources we share with you are informed by our ongoing work with the NAIS general counsel, the Office of the First Lady, and victim families.

Schedule a call

Schedule a call

Schedule a call

Let’s discuss your school’s needs and develop the right approach.

Email evan@pathosgroup.ai to set up a Zoom call. After our conversation, you’ll receive a customized scope of work proposal for your review.

Fast facts

Fast facts

Fast facts

  • Sextortion cases are surging – Reports of sextortion doubled from 2022 to 2023, with over 26,000 cases last year -- resulting in 20 minors who died by suicide.

  • Deepfake nudes are hitting schools – 11% of students (ages 9–17) say they know classmates who have used AI to create fake nude images.

  • Schools are facing lawsuits – A Pennsylvania school was sued after AI-generated nude photos of 50 students circulated, with parents alleging the administration failed to act.

  • Students are getting arrested – In Florida, two middle schoolers were charged with felonies for making and sharing explicit deepfakes of classmates.

  • Mishandling evidence is a legal risk – A Colorado school administrator was charged with child exploitation for improperly storing explicit student images during an investigation.

  • Independent schools are unprepared – Most school policies were written before AI deepfakes existed, leaving leaders without clear protocols to respond.