Deepfake sexual abuse is an urgent student safety risk—does your school have a plan?
What you'll get if you choose to engage with us:
A well-trained leadership team – Your leadership will understand AI-driven sexual abuse, legal risks, and best practices.
Policies and incident response plans – You'll have policies and plans developed through simulations and case studies.
Resources for staff, parents, & students – You'll be prepared to respond to an incident and equipped to prevent one.
Expertise – The guidance and resources we share with you are informed by our ongoing work with the NAIS general counsel, the Office of the First Lady, and victim families.
In just 10 minutes, find out how prepared your school is to handle deepfake threats. Get a personalized risk score, tailored recommendations, and clear next steps.
Sextortion cases are surging – Reports of sextortion doubled from 2022 to 2023, with over 26,000 cases last year -- resulting in 20 minors who died by suicide.
Deepfake nudes are hitting schools – 11% of students (ages 9–17) say they know classmates who have used AI to create fake nude images.
Schools are facing lawsuits – A Pennsylvania school was sued after AI-generated nude photos of 50 students circulated, with parents alleging the administration failed to act.
Students are getting arrested – In Florida, two middle schoolers were charged with felonies for making and sharing explicit deepfakes of classmates.
Mishandling evidence is a legal risk – A Colorado school administrator was charged with child exploitation for improperly storing explicit student images during an investigation.
Independent schools are unprepared – Most school policies were written before AI deepfakes existed, leaving leaders without clear protocols to respond.