Nonprofit Offers Resources to Help Schools ‘End Deepfakes’

featured-image

A website from the nonprofit Opportunity Labs went live this week with a K-12 deepfake policy framework, incident response guide and the start of a platform for state education leaders to collaborate on guidance.

As education leaders scramble to catch up with artificial intelligence, the nonprofit Opportunity Labs hopes to help them get a handle on one of its more dangerous downsides: the rise of sexually explicit deepfake images that depict students. Deepfakes are AI-generated photos, videos or audio recordings that appear authentic and often include a real person’s image or voice without their consent. When they depict a person younger than 18 in a sexually explicit manner, they are considered child sexual abuse material, the for child pornography.

Opportunity Labs launched a website this week called that includes a K-12 policy framework, incident response guide and other resources for schools to address this emerging issue. The site will also host a network for state education leaders to collaborate on district guidance, according to Andrew Buher, founder and managing director of . Buher, the former chief operating officer for the New York City Department of Education and a lecturer in the Princeton School of Public and International Affairs, said End Deepfakes aims to round out the law enforcement approach to sexual deepfakes with proactive, student-supportive measures for schools.



In a conducted by the nonprofit Center for Democracy and Technology last summer, 15 percent of high school students said they had heard about sexual deepfakes that depict someone from their school, while 50 percent of teachers said they had not received any state or district guidance on how to handle student reports of such incidents. Meanwhile, 23 states have that make it a crime to create or share sexual deepfakes that depict minors, the latest data from government relations firm MultiState Associates. A similar federal law called the passed the U.

S. House this week and is headed to President Donald Trump’s desk for his signature. The bipartisan bill, championed by Melania Trump, makes it illegal to post or threaten to post authentic or AI-based intimate images of another person without their consent.

It also requires online platforms to remove such images within 48 hours of receiving notice that they exist. “The Take It Down Act, I would say, is necessary but insufficient,” Buher said. “Our instinct is this needs to be an everything-above-the-line effort.

There need to be federal laws, there need to be state laws, and there needs to be really, really great guidance from state education agencies that districts and schools can rely on to adopt evidence-based policies.” To bring leaders from state education agencies together to work on strategies to reduce and respond to deepfakes that involve students, Opportunity Labs is setting up what it calls the End Deepfakes in K-12 Education Network. “This is underpinning the idea that states should be issuing guidance that districts can adopt,” Buher said.

“This is a platform to help them come together to build that guidance, utilizing each other’s knowledge and also subject matter experts that we will bring in for webinars to help answer questions.” Staff from state education agencies can use a contact form on the End Deepfakes website to join the network, and Buher said Opportunity Labs is reaching out to states about the network as well, through groups such as the Council of Chief State School Officers and the National Governors Association. “If they don’t get to us, we will get to them,” he said.

“We’re hoping to kick it off as soon as possible because we would love if states had created guidance by the start of next school year.” To give state education agencies, as well as individual school districts, a head start on deepfake policies and procedures, Buher said Opportunity Labs built resources for its End Deepfakes website by working with K-12 teachers and administrators, parents whose children had been deepfake victims, AI literacy developers, attorneys, and think tanks such as the Center for Democracy and Technology. One such resource is a K-12 deepfake policy framework, which offers details on how schools can amend existing policies, such as codes of conduct, acceptable-use policies and student disciplinary policies, to address deepfake incidents.

It also includes an anti-deepfake student pledge. “There is a whole important set of work that is focused on social norming and school culture that has to go along with any policies that are adopted at the district or school level. We thought a student pledge was something really critical to begin that social-norming process,” Buher said.

“You can have all the policies in place that you want, the laws criminalizing this behavior, but the real change will only happen through kids holding their peers accountable and creating really strong social norms and a culture of not accepting this behavior.” Another resource on the site is an incident response guide for school leaders, which Buher said is geared toward principals and includes steps for assessing, reporting and containing deepfake incidents; providing support and safety for targeted students; interacting with law enforcement; applying student discipline; and setting up communication protocols. He added that separate response guides for teachers and superintendents will be on the site within the next month, as will a deepfake teaching and learning library with detailed evaluations of school AI literacy programs that address deepfakes.

However, Buher said more research is needed to understand what works when it comes to stopping the spread of sexual deepfakes in schools. In a call to action to academics and researchers, the End Deepfakes site includes a research agenda that lays out a road map for studying "image-based sexual abuse in K-12, including prevalence, risks and prevention strategies.".