AI Governance Frameworks and Field Tools for K-12 Education
-
As emerging technologies, including deepfakes, began to surface in K-12 contexts, schools and districts were encountering real uncertainty about implications for student safety, instructional integrity, and institutional responsibility. Guidance was fragmented across legal, technical, and instructional domains, and there were few shared reference points for decision-making.
At the same time, school-based professionals were increasingly fielding questions and concerns related to technology-driven incidents without clear system-level guidance. In the absence of clarity, responsibility often fell unevenly on individuals to interpret risks, respond to incidents, and assess new tools on their own.
-
Opportunity Labs convened experts across K-12 education, law, and tech to create public and actionable resources. Key actions included:
Synthesizing instructional, legal, and technical perspectives on emerging technologies, including deepfakes, into clear, plain-language briefing materials for school contexts
Designing and analyzing a survey of school-based social workers to understand how emerging technology issues were appearing in practice, where uncertainty was highest, and what additional support might be needed
Convening and facilitating cross-functional roundtables with educators, school-based staff, policy experts, and technologists to surface concerns, compare perspectives, and build shared understanding
Developing procurement benchmarks, surfacing practical questions institutions could use to assess vendor claims, data practices, implementation burden, and long-term viability
Across this work, the emphasis was on slowing the conversation down enough to support clearer thinking, rather than pushing toward premature adoption.
-
This work produced a policy framework, incident response guide, research agenda, and procurement benchmarks — concrete resources for school and district leaders navigating deepfake-related decisions without clear system-level guidance. A survey of school-based social workers surfaced where uncertainty was highest and what additional support practitioners most needed. The goal throughout was to give institutions something they could actually use: clear reference points in a space where guidance was otherwise fragmented.
-
In periods of rapid technological change, clarity and shared language are often more valuable than speed
When systems lack clarity, the burden of interpretation falls unevenly on individuals
Early questions about investment and procurement can shape how institutions approach new tools, even before formal decisions are made
-
Director, Opportunity Labs (2023–2025)
I contributed to Opportunity Labs' AI governance and field-building portfolio, with primary responsibility for deepfake prevention and response work and national AI research. This included:
Leading strategy, design, and development of a national resource and policy hub for deepfake deterrence, prevention, and incident response in K-12 education, producing a policy toolkit, research agenda, and incident response guide for state education leaders
Leading national research on AI adoption in schools by designing and administering surveys of district superintendents and school social workers, synthesizing findings into policy guidance for state and city decision-makers and philanthropic investors
Helping develop Procurement Benchmarks for AI in K-12 Education and Strategic Investment Principles for Generative AI; adapted the framework for school-based mental health contexts
Supporting national roundtables convening 50+ educators, policymakers, researchers, and technology developers, contributing to synthesis, interview design, and programming