Full Menu

APS Artificial Intelligence (AI) Frequently Asked Questions (FAQ)

Last updated 10/24/25


What is Artificial Intelligence (AI)? 

AI refers to technologies that simulate human intelligence using algorithms and data.Generative AI tools can create text, images, audio, and video by learning patterns from existing data.

How is APS using AI in schools? 

APS is integrating AI to:

  • Enhance teaching and learning.
  • Support personalized instruction.
  • Improve operational efficiency.
  • Promote equity and inclusion.

 

What generative AI tools will APS staff have access to during the 2025–2026 school year? 

For the 2025–2026 school year, APS staff will have access to thoroughly vetted tools, including the following platforms:

  • SchoolAI (primary instructional tool)
  • Microsoft Copilot, Google Gemini, and NotebookLM (for planning and productivity)

 

What generative AI tools are approved for student use at this time? 

Students in grades 6–12 could begin using generative AI tools in guided environments once specific requirements have been met by both staff and students, including: Teacher Criteria

  • Complete Level 1 and Level 2 of the SchoolAI Certified Educator training on the platform.
  • Review SchoolAI’s responsible use expectations before using it with students
  • Use approved tools only when students are being actively monitored in class.

Student Criteria

  • Complete the “Student Technology Acceptable Use Agreement” with the Instructional Technology Coordinator (ITC).
  • Participate in a teacher-led review of safe and ethical AI use, which includes the acknowledgment of participation with a “SchoolAI Responsible AI Use” form.

These safeguards help students engage with and explore AI in safe, responsible, and age-appropriate ways.

Can students use public AI chatbots like ChatGPT, Copilot, or Gemini? 

Students cannot use these tools on APS devices. Currently, student access to public AI tools is restricted due to approval requirements, including—but not limited to—age requirements and privacy regulations such as the Family Educational Rights and Privacy Act (FERPA) and the Children’s Online Privacy Protection Act (COPPA). APS piloted secure, vetted tools with more than 400 volunteer staff members during the “2024 Year of Exploration.” Students in grades 6–12 will be able to use SchoolAI within structured, teacher-led, and monitored classrooms.

How does APS ensure student safety and privacy with AI? 

APS evaluates all digital resources, including AI tools, through a rigorous yearlong approval process involving Academics; Diversity, Equity, and Inclusion (DEI); Information Services; School Support; Student Services; and APS Legal. We follow strict privacy laws (including FERPA and COPPA) and prioritize transparency, ethical use, and data protection in every decision.

Will AI be used to replace teachers or traditional instruction? 

No. AI is designed to support—not replace—educators. It can help streamline tasks, generate instructional materials, personalize learning experiences, and boost student engagement. However, there is no substitute for a teacher’s ability to understand the nuances of student needs, build relationships, and guide learning with empathy and expertise.

How is APS preparing teachers to use AI in the classroom?

APS is preparing educators for AI integration through a multiphase professional learning plan. All instructional staff are expected to complete at least one course from the “APS GenAI for Educators” self-paced Canvas series, which builds foundational knowledge and practical skills. Teachers who complete all three courses and the additional assignment to provide evidence of their learning earn a certificate, digital badge, and “APS Certified GenAI Educator” designation.

In addition, each school’s ITC is leading a structured, school-based session aligned to the APS AI Framework. This session provides an overview of the content available in the course series and introduces SchoolAI as part of our “Year of Empowered AI Learning” initiative, helping all teachers explore both the challenges and opportunities AI brings to teaching and learning while building confidence around ethical use, instructional value, and student safety.

Quick one- to two-minute “AI Byte” videos will be shared with staff throughout the year to offer practical guidance and highlight classroom-ready strategies.

How can parents stay informed about AI use in APS? 

APS maintains a dedicated webpage with guidance documents, updates, and resources: Year of Empowered AI Learning. APS will also hold a community engagement session in fall 2025, allowing parents to learn more about our use of SchoolAI.  A virtual session is tentatively scheduled for October 8, 2025 at 7:00pm. More information will be shared by the district as details are finalized.

What are the ethical concerns around AI in education? 

APS is committed to helping students understand:

  • The risks of misinformation and bias
  • The importance of academic integrity
  • How to use AI tools responsibly and critically

We want students to be critical thinkers and ethical digital citizens.

What are APS’ expectations for ethical and transparent use of AI in classrooms? 

APS has outlined expectations for both staff and students to promote responsible and transparent use of AI tools like SchoolAI: Teacher Responsibilities

  • Review the safe and ethical use of AI with students before introducing any tools, including examples of appropriate and inappropriate use.
  • Clearly communicate the level of acceptable AI use for assignments or projects to both students and families.
  • Model proper citations of AI use as appropriate

Student Responsibilities

  • Communicate openly and honestly with teachers about how AI is being used.
  • Follow the acceptable use guidelines for each assignment or project, as defined by the teacher.
  • Use proper citation of AI use as needed

These expectations help build a culture of trust, integrity, and responsible digital citizenship in all AI-supported learning environments.

What is the guidance for student misuse of AI in schoolwork? 

This guidance is offered in the APS Handbook & Student Code of Conduct: “Deliberately copying, creating, or submitting work that is not your own—including through the use of technology or AI tools—is considered cheating, plagiarism, or forgery. Students are prohibited from sharing work or discussing assessments with others unless explicitly permitted by the teacher. Students may be subject to consequences.”

Will AI be included in student handbooks or curriculum? 

Yes. The district has begun to include AI-related policies in the 2025–2026 Student Rights & Responsibilities (previously the handbook & Code of Conduct), the Acceptable Use Student Technology Acknowledgment, and the Acceptable Use Policy (AUP). Curriculum updates are underway to reflect AI literacy and responsible use.A living guidance document can be found on the APS website: APS Generative AI Guidance.   

What factors did APS consider when selecting SchoolAI for the Generative AI pilot? 

APS evaluated both SchoolAI and MagicSchool during the pilot and selected SchoolAI based on, but not limited to, the following criteria:

  • Ease of use for educators
  • Alignment with instructional goals
  • Support for differentiated instruction
  • Ethical design and transparency
  • Quality of training and onboarding
  • Feedback from pilot participants
  • Technical support and integration
  • Student safety and data privacy
  • Vendor responsiveness and participation
  • Environmental impact and sustainability considerations
  • Bias review and equity alignment by the DEI staff

 

Does SchoolAI have third-party verification to ensure it complies with student data privacy laws like FERPA and COPPA? 

Yes. SchoolAI has multiple layers of third-party verification and transparency measures in place to protect student data:

  • FERPA and COPPA Compliance: SchoolAI meets federal requirements for student data privacy.
  • System and Organization Controls (SOC) 2 Type I Certification (2024): An independent audit confirmed that SchoolAI has implemented a robust system of security controls aligned with national trust service criteria.
  • 1EdTech TrustEd Apps Certification: SchoolAI is recognized for secure and interoperable educational technology.
  • Student Data Privacy Consortium (SDPC): SchoolAI has 393 active entries, representing approved data privacy agreements across many schools and districts—demonstrating widespread adoption and trust across the United States.

 

How will APS prevent students from using AI tools inappropriately or accessing harmful content?

APS uses K12-specific platforms designed for safe student use. Currently, SchoolAI is the only generative AI platform approved for secondary students, and it includes teacher-monitored sessions, backend filters, and session timeouts to prevent misuse. All student activity is visible to educators, and any concerning content is flagged and escalated according to district protocols. APS’s enterprise filtering system, Lightspeed, blocks access to unapproved websites and tools, allowing only those explicitly permitted. Teachers are trained to supervise student interactions and guide appropriate use, while expectations for safe and responsible behavior are outlined in course syllabi, student handbooks, and the Acceptable Use Policy. Together, these safeguards help ensure technology is used constructively while minimizing risks and protecting students.

How will APS address bias in AI?

Bias mitigation was a key criterion in selecting an AI platform for student use. APS is providing professional learning to staff on how to critically evaluate AI-generated information and write prompts that lead to more inclusive results. Inclusive outputs avoid stereotypes, generalizations, tropes, assumptions, and reflect diverse representation rather than dominant cultural norms. Our professional development prepares teachers to structure AI prompts transparently—encouraging evidence-based responses, acknowledgment of limitations, and multiple perspectives. AI is positioned as a co-planner, not an authority, with educators applying their professional judgment to actively mitigate bias.

Will elementary students use SchoolAI at all?

Elementary students do not have access to SchoolAI.

Is SchoolAI used by every teacher? Will teachers be required to use this tool, or can they teach their subject “old school” if they prefer?

The use of generative AI, including SchoolAI, is not required in APS. Teachers can choose to use the vetted and approved generative AI tools: Google Gemini, SchoolAI, NotebookLM, and Microsoft Copilot. There is an expectation that all instructional staff take advantage of professional development opportunities – both in person and self-paced – to learn about generative AI and its impact on education.

How can parents actively provide input into APS’s evolving policies in AI?

There is no policy developed specifically about Artificial Intelligence. Parents will have an opportunity to provide feedback to the Acceptable Use Policy during a 30-day public comment period.

Will AI dictate the curriculum?

No. APS curriculum is guided by the Virginia Standards of Learning—defining what we teach. The how of teaching is shaped by APS’s instructional values, not by AI. Generative AI tools do not determine curriculum. Instead, they support educator expertise by enhancing planning, differentiation, and student engagement. APS uses a districtwide platform that allows content offices to customize GenAI Spaces aligned with APS curriculum and instructional goals. Behind the scenes, best practices like Universal Design for Learning (UDL) and differentiated instruction are built into every response. AI is one tool among many. It supports—not replaces—educators, and it is meant to enhance learning without dictating it.

How is digital citizenship being taught across grade levels, and how does AI fit into that framework?

There are five qualities of digital citizenship taught at every grade level. School staff, including classroom teachers, Instructional Technology Coordinators and Librarians are teaching our secondary students how to apply these qualities to SchoolAI. In addition, the teacher shows a video focused on using AI responsibly before using SchoolAI with their class. Using AI Responsibly for Students v1.mp4

Will SchoolAI be on students iPads (so, available for students to use at any time?)

SchoolAI Spaces can only be shared by teachers with secondary students through a join code. The tool is not an app or something that students can log into in a browser.

In grades 6-12 if a teacher has completed the necessary training requirements, they can create Spaces in SchoolAI to use with students. These Spaces are created with intentional instructional delivery in mind. SchoolAI is not installed on student devices. The teacher provides a code so the student can join a specific Space. The teacher is able to monitor the interaction.

When the lesson is over, the Space can easily be closed so that it is not available outside of class time with the teacher.

Can the school walk parents through SchoolAI?

SchoolAI partnered with APS to provide a walkthrough for parents on October 8, 2025. This recording is available on the  APS Generative AI Guidance page.

Will there be any student group chats that will use GenAI?

There is no student group chat in SchoolAI.

How does APS help students think critically when using generative AI?

APS teaches students to use generative AI as a thinking partner, not a source of truth. Educators design prompts that require students to question, revise, and evaluate AI-generated responses. Students learn to identify bias, check facts, and reflect on accuracy. This builds inquiry, critical thinking, and self-directed learning, aligned with digital citizenship and Virginia’s learning standards.

How do you minimize bias in students’ use of generative AI?

APS works with students to engineer prompts that both match their learning needs and account for implicit and explicit bias. The minimization of bias in all aspects of schooling is a long-term goal of APS and applies to both in-person and digital interactions.

How does APS Guidance for the Use of Generative AI in Schools map to the 4 C's framework that School AI uses to help APS benefit from AI?

SchoolAI’s 4 C’s (Conscientious, Collaborative, Critical, Creative), align with the Virginia Department of Education’s Profile of a Graduate, which emphasizes the development of Critical Thinking, Creative Thinking, Communication, Collaboration, and Citizenship. APS’ guidance emphasizes our commitment to helping students build these competencies through responsible, teacher-guided use of AI.

When will this application be available to our students? Or is it already live?

SchoolAI can be used with 6th-12th grade students once teachers have met certain criteria.  Teachers must have chosen to use SchoolAI and completed additional requirements can share monitored SchoolAI Spaces with students as a part of their classroom instruction.

Can you clarify, is SchoolAI going to be made available to all students, and if so, when will that occur? Or will it only be available selectively in individual classes? If the latter, how will we know when students are being given access?

Once 6th-12th grade teachers and students fulfill specific APS requirements, SchoolAI can be used as a supplemental resource to support learning. Teachers can elect to use this tool – it is not a requirement. An introduction letter has been shared with instructional staff that should be shared with parents before Spaces are shared with students.

What does SOC2 mean?

SOC 2 stands for System and Organization Controls 2.

It’s a framework used to evaluate how well a company protects customer data, especially in cloud-based services. It focuses on five areas: security, availability, processing integrity, confidentiality, and privacy. SOC2 compliance demonstrates that an organization has strong information security practices to protect sensitive data.

When we say SchoolAI is SOC 2 compliant, it means the company has strong systems in place to keep student and school data secure, private, and reliable, meeting high standards for handling sensitive information.

What does PLC resource sharing mean?

PLC stands for Professional Learning Community. A PLC is a collaborative team of educators who engage in regular cycles of shared inquiry and reflective practice. The goal is to enhance student outcomes through collective action and evidence-based strategies. PLCs are grounded in the belief that improving student learning depends on educators participating in ongoing, embedded professional learning. Resources, whether generated through AI or not, are often shared within the team to benefit collaborative conversations and the development of student-facing resources.

At the SchoolAI Community Event, PLC resource sharing referred to educators exchanging AI-related tools, lesson ideas, prompt libraries, and feedback to support safe, effective, and student-centered use of SchoolAI.

APS seems to be operating under the assumption that students who are exposed to A.I. the most are the ones who will be at the greatest advantage in the future. What evidence is there that this is the case? (Has APS consulted evidence that suggests the opposite: that the more students use A.I., the worse their critical thinking gets?)

We seek to prepare students for future careers. According to the World Economic Forum Future Job Report 2025, AI and big data skills are the fastest-growing globally, outpacing all other skill categories. Employers across industries overwhelmingly expect AI and information processing technologies to transform their businesses by 2030. The top three fastest-growing skills are: AI and big data, Networks and cybersecurity, technological literacy.

How can students use and benefit from SchoolAI?

While much of the early conversation around SchoolAI has focused on supporting teachers, students also benefit directly from its features – especially through Spaces, which are educator-designed AI chatbots built for student use.

Students use SchoolAI to:

  • Explore ideas and organize thinking in guided GenAI Spaces
  • Study interactively using tools like flashcards and conversational review modes
  • Build AI literacy through scaffolded activities that teach responsible use, bias awareness, and citation practices

These experiences are always teacher-controlled, meaning educators design the prompts, monitor student interactions, and adjust instruction in real time.

What public engagement was done before APS started on the Year of Staff Exploration?

APS began with internal listening sessions, ACTL committee and subcommittee discussions, and cross-departmental planning meetings. Staff input was gathered through surveys and pilot reflections. Public engagement started in spring 2025 with ACTL outreach and PTA survey coordination. Broader engagement expanded through the October 8 Community Event, updated website resources, and FAQ development. APS also joined a state partnership with VASS to collaborate with other districts.

How will student use of AI be monitored by teachers/parents?

During class time, teachers could provide secondary students with access to generative AI tools within monitored environments called Spaces. These are currently only open during instructional hours, allowing teachers to guide and observe student use in real time.

How does APS handle plagiarism when students use AI tools? Are teachers using AI trackers like Turnitin?

APS takes academic integrity seriously. APS has integrated AI-related language into the Acceptable Use Policy and student conduct materials, and any concerns are addressed through established escalation procedures. AI detectors such as Turnitin have been disabled in APS due to their inconsistent accuracy, lack of transparency in how they evaluate AI-generated content, and potential bias against multilingual learners and neurodiverse students. Instead, APS emphasizes teacher guidance, student reflection, and responsible use practices to help students understand how to use AI ethically and cite it appropriately when used to support their work.

How can parents opt their children out of all assignments and assessments using A.I. and do paper and pencil alternatives?

APS supports teacher-led student use of generative AI tools at the secondary level and encourages open communication with families. If a parent prefers their child not use AI for a specific assignment, they’re encouraged to work with the teacher directly. This opens a conversation about the learning goals, such as building digital literacy and responsible tool use, and allows teachers to offer paper-pencil alternatives when necessary. Teachers have access to customizable letters in five languages to support the notification of families when introducing AI tools, recognizing that AI features are built into many platforms in different ways.

There is research that shows that the more people rely on A.I., the worse their critical thinking becomes. Did APS consult this research before making this decision? Why did APS adopt this?

Yes. APS reviewed current research and consulted with both higher education experts and consultants recommended by the VA Department of Education before adopting AI tools. Studies show that while over-reliance on AI can reduce motivation for self-reflection and critical evaluation, strategic use – guided by teachers – can enhance critical thinking by exposing students to diverse perspectives, supporting argument construction, and encouraging information verification.

APS’s approach emphasizes teacher-led use, transparency, and student responsibility. AI is treated as a support tool, not a substitute for thinking, and is integrated in ways that promote ethical, developmentally appropriate learning.

How does APS’s AI use guidance to align with academic integrity? Isn’t it cheating if AI helps write or design a student’s work?

APS treats AI like any other learning tool – its use must be transparent, purposeful, and guided by the teacher. Just like peer feedback or tutoring, AI support is acceptable when students are actively involved, understand the process, and cite their use. Teachers set clear expectations for what’s allowed, and students are still responsible for their own thinking and learning.

About the "acceptable use of A.I." document: Level 1 provides these examples: "Student asks AI to suggest synonyms for a poem.Teacher uses AI to rephrase a rubric description." I'm trying to think of a way to ask this politely, but, that's a thesaurus. Why not just teach them to use a thesaurus? Does APS not possess enough thesauri? Because if APS needs to purchase more thesauri, just tell us! We'll do a fundraiser and buy some, and I bet it'll be a fraction of the cost of this platform!

It is absolutely right that asking AI for synonyms is similar to using a thesaurus—and in fact, that’s part of the point. The Level 1 examples are meant to show how AI can support learning in simple, familiar ways. In this case, AI acts like a digital thesaurus, but with added flexibility: it can suggest words in context, explain tone differences, or even help students refine their writing voice. It is the earliest and lowest level of adoption of a new technology.APS values traditional tools like thesauri, and teachers still teach and encourage their use. The AI example is just one way to show how technology can complement existing strategies. The template is designed to help teachers communicate how they might use AI in their own classrooms, and they can swap in examples that better fit their style or subject area.

APS is committed to providing a range of tools—digital and analog—that support student learning. The AI platform is part of a broader effort to prepare students for the future while keeping instruction grounded in strong pedagogy.

It seems like already, the extensive use of game-based apps to "learn' content and "study" for tests is diminishing real understanding of complicated concepts. The use of AI is documented to further reduce critical thinking skills. How will APS ward against this?

APS uses AI and digital tools to support, not replace, student thinking. Teachers guide how and when these tools are used, with a focus on instructional goals like critical thinking, analysis, and reflection. APS also prioritizes digital literacy, helping students learn to use AI responsibly and evaluate its output. Game-based and AI-supported learning are used intentionally, not excessively, and always with teacher oversight.

What considerations does APS make when partnering with tech firms to introduce their technology into schools?

APS follows a rigorous review process before adopting any new technology. Vendors must complete a Technical Quality Assessment (TQA) that evaluates compatibility with APS infrastructure, cybersecurity risks, and long-term support needs. APS also requires documentation on student data privacy, insurance, and compliance with district procurement policies. Partnerships are guided by instructional goals, equity, accessibility, sustainability, and alignment with APS values – not vendor features alone.

What are the main goals of introducing AI as a tool to support learning in the Secondary Schools?

At Arlington Public Schools, our goal in introducing generative AI tools at the secondary level is to create a more equitable and future-ready learning environment. We recognize that students are already exploring this type of technology independently – our role is to guide them in using AI responsibly and effectively. By integrating generative AI into instruction:

  • We aim to level the playing field for all learners, including those in advanced programs like AP and IB.
  • We provide students with scaffolded, teacher-monitored spaces where they can safely explore, question, and create using AI.
  • We prepare students for college and career readiness by helping them become both critical consumers and thoughtful producers of information.

This work is grounded in our commitment to student safety, digital citizenship, and equitable access to innovation.


This FAQ page will continue to be updated regularly.