Skip Navigation

Administrative Procedure 0123 - Artificial Intelligence Use and Governance

I. PURPOSE:

To provide and establish procedures for the ethical, responsible, and effective approval, and use of Artificial Intelligence (AI) systems by Prince George's County Public Schools (PGCPS) users, while ensuring alignment with the district's strategic goals and legal obligations.

II. POLICY:

The Prince George’s County Board of Education (Board) recognizes that foundational to PGCPS’ outcome goal of educational excellence is the strategic imperative to re-imagine teaching and learning in non-traditional ways to meet and inspire the needs of PGCPS learners. The Board acknowledges that by harnessing the power of organizational learning for improved creativity, enriched collaboration, system knowledge sharing, and operational efficiency, workforce, and operational excellence can be achieved. The Board recognizes that the use of artificial intelligence can, when used appropriately, enhance student learning by improving the efficiency of education, providing new and creative ways to support learning, and encouraging independent research, curiosity, critical thinking, and problem-solving. (Board Policy 0123)

III. BACKGROUND:    

AI technologies are rapidly evolving and offer significant opportunities alongside potential challenges for K-12 education. This administrative procedure provides necessary operational guidance to PGCPS users, namely students and staff, to navigate the use of AI tools responsibly and effectively, maximizing benefits while mitigating risks related to privacy, bias, academic integrity, and safety.

IV. DEFINITIONS:

  1.   AI literacy – AI literacy is the ability to understand, use, and critically evaluate artificial intelligence (AI) responsibly, ethically, and safely. It includes recognizing AI's capabilities and limitations, employing it to enhance learning and productivity, and acknowledging the importance of addressing bias, privacy, and security concerns.
  2. AI Evaluation Process – The formal process established by PGCPS for evaluating and approving specific AI systems for use within the school district. The AI Evaluation Process is managed by a designated cross-functional working group or agreed-upon lead office.
  3. Algorithmic Bias or bias – Systematic and repeatable errors in an AI system that create unfair outcomes, such as privileging one arbitrary group of users over others based on demographics, learning differences, or other characteristics.
  4. Artificial intelligence (AI) – A system of machine learning that is capable of performing complex and original tasks such as problem-solving, learning, reasoning, understanding natural language, and recognizing patterns in data. AI systems use algorithms, data, and computational power to simulate cognitive functions and make autonomous decisions, enabling them to perform a wide range of tasks and improve their performance over time through learning and adaptation. This includes, but is not limited to, generative AI, adaptive learning platforms, predictive analytics tools, AI-powered assessment tools, AI tutors, and operational AI systems.
  5. Deep fake – Video, image or audio that is digitally altered to appear as someone else, often used maliciously to spread false information.
  6. EdTech vendor or vendor – A vendor that offers Ed-tech (a combination of "education" and "technology") hardware and software designed to enhance teacher-led learning in classrooms and improve students' education outcomes. Some of these may incorporate or include AI components.
  7. Generative AI – AI systems capable of generating novel content, such as text, images, audio, code, or data, in response to prompts (e.g., ChatGPT, Gemini, Copilot, DALL-E, Midjourney).
  8. Hallucinations – A response within an AI tool that produces inaccurate, inappropriate, and/or misleading outputs that appear to be factually correct.
  9. Personally Identifiable Information (PII) – Any data that can be used to identify a specific individual, such as name, address, student or employee number or Social Security number, or other unique identifiers. The data may include information about a person's family, such as name, phone number or address, and alone or in combination, is linked or linkable to a specific individual and allows a person to be identified with reasonable certainty.
  10. PGCPS Approved AI System – Any Artificial Intelligence (AI) system digital tool, or EDTech application containing an AI component that has been officially authorized for use within Prince George’s County Public Schools (PGCPS) by students and/or staff.
  11. School official – A contractor, consultant, or other party to whom PGCPS has outsourced institutional services or functions may be considered a school official provided that they are performing an institutional service or function for which PGCPS would otherwise use employees and is under the direct control of PGCPS with respect to the use and maintenance of education records and/or PII.

V. PROCEDURES:

  1. AI Governance and Oversight
    1. The Superintendent shall maintain the cross-functional joint working group as described in Board Policy 0123 that includes the Divisions of Information Technology and Academics as well as internal and external stakeholders with expertise and insight into artificial intelligence and its impact on education. The Division of Information Technology will lead the working group.
    2. This working group is responsible for:
      1. Developing, maintaining, and implementing the AI Evaluation Process;
      2. Creating and updating specific criteria for the evaluation process, addressing all characteristics outlined in Board Policy 0123;
      3. Reviewing feedback and concerns regarding AI use; and
      4. Recommending updates to this administrative procedure based on technological changes and practical experience.
  2. AI Evaluation and Approval Process
    1. The Division of Information Technology will lead and implement the AI Evaluation Process.
    2. Requirement – All AI systems designated for any of the following purposes must pass the formal AI Evaluation Process prior to being procured or implemented:
      1.  Staff use for instruction or productivity that involves access to, or processing of, PII or other sensitive data; and PGCPS proprietary data (e.g., data that is not available publicly);
      2. Direct use by students; or
      3. Significant operational decisions, such as one that requires resource allocation.
    3. Proposal Submission – A requesting school principal, office director, or program manager must submit a Digital Tools Review form (see Attachment A). The form requires details on the AI system, intended use of the digital tool, alignment with PGCPS goal(s), target user group(s), data requirements, and available vendor documentation to include privacy policies and terms of service.
    4. Evaluation Steps – Utilizing information gathered from vendor responses to the Artificial Intelligence Declaration Addendum (Attachment B), the working group will coordinate reviews by relevant departments (Information Technology for security/privacy, Curriculum for instructional tools, Special Education for accessibility, Office of General Counsel for legal for compliance, etc.).
      The review must address:
      1. Data Privacy and Security;
      2. Algorithmic Bias and Equity Assessment;
      3. Transparency and Explainability;
      4. Instructional Alignment / Operational Value;
      5.  Accuracy and Reliability;
      6. Vendor Disclosure and Accountability;
      7. Sources of Training Data; and
      8. Accessibility Compliance.
    5. Decision – The working group will recommend approval, approval with specific conditions (e.g., limited pilot, restricted use cases), or denial to the requesting party.
    6. Approved List – Approved AI systems will be added to an official list maintained by the Division of Information Technology and accessible to staff. The list is inclusive of PGCPS-approved digital resources and EdTech vendors, which may include those with generative AI components, and will be available on the PGCPS website.
  3. Responsible Use Guidelines for Staff
    1.  Approved Tools Only: Staff shall only use PGCPS-approved AI systems.
    2. Critical Evaluation: Staff remain professionally responsible for content and for critically evaluating all AI-generated content (lesson plans, emails, assessments, analysis, code, etc.) for accuracy, relevance, potential bias, and appropriateness before use. AI systems are tools and can produce inaccurate or inappropriate outputs ("hallucinations").
    3. Data Protection: Staff shall use approved tools and appropriate safeguards for sensitive data. Never input student PII or confidential PGCPS information (e.g., employee records, non-public financial data, security details) into AI tools (including free versions of popular chatbots or image generators).
    4. Instructional Use: AI use shall support learning goals and serve as a tool to enhance student engagement. It should not be used to replace direct, differentiated instruction and professional judgment by the teacher.
    5. Professional Development: PGCPS staff will be offered professional development in AI modules covering AI literacy, ethical considerations, bias awareness, data privacy, academic integrity implications, and PGCPS policies and procedures related to AI. Information about training can be found on the Artificial Intelligence PGCPS website.
    6. Transparency: Staff should be transparent with students (age-appropriately) and colleagues about when and how they are using AI tools to support their work or instruction.
  4. Responsible Use Guidelines for Students
    1.  Approved Tools and Teacher Guidance: Students may only use PGCPS-approved AI systems for educational purposes as directed and monitored by instructional staff. Teachers are responsible for providing clear, age-appropriate instructions on when, how, and why specific AI tools are permitted for assignments or learning activities. Selected tools should be aligned to the AI Academic Integrity Guidelines. Information about the Guidelines can be found on the Artificial Intelligence PGCPS website.
    2.  Academic Integrity
      1. Submitting work generated partially or wholly by AI as one's own original work, without specific teacher authorization and proper citation according to provided guidelines, constitutes plagiarism and is a violation of the Student Rights and Responsibilities Handbook (AP 10101).
      2. Teachers shall provide explicit guidelines for each assignment where AI use is permitted, detailing the extent of permissible use (e.g., brainstorming only, drafting assistance, research tool) and the required citation format (see Attachment C for standard guidelines).
      3. Teachers are responsible for designing assessments that measure student learning effectively in the age of AI and for utilizing PGCPS-supported strategies and tools to uphold academic integrity standards. This includes discussing ethical AI use and potential consequences with students.
    3. Acceptable Use
      Student use of AI must adhere to PGCPS Acceptable Usage Guidelines (AP 0700) and standards for digital citizenship, including respectful communication and avoiding the creation or dissemination of harmful or inappropriate content, including deep fakes.
  5. Parental Consent and Notification
    1. Consent
      1. To protect student privacy, PGCPS complies with two key federal laws:
        1. As defined by the Family Educational Rights and Privacy Act (FERPA) and constituted by an official agreement with PGCPS, EdTech vendors of approved AI tools for PK-12 students are designated as school officials. Based on this designation through the agreement, a vendor’s use of student data is limited to specific educational purposes. In such cases, for students under 13, the district can consent on behalf of parents, as outlined in the federal Children’s Online Privacy Protection Act (COPPA).
    2. Notification
      The Division of Information Technology will provide annual notification to parents regarding the district-approved educational technology tools used for instruction, including approved AI systems that students may interact with, consistent with data privacy regulations. Specific high-impact AI uses may warrant direct notification.

VI. MONITORING AND COMPLIANCE:

  1. The Divisions of Information Technology and Academics share responsibility for monitoring the implementation and compliance with this administrative procedure.
  2. Ongoing training on AI topics (e.g., ethics, privacy, bias, academic integrity, PGCPS-approved tools) will be developed and made available to all staff by the Division of Information Technology.
  3. Compliance monitoring activities will include:
    1.  Verification of staff completion records for AI professional development; and
    2. Systematic collection and analysis of feedback received through the established mechanisms regarding AI tools and processes.
  4. Failure to comply with this administrative procedure may result in disciplinary action for staff or students (consistent with the Student Rights and Responsibilities Handbook), revocation of access to AI tools, and potential legal liability for PGCPS.

VII. RELATED ADMINISTRATIVE PROCEDURES:

VIII. LEGAL REFERENCE:

  • Family Educational Rights and Privacy Act (FERPA), 20 U.S.C. § 1232g; 34 CFR Part 99
  • Children's Online Privacy Protection Act (COPPA), 15 U.S.C. §§ 6501-6506; 16 CFR Part 312
  • Protection of Pupil Rights Amendment (PPRA), 20 U.S.C. § 1232h; 34 CFR Part 98
  • Americans with Disabilities Act (ADA), 42 U.S.C. § 12101, et seq.
  • Section 504 of the Rehabilitation Act of 1973, 29 U.S.C. § 794
  • Maryland Student Data Privacy Act of 2015, Md. Code Ann., Educ. § 4-131

IX. MAINTENANCE AND UPDATE OF THIS ADMINISTRATIVE PROCEDURE:

  1. The Division of Information Technology, in collaboration with the Office of the Chief Academic Officer is responsible for maintaining and updating this administrative procedure.
  2. This administrative procedure will be reviewed at least annually, concurrent with the review of Board Policy 0123, and updated as needed based on technological advancements, evolving ethical considerations, legal requirements, stakeholder feedback, and monitoring results.

X. CANCELLATIONS AND SUPERSEDURES:

None; this is a new administrative procedure.

XI. EFFECTIVE DATE: 

December 2, 2025

DOCUMENTS

Administrative Procedure 0123 - Artificial Intelligence Use and Governance.pdf

Attachments

  1. Digital Tools Review form
  2. Artificial Intelligence Declaration Addendum (Evaluation Criteria)
  3. AI Academic Integrity Guidelines.