Skip to content

Control 2.14: Declarative and SharePoint Agents Governance — Verification & Testing

Test cases and evidence collection for validating agent access, sharing, Registry review, and SharePoint-backed knowledge governance.

Test Cases

Test 1: Agent Access and Creation Restriction

  • Objective: Confirm agent access and agent creation are restricted to approved users
  • Steps:
  • As a standard Copilot user outside the approved scope, attempt to access and create agents.
  • Verify the experience is blocked or not available.
  • As an approved user, verify the intended access path is available.
  • Expected Result: Only approved users can access or create governed agents.
  • Evidence: Screenshots from both user types.

Test 2: Agent Data Source Security

  • Objective: Verify all active agent data sources meet security requirements
  • Steps:
  • Run Script 2 to check source site security posture
  • Verify all sources have sensitivity labels applied
  • Verify sharing is appropriately restricted on all source sites
  • Confirm no oversharing exists on agent source sites
  • Expected Result: All agent data sources meet minimum security requirements
  • Evidence: Source site security report

Test 3: Agent Scope Limitation

  • Objective: Confirm agents only access content within their defined scope
  • Steps:
  • Select an active declarative agent with a specific site scope
  • Ask the agent a question that would require content outside its scope
  • Verify the agent only responds with content from its defined data source
  • Verify no content leakage from other sites
  • Expected Result: Agent responses limited to defined content scope
  • Evidence: Agent interaction showing scope enforcement

Test 4: Registry and Governance Documentation

  • Objective: Verify all active agents have Registry visibility, ownership, and governance approval documentation
  • Steps:
  • Compile the inventory of active agents from the Registry or equivalent export.
  • Cross-reference each agent against governance approval records
  • Verify each agent has documented purpose, owner, data source review, and approval
  • Flag any agents without proper governance documentation or owner assignment
  • Expected Result: All active agents have complete governance documentation and ownership.
  • Evidence: Agent inventory with governance approval cross-reference.

Evidence Collection

Evidence Item Format Storage Location Retention
Agent inventory CSV Compliance evidence repository 7 years
Source site security report CSV Compliance evidence repository 7 years
Agent scope test results PDF Compliance evidence repository 7 years
Governance approval records PDF Governance document repository 7 years

Compliance Mapping

Regulation Requirement How This Control Supports It
OCC Model Risk Management Model governance Agent governance supports compliance with AI model governance requirements
FINRA Rule 3110 Technology supervision Agent oversight supports compliance with supervisory technology requirements
NIST AI RMF GOVERN 1.1 — AI system governance Formal agent governance supports compliance with AI governance requirements