Skip to content

Control 3.10: SEC Reg S-P -- Privacy of Consumer Financial Information

Control ID: 3.10 Pillar: Compliance & Audit Regulatory Reference: SEC Regulation S-P (17 CFR 248.30), GLBA Title V (Privacy), SEC Regulation S-ID (Identity Theft Red Flags) Last Verified: 2026-02-17 Governance Levels: Baseline / Recommended / Regulated


Objective

Establish controls governing Microsoft 365 Copilot's access to consumer financial information, helping meet SEC Regulation S-P privacy requirements for safeguarding nonpublic personal information (NPI), maintaining privacy notice accuracy, supporting opt-out provisions, and implementing the safeguard requirements specified under the Safeguards Rule (Rule 248.30) -- including the 2023 amendments that took effect in 2025 with a staggered compliance schedule.

Why This Matters for FSI

SEC Regulation S-P implements the privacy provisions of the Gramm-Leach-Bliley Act (GLBA) Title V for SEC-registered broker-dealers, investment advisers, and investment companies. The regulation requires financial institutions to:

  1. Provide privacy notices to customers describing information-sharing practices
  2. Offer opt-out rights before sharing NPI with nonaffiliated third parties
  3. Implement safeguards to protect the security and confidentiality of customer records and information

Reg S-P 2023 amendments (staggered effective dates): In December 2023, the SEC adopted amendments to Regulation S-P that significantly strengthened the Safeguards Rule. The amendments introduced a mandatory incident response program requirement and a 72-hour vendor notification obligation. Compliance dates were staggered by entity size:

  • Large entities (those filing reports under Exchange Act Section 13 or 15(d) with $1 billion or more in assets or assets under management): compliance required by June 2025
  • Small entities (all other covered entities): compliance required by December 2025

As of the date of this control, both compliance deadlines have passed. Institutions that have not yet implemented the amended Safeguards Rule requirements are out of compliance.

M365 Copilot presents specific Reg S-P challenges because it accesses and processes NPI stored across Microsoft 365 -- customer account data in emails, financial information in SharePoint documents, client records in Teams conversations, and personal details in OneDrive files. Copilot's ability to surface, summarize, and generate content from NPI creates new vectors for potential unauthorized disclosure or inadequate safeguarding that must be addressed within the Reg S-P compliance framework.

Control Description

This control addresses the intersection of Copilot functionality with Reg S-P requirements across all three pillars: privacy notices, opt-out provisions, and safeguards. The amended Reg S-P requirements (incident response program and 72-hour notification) are integrated throughout.

Reg S-P Amendment Details: 72-Hour Vendor Notification and Incident Response Programs

The 2023 Reg S-P amendments introduced two specific requirements directly relevant to M365 Copilot deployments:

SEC Rule 248.30(a)(3) — 72-hour vendor notification: Covered institutions must notify service providers of unauthorized access to customer information within 72 hours of detection. For M365 Copilot, this means incidents where Copilot improperly surfaces customer NPI -- whether due to permission misconfiguration, DLP policy gaps, or oversharing -- must trigger the 72-hour notification chain to Microsoft as the service provider. The 72-hour clock begins at the moment the institution detects or has reason to believe unauthorized access has occurred, not when the investigation is complete.

SEC Rule 248.30(a)(4) — Mandatory incident response program: Covered institutions must have written policies and procedures for an incident response program that addresses unauthorized access to or use of customer information. The program must include procedures for: assessing the nature and scope of incidents; containing and remediating incidents; notifying affected individuals; and, under the amended rule, notifying service providers within 72 hours. Copilot-related incidents -- including Copilot surfacing NPI to unauthorized users due to oversharing, or Copilot-generated content containing customer NPI sent to wrong recipients -- constitute incidents under this requirement.

NPI Categories Accessible to Copilot

NPI Category M365 Location Copilot Access Path Risk Level
Account information Exchange (emails), SharePoint (documents) Copilot can surface account numbers, balances, transaction history from grounding data High
Financial status SharePoint, OneDrive (financial documents) Copilot can summarize income, assets, net worth from client files High
Transaction records Exchange (confirmations), SharePoint (reports) Copilot can reference trade details, transaction amounts, counterparties High
Social Security numbers Exchange, SharePoint (applications, forms) Copilot may surface SSNs from stored documents; DLP should block Critical
Contact information Exchange (contacts), Teams (chat) Copilot can access addresses, phone numbers, email addresses Moderate
Employment information SharePoint (KYC documents), Exchange Copilot can reference employer, income, occupation from client files Moderate
Investment preferences Exchange, SharePoint, Teams Copilot can surface risk tolerance, investment objectives, time horizons Moderate
Health information Exchange, SharePoint (insurance applications) Copilot may access health data in insurance-related workflows High

Reg S-P Safeguards Rule Requirements and Copilot Controls

Safeguards Rule Requirement Copilot Control
Designate a qualified individual to oversee the safeguards program Include Copilot in the designated individual's scope of responsibility
Conduct risk assessments of customer information Assess Copilot as a system that accesses and processes customer information
Design and implement safeguards to control identified risks Implement DLP, sensitivity labels, information barriers, and access controls for Copilot
Regularly test safeguard effectiveness Include Copilot-specific test scenarios in safeguard testing program
Train personnel on safeguard policies Include Copilot NPI access risks in security awareness training
Oversee service provider arrangements Review Microsoft's data processing practices for Copilot NPI access
Establish an incident response program (Rule 248.30(a)(4)) Incident response program must cover Copilot-related NPI incidents with written policies and procedures
Notify service providers within 72 hours (Rule 248.30(a)(3)) 72-hour notification to Microsoft required for Copilot-related unauthorized NPI access
Notify affected individuals within 30 days Copilot-related NPI incidents trigger customer notification obligations within 30 days

Privacy Notice Considerations

When Copilot accesses NPI, the firm's privacy notice must accurately describe:

  • Categories of NPI collected and used: Privacy notices should reflect that AI tools access NPI for operational purposes
  • Parties with whom NPI is shared: If Copilot processing involves Microsoft as a service provider, this should be reflected in service provider disclosures
  • Security practices: The notice should reference AI-related safeguards as part of the firm's security program
  • Consumer rights: Opt-out rights and procedures should be clearly stated

Copilot Surface Coverage

Copilot Surface NPI Exposure Risk Safeguard Mechanism
Outlook Copilot High -- emails contain client NPI (account info, SSNs, financial data) DLP policies on Exchange; sensitivity labels on NPI-containing emails
Microsoft 365 Copilot Chat High -- Copilot Chat searches across all workloads and can surface NPI from any location DLP for Copilot location; permission-based access controls; sensitivity labels
SharePoint/OneDrive Copilot High -- financial documents, client files, KYC records DLP policies; sensitivity labels; site-level permissions; information barriers
Teams Copilot Moderate -- meeting transcripts may contain NPI discussed verbally Teams DLP; meeting transcript controls; channel permissions
Word Copilot Moderate -- Copilot may reference NPI when drafting client documents DLP policies; sensitivity label inheritance; output review
Excel Copilot High -- financial data, account information, portfolio data DLP policies; sensitivity labels; file-level permissions
Copilot Pages Moderate -- pages may contain NPI copied from other sources OneDrive DLP; sharing controls; sensitivity labels

Governance Levels

Baseline

  • Note the amended Reg S-P timeline: proposed March 2023, final rule adopted May 2024, compliance deadline December 3, 2025 for larger firms; key changes include expanded scope (investment advisers AND broker-dealers), mandatory written policies, expanded "customer information" definition, mandatory breach notification (≤30 days), and 72-hour service provider notification
  • Assess Copilot as a system that accesses NPI in the firm's Reg S-P risk assessment
  • Verify DLP policies cover NPI data types (SSN, account numbers, financial data) across Copilot-accessible locations
  • Review privacy notices for accuracy regarding AI tool usage and NPI access
  • Include Copilot NPI access in the firm's information security program documentation
  • Establish a written incident response program covering Copilot-related NPI incidents (SEC Rule 248.30(a)(4) requirement; must include the 72-hour vendor notification procedure per Rule 248.30(a)(3)); confirm Microsoft's Data Processing Agreement (DPA) covers the 72-hour service provider notification SLA
  • Enable DSPM for AI (Purview portal > DSPM for AI) to detect SITs in Copilot prompts and responses as the primary internal NPI detection mechanism
  • Train Copilot users on NPI handling obligations and Copilot-specific NPI risks
  • Implement sensitivity labels for NPI-containing content with DLP enforcement at the Copilot interaction layer
  • Configure information barriers to prevent Copilot from surfacing NPI across organizational boundaries (e.g., advisory vs. brokerage)
  • Deploy Copilot-specific DLP policies using -CopilotLocation All with FSI-relevant SITs: U.S. Social Security Number (SSN), ABA Routing Number, U.S. Bank Account Number, Credit Card Number, and U.S. Individual Taxpayer Identification Number (ITIN); configure incident reports to route to the privacy/compliance team
  • Configure DSPM for AI (Purview portal > DSPM for AI) to detect SITs in Copilot prompts and responses, and to identify oversharing of labeled content (e.g., Confidential — Customer Data) through Copilot interactions
  • Conduct annual Reg S-P risk assessment with explicit Copilot NPI analysis
  • Implement automated NPI detection and classification for content accessed by Copilot
  • Configure the incident response program in Microsoft Purview: document the 72-hour notification workflow for Microsoft as service provider; establish escalation paths and notification templates for Copilot NPI incidents
  • Test incident response procedures with Copilot-related NPI breach scenarios
  • Update privacy notices to explicitly address AI tool usage in NPI processing
  • Monitor Copilot access patterns for anomalous NPI access

Regulated

  • Implement real-time NPI access monitoring for Copilot interactions using Microsoft Sentinel
  • Configure automated alerting for Copilot access to high-sensitivity NPI (SSNs, financial account numbers)
  • Conduct quarterly NPI access reviews for Copilot-licensed users
  • Implement data minimization controls to limit NPI exposure through Copilot
  • Implement the full amended Reg S-P incident response workflow: (1) Detection via DSPM for AI / DLP alert → routed to privacy/compliance team; (2) Assessment within ≤72 hours — determine whether Microsoft has triggered their 72-hour notification, cross-check with Azure Service Health; (3) Scope determination via UAL audit logs + DSPM prompt/response data to identify which customer NPI was exposed; (4) Customer notification ≤30 days after determination of misuse or reasonably likely misuse; note: no SEC notification obligation under Reg S-P unless a separate reportable cybersecurity event under Reg S-K Item 1.05 or SAR requirements triggers independently
  • Prepare Reg S-P examination-ready documentation including Copilot NPI safeguard controls, incident response program documentation, and evidence of 72-hour notification readiness per Rule 248.30(a)(3)
  • Conduct tabletop exercises for Copilot-related NPI breach scenarios including both 72-hour vendor notification and 30-day customer notification timelines
  • Commission independent assessment of Copilot NPI safeguard effectiveness
  • Implement continuous compliance monitoring for Reg S-P safeguard requirements across Copilot surfaces

Setup & Configuration

Step 1: Assess Copilot NPI Access Scope

  1. Inventory NPI data types stored across Microsoft 365 workloads:
    • Exchange: Client emails, account statements, application forms
    • SharePoint: Client files, KYC documents, financial records
    • OneDrive: Individual files containing NPI
    • Teams: Client conversations, meeting transcripts
  2. Map Copilot access to each NPI location based on user permissions
  3. Identify high-risk NPI access scenarios:
    • Copilot Chat queries that could surface NPI from multiple sources
    • Copilot summaries that aggregate NPI from multiple documents
    • Copilot-drafted communications that reference NPI
  4. Document findings in the firm's Reg S-P risk assessment

Step 2: Configure DLP Policies for NPI Protection

  1. Navigate to Microsoft Purview portal
  2. Go to Data Loss Prevention > Policies
  3. Create or update DLP policies:
    • Name: FSI-RegSP-NPI-Protection-Copilot
    • Description: Protects NPI from unauthorized disclosure through Copilot interactions
    • Locations: Exchange, SharePoint, OneDrive, Teams, Microsoft 365 Copilot (preview)
    • Sensitive information types:
      • US Social Security Number (SSN)
      • US Bank Account Number
      • Credit Card Number
      • US Individual Taxpayer Identification Number (ITIN)
      • US/UK Passport Number
      • Custom SITs for firm-specific account numbers
    • Policy actions:
      • Block with override for NPI detected in Copilot responses (allows authorized users to proceed with justification)
      • Block for SSN in Copilot responses (no override for SSN surfacing)
    • User notifications: Enable policy tips explaining why content was blocked

Step 3: Implement Sensitivity Labels for NPI Content

  1. Create or verify sensitivity labels for NPI-containing content:
    • Label: Confidential - Client NPI
    • Protection: Encryption, prevent forwarding, watermark
    • Auto-labeling: Configure auto-labeling rules for documents containing NPI patterns
    • Copilot behavior: Content with this label is accessible to Copilot for authorized users but DLP policies prevent NPI from appearing in Copilot responses inappropriately
  2. Deploy auto-labeling policies for SharePoint sites and OneDrive locations containing client records

Step 4: Update Privacy Notices

  1. Review the firm's current privacy notice under Reg S-P
  2. Assess whether AI tool usage is adequately disclosed:
    • Does the notice describe the use of AI tools in processing customer information?
    • Does the notice accurately describe information sharing with technology service providers (Microsoft)?
    • Do opt-out procedures account for AI processing of NPI?
  3. Update privacy notices if needed:
    • Add disclosure about AI-assisted processing of customer information
    • Confirm Microsoft is disclosed as a service provider processing NPI
    • Verify opt-out procedures are clear and actionable
  4. Distribute updated notices per Reg S-P requirements

Step 5: Configure Incident Response Program for Copilot NPI Events

The amended Reg S-P requires a written incident response program (SEC Rule 248.30(a)(4)) with documented procedures for 72-hour service provider notification (SEC Rule 248.30(a)(3)). For Copilot deployments:

  1. Document Copilot-related NPI incident scenarios in the written incident response program:
    • Scenario 1: Copilot surfaces client NPI to unauthorized user due to permission misconfiguration
    • Scenario 2: Copilot-drafted communication includes client NPI that should not have been disclosed
    • Scenario 3: User exports NPI from Copilot interaction and shares externally
  2. Define incident severity levels:
    • Critical: SSN, account credentials, or bulk NPI exposure through Copilot
    • High: Individual client financial data exposure through Copilot
    • Medium: Client contact information exposure through Copilot
    • Low: Near-miss or policy tip triggered without actual NPI disclosure
  3. Establish the notification timeline in writing:
    • Internal escalation: 4 hours from detection
    • Executive notification: 24 hours from detection
    • Service provider (Microsoft) notification: 72 hours from detection (SEC Rule 248.30(a)(3)) -- document the specific Microsoft notification path (Microsoft Security Response Center, Microsoft 365 admin portal incident reporting)
    • Customer notification: Within 30 days of becoming aware (per amended Reg S-P)
    • Regulatory notification: Per applicable requirements
  4. Configure automated alerting in Microsoft Purview for Copilot NPI events to support the 72-hour detection-to-notification window

Financial Sector Considerations

Reg S-P 2023 Amendments: What Changed for Copilot

The key changes relevant to Copilot deployments are the mandatory incident response program and the 72-hour vendor notification, both now in effect:

  • Written incident response program (Rule 248.30(a)(4)): Required as written policies and procedures -- not an informal process. Copilot access to NPI must be addressed in these written policies, including Copilot-specific incident scenarios.
  • 72-hour vendor notification (Rule 248.30(a)(3)): Covered institutions must notify service providers (including Microsoft for Copilot incidents) within 72 hours of detecting unauthorized access. This is a shorter window than the 30-day customer notification and requires near-immediate escalation.
  • Customer notification within 30 days: Applies to Copilot-related NPI incidents where customer information was accessed without authorization.
  • Oversight of service providers: Strengthened under the amendments -- Microsoft's Copilot data processing practices must be assessed, and service provider agreements should address incident notification procedures.

NPI in Copilot Context Windows

Copilot's context window includes content from the user's Microsoft Graph -- emails, files, chats, and calendar. This means NPI from client communications, financial documents, and internal discussions is all accessible within a single Copilot interaction. The aggregation of NPI from multiple sources in a single Copilot response creates a higher-risk NPI exposure scenario than any individual workload alone.

Opt-Out Considerations

Reg S-P requires that customers be given the opportunity to opt out of NPI sharing with nonaffiliated third parties. While Microsoft is typically a service provider (not a nonaffiliated third party) under GLBA, firms should assess whether:

  • Copilot's processing of NPI constitutes "sharing" under the opt-out provisions
  • Service provider exception requirements are met (contractual provisions, appropriate oversight)
  • Any Copilot extensibility features (plugins, connectors) involve NPI sharing with additional parties

Intersection with State Privacy Laws

In addition to Reg S-P, state privacy laws may impose additional requirements on NPI processed by Copilot:

  • California CCPA/CPRA: Right to know, delete, and opt out of sale/sharing of personal information
  • New York DFS Cybersecurity Regulation (23 NYCRR 500): Cybersecurity program requirements including AI system governance
  • Massachusetts 201 CMR 17.00: Data security requirements for personal information
  • Firms operating in multiple states must maintain a jurisdiction-specific compliance matrix for NPI and AI

Verification Criteria

# Verification Step Expected Outcome Governance Level
1 Verify Copilot is included in Reg S-P risk assessment Risk assessment documents Copilot NPI access scope and controls Baseline
2 Test DLP policy for NPI in Copilot responses Copilot-surfaced SSN is blocked; policy tip is displayed to user Baseline
3 Verify privacy notice addresses AI tool usage Privacy notice discloses use of AI tools in processing customer information Baseline
4 Verify written incident response program exists and covers Copilot Program includes Copilot-specific scenarios with written notification procedures (Rule 248.30(a)(4)) Baseline
5 Verify 72-hour vendor notification procedure is documented Written procedure identifies Microsoft notification path and 72-hour trigger per Rule 248.30(a)(3)) Baseline
6 Test sensitivity label auto-labeling for NPI content Documents containing NPI patterns are automatically labeled Recommended
7 Test incident response plan covers Copilot NPI scenarios Plan includes Copilot-specific scenarios with all notification timelines (72-hour vendor, 30-day customer) Recommended
8 Test information barrier enforcement in Copilot Copilot does not surface NPI across barrier boundaries Recommended
9 Run tabletop exercise for Copilot NPI breach Exercise tests 72-hour vendor notification and 30-day customer notification timelines with documented outcomes Regulated
10 Verify real-time NPI access monitoring Sentinel alerts trigger for anomalous Copilot NPI access patterns Regulated
11 Review independent safeguard effectiveness assessment Assessment validates Copilot NPI controls and identifies gaps Regulated
12 Verify examination-ready documentation Complete Reg S-P compliance package for Copilot, including incident response program documentation and compliance date evidence Regulated

Additional Resources


FSI Copilot Governance Framework v1.2.1 - March 2026