Privacy and Security Guidance for using Microsoft 365 Copilot

What's Included in this Article?

 

Responsible Use of Microsoft 365 Copilot at SFU

Microsoft 365 Copilot is a powerful AI assistant but it isn't responsible for the content it generates. During sessions with your AI companion you will be responsible for how you generate and utilize the results produced, so it will be important to acknowledge the following:

 

Fairness

AI companions are human-like but are not human, and may not weigh or consider biases when trying to respond to your questions.

  • Ensure the content you curate with your AI companion does not amplify biases or violate human rights, accessibility, or fairness obligations you have at the university.

 

Accountability

AI companions enhance and augment productivity. They are not responsible for the results that are produced.

  • You remain accountable for content used and generated by your AI companion, including the impacts of its use elsewhere.
  • You remain accountable for the arrival, execution, and transparency of decisions made. AI companions are not responsible for any decision making.

 

Authenticity

Responses from generative AI may be convincing but can be inaccurate or misleading (often referred to as "hallucinations").

  • Responses from AI generated content must not be treated as a source of authority on any subject.
  • You are responsible for verifying responses provided by your AI companion. If you are unable to verify results then consider not using them.
  • Ensure AI generated content doesn't falsely impersonate or misrepresent you, others, or any other additional commitments you have (such as copyrighted content).
  • Be transparent when others are interacting with AI generated content created by you vs. human generated content.

 

Data hygiene while using Microsoft 365 Copilot

Microsoft 365 Copilot lives within the overall SFU Microsoft 365 platform which has a rich set of features to help you create, collaborate, and share content with others. During your interactions with Microsoft 365 Copilot it may interact with files and content your account has permissions to view (such as a Word document, email...etc.) to help generate a response so you should always practice good data hygiene with the following practices:

 

Guarding Against Oversharing

Microsoft 365 is a great platform for collaboration and productivity, however, content owners may inadvertently share more content than they need (often called "Oversharing"). It will be important to consider the following when sharing content on the Microsoft 365 platform:

  • Pick the right level of access for each link you make when you share a document. For example, you can share your work with anyone, only with people in your organization, or only the people you choose.
  • If you create "Anyone" links that can be shared publicly, set an expiry date on the link. This means that your links will stop working after at time you define. This can help ensure that content isn't shared forever and reduces the burden of managing links overall.
  • Regularly review your links using the "My Content" feature in the M365 portal. Under the "Shared" tab, you can filter through all the links you've made and who can use them. You can also change or delete any links that you don't need or want anymore.

For an in-depth overview of how each link works while sharing information see: How shareable links work in OneDrive and SharePoint in Microsoft 365.

 

Classifying Sensitive Data

As an AI companion, Microsoft 365 Copilot will be able to interact with documents and content that you have the permission to view in an effort to generate a response for you. For example, if you are able to view or edit a Word document, then your AI companion may also access this to help generate a response for you if needed. As a result, it may review documents that contain sensitive information such as:

  • Personal Information: Such as names, addresses, phone numbers, personal email addresses, social security numbers, etc.
  • Confidential or Sensitive Content: Including proprietary business information, financial details, legal documents, or any other information intended to be kept private

For documents that contain sensitive information, practice SFU's guidelines for data. This can include:

  • Data minimization: Where documents and content only contain the amount of data needed for a specific task.
  • Principal of least privilege: Where data is only shared with other SFU employees in alignment with their role and duties.

 

Practicing Data Lifecycle Management

Regularly review and remove documents or content that are no longer needed in Microsoft 365. For documents that need to be archived and retained in the long term for legal reasons, store them in the appropriate location identified by your department. For example, store the final version of Word document in a network file-share or other system that has been identified as the archive space for your area.

For more guidance about file lifecycle management in Microsoft 365, see the following recommendations from the SFU Archives and Records Management department:

 

How your data is protected while using Microsoft 365 Copilot

Unlike public AI companions (such as ChatGPT, Bard...etc), Microsoft 365 Copilot will leverage the same data, access, and security boundaries as the SFU Microsoft 365 platform. This allows it to personalize and tailor responses to your questions in a unique way without exposing requests outside the SFU organization.

These security and access boundaries include, but are not limited to:

  • Data isolation: Data is kept dedicated, separated, and isolated from other Microsoft 365 instances or AI environments. Microsoft 365 Copilot will not store responses or prompts outside the service boundary of the SFU environment.
  • Access Controls: Microsoft 365 Copilot only presents data that each individual can access using the same underlying controls as the overall SFU Microsoft 365 platform. This ensures the AI companion can only ground responses in content that the current user is authorized to access and that data isn't unintentionally leaked between users, groups, or other environments.
  • Encryption: Data is encrypted at every step that is in alignment with the commitments made for the SFU Microsoft 365 platform.

For more information about the security and privacy controls in place for Microsoft 365 Copilot, see: Data, Privacy, and Security for Microsoft Copilot for Microsoft 365 | Microsoft Learn

 

How the underlying AI model is trained

Microsoft 365 Copilot is designed so your interactions are not used to train an underlying AI model (often referred to as a "Large Language Model" or LLM). Instead, your interactions are heavily augmented in the background with information from the Microsoft 365 platform (such as the context of where your request came from, what data you have access to, and other information related to your content) before it is processed by your AI companion. Since your requests are highly contextualized the underlying model doesn't need to be trained on your data to understand you.

This ensures responses can be robust, tailored, and applicable to you while your inquiries stay within the boundaries of SFU. This approach is different from public AI companions which may train themselves on a variety of submitted and public content.

For an illustrated example of this in action, see the image below.

 

Advisory Notice from the SFU Privacy Office on the use of Microsoft 365 Copilot

In using this service, you agree to limit the upload, prompt, submission, or provision of access to documents or information that contains or includes personal (non-business contact information) or confidential information to this Microsoft 365 Copilot AI companion that is not strictly necessary. We urge all users to exercise caution and ensure that any personal information disclosed or used within the Microsoft 365 Copilot AI is done so in strict accordance with the Collection Notice under which the information was originally collected.

  • For greater clarity Personal Information: may include student names, private non-business addresses, non-work related phone numbers, non-work related email addresses, social security numbers, employment or education history, etc. For a more expansive list of personal information data elements please see: https://www.sfu.ca/content/dam/sfu/policies/files/information_policies/I10-11/I10.11 Schedule 1 - May 29 2021.pdf
  • For greater clarity Confidential or Sensitive Content: Including regulated data, proprietary business information, financial details, legal documents, or any other information that is confidential in nature regarding the university.
     

Please avoid relying on any responses from this Microsoft 365 Copilot AI companion to make decisions concerning individuals unless you have verified the accuracy of the information provided to, and statements provided by, this Microsoft 365 Copilot AI companion.

Should you have any inquiries or concerns regarding the appropriate use of this platform, please contact the SFU Microsoft 365 team via the ITS Service Hub at https://servicehub.sfu.ca/.


We would like to caution users about the responsible use of Microsoft 365 Copilot AI. While AI enhances search and productivity capabilities, and user experience, it is essential to be mindful of its potential consequences. Please consider the following when providing prompts to the AI :

  1. Ensure that personal information entered into prompts is done in accordance with the Collection Notice under which that information was collected. If any personal information is entered, please ensure that it is handled responsibly, in accordance with any Freedom of Information and Protection of Privacy Act (RSBC 1996, c.165) and university policy related obligations you may have.
  2. As AI can inadvertently amplify misinformation. Users should critically evaluate information and cross-reference information generated with reliable sources to verify the accuracy of search results.
  3. AI algorithms may unintentionally reflect biases. Be aware that search results can carry inherent biases, and it's crucial to critically evaluate information generated by AI.
  4. Please be advised that all prompts entered into Microsoft 365 Copilot AI products, as well as the responses generated, are stored within the system regardless of how these responses are subsequently used or handled. 


It is your responsibility to adhere to these guidelines to maintain the privacy and security of personal information you have access to.

Print Article

Related Articles (3)

Explore how to get the most out of your questions to Copilot with this overview and links to constructing prompts.
Explore resources and learning paths that can help you get started with Microsoft 365 Copilot.
Learn more about Microsoft 365 Copilot (a personalized productivity AI assistant) at SFU.
Loading...