Approved AI tools at SFU

OVERVIEW

The SFU Privacy Management Program has established a process to assess the risks of Personal or Confidential information being improperly disclosed and used when working with software applications that incorporate AI tools.

 

Assessed AI Applications

The table below lists the AI applications that have been assessed and found safe to use provided that users comply with the specific privacy, information security, and intellectual property guidance provided by SFU.  

Application 

Type 

SFU Authorized Uses 

Who can use it 

Copilot Chat 

General purpose AI assistant 

Productivity assistant for work, study, and research at SFU,  learning and teaching tool, no personal information should be entered into the AI Assistant without the consent of the person to whom such information pertains. 

SFU wide 

Microsoft 365 Copilot 

Personal AI Assistant,  

Advanced faculty/staff productivity assistant, and pedagogical enhancement (not for student assessment) 

SFU faculty and staff – Available upon request 

Chat GPT 

AI Assistant, AI Text Generator (LLLM) 

Pedagogical purposes in SFU courses 

SFU wide 

Ednius   

AI text generator (LLM) 

Pilot AI grading program 

Unit Specific – Beedie School of Business 

Run Diffusion   

AI art generator 

Creative/media course use only; not for ID photos or official branding 

Unit Specific – School of Interactive Arts and Technology (SIAT) 

To request a Privacy Impact Assessment (PIA) for an AI-enabled tool, please see the overview and forms here: Privacy Impact Assessments 

Concerns and considerations when using AI applications that do not have an SFU PIA 

When using an AI application that does not have an SFU PIA as indicated in the above table, always keep in mind these six considerations:

Unauthorized data collection

  • BC’s Freedom of Information and Protection of Privacy Act (FIPPA) strictly prohibits unauthorized collection, use and disclosure of Personal information. The scope of “Personal information” is fairly broad, and includes information such as name, image, marital or family status, unique identifying numbers, demographic information (gender, race, ethnicity, religion), educational history,  employment history and job performance, personal contact information, financial history, medical history, disability status, an individual's recorded personal views or opinions, and anyone else's recorded views and opinions about an individual. The ease of accidentally using or disclosing personal data is a significant concern, especially if the data is used for purposes beyond what the user initially agreed to at the original point of collection, such as unwanted targeted advertising or profile generation.

Surveillance and tracking

  • AI applications can be used for monitoring individuals online and in the physical world, raising significant concerns about mass surveillance and loss of privacy. Data that you enter into GenAI-enabled software applications can be used to deliver highly targeted advertisements, influence decision-making (e.g., what content you see), or even predict future behaviour. In some cases, this tracking extends beyond the application itself, following users across websites and devices using tracking pixels, cookies, and device fingerprints.

Data breaches and leaks

  • AI systems often store and analyze large datasets. Therefore, they are attractive targets for hackers and cybercriminals. If an AI system containing medical, educational, or financial information is breached, Personal or Confidential information could be exposed. This threatens individual privacy and can lead to identity theft and financial fraud. Report such breaches to Information Security Services via SFU ServiceHub and the Privacy Management Program.

Automated decision-making and loss of human oversight

  • AI can improve efficiency and decision-making, but excessive reliance on AI without human oversight can have serious consequences. AI can potentially be used to support business activities such as hiring employees, or assessing student university admissions, by making decisions based on data patterns. However, these decisions may lack transparency, making it difficult for individuals to understand the outcome or question the results of the analyses, or to recognize any biases that may have influenced the decision. This leads to unfair or discriminatory outcomes, particularly if AI is trained on biased data. If you are going to use AI to make a decision, you need to be prepared to defend the decision and the decision-making process.

Threats to Information Security

Copyright infringement

  • GenAI applications may use content provided to them as training data to further refine their performance.  You should be aware that uploading or pasting parts, or all, of a document into this type of GenAI application may violate copyright. Also remember that even if you or SFU owns the copyright to a document, always check that it does not contain Personal or Confidential information as outlined above. Additionally, many third party copyrighted works are made available to SFU users under license. In many cases these licenses prohibit the use of the content with GenAI applications, so always check the license terms for this content before using them with GenAI applications.

 

Subject-specific contacts

Privacy Queries can be sent to privacy@sfu.ca.

Information Security Queries can be sent to information-security@sfu.ca.

Risk Queries can be sent to risk_srs@sfu.ca.

Copyright Queries can be sent to copy@sfu.ca.

 

Definitions

Personal Information: recorded data about an identifiable individual other than business contact information.  Examples include name, image, marital or family status, unique identifying numbers, demographic information (gender, race, ethnicity, religion), educational history, employment history and job performance, personal contact information, financial history, medical history, disability status, an individual's recorded personal views or opinions, and anyone else's recorded views and opinions about an individual.

Confidential information: includes trade secrets and proprietary data.

Print Article

Related Articles (2)

Learn more about Microsoft 365 Copilot (a personalized productivity AI assistant) at SFU.
Learn about privacy and security guidance for using Copilot at SFU.