Key to the Icons for Quick Information

Icons (with letters, colors, and alternative text descriptions) are provided to offer quick information. 

  • A green square with the letter G indicates protections and useful options for privacy, security, and accessibility
  • A yellow square with the letter C indicates caution is needed regarding privacy, security, and/or accessibility
  • A red square with the letter R indicates risk in the areas of privacy, security, and/or accessibility
  • A purple square with the letter A indicates a platform or tool that Amherst College has reviewed for privacy, security, and accessibility, and has a contract or purchase agreement that may have applied additional protections or requirements
Image
OpenAI logo in green

ChatGPT & DALL-E
Learn more...

Two tools from OpenAi: 
A generative AI chatbot and a text-to-image tool. 
Caution is needed for privacy and security. 
Useful options for accessibility. 
Computer code created will require accessibility checks.

Image
yellow square with letter C

 

 

Image
Zoom logo, a video camera on a blue button

Zoom Phone & Video
Learn more...

Amherst College's phone and audio and video meeting platform. 
Additional privacy and security protections through a Campus license agreement. 
Many accessibility features.

Image
purple square with the letter A
Image
green square with letter G

 

 

Image
Google Bard logo, two four-pointed colorful stars

Google Bard
Learn more...

Generative AI chatbot developed by Google.
Caution is needed for privacy and security as a lot of data may be collected. 
Useful options for accessibility.

Image
red square with letter R

 

 

Image
Microsoft Copilot logo, a rainbow-colored curving abstract shape

Microsoft Copilot
Learn more...

AI chatbot developed by Microsoft that creates text and images.
Caution is needed for privacy and security as a lot of data may be collected. 
Useful options for accessibility. 
Computer code created will require accessibility checks.

Image
red square with letter R

 

 


College President’s and Provost’s Generative AI Recommendations

Proceed with caution when using generative AI tools (including within platforms such as Slack, Tableau, Google, and Office365 that are rapidly adopting them).

While recognizing the potential for innovation and creativity that generative AI tools offer, a subject that will be explored in a variety of ways on campus, the guidance that we offer below focuses largely on mitigating risks, some of which carry serious consequences, that we wish to bring to your attention.

Many don’t offer robust privacy controls. Unless the College’s licensing agreement ensures content isn't used externally or to train algorithms without anonymization, please follow the guidelines below: 

  • In interactions with these tools, do not use information that you; the College; our faculty, staff, or students would not want shared publicly. Examples include financial statements, payroll or salary details, disciplinary records, College processes, and system architecture and/or vulnerabilities.
  • Do not use personally identifiable information (PII) or data files that are not anonymized, such as donor history, student transcripts, tenure review details, resumes or application letters, payment history, etc.
  • If you are considering licensing a new tool or utilizing a new AI feature in a tool you already use, please work with AskIT to ensure that the tools and services you procure or are using on behalf of the College have appropriate privacy and security protections and are assessed for risk prior to use.
  • If you have already begun the use of new tools or features, configure the platform to not use or retain chat/prompt history, if possible. Once data are placed into these platforms, there are no guarantees that they can later be removed from them.
  • Artificial intelligence tools are advancing rapidly, but they can still be inaccurate and occasionally produce hallucinations. Moreover, they may exhibit inherent biases. Always verify the suggestions or information they provide, particularly in technical disciplines and matters related to student work and the honor code.

If you have questions that focus on the technical aspects of AI, please reach out to David Hamilton, chief information officer. For all other questions, please contact the co-chairs of the task force, Chris Kingston and Jaya Kannan.

We thank you in advance for your attention to these matters and look forward to navigating this new frontier with the Amherst community.