As generative AI tools rapidly evolve, faculty and staff at The University of Alabama are facing new challenges and opportunities. While many AI tools are making their way into higher education, not all offer the level of data protection required to safeguard university information. To better understand the potential benefits and risks of these technologies, the Office of Information Technology (OIT) recently conducted pilot programs on two prominent AI tools: Microsoft 365 Copilot and Otter.AI. The goal was to evaluate their functionality, security, and feasibility for campus-wide use.
Microsoft 365 Copilot Pilot
The first tool under review was Microsoft 365 Copilot, which integrates generative AI into the Microsoft 365 suite. This AI leverages Microsoft Graph technology to tailor outputs based on each user’s data access permissions, adding an extra layer of security. All prompts are processed by a private instance of Microsoft’s large language model (LLM), providing an added assurance of data protection.
For this pilot, OIT engaged 35 staff members across various roles, including content creators, designers, and management, who used the tool for two months. Throughout the first month, short training sessions were offered several times a week to familiarize users with Copilot’s features. Surveys were also conducted at different stages of the pilot to capture feedback on user experience.
The results were mixed. While Copilot offered some helpful functionalities, such as email coaching for sentence restructuring and image discovery, the tool still had limitations. For instance, it often altered email tone in a way that was not always helpful, and written communications generated by Copilot required significant edits. Many users indicated that more training on how to prompt the AI was necessary to maximize its effectiveness.
From a security perspective, Copilot was approved for use without restrictions, making it a safe option for the university. However, the cost—$30 per employee per month—would need to be borne by individual departments. While OIT found some useful features, the overall return on investment (ROI) was hard to quantify, leaving departments to weigh the costs and benefits.
Otter.AI Pilot
Next, OIT assessed Otter.AI, a generative AI tool for virtual meetings that offers transcription, note-taking, and summarization capabilities. Similar to Read.AI, which was previously prohibited on campus due to security concerns, Otter.AI aims to make meeting management more efficient by generating notes and action items. It even includes an “OtterPilot” feature, which can attend meetings in place of an employee.
In this one-month pilot, 20 employees used Otter.AI, receiving training from both OIT and an Otter.AI representative. A survey at the end of the pilot revealed that Otter.AI’s transcription quality was fair-to-adequate, but there were some drawbacks. Users frequently had to identify speakers manually, and transcription quality diminished when multiple people spoke at once. Additionally, action items were sometimes incorrect, and Otter.AI struggled with accents.
Compared to Microsoft 365 Copilot’s transcription, most users found Otter.AI to be less reliable. The pilot concluded with a decision to allow the professional and business versions of Otter.AI on campus, but only with restrictions. Sensitive or restricted data should not be used with the tool. The free version of Otter.AI was prohibited due to security concerns.
Moving Forward with AI at UA
The AI pilot programs provided valuable insights into how tools like Microsoft 365 Copilot and Otter.AI might fit into The University of Alabama’s technological ecosystem. While both tools showed potential, they also highlighted the importance of thorough training, understanding limitations, and assessing security risks before implementing AI solutions on a broader scale. OIT remains committed to exploring innovative technologies while ensuring the protection of UA data.
As generative AI continues to advance, these pilots mark the beginning of a thoughtful and informed approach to integrating AI tools into the university’s operations.