Introduction
The exponential growth and adoption of Generative Artificial Intelligence (Generative AI or GenAI) tools and services such as ChatGPT and Google Gemini cannot be ignored. Existing services from major tech organizations receiving Generative AI augmentations, such as Microsoft’s CoPilot and Azure OpenAI, make Generative AI more ubiquitous than ever. The increase in API-accessible GenAI solutions extends the potential reach of these tools even further. The implications of this technology are far-reaching, not just within higher education, but in nearly every industry across the globe.
Generative AI represents a significant opportunity for individuals and institutions that can leverage it in a creative, responsible and ethical manner, but also represents serious risks for those who wander into the field of GenAI without having a full understanding of the risks. The University of Wisconsin-Milwaukee is committed to providing its students and employees with the skills and experience necessary to sustain success in this new and developing landscape.
What is Gen AI?
Generative artificial intelligence is artificial intelligence capable of generating text, images, or other media, using large language generative models. Generative AI models learn the patterns and structure of their input training data and then generate new data that has similar characteristics. These tools leverage deep learning technologies and neural networks to achieve their goals.
Types of Generative AI: Public Cloud vs. Local (On-Premises)
From a high level, GenAI tools can be implemented and accessed in three distinct ways, with each implementation type significantly affecting data privacy concerns and security requirements. It should also be noted that a particular use case can be realized using any, or even a combination, of the implementation types mentioned below.
- The most well-known implementations are in a public cloud. Large Language Model powered chatbots like ChatGPT and Google Gemini fall into this category. These tools are available to the public via a web browser, and the training data used by these tools includes publicly accessible information on the internet, along with – in some, but not all, cases – prompt input information from users of the platform. As a result, users should avoid inputting any sensitive information into these tools. Certain cloud software vendors may also offer microservices within their platform that are powered by AI. Please consult with the information security office before engaging with these services for university business.
- The next type of Generative AI implementation is a private cloud, wherein an entity with significant computing resources hosts the model and training data provided by a client. The hosting entity allows the client to access the model via a private, authenticated interface. Third party vendors, such as Microsoft with Azure OpenAI, offer private cloud implementations to individuals and organizations. The private nature of these implementations allows for more sensitive information to be sent to the platform and could potentially increase the variety of approved use cases when compared to public cloud implementations.
- Finally, there is the local (on-premises) implementation, wherein the model and training data are hosted on a desktop, laptop or server that is accessible to only the individuals who have been granted access to the host device by the device administrators. As Generative AI tools become increasingly accessible and lightweight, (e.g.: Windows’s AI Studio) local GenAI implementations will become more popular. In local implementations, the model and the training data stored on these assets need to be tightly controlled, which increases the responsibilities of the individuals managing these implementations to ensure compliance with relevant policies and regulations, but also opens the door for additional use cases when compared to public cloud implementations.
Evolving Technological and Regulatory Landscape
The Generative AI environment is constantly evolving. The number of GenAI tools and varying uses for Generative AI are growing at a blistering pace. On top of this, numerous legal matters surrounding Generative AI, including intellectual property, copyright and data ownership, are still undecided. Additionally, new federal regulations, state legislation, or Universities of Wisconsin policies could affect how these tools are used. As the legal and political fields work through these matters, significant changes could come to the Generative AI landscape. It is important that individuals keep up with the ebbs and flows of the overall Generative AI environment.
Use Cases for Generative AI
As part of the university’s desire to enable Generative AI on campus, below is a list of use cases that can be explored by campus stakeholders interested in leveraging GenAI. This list is not exhaustive, and more use cases can be added as needed. If you are a faculty member or student that is interested in establishing a different use case, please reach out to the Center for Excellence in Teaching & Learning (cetl@uwm.edu) for consultation on how to proceed. University staff should reach out to the Information Security Office (infosec@uwm.edu) for further consultation about how to proceed.
- Instructional support: assistance in developing teaching resources and assignment rubrics.
- Language learning support: AI tools offer great translation features that can be leveraged to increase accessibility of teaching resources.
- Tutoring and mentoring during lecture: This use case would allow for more active engagements for students while in the classroom, augmenting learning by providing individualized information and assignments to best support student success while in the classroom. Please note that the AI environment in use must meet the requirements for FERPA protected data.
- Adaptive Learning Systems: This allows for personalized tutoring for individual students. Please note that the AI environment in use must meet the requirements for FERPA protected data.
- Predictive analytics for student success: early warning systems that allow for early intervention and support to improve student success rates. Please note that the AI environment in use must meet the requirements for FERPA protected data.
- Supporting administrative efficiency: augmenting staff work by enabling AI-content generation that can be leveraged for administrative purposes. Please note that depending on the data involved, a review of the use case by Information Security and Research Computing may be required. Also, please see MarComm’s guidelines for using Generative AI when creating content.
- Brainstorming: Staff needing to address problems in novel ways may choose to utilize Generative AI to enable brainstorming of new ideas. Please note that if you are planning to enter university data onto a platform to conduct a brainstorming exercise, a review of the use case by Information Security or Research Computing may be required.
- Content generation or revision: GenAI tools can be used to revise or create content that is meant for public consumption. Any GenAI content that is going to be published should be reviewed by a human before being published, and the use of the GenAI tool involved should be cited. Please note that depending on the data involved, a review of the use case by Information Security or Research Computing may be required. Please see MarComm’s guidelines for using Generative AI when creating content.
- Translation: AI tools offer great translation features that can be leveraged to increase accessibility of resources by various campus stakeholders.
- Idea generation for areas of potential research: Leveraging AI tools to identify new and emerging areas of research.
- Accelerating research: Researchers can leverage Generative AI tools to accelerate their research. Quicker analysis of research outcomes can be determined and allow for new patterns and finding that humans may have difficulty identifying. Please note that depending on the data involved, a review of the use case by Information Security, Research Computing, or Institutional Research Board may be required.
- Quicker code generation: Researchers that require code generation to perform their research can utilize coding specific Generative AI tools to improve and accelerate their projects. Please note that depending on the data involved, a review of the use case by Information Security, Research Computing, or Institutional Research Board may be required.