Artificial Intelligence at Duquesne

Since the release of generative artificial intelligence (AI) tools such as ChatGPT in 2022, the use of AI has grown considerably, and its impact is being compared to some of the most significant technological advancements of our time. However, AI is not without limitations, and institutions across the globe are grappling with how to integrate these tools ethically, safely, and effectively.

In Duquesne University’s latest strategic plan, the University pledged to pursue the following imperative: “Duquesne will identify bold pathways to offer students the knowledge and skills they will need for productive and fulfilling lives and careers in the decades ahead.” To prepare students for the world beyond graduation, it is increasingly clear that Duquesne should explore ways to incorporate artificial intelligence in the teaching and learning process.

Over the past academic year, the Office of the Provost worked with Duquesne's faculty-based Artificial Intelligence Committee and other campus groups to develop the framework for the University’s response to artificial intelligence. This page highlights those efforts and offers guidance for faculty and students on various topics related to AI's applications in teaching and learning.

Integrating artificial intelligence tools at Duquesne will be a gradual and collective effort. Faculty, staff and students are encouraged to share feedback, questions and comments on the guidance below with the Provost's Office by emailing provostFREEDUQUESNE. This page will be updated regularly to incorporate campus feedback and respond to new advancements in AI. 

Duquesne's AI-Related Policies

Below are several existing AI-related policies that faculty and students should consider before using AI tools. This section is not comprehensive – schools and units across the University may have specialized policies not listed below. If you are unsure whether your use of artificial intelligence violates existing policy, contact your instructor or department chair for guidance.

TAP 26

Users of generative AI are responsible for the accuracy, privacy, regulatory compliance, and ethical use of content developed using generative AI tools or services and used in campus communications and documentation including email.

Computing and Technology Policy and Service Requirements

Developed by CTS, these policies and guidelines govern the use of technology at Duquesne University.

Data Governance Service Requirements

Restricted Data may not be used in conjunction with any generative AI tool that the University has not licensed and/or contracted for use.

Academic Integrity

All members of the Duquesne University community – including faculty, students, administration, and staff – are responsible for upholding academic integrity and maintaining a culture in which academic integrity can flourish.

Guidance for Faculty

The Provost's Office, in collaboration with the faculty-based Artificial Intelligence Committee, created preliminary AI guidance for faculty on various topics related to their teaching duties. The guidance is intentionally broad, as AI may be used in vastly different ways across Duquesne's 10 schools and over 250 programs. Duquesne’s faculty are recommended to follow the guidelines below. Faculty should ensure their AI use is in accordance with any specialized AI policies from their school or department.

While many AI tools are available, it is essential that faculty use Duquesne-approved tools when interacting with confidential or restricted data. Unlike other AI software, Duquesne-approved AI tools such as Microsoft Copilot have security protections for University data and intellectual property. For a list of AI tools approved for use by Duquesne, visit the Office of Computing and Technology's AI Guide.
 

Faculty Guidance

Click each tab for detailed, task-specific guidance.

Given the rapid advancement of AI, Duquesne’s faculty likely have varying levels of exposure and proficiency in AI. Before considering how to integrate AI into their courses, faculty are encouraged to explore the various AI tools and their capabilities. Additionally, faculty should consider tracking the latest news and updates on AI and evaluating how these tools are used in their respective fields. Faculty and academic leadership are encouraged to discuss these developments in faculty and leadership meetings to increase awareness and promote discussion on AI.

Faculty are also encouraged to consider attending professional development opportunities on AI. Schools, centers, and departments from across the University — including the Center for Teaching Excellence and the University Writing Center — host workshops, guest speakers, symposia, and other initiatives centered on the use of AI. These opportunities cover various topics related to AI’s use in higher education and will continue throughout the 2024-2025 academic year. To learn more, view the "AI Initiatives and Resources" section below.

Faculty are strongly recommended to develop an AI policy that stipulates acceptable uses of AI within the context of their course, both for themselves and their students. When developing an AI course policy, faculty should consider the following:
  • Ensure consistency with school/department policies: Some schools/departments have created specialized AI policies. Course AI policies should be consistent with any existing school and departmental policies. Faculty should confer with their department and abide by their respective school’s policy approval guidelines before implementing a specialized AI policy. Additionally, departments are encouraged to discuss implementing the same (or similar) AI policies across multiple sections of the same course.
  • Communicate plan for using AI: If faculty intend to use AI to assist in developing course material or as a tool in course assignments or email correspondence, they should disclose this to students in their AI syllabus statement and go over the statement at the beginning of the course.
  • Consider how AI interacts with your course's learning objectives: Faculty and student AI use should align with the course’s stated learning objectives.
  • Evaluate how AI is used in your discipline: If students in a specific discipline are expected to use AI in their future careers, faculty should consider integrating AI into the course in ways that help students become proficient in these tools.
  • Clarify acceptable types of AI tools: AI tools can perform a wide range of tasks, including writing, proofreading, and brainstorming. AI course policies should stipulate which types of tools are allowed in the course and in what capacity. For example, a course might allow editing and proofreading tools such as Grammarly but disallow ChatGPT and other text-generating tools.
  • Outline unacceptable uses of AI: This includes outlining how to cite AI use properly in the course and providing clear examples of policy violations.
Faculty should consider the following recommendations when using AI in course material and assignments:
  • Evaluate the accuracy of AI-generated content: AI tools can generate inaccurate information, pull information from unreliable sources, or fail to cite sources properly. Faculty should evaluate AI-generated content to verify it is factual, cited properly, and free of bias. In accordance with TAP 26, faculty maintain responsibility for ensuring the content they generate using AI tools is accurate.
  • Ensure required AI tools are accessible: If faculty create assignments that promote or require students to use AI, they should ensure that the required tools are accessible to all students and do not have undue barriers to access.
  • Use AI in ways that promote human skills: AI should support student learning, not supplant it. If faculty determine that AI is applicable in their course, they should strive to employ it in ways that develop students’ uniquely human skills, such as critical thinking and problem-solving, that AI cannot replicate.
  • Review resources on AI in course activities: The "AI Resources" section compiles several internal and external resources that provide guidance on how to adjust course assignments to account for AI. These include resources from Duquesne’s Center for Teaching Excellence (CTE), which developed a variety of workshops on teaching with AI tools. 
AI tools should not be used as the only means of assessing students. If AI is used in student assessment, faculty remain fully responsible for holistically evaluating student performance. Given the sensitivity and confidentiality of student assessment, faculty are not permitted to use AI programs in student assessment that have not been approved for use by the University.

Faculty should exercise caution when using AI detection tools to review student assignments, as it is extremely difficult to identify AI-generated work. There may be instances where detection tools incorrectly attribute work to AI or fail to detect AI-generated work.

Before using AI to assist in student assessment, faculty should consult with their department chair and abide by any existing school/department/program AI regulations and policies. If AI is used to support student assessment in any capacity, faculty are strongly recommended to disclose its use to students verbally and include a statement in the course syllabus.

This section provides guidance on the use of AI for administrative tasks such as responding to emails, note-taking, and drafting agendas or summaries. In all cases, faculty are responsible for ensuring their use of AI does not violate existing disclosure laws and policies for confidential information, including the Health Insurance Portability and Accountability Act (HIPAA), Family Educational Rights and Privacy Act (FERPA), Duquesne University’s TAPs and CTS’s Data Governance Service Requirements.

Faculty may use Duquesne-approved AI programs such as Microsoft Copilot to assist in email correspondence. While AI can assist in drafting email correspondences, TAP 26 stipulates that faculty maintain total responsibility for all content generated using AI tools.

Given that faculty have access to sensitive or confidential student information, it is imperative that any AI tool used to aid in email correspondence has protections for institutional data and has been approved for use by the University. Unprotected third-party AI software is prohibited from being used to read, respond or track emails on a Duquesne email account. Applications such as Mailbutler – which uses AI to track and respond to emails –lack protections for institutional data and may threaten university security.

Before using AI for tasks such as taking meeting notes or summaries, faculty should ensure all meeting participants consent to the use of AI in that meeting.  

With AI tools becoming widely available, it is essential to ensure there are clear AI policies in place for students. Existing course policies may not sufficiently address each area where AI intersects with coursework. In the near term (fall 2024 and spring 2025), faculty still evaluating AI should consider adopting the following approach:

  • Adopt existing school/department AI policyIf your respective school or department has already developed an AI policy, refer to it in your individual course policy and syllabus statement.
  • Include an AI statement on your syllabus: If your school/department does not have an AI policy, consider adopting one of the sample syllabus statements created by the Provost's Office. You can find them in the "Syllabus Statements" section below.
  • Engage in transparent conversation with students: At the beginning of your course, have a transparent discussion with students about your stance on AI, your level of experience with AI tools, and your AI syllabus statement.

Syllabus Statements

Faculty are strongly encouraged to include an AI statement on their course syllabi. This section contains sample AI syllabus statements intended to assist faculty in creating their own syllabus language. The statements are segmented into three categories of use: permissive, limited and restrictive/no use. Also included is a sample prompt illustrating how students should cite the use of AI for course assignments. These statements were drafted by the AI Committee and updated by the Provost’s Office. Faculty are encouraged to use or modify the language in these examples to meet their course's needs.

Syllabus Statement Templates

Faculty are encouraged to use or modify these templates to create AI syllabus statements for their courses.

“The use of generative artificial intelligence tools is permitted in this class. AI use is becoming more widespread across various industries and fields, and it is an important part of your education to build competency in using this technology. Some assignments or in-class learning activities may even encourage or require the use of AI tools. However, you must cite all instances of AI use in coursework. Failure to do so will be treated as a violation of academic integrity. Students are responsible for ensuring the accuracy of all work generated with the assistance of AI.”

“Utilization of generative artificial intelligence tools in this course will only be allowed for specific assignments with my explicit approval. In all cases, students must cite the use of AI in coursework. Failure to do so will be treated as a violation of academic integrity. Students are responsible for ensuring the accuracy of all work generated with the assistance of AI.”

“All work submitted by students in this course must be original and generated by the students themselves. Assignments must be completed without substantive assistance from others, including automated tools. Using generative artificial intelligence tools in classwork is prohibited and will be treated as a violation of academic integrity.”

If faculty allow the use of AI in their course, it is strongly suggested they adopt a course policy requiring students to cite all uses of AI in coursework. The AI Committee provided the following example of a prompt specifying how students should cite AI: 

When citing your use of AI in this course, you must specify:

  • What AI tool was used
  • What specific content was generated by AI 
  • What prompt was used to generate the AI content 
    • Both the prompt and original response from the AI tool must be provided (as a screenshot or PDF) 
  • What edits were made to the content generated by AI, if any
  • Why the content was generated with AI

 

Guidance for Students

To prepare students for the world beyond graduation, it is increasingly clear that Duquesne should develop its students’ proficiency in using AI tools. The Provost's Office created broad guidelines for students to follow when using AI:

Abide by Existing Policies

Student AI use is governed by individual course syllabi. In the absence of an AI course policy, students should abide by school/department policies related to the use of AI. If you are unsure if your use of AI is consistent with existing guidelines and policies, consult with your course instructor for additional guidance.

Use AI to Assist Learning, Not Replace It

Use AI in ways that support learning, not replace it. AI is a tool and should be used in ways that develop essential skills and abilities – such as critical thinking and problem-solving – that will prepare you for the world beyond graduation.

Cite All Uses of AI

It is strongly recommended that you cite all uses of AI in coursework. Citing AI use promotes transparency between faculty and students and ensures your work is properly attributed.

Ensure Accuracy of AI-Generated Content

AI tools frequently gather information from other sources that may not be factual or accurate. You should critically evaluate the output of any AI-generated work to determine if any biased, misleading, or improperly cited content was produced. Students maintain full responsibility for all content generated by AI.

AI Initiatives and Resources

Schools, departments, centers and institutes across campus have also created workshops, symposia and informational resources related to how AI intersects with their respective disciplines. During its benchmarking process, the AI Committee also compiled a series of external articles and resources related to AI's use in higher education. 
 

Initiatives and Informational Resources

Click through to view Duquesne's AI events & initiatives, and learn more about AI through informational articles and resources.

Artificial Intelligence Speaker Series: CTE, in collaboration with the Office of the Provost, is establishing an inaugural speaker series on artificial intelligence. This series will feature prominent experts from across the globe who will offer varied perspectives on using artificial intelligence. Speakers will represent a broad range of disciplines from across the higher education sector. This series is expected to begin during the 2024-2025 academic year. More information will be released as it becomes available.

Center for Teaching Excellence AI Workshops: CTE
has expanded its support for faculty by creating a series of generative AI workshops. CTE is planning to host several AI workshops during the 2024-2025 academic year. CTE has also made available recorded lectures and slides from their AY2023-2024 AI workshops. These resources are located on CTE’s intranet page and require you to log in with your Duquesne MultiPass credentials.

Grefenstette Center for Ethics in Science, Technology, and Law: In spring 2024, it was named to the National AI Safety Institute Consortium at NIST. Artificial intelligence has also become a topic of focus at the Grefenstette Center’s annual Tech Ethics Symposium. On October 17-18, 2024, the Center hosted the 5th Annual Tech Ethics Symposium: “How is AI Transforming Our Communities?” This two-day symposium focused on how generative AI transforms our daily lives and our communities.

University Writing Center: The Writing Center is a vital resource for students as they develop their writing skills. With the rise of generative artificial intelligence, the Writing Center has taken several steps to ensure it continues to be an effective resource. It hosts AI workshops and trainings and developed framework for using AI at the Center. The Writing Center’s intranet site provides more information about its approach to AI.
 
  • Artificial Intelligence LibGuide: Created by Gumberg Library, this guide is a valuable resource for learning the basics of artificial intelligence. It provides an overview of what AI is, how some commonly used AI tools work, and the applications and ethics of AI. This guide aims to provide readers with enough information about AI to understand the implications of news and advancements in the field, and to help readers interested in creating their own AI tools to find resources to learn more.
  • Generative Artificial Intelligence (AI) Guide: This CTS guide provides information on the legal, ethical, and privacy considerations of using AI at Duquesne. It also contains a broad overview of Microsoft Copilot, a Duquesne-approved AI tool. 
  • CTE's Teaching With Generative AI page: This page provides guidance on using generative AI tools in the teaching process. It offers examples of best practices and unethical uses, as well as sample syllabus statements and assistance with prompt writing.

The external resources below are intended to guide Duquesne’s faculty and students as they develop skills in AI and integrate these tools into their teaching and learning. They include AI trainings, guidance on modifying syllabus language, and ways to adjust course activities to account for AI. Please note that the resources listed below do not represent Duquesne's official position or policy and are provided for informational purposes only.

Learning About AI:

  • List of Generative Artificial Intelligence Programs: AI is rapidly evolving and new tools are released frequently. This GitHub page compiles a list of generative AI tools categorized by task, including editors, text generators, video/photo generators, and more.
  • Educause AI Guide: This broad overview of AI in higher education compiles Educause’s AI guidance as well as policy examples from universities across the country.
  • Google Generative AI Training: Google offers free generative AI training and certification. Designed for any role or skill set, this page compiles learning opportunities to help you understand generative AI, how it can be used, and how it differs from other types of machine learning.
  • Microsoft Copilot Training: Microsoft’s official Copilot website compiles training, videos, and other resources. It is a valuable resource for learning how to use their AI tool effectively. You can access Copilot by logging in with your Duquesne MultiPass credentials.

 

AI in Pedagogy, Course Design:

  • Approaches to Assessment in the Age of AI: King’s College London created a guidance page for adapting student assessment to account for the presence of generative AI.
  • Bloom’s Taxonomy of Learning Revisited: Oregon State University created this table “as a reference for evaluating and making changes to aligned course activities and assessments (or, where possible, learning outcomes) that account for generative Artificial Intelligence tool capabilities and distinctive human skills.”
  • Sentient Syllabus Project: Sentient Syllabus is a collaborative project created by prominent scholars from colleges and universities worldwide. It was designed to assist faculty in responding to artificial intelligence’s growing use in higher education. This project covers a broad range of topics, including adjusting syllabus language, course objectives, and course activities to account for AI’s presence.
  • Should You Add an AI Policy to Your Syllabus? This article from the Chronicle of Higher Education offers a step-by-step guide for faculty as they consider what/how to add an AI statement to their course syllabus.
  • Will ChatGPT Change the Way You Teach? The Chronicle of Higher Education’s “Teaching” newsletter compiles advice and resources related to teaching with generative AI.