Teaching Resources


As you teach and think about ways you want to innovate your courses or even run into obstacles, consider looking over the resources we provide on this page as well in our archive (Box link).

If you're interested in Syllabus Resources (crafting learning goals, guidance on policies and support statements, tips on tone), want to consider new ways of approaching Lectures in your class, are thinking about utilizing Rubrics, have questions Feedback on Teaching (both in peer-to-peer settings and with students), want to utilize Early Course Evaluations to generate more feedback from students, are looking for new Teaching Tools that can help foster active learning in your classes, or want to start Teaching with Generative AI, there are resources to support you. 

If there is a particular topic you want to learn more about, one of the best ways for you to receive assistance is through a consultation. You can follow the link below for more information. NOTE: We at CTE do not participate in the formal evaluation of faculty or TAs (e.g., for hiring, promotion, tenure, merit pay, recommendation letters).

You might also find valuable support via some of the recorded workshops (requires multipass login).

Cultivating and Navigating Feedback on Teaching

Feedback on your teaching comes in many different forms and requires multiple distinct attitudes and approaches in order to get the most out of each. These resources help provide you the resources you need to facilitate and manage that feedback on teaching so you can succeed and grow at Duquesne.

Required student evaluation of teaching is a tool for developing your pedagogical skills from semester to semester. It is integral to promotion, tenure, and annual review, as well as some awards.

Teaching and learning are at the heart of Duquesne. In order to assure quality and provide regular feedback to instructors on their teaching, Duquesne uses two kinds of teaching evaluation: student and peer. Both student and faculty peer perspectives on teaching and course design are helpful - each in its own way. Evaluation of teaching findings are useful both for improving one's teaching (formative evaluation), as well as for hiring, promotion and tenure decisions (summative evaluation).

Students complete the Student Perception of Teaching (SPOT) to provide feedback about their instructor's teaching and the course's design. This survey is used in face-to-face, hybrid and online courses. Clinical courses use a different evaluation of teaching.

If you'd like to review the questions that are utilized in the SPOT form, please review our SPOT handout (Box Link).

Increasing response rates is one important action to take in order to alleviate concerns about the generalizability of the student perceptions of teaching (Goodman, Anson, & Belcheir, 2015). In a study at Brigham Young University, Johnson (2003) found that the way faculty communicate with students about the online survey influences the response rate:

Type of Faculty Communication Average Response Rate
Assigned students to complete online rating forms but did not give them points 77%
Encouraged students to complete the online forms but did not make it a formal assignment 32%
Did not mention the online student-rating forms to students 20%

Effective strategies include:

  1. Work towards creating a climate of mutual respect, one where student opinions are respected and addressed and instructor needs are taken into consideration (Chapman & Joines, 2017)
  2. Discussing the importance of student ratings to the faculty member and their efforts to improve the course (Ballantyne, 2003; Linse, 2016),
  3. Noting that their feedback will likely benefit future students (Linse, 2016),
  4. Start your next semester by discussing what you learned from the surveys and how you are adjusting your teaching or something about the course as a result,
  5. Provide non-point incentives (e.g. class treat for reaching a certain completion percentage) to complete the evaluation (Goodman et al., 2015),
  6. Giving time in class to complete the evaluation (Goodman et al., 2015). For example, allow time during class for students to complete the SPOT, either using their own devices or holding the class in a computer lab. The course instructor needs to be absent during the completion of SPOT. Alternatively, make the completion of the online ratings a course assignment (e.g. "Tonight, as part of your homework, please complete the online course evaluation on Canvas."). This would be part of routine homework, not for points toward the final grade.
  7. Multiple reminders from the faculty (Linse, 2016).

In the spirit of supporting faculty, CTE has worked with Duquesne faculty to create a resource that explains the University's procedures and goals for peer evaluation, as well as provide a resource that may be useful for faculty to consult with their colleagues when being evaluated or when carrying out an evaluation.

While CTE does not have the capacity to set university policy, our goals are to:

  • Explain the guidelines found in the Faculty Handbook and the policies and procedures put forth by Academic Affairs. 
  • Offer recommendations and/or food for thought to discuss with your colleagues.
  • Foster a culture where teaching and learning is valued as a skill that grows over time

CTE is incredibly thankful to our Faculty Focus Group on Peer Evaluation (Spring 2022) for all the feedback and input they provided in shaping the support and guidance this page and its resources provides. Thank you.

Below you'll find links to two guides related to Peer Evaluation. The first is a guide for pre-observation that begins by covering policies from Duquesne University's Academic Affairs relative to the evaluation process then moves on to addressing important concepts and discussions for evaluators and evaluatees to engage. The second guide examines how learning is defined and evaluated at Duquesne University with in-depth analyses of 'rigor' and 'engagement' -- the two main evaluated qualities of learning at Duquesne. This second guide also provides multiple discussion questions for evaluators and evaluatees to consider together expectations of how 'rigor' and 'engagement' function and appear in their specific discipline and courses.

Pre-Observation Guide and Recommendations (Box Link)

Evaluating Learning, Rigor, and Engagement at Duquesne (Box Link)


There is no official standardized form of peer evaluation at Duquesne University. However, it is always useful to have some samples to review and consult for ideas. The following are some evaluation tools from scholars and other institutions that are excellent in their own right but also have areas that could be strengthened. Along with links to these documents, we are providing commentary on the standout excellences and the areas that could be strengthened in each resource. These evaluations are based on our own views and the views of faculty who participated in our focus group on peer evaluation in the Spring of 2022.

Note: To access the documents below, you'll need to sign-in to Box with your DORI credentials.

University of Toronto Peer Evaluation Rubric (Box Link)

Standout strengths: Thorough, multiple evaluation rubrics, tools for self-evaluation, helpful set of "questions to avoid" for evaluators. Great questions to consider using for an observation starting on page 16 of the document.

Aspects to be aware of: Toronto's guide is understandably dense and takes some time to get through (but the content is worthwhile!).   

University of Kansas Peer Evaluation Rubric (Box Link)

Standouts: straightforward, strong rubric categories, good language for categories ('developing', 'proficient', and 'excellent'), concise (the rubric itself is only one page), checklist style evaluation which is helpful as a way to take notes to jog memory later.

Aspects to be aware of: The rubric categories are strong, but there is limited exegesis about the categories. If unfamiliar with the teaching approaches, language, or rubric categories, then this rubric will require further external reading to be fruitfully usable. Given this, it's important to review the rubric and its content with your peer to establish shared understanding if you plan on using it or borrowing elements from it in your evaluation. Also important to keep in mind that a filled out checklist does not constitute a peer evaluation of teaching at Duquesne.

UCLA Peer Evaluation Rubric (Box Link)

Standouts: Procedural evaluation approach that enforces reflective writing around each evaluated category, offers strong framing questions to keep in mind during the evaluation, segments class evaluation into 5-10 minute sections.

Aspects to be aware of: Similar to the rubric from University of Kansas, UCLA's document expects users to already possess fairly well-developed teaching knowledge. If considering using this resource or borrowing elements from it for your evaluation, make sure to establish shared understanding of those elements with your peer prior to the observation.

Mansfield Library Pre-Observation Questions (Box Link)

Standouts: Specialized in the specific, and important, area of the pre-observation part of the process. University of Toronto's guide also offers some insight on this, but Mansfield Library's approach is easy to adapt to your specific needs.

Aspects to be aware of: While this document can help frame the actual observation, it does not provide insight into how to proceed with the actual evaluation. Additionally, make sure you take the time to reflect on whether the questions work as they are or if they would benefit from being slightly modified/adapted to your specific concerns relative to your discipline and/or teaching and learning context.

Receiving student feedback early in the semester helps you examine your teaching, implement strategies for better student learning and potentially improve scores on Student Evaluation Surveys. And you get to ask for the particular feedback that you're seeking.

Impact of Early Course Evaluation on End of Semester Evaluations:

Peter Cohen's meta-analysis of studies on the impact of early-course evaluations on end of term evaluations concludes, "Instructors receiving mid-semester feedback averaged .16 of a rating point higher on end-of-semester overall ratings than did instructors receiving no mid-semester feed- back" (Cohen, 1980).  In a more recent study at Brigham Young University, the authors show that the impact of midcourse feedback on end-of-term feedback depends on what instructors do with the early course evaluation: "Student ratings showed improvement in proportion to the extent to which the faculty member engaged with the mid-course evaluation. Faculty who read the student feedback and did not discuss it with their students saw a 2 percent improvement in their online student rating scores. Faculty who read the feedback, discussed it with students, and did not make changes saw a 5 percent improvement. Finally, faculty who conducted the mid-course evaluation, read the feedback, discussed it with their students, and made changes saw a 9 percent improvement" (McGowan & Osguthorpe, 2011).

Processing the Feedback from Your Evaluations:

The table is adapted from an article by Buskit and Hogan (2010).

Throw out the off-the-wall comments that do not provide you with useful information and forget about them. "She needs a haircut and a new pair of shoes."
Set aside the positive comments that don't tell you anything specific. "Best class ever"
Divide the negative comments into two groups: those you can change and those that you cannot change. Can Change: ... redistributing the points for different assignments because of the amount of work that they perceived were required for each assignment.
Cannot Change: ... let students out of
class early rather than keeping them the entire class period.
Work on perceptions and learn to be explicit. As we look at our evaluations, we often think, "But I do that!" If we feel we are doing the things that students say we are not doing, then it may be that we need to address students' perceptions.
Savor the comments that are meant to be negative, but let you know you are doing your Job. "She made us think." "Dr. S. is a very influential
teacher, but I didn't come to college to be influenced."

Discussing the Feedback with your Students:

Karen Lewis (2001) says, "Perhaps the most important part of conducting a mid-semester feedback session is your response to the students. In your response, you need to let them know what you learned from their information and what differences it will make. "

For more information and sample Informal Evaluations/Surveys, follow the link below.

 

Learn More

Cultivate Feedback from Students through Early Course Evaluations

Early course evaluations are a way to gauge how the semester is going for your students while there is time for you to make adjustments to the course and your teaching.  Most professors prefer to administer the early course evaluations during the third through fifth weeks of a semester. However, you may decide to seek feedback after all major assignments once you get comfortable with using early course evaluations.  

The example models provided are helpful but don't feel like you have to stick to exactly these approaches. Feel free to customize your evaluation so that you get the most helpful feedback for you! 

This questionnaire is intended to provide information on how to enhance the quality of instruction during the remaining weeks of this semester.

These surveys ask students to indicate whether or not they agree with each statement by circling the appropriate number on a likert scale. Students are also encouraged to write in comments or suggestions for the instructor under each rated statement.

For example:
"The instructor presents information clearly."

Strongly Agree 1 2 3 4 5 Strongly Disagree

Comments:

Here's a list of sample statements (The instructor should choose a few of the following items which relate to his or her questions about her teaching):

  • The instructor presents information clearly.
  • The instructor's presentations are easy to hear and understand.
  • The instructor's use of audio visual aids and the blackboard supplement the presentation.
  • The instructor manages class time well.
  • The instructor is well-prepared for class sessions.
  • The instructor relates course content to students' lives.
  • The instructor conveys enthusiasm for the course.
  • The instructor helps create a non-threatening climate for participation.
  • The instructor welcomes questions and answers them clearly.
  • The instructor provides helpful feedback on student assignments.
  • The instructor is fair in grading and carrying out the course policies.

A KQS (Keep/Quit/Start) survey can be done without any advanced preparation.  Simply write these three items on the board.

  • Keep Doing
  • Quit Doing
  • Start Doing

What could I as the instructor keep doing, quit doing, and start doing to help you learn?

What can you as a student keep doing, quit doing, and start doing to promote your learning?

If you have more time to prepare in advance, and feel that students would be more responsive if they didn't have to publicly share their responses, then you can use these same questions for brief printed or digital in-class surveys.

As opposed to the other models where students rate statements, an open-ended questionnaire focuses on students' individual feedback in response to more general questions. To begin constructing an open-ended evaluation, an instructor can choose relevant items from the following list for students to complete.

  • The most important thing I have learned in this course so far is. . .
  • The most interesting new idea I have learned in this course is. . .
  • The aspect of this class that is the most helpful for my learning is. . .
  • The most challenging part of this course is. . .
  • The most difficult aspect of this course for me is. . .
  • The one thing I can do to improve my learning in this course is. . .
  • I would have an easier time learning in this class if I could. . .
  • The one thing I expect out of this course that is not being provided is…
  • I would improve this course by. . .
  • The great weakness of this course that I would correct is. . .
  • The part of the course that I have liked the most so far is. . .
  • I know the following techniques are used in other classes and I believe they could be used effectively in this class. . .
  • I have the following questions or suggestions for you [class instructor]. . .
  • I suggest that you [class instructor] make the following changes in the course between now and the end of the semester to help me learn better….
  • My recommendation for improving class participation and discussion is…
  • To help me learn, you [class instructor] should start doing the following things. . .
  • To help me learn, you [class instructor] should stop doing the following things. . .
  • I have the following suggestions for how you [class instructor] can use other media or other technologies in this course. . .
  • Please comment on how well the required readings tie in with class presentations, discussions, and small-group projects.
  • Please comment on how well each class session helps you synthesize the major ideas for the lesson.

Source of open-ended items: United States Military Academy, Center for Teaching Excellence.

"As this course progressed, I was able to get it back on track by using a mid-semester evaluation process called "pluses and wishes." Students divided the evaluation sheet in half and placed all the positives about the course on one side and suggestions for improvement on the other. For the most part, the students were satisfied with the course, but the one "wish" that was prevalent was to increase student interaction" (Ladson-Billings, 1996).

This simple but effective evaluation method encourages students to focus on both the positives they've already experienced and the ones they hope the class will create in the future.

Utilizing Lecture

Lecturing is an important part of teaching, but it's also important to make sure that the style and approach you take to using lecture is aligning with what you want students to get out of each class meeting. This section provides you with a few different styles of lecture and explains the goal of each approach.

This style of lecturing may not facilitate the most active learning environments, but there are absolutely times when this style of lecture is necessary. It is also great professional development for your students to learn how engage with this mode of information intake. It can help if you tell them about this reasoning (and your own unique justifications for this pedagogical decision)! In this style of lecture, students generate comprehensive notes based on what the instructor presents so that they can consult these notes when studying. Little to no discussion.

The goal is engaged listening and resource production.

Three Different Ways to Run a "Pure" Lecture

In this style of lecturing, the instructor designs lectures to create opportunities for students to operationalize the knowledge they've learned either in preparation for the class (e.g. through the reading or the preceding lab), are learning throughout the class from you, or a combination of both.  This approach utilizes frequent, low stakes, testing or activities that have students put the learned concepts to work.

The goal is engagement with concepts and processes.

Three Different Ways to Run an Operational Lecture

This style of lecture has is where you present on brief sections that are either broken up with planned discussion periods or you foster spontaneous discussion throughout. Specific questions rather than open-ended "what do you think about this?" or "any questions?" are best.

The goal is participation and group engagement.

Three Different Ways to Run a Lecture with Intermittent Discussion

Helpful Teaching Tools

There are a lot of digital tools available that can aid class participation. In large classes participation can be difficult to manage whereas in small classes participation can also be difficult because students feel exposed. These digital tools allow for parallel digital participation in class that can be made anonymous. 

Some of these tools, such as Nearpod, even help you create more engaging asynchronous lessons. All full and part-time faculty at Duquesne University have access to the paid version of Nearpod. You just need to put in a request to educational technology.

Other tools, such as Mentimeter and Padlet, are great ways to engage students in "exit tickets" (small reflections turned in at the end of class to indicate what concept(s) covered in that class is/are most clear for them after the lesson and which concept(s) is/are still unclear for them). Both Mentimeter and Padlet also support anonymous polling which can be a great way of running class check-ins or even early course evaluations. While each service has a paid version available with a few more features and the removal of usage caps, the free versions of both products are incredibly useful.

There are multiple tutorials available online for all three products, but if you'd like to schedule a consultation on how best to integrate one (or more) of these tools into your class, contact Jeff%20Lambert.

Syllabus Resources

Various resources and food for thought as you develop and/or revise your syllabi!

The syllabus is a great place for faculty members to begin helping students appreciate the nature of a given course. As educators, we must not assume that our students explicitly understand why they are taking a given class, how it relates to the college curriculum, or what is meant by the requirements that we carefully outline in our syllabus. The stereotypical response that students give about the course being required, while humorous, reveals the need for faculty to become more explicit in their syllabus construction as a teaching tool. Sharon Rubin outlines several questions that many syllabi fail to address. Carefully crafting our syllabi to address some of these questions would help our students as they learn in our courses.
  1. Why should a student want to take this course? How does it make a difference as part of the discipline? How does it fit into the general-education program?
  2. What are the objectives of the course? Where does it lead, intellectually and practically?
  3. Why do the parts of the course come in the order that they do? Most syllabi note the order in which topics will be discussed, but make no attempt to explain the way the professor has chosen to organize the course.
  4. What is the purpose of the assignments? Students are frequently told how much an assignment will "count" and how many pages long it must be, but they are rarely given any idea about what it will demand of them or what the goal is. Will students be required to describe, discuss, analyze, provide evidence, criticize, defend, compare, apply? To what end? If students are expected to present a project before the class, are the criteria for an excellent presentation made clear?
  5. What will the tests test? memory? understanding? ability to synthesize? To present evidence logically? To apply knowledge in a new context? 
  6. Why have the books been chosen? What is their relative importance in the course and in the discipline? Is the emphasis in the course on primary or secondary materials and why?

When we explicitly provide this information to our students, they can become better learners.
Many students are used to syllabi that simply deadlines, provide general information about required books and other materials, or give vague descriptions of assignments. If you're putting the work into making your syllabus a truly helpful guide to students, then you need to ensure they interact with this helpful document you've made. Consider the following:

  1. A syllabus quiz during one of the first classes of the semester. While it is helpful to read through your syllabus with students on the first day, it is also helpful to make them study it so that they really become familiar with the information it provides. Quiz questions can be on things like your office hours and email, the learning goals for the class, deadlines for assignments, etc. It can also be useful to let them 'grade' each other's quizzes with the answer key during the following class meeting. This way you can consider this a 'graded for completion' aspect of the class which cuts down on your workload as well as getting your students to once again revisit the syllabus.
  2. Provide them a way to leave anonymous feedback on the syllabus. Ask them to bring to your attention any areas that are unclear or about which they still have questions. This gets them reading your syllabus and it also shows that you value their input.
Check with your dean and chair to see if there are any specific required policies to include in your syllabus. Even if you aren't required to have any, check out this Syllabus Checklist. You might find some valuable templates for policies you hadn't considered.

It is highly recommended to include the following policies: Disability Accommodations, Title IX/Mandated Reporter Disclosure, Generative AI Policy.

Research has shown that a friendly tone in a syllabus leads students to perceive an instructor as kinder, more supportive, and motivated to teach (Harnish & Bridges, 2011; Ishiyama & Hartlaub, 2002).

For example, instead of:

NO CELL PHONES!! Using a phone during class is disrespectful and those who do so will be marked absent and/or asked to leave.

Consider something like:

Life is full of distractions, some we can choose and some we can’t. Having your phone out to complete non-essential tasks distracts not just you, but everyone else in class too. I ask you to be mindful of this and limit your phone use to what is strictly essential. If I notice that your device use is distracting others in class, I will reach out to you outside of class to address this.

Being mindful of tone can often increase the length of your policies. Given this, you might want to make sure you're only including a policy that you intend to consistently enforce.

Teaching with Generative AI

Since OpenAI launched ChatGPT in November 2022, every industry has had to adapt and evolve to integrate Generative AI into its daily operations, and this includes higher education. If you still have questions about what Generative AI is, then consider reading this LibGuide. Additionally, Duquesne University's service page for GenerativeAI also offers helpful guidance and includes a link to the updated TAP 26 (Acceptable Uses of Computing Resources). 

Duquesne University also has an enterprised Generative AI software, Microsoft Copilot, which you can access with your multipass. Access to this enterprise solution means that data shared with Copilot through your Duquesne Account is protected and won't be utilized for training of the AI.

All that out of the way, you're likely interested in ways to incorporate Generative AI into your teaching and your student's learning experiences. For ease, these tips have been organized into the categories of "Recommended Practices", "Unethical Use", "Sample Policy Statements", and "Tips for Prompt Writing".

If you have more questions, you can always schedule%20a%20consultation.

Teaching and Learning with Generative AI

These are some general ideas to get you started on thinking about both how you can use AI in your teaching, and how students can use AI in their learning.
  • Creating study guides/reading guides for students.
  • Generate fictional case-studies.
  • Proofreading.
  • Assist with generating rubrics and/or practice exams.
  • Outlining for a project/paper/presentation.
  • Easily create images for presentations.
  • Create a research schedule/timeline.
  • Function as an all-hours tutor.
  • Role-play interviews.
  • Require students to “teach” the AI about a subject.
Note: these all address professional ethics expectations. If you want to use Generative AI to write posts for your non-academic blog, plan a birthday party, meal plan, or whatever else, that’s completely up to you.
  • Do not submit personal identifying information (your own or your colleagues/students).
  • Do not use non-enterprise Generative AI tools that can view/respond to your emails.
  • Do not use AI for comprehensive grading of student work. It is only acceptable to use AI to assist with grading and if you do so you must be completely transparent with studies about the level of AI assistance in your grading.
  • Do not hide the use of GenAI: if you are using a GenAI tool to take notes during a meeting, plan an event, write a memo, come up with case studies, develop learning activities, translate an article, assist in grading, etc., then it is unethical to not disclose this.

The following are only recommendations, you are welcome to use them if you want but also encouraged to tailor them to your specific needs:

Example Permissive Statement: 

The use of generative AI tools is accepted in this class. Some assignments or in-class learning activities may even encourage or require it. AI use is becoming more widespread across various industries and fields. Given this, it is an important part of your education to build competency in the use of this technology. That said, utilization of an AI service to produce work without citing the use of that service may be treated as a violation of academic integrity. Additionally, if you use these tools, keep in mind that they may produce inaccurate and/or biased content. 

If you utilize AI for any part of your assignments and/or in-class activities, you must specify: 
    • What specific content was generated by AI 
    • What AI tool was used 
    • What prompt was used to generate the AI content 
      • Both the prompt and original response from the AI tool must be provided (as a screenshot or PDF) in order for your use to be considered “cited” properly. 
    • Why the content was generated with AI 


Example Limited Use/Moderate Statement: 

While I acknowledge that AI use is an important part of your education, the goals of this course emphasize other aspects of development and these AI tools create more barriers to these goals than they solve. That said, utilization of an AI service will be allowed in certain circumstances as designated per assignment or with my approval. Note that my approval still requires that you cite the use of AI tools, failure to do so will be treated as a violation of academic integrity. Additionally, if you use these tools, keep in mind that they may produce inaccurate and/or biased content. 
If you utilize AI for one of the allowed assignments and/or activities, you must specify: 
  • What specific content was generated by AI 
  • What AI tool was used 
  • What prompt was used to generate the AI content 
    • Both the prompt and original response from the AI tool must be provided (as a screenshot or PDF) in order for your use to be considered “cited” properly. 
  • Why the content was generated with AI 

Example Restrictive/No Statement: 

This course assumes that all work submitted by students will be generated by the students themselves, working individually or in groups. Students should not have another person/entity do the writing of any substantive portion of an assignment for them. This includes hiring a person or a company to write assignments as well as using artificial intelligence tools like Microsoft Co-pilot, ChatGPT, etc. 

GenAI processes prompts through the following hierarchy (in order from highest priority to least):

  • Task (what are you asking it to do)
  • Context (specifics to keep in mind while completing the task)
  • Exemplar (the completion of the task should be modeled after...)
  • Persona (what identity should the AI assume)
  • Format (mode of delivery)
  • Tone (e.g. casual or formal)

This hierarchy can help you clarify your prompts but it's worth noting that you don't need every component from this hierarchy in order to have a decent prompt. This sample prompt creates a situation where the AI acts like a student seeking tutoring services and it only uses four of the hierarchy components. Note that the words in parentheses at the start of each sentence are there to signal what role the sentence plays, but these words are not a part of the prompt itself:

(Persona) You are a first-year college undergraduate seeking my services as a tutor. (Task) Ask me questions about [insert topic] one at time. (Task) After I answer your question, follow-up with a clarifying statement so I can ensure your understanding. (Task) Throughout our conversation, get one or two of these clarifying statements wrong so I can correct you. (Task) If I miss a mistake you‘ve made, let me know. (Format) This should be written like a conversational exchange between student and mentor, short questions and responses are best. (Tone) Be friendly and collaborative but also respectful and deferent in your tone.

Rubrics

Grading Smarter, Not Harder

A rubric is a grading guide that makes explicit the criteria for judging students’ work on discussion, a
paper, performance, product, show-the-work problem, portfolio, presentation, essay question—any
student work you seek to evaluate. Rubrics inform students of expectations while they are learning.
These tools also enable teachers to grade efficiently, judge student work against a standard, and
communicate readily with each student.

Rubrics should not function as 'checklists' that students work through. Nor should they use vague qualitative descriptors like "uses strong arguments" because this doesn't clarify for your students how you are differentiating a 'strong' argument from a 'weak' argument. Be descriptive but also strive to be brief.

 

The Benefits of a Rubric

  • Grade consistently and efficiently against a standard.
    Faculty report that they grade more fairly and efficiently when using a rubric. They don’t have to keep repeating the same comment. They are more apt to be consistent when grading many papers (minimizing the “fatigue” factor) or when responding to students whose performance differs across assignments (minimizing the “halo” effect). The use of rubrics implies that you’re rating students’ work against a standard rather than against one another. Rubrics help you do a quick analysis of student work to see patterns of strength and weakness.
  • Provide rich feedback to students on their performance.
    Compare the information conveyed by a score (e.g., 85%) on an objective test of problem solving to that provided by a rubric which identifies areas of misunderstanding or omissions in the problem-solving process. Or, compare the information conveyed by a grade on an essay (e.g., “C”) to that provided by a rubric that rates performance in areas such as content, organization, style, and grammatical correctness. The scores and grades tell the learners the instructor’s overall rating of their performance and, perhaps, how they performed relative to other students in the class, but don’t provide guidance in how to improve.
  • Guide faculty in planning instruction.
    Once you have developed well-specified criteria and expected levels of achievement for a task, it may become evident that students need practice in various subtasks. For example, critical thinking in sociology may require understanding a political or economic context; and developing a persuasive argument about a historical issue may require understanding how primary sources can be used as evidence. Some tasks we set for students are complex, and we need to assure that they know how to carry out elements of the task before orchestrating these elements into a finished product.

Are we spoon feeding students? When students encounter new kinds of complex tasks, scoring
guides are useful in clearly setting forth the expectations. As students become familiar with the
conventions of writing or performance in your field, the instructions and feedback should be less
structured and detailed.

Consider involving students in the development of scoring guides especially when this simulates a
task they will face in their career (e.g., work performance reviews, analysis of team work, revision of their own or others’ writing, evaluation of a product, blind reviews of articles, search committees). You might give them a skeletal draft to fill in based on sample work they review. Students can use rubrics to review their peers’ or sample work so that they learn what the expectations are and see examples of stronger and weaker performance.

Example Rubric Categories Sorted by Discipline

The following is not an exhaustive list and you may also find that some categories would work well in multiple disciplines. This is just to help you brainstorm!

Healthcare Rubric Categories

  • Critical Thinking
  • Communication
  • Consequences
  • Leadership/Teamwork
  • Classification
  • Dosage/Use of Medication
  • Patient & Family Teaching


Math & Computer Science Rubric Categories

  • Notations
  • Explanation/Demonstration of work
  • Neatness
  • Errors
  • Strategy/Procedure
  • Completion

Science & Engineering Rubric Categories

  • Error Analysis
  • Notes
  • Summary
  • Safety
  • Hypothesis Statement
  • Experimental Design
  • Time Management
  • Design Optimization
  • Analyzation/Interpretation of Data
  • Computational Thinking/Process
  • Development and/or Use of Models

Business Rubric Categories

  • Service/Product Knowledge
  • Situational Analysis
  • Implementation Plan
  • Nonverbal Presentation Skills
  • Creativity
  • Objectives

Liberal Arts Rubric Categories

  • Clarity
  • Creativity
  • Thesis
  • Arguments
  • Word Choice/Diction
  • Grammar/Punctuation
  • Reflection
  • Accuracy

Links to External Rubric Resources

Rubistar
A helpful brainstorming tool that provides some template rubric categories for different disciplines and assessments. 

Syracuse University List of Sample Rubrics:
Repository of multiple sample rubrics from multiple different universities.

Cornell University’s Center for Teaching Innovation’s Guide to Rubrics:
Another perspective on rubrics with additional tips and food for thought.

Guide for Embedding Rubrics into Canvas Assignments:
Official guide from Canvas on how to utilize the embedded rubric feature.