JUMP TO:
Assessment Fundamentals | Assessment in Specific Learning Contexts | Faculty and Student Engagement in Assessment | Ethical Conduct of Assessment
What Do We Mean by "Levels" of Assessment?
Assessments of learning occur throughout the institution.
Students, faculty, and staff are most familiar with assessment in courses. Grounded in specific learning outcomes, this work asks what have students learned and what can they do at the conclusion of a course, a course module, or even an individual task. Individual instructors are best suited to the design of assessments in their courses, though help is available through Duke’s Learning Innovation and Lifetime Education or you can contact a member of the Assessment team for a consultation.
Up a level from individual courses is learning within a program of study, for example an academic major. In such cases, we collect evidence of scaffolded learning across the program’s constituent courses. Artifacts from individual courses contribute to this body of evidence, but the course is not the focus of this level of assessment. Rather, we seek to understand the connections between courses and how they work together to achieve the learning outcomes of the program.
Although assessment in a degree program (e.g., major) is a traditional focus, some departments wish to study learning across a series of foundational courses that may be a subset of the major or in service of other disciplines. These questions can utilize measures at the level of the course (e.g., knowledge tests) but seek alignment of measures and evidence across the course series. Helping departments and programs understand student learning across courses is a major part of the Office of University Assessment’s work portfolio.
With respect to compliance with the SACSCOC Principles of Accreditation, assessment in the program is documented in Section 8.2.A. (pg. 21).
Assessment also occurs across the School or College. At this level, we typically focus on the general education in the undergraduate schools. The general education curriculum is the set of experiences and competencies we expect students to have and develop regardless of major. Such courses and course types are required for graduation, again, regardless of the major. Throughout the 2024-25 academic year, the Office of University assessment is developing the assessment plan for the emerging curriculum in Trinity College.
With respect to compliance with the SACSCOC Principles of Accreditation, assessment in the program is documented in Section 8.2.B. (pg. 21) and is indirectly represented in Section 7 (pg. 19).
The co-curriculum refers to experiences that occur outside formal classes, but nonetheless extend and enrich students’ development of competencies, skills, and habits of mind, such as teamwork, problem solving, and oral communication. Because engagement with the co-curriculum is not required of students and because of the wide diversity of events and programs, it is difficult to actualize assessment across the co-curriculum. Instead, we tend to focus on the impacts of individual organizations.
Students interested in assessing learning in their organizations should consult their faculty or staff advisor for assistance. They also are welcome to reach out to the Office of University Assessment.
Assessment Fundamentals
What is assessment and what is the assessment cycle?
A quick Google search will uncover dozens if not hundreds of common-sense definitions of “assessment.” In this office, we think about assessment as the ongoing process of rigorous self-study that:
- documents good educational practice,
- helps faculty and staff create, revise, or enhance learning opportunities for students,
- informs students’ own understandings of their development,
- enables rich discussions of our mission and values as a learning community, and
- provides evidentiary support for external reports including the requirements of accreditation and funding proposals.
The assessment cycle can be illustrated in a variety of ways (see these Google search results), but these visualizations all are based on the idea that assessment processes start with articulations of objectives, and moves through the collection, interpretation and discussion of evidence, before using findings to make informed plans for future teaching and learning. The process is iterative and introspective.
- George Washington University
James Madison University (LINK(link opens in a new window/tab))- Penn State University
Writing and Organizing Student Learning Outcomes
Well-conceived and well-worded outcomes are the foundation of an effective assessment plan. They should represent and operationalize the program’s mission in clear, measurable statements of students’ attainment and learning progress.
- Duke University (LINK)
- University of Wisconsin-Madison (LINK)
- James Madison University (LINK, LINK)
- IUPUI (LINK)
- Cal Poly (LINK)
Academic Learning Compacts (Florida State System) (LINK)- Checklist for good learning objectives (JMU) (LINK)
- Adelman, C. (2015, February). To imagine a verb: The language and syntax of learning outcomes statements (Occasional Paper No. 24). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). (LINK)
A curriculum map or matrix is an illustration of student learning outcomes across a learning experience (whether a topic, a course, or a program of study). The curriculum map can be used to:
- Understand the learning journey across experiences
- Facilitate discussion among stakeholders
- Select and schedule appropriate assessment tasks
- Summary (LINK)
- National Institute for Learning Outcomes Assessment: (LINK, LINK)
- University of Illinois (LINK)
- University of Cincinnati (LINK)
- University of Massachusetts (LINK)
DESIGNING OR CHOOSING MEASURES OR INSTRUMENTS
It also is helpful to understand some of the terms assessment experts use to characterize and evaluate the suitability of an assessment measure.
- Understanding terms: Qualitative and Quantitative approaches, a literature review
- Understanding terms: Formative and Summative (Yale)
- Understanding terms: Direct and Indirect measurement (SMU)
- Understanding terms: High- and low-stakes assessment (JHU)
- Understanding terms: Authentic and performance-based assessment (LINK, LINK)
- Penn State University (LINK)
University of Illinois (LINK)- Buffalo State College (LINK)
- Clauser, J. C., & Hambleton, R. K. (2017). Item analysis for classroom assessments in higher education. In Handbook on measurement, assessment, and evaluation in higher education (pp. 355-369). Routledge. (LINK)
- D'Sa, J. L., & Visbal-Dionaldo, M. L. (2017). Analysis of multiple choice questions: Item difficulty, discrimination index and distractor efficiency. International Journal of Nursing Education, 9(3). DOI: 10.5958/0974-9357.2017.00079.4 (LINK)
- DeMars, C. (2010). Item Response Theory. New York, NY: Oxford University Press. (LINK)
- Haladyna, T.M. & Downing, S.M. & Rodriguez, M.C. (2002). A review of multiple-choice item-writing guidelines for classroom assessment. Applied Measurement
- in Education, 15(3), 309-334. (LINK)
- Quaigrain, K., & Arhin, A. K. (2017). Using reliability and item analysis to evaluate a teacher-developed test in educational measurement and evaluation. Cogent Education, 4(1), 1301013. DOI: 10.1080/2331186X.2017.1301013
Standardized Tests Used in Trinity College to study Curriculum 2000
- California Critical Thinking Skills Test (at Duke, external links)
- Defining Issues Test (at Duke, external links)
- Global Perspectives Inventory (at Duke, external links)
- Quantitative Literacy and Reasoning Assessment (at Duke, external links)
- Carleton College (LINK(link opens in a new window/tab))
- University of Calgary (LINK(link opens in a new window/tab))
- Yale University (LINK(link opens in a new window/tab))
- University of Illinois (LINK(link opens in a new window/tab))
- University of New South Wales (LINK(link opens in a new window/tab))
- Duke Learning Innovation(link opens in a new window/tab) regularly works with Duke faculty to design and plan for learning assessments in large courses.
MEAUSURE AND INSTRUMENT TYPES
Asking good test questions (Cornell LINK; University of Illinois LINK)- Selecting and designing instruments (James Madison University) (LINK)
- Where to look for pre-existing instruments (James Madison University) (LINK)
- Multiple-choice exams (LINK)
- An annotated bibliography of test development (LINK)
- Benjamin, R., Miller, M. A., Rhodes, T. L., Banta, T. W., Pike, G. R., & Davies, G. (2012, September). The seven red herrings about standardized assessments in higher education. (Occasional Paper No. 15). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). (LINK)
- Vanderbilt (LINK)
- University of Nebraska (LINK)
- University of Toronto (LINK)
- University of Warwick (LINK)
- Carlton College (LINK)
- See the peer-reviewed journal Assessing Writing (LINK)
- See also Duke University Writing Studio) and Writing in the Disciplines program.
- For questions about scaling essays in larger courses, contact Duke Learning Innovation & Lifetime Education for a consultation.
There are many peer-reviewed journal pieces and other publications supporting and critiquing course evaluation processes. The following selections focus on the use of evaluation results to inform teaching practice and program-level assessment.
- Understanding advantages and disadvantages of surveys (LINK, LINK, LINK)
- Swarthmore College (LINK)
- Cornell Univeristy (LINK)
- Carlton College and knowledge surveys (LINK))
- NASPA Foundation (LINK)
- Duke University survey policies (LINK, LINK)
- Duke Initiative on Survey Methodology (LINK)
- Qualtrics at Duke (LINK))
Analysis of Evidence
Methods of analysis vary by academic discipline. Both quantitative and qualitative approaches are appropriate for analyzing assessment evidence. Before you analyze your data, consider the benchmarks or targets which would represent success or a positive outcome of your study.
Illinois State University (LINK)- Washington State University (LINK)
University of Virginia (LINK)- James Madison University (LINK)
- Data visualization (LINK, LINK)
- See also Duke Center for Data and Visualization Sciences (LINK)
- The National Council on Measurement in Education (NCME) also developed videos on the “anatomy of measurement” (LINK).
- Murphy, S. A. (2015). How data visualization supports academic library assessment: Three examples from The Ohio State University Libraries using Tableau. College & Research Libraries News, 76(9), 482-486. (LINK)
- Zilvinskis, J., & Michalski, G. V. (2016). Mining text data: Making sense of what students tell us. Professional File. Article 139, Fall 2016. Association for Institutional Research. (LINK)
Why and when to use a rubric?
- Cornell University (LINK)
- Arizona State University (LINK)
- UC Berkeley (LINK)
- University of Oklahoma (LINK)
- Andrade, H. (2005). Teaching with Rubrics: The Good, the Bad, and the Ugly. College Teaching 53(1):27-30. (LINK)
- Reddy, Y. M., & Andrade, H. (2010). A review of rubric use in higher education. Assessment & Evaluation in Higher Education, 35(4), 435-448. (LINK)
- Stevens, D. D. & Levi, A. (2005). Introduction to Rubrics : An Assessment Tool to Save Grading Time, Convey Effective Feedback, and Promote Student Learning. Sterling, VA: Stylus Publishing.
- Stanny, C.J. & Nilson, L.B. (2014). Specifications Grading: Restoring Rigor, Motivating Students, and Saving Faculty Time. Sterling, VA: Stylus Publishing.
Types of rubrics explained
Examples of rubrics
The Duke University Writing Studio and Duke Learning Innovation can assist Duke faculty, staff, and students with the development of effective rubrics.
The AAC&U VALUE project (LINK) makes available to the general public 16 detailed rubrics (LINK) to guide teaching and evaluation. Users are encouraged to extend, adapt, or blend the rubrics as relevant to their assessment goals. The following examples follow the VALUE project schema.
Intellectual and practical skills
- Inquiry and analysis (LINK)
- Critical thinking (LINK, LINK, LINK)
- Creative thinking (LINK,
LINK, LINK) - Written communication (LINK, LINK, LINK)
- Oral communication (LINK, LINK, LINK)
- Reading (LINK, LINK)
- Quantitative literacy (LINK, LINK, LINK)
- Information literacy (LINK, LINK, LINK)
- Teamwork (LINK,
LINK, LINK) - Problem solving (LINK,
LINK,LINK)
Personal and social responsibility
- Civic engagement—local and global (LINK), LINK, LINK, LINK)
- Intercultural knowledge and competence (LINK, LINK)
- Ethical reasoning (LINK, LINK, LINK)
- Foundations and skills for lifelong learning (LINK)
- Global learning (LINK, LINK)
Integrative and applied learning
Communication and Use of Findings
Many assessment practitioners use the term “Closing the loop” to describe the resolution of an assessment process. The interpretation of evidence leads to well-informed updates to the curriculum or educational practice. This stage usually involves the sharing of written reports and/or presentations with specific recommendations for action.
There are a variety of ways assessment findings can be shared with others. The suitability of the medium or venue depends on the confidentiality of the findings and the audience’s need-to-know.
Options for sharing findings include:
- Discussions within department or program meetings
- Essays published in peer-reviewed journals, trade publications, or popular media
- Newsletters within the department or program
- Summaries on the department’s or program’s website
- Student gatherings like the Majors Fair
Discussions among faculty and staff may be the most critical venue as it invites real-time dialogue and deliberation. Moreover, this is the context in which decisions will be made about the curriculum, courses, pedagogy, and student support services. Acting upon information is an essential stage of good assessment.
Examples of decisions include:
- Changing program requirements
- Updating content of one course to prepare students for a subsequent course
- Offering additional training and support to TAs
- Seeking summer funding for updates to course pedagogy
- Adding an assessment measure to fill in gaps in information.
The term “Closing the Loop” is a bit of a misnomer because the assessment cycle never really closes. As we make evidence-guided adjustments to our work, we restart the process with new or revised learning outcomes and updated targets for student learning.
Research Utilization: An annotated bibliography (LINK)- Cummings, G. G., Estabrooks, C. A., Midodzi, W. K., Wallin, L., & Hayduk, L. (2007). Influence of organizational characteristics and context on research utilization. Nursing research, 56(4), S24-S39. (LINK)
- Estabrooks, C. A., Floyd, J. A., Scott‐Findlay, S., O'Leary, K. A., & Gushta, M. (2003). Individual determinants of research utilization: a systematic review. Journal of advanced nursing, 43(5), 506-520. (LINK)
- Weiss, C. H. (1979). The many meanings of research utilization. Public administration review, 39(5), 426-431. (LINK)
- Weiss, C. H. (1993). Where politics and evaluation research meet. Evaluation practice, 14(1), 93-106. (LINK)
- Examples of effective use of assessment results (LINK)
- Diery, A., Vogel, F., Knogler, M., & Seidel, T. (2020, June). Evidence-Based Practice in Higher Education: Teacher Educators' Attitudes, Challenges, and Uses. In Frontiers in Education (Vol. 5, p. 62). Frontiers. (LINK)
- Fulcher, K. H., Smith, K. L., Sanchez, E. R., & Sanders, C. B. (2017). Needle in a Haystack: Finding Learning Improvement in Assessment Reports. Professional File. Article 141, Summer 2017. Association for Institutional Research. (LINK)
- Huberman, M. (1994). Research utilization: The state of the art. Knowledge and policy, 7(4), 13-33. (LINK)
- Jankowski, N. (2021, January). Evidence-based storytelling in assessment. (Occasional Paper No. 50). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment. (LINK)
Assessment in Specific Learning Contexts
- Civic engagement (
LINK, LINK) - Student leadership (LINK,
LINK) - Living-learning communities (LINK, LINK, LINK
- Undergraduate research (LINK, LINK, LINK)
- Study abroad or away (LINK, LINK))
- Finley, A. (2019, November). A comprehensive approach to assessment of high-impact practices (Occasional Paper No. 41). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). (LINK)
Diversity, Equity, and Inclusion
- Anti-racism at Duke (LINK)
- Update on AIR Activities to Advance Diversity, Equity, and Inclusion (LINK)
- Be Part of the Solution: Antiracism in Institutional Research (LINK)
- Centering Racial Equity Throughout Data Integration (LINK)
- 5 Steps to Take as an Antiracist Data Scientist (LINK)
- Truth, Racial Healing & Transformation (TRHT) AAC&U Campus Centers (LINK)
- MacKinnon, D., & Manathunga, C. (2003). Going Global with Assessment: What to do when the dominant culture's literacy drives assessment. Higher Education Research & Development, 22(2), 131-144. (LINK)
- McNair, T. B. (2020). We Hold These Truths: Dismantling Racial Hierarchies, Building Equitable Communities. Association of American Colleges and Universities. 1818 R Street NW, Washington, DC 20009. (LINK)
- Montenegro, E., & Jankowski, N. A. (2020, January). A new decade for assessment: Embedding equity into assessment praxis (Occasional Paper No. 42). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). (LINK)
- Aguillon, S. M., Siegmund, G. F., Petipas, R. H., Drake, A. G., Cotner, S., & Ballen, C. J. (2020). Gender differences in student participation in an active-learning classroom. CBE—Life Sciences Education, 19(2), ar12. (LINK)
- Elwood, J. (2006). Gender issues in testing and assessment. The Sage handbook of gender and education, 262-278.
- Garvey, J. C., Hart, J., Metcalfe, A. S., & Fellabaum-Toston, J. (2019). Methodological troubles with gender and sex in higher education survey research. The Review of Higher Education, 43(1), 1-24. (LINK)
- Johnson, E. A., Subasic, A., Beemyn, G., Martin, C., Rankin, S., & Tubbs, N. J. (2011). Promising practices for inclusion of gender identity/gender expression in higher education. The Pennsylvania State University LGBTA. (LINK)
- MacNell, L., Driscoll, A., & Hunt, A. N. (2015). What’s in a name: Exposing gender bias in student ratings of teaching. Innovative Higher Education, 40(4), 291-303. (LINK)
- Seifert, T. A., Wells, R. S., Saunders, D. B., & Gopaul, B. (2013). Unrealized educational expectations a growing or diminishing gender gap? It Depends on Your Definition. Professional File. Article 134, Fall 2013. Association for Institutional Research. (LINK)
- Vantieghem, W., Vermeersch, H., & Van Houtte, M. (2014). Transcending the gender dichotomy in educational gender gap research: The association between gender identity and academic self-efficacy. Contemporary Educational Psychology, 39(4), 369-378. (LINK)
- Montenegro, E., & Jankowski, N. A. (2017, January). Equity and assessment: Moving towards culturally responsive assessment. (Occasional Paper No. 29). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). (LINK)
- Manathunga, C. (2009). Research as an intercultural ‘contact zone’. Discourse: Studies in the Cultural politics of Education, 30(2), 165-177. (LINK)
- Students’ perceptions about assessment in higher education (
LINK) - Involving students in the assessment process (
LINK, LINK) - Finney, S. J., Sundre, D. L., Swain, M.S., & Williams, L. M. (2016). The validity of value-added estimates from low-stakes testing contexts: The impact of change in test-taking motivation and test consequences. Educational Assessment. (LINK)
- Turos, J. M. (2020, March). Actively engaging undergraduate students in the assessment process. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). (LINK)
- Truncale, N. P., Chalk, E. D., Pellegrino, C., & Kemmerling, J. (2018, March). Implementing a student assessment scholars program: Students engaging in continuous improvement. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). (LINK)
- Wise, S. L. & DeMars, C. E. (2005). Low examinee effort in low-stakes assessment: Problems and potential solutions. Educational Assessment,10 (1),1-17. (LINK)
- Planning learning outcomes through student employment opportunities (LINK, LINK)
- What is the Scholarship of Teaching & Learning (SoTL)? (LINK(link opens in a new window/tab), LINK(link opens in a new window/tab))
- Duke Learning Networks (LINK(link opens in a new window/tab))
- Duke Learning Innovation event calendar (LINK(link opens in a new window/tab))
- Duke Office of Faculty Advancement (LINK(link opens in a new window/tab))
- Duke Faculty Affairs (LINK(link opens in a new window/tab))
- Duke Graduate School Preparing Future Faculty program (LINK(link opens in a new window/tab))
- Duke Graduate School Certificate in College Teaching (LINK(link opens in a new window/tab))
- Duke Graduate School Emerging Leaders Institute (LINK(link opens in a new window/tab))
- Banta, T. W. (Ed.). (2002). Building a scholarship of assessment. John Wiley & Sons.
- Cain, T. R. (2014, November). Assessment and academic freedom: In concert, not conflict (Occasional Paper No. 22). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). (LINK(link opens in a new window/tab))
- Hutchings, P. (2010, April). Opening doors to faculty involvement in assessment. (Occasional Paper No. 4). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). (LINK(link opens in a new window/tab))
- Gold, L., Rhoades, G., Smith, M., & Kuh, G. (2011, May). What faculty unions say about student learning outcomes assessment. (Occasional Paper No. 9). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). (LINK(link opens in a new window/tab))
- Polychronopoulos, G. B., & Leaderman, E. C. (2019, July). Strengths-based assessment practice: Constructing our professional identities through reflection. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). (LINK(link opens in a new window/tab))
- Stanny, C. J. (2019, July). Promoting an improvement culture. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). (LINK(link opens in a new window/tab))