Indirect measures include perspectives, input, and other indicators (from students or others) that provide evidence related to student learning (e.g., perceived gains or confidence in specific skills or knowledge, motivation, satisfaction, the availability or quality of learning opportunities, student progress, etc.). Indirect measures gather information related to program-level SLOs — for example, senior exit surveys should include questions related to learning in the major and not just satisfaction with the college experience in general. Indirect measures can provide valuable information as to why students learned or did not learn, but they don’t provide direct evidence of what students have learned.
Types of Indirect Measures
Indirect measures come in many forms, and may vary to best meet the needs of the program. WSU encourages programs to choose measures that provide useful information to their faculty and fit with disciplinary expectations. Data from indirect measures may be quantitative (numeric) or qualitative (textual). See Assessment Data Analysis for more information about quantitative and qualitative data.
Student Perspectives and Experience:
|Focus Groups||Students discuss their experiences, motivation, and perspective about aspects of their educational experience, skills, or knowledge. See our Focus Groups webpage for more information.|
|Interviews||One-on-one dialog with a student to determine his/her perception regarding learning outcome achievement and related academic experiences.|
|Surveys: Student or Alumni||Students/alumni report their confidence or perceived gains in knowledge, skills and/or abilities related to program-level SLOs, their perspective on aspects of their educational experience, motivations, and rationales. May be locally developed (e.g. a departmental senior exit survey) or a standardized instrument (e.g. the National Survey of Student Engagement). See our Student Surveys (locally developed) and National Survey of Student Engagement webpages for more information.|
|Course Evaluations||Responses about learning outcomes, academic experiences, perceptions and motivation can provide useful data about curricular effectiveness. See our Course Evaluations webpage for more information.|
|Student Self-assessments or Reflections||Student self-reflection of their performance, experience, or processes, or evaluation of peer performance on a work product.|
Professional Perspectives and Input:
|Advisory Boards||Program consultations with an advisory group providing professional input on the program and/or skills perceived as important for graduates.|
|Faculty Reviews of Curriculum, SLOs, Syllabi, or Assignment Prompts||Faculty review, evaluate, and document where program-level SLOs are taught and developed in assignments, courses, and curricula.|
|Feedback from External Accreditors||Feedback from external accreditors providing professional input on the program, including its curriculum or SLOs .|
|Internship Supervisor, Preceptor, or Employer Feedback on Student Activities, Motivation, or Behavior||Typically a written evaluation of student experiences or general performance in a work setting.|
|Employer Surveys||Potential employers indicate the job skills they perceive are important for graduates, or provide other professional input on the program.|
Indicators of Progress, Success, Retention, etc:
|Course Grades||Course grades can give information about cohort progress through the curriculum, which can complement direct measures.|
|Internal Data||Centrally collected data (e.g. registration or course enrollment data, class size data, graduation rates, retention rates, grade point averages) can, for example, give information about cohort success or progress through the curriculum, which can complement direct measures.|
|Participation Rates||Student participation rates in research, internship, service learning, study abroad, and other activities, especially high impact practices, connected to student learning.|
Indirect Measures Resources and Toolkits
The following resources are provided to assist programs and faculty as they choose, develop, implement, and refine indirect measures for assessing student learning as part of program-level assessment. ACE is available to collaborate with undergraduate degree programs to design indirect measures for program-level assessment; contact us for additional information.
ACE provides resources for programs on particular indirect measures:
- Student Surveys (Locally developed)
- National Survey of Student Engagement (NSSE)
- Focus Groups
- Course Evaluations
Additional Resources and Scholarship
Montenegro, E. & Jankowski, N. (2020). A New Decade for Assessment: Embedding Equity into Assessment Praxis (Occasional Paper No. 42). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).
Note: Please contact ACE to borrow a book.
Suskie, L. (2018). Part 4: The assessment toolbox. In Assessing Student Learning: A Common Sense Guide. San Francisco, CA: Jossey-Bass.
Suskie, L. Blog posts categorized ‘How to Assess’, A Common Sense Approach to Assessment in Higher Education Blog. Available at: https://www.lindasuskie.com/apps/blog
University of Hawaii at Manoa Assessment and Curriculum Support Center. How To: Choose a Method to Collect Data or Evidence Website.