Teaching, Learning & Assessment

Preparing annual assessment reports and plans

Below are guidelines for what to include in each section of the assessment report and plan. Keep in mind that the more people involved in all stages of assessment -- from planning to implementing to discussing and responding to results -- the better. Involving students in as many parts of the process as possible is strongly encouraged.

Templates for assessment plans and reports can be downloaded from Academic Affairs.

Programs can request assessment support from Teaching, Learning, and Assessment.

Assessment reports

1. College and/or program mission statement

  • The mission statement guides the identification of which learning outcomes are most important.

2. Program and institutional learning outcomes

  • See Creating SMARRT Learning Outcomes
  • List all Major (Program) Learning Outcomes (MLOs).
  • Indicate which MLOs are aligned to which Undergraduate Learning Outcomes (ULOs) or Graduate Learning Outcomes (GLOs).
    • Alignment indicates that when students demonstrate achievement of the MLO they simultaneously demonstrate achievement of one or more ULOs/GLOs. For example, successful completion of an assignment designed to assess student achievement of an MLO may require competency in written communication (ULO 1 - WC), information literacy (ULO 1 - IL), critical thinking (ULO 1 - CT), and integrative knowledge (ULO 3).

3. Curriculum map

  • The curriculum map should identify in which courses each MLO (and any ULOs/GLOs not aligned with an MLO) is introduced (I), developed (D), mastered (M), and assessed for degree-level proficiency (A).
  • Curriculum maps often overuse the D designation. D should only be used to indicate courses for which students do work explicitly aligned to the MLO, are provided with instructor feedback specifically related to that MLO, and are given the opportunity to revise in response to that feedback. Courses where development of the MLO is implicit (i.e. likely to happen as a result of doing work focused on other outcomes) should not be labeled D.
  • For more guidance, see this module on Creating and Analyzing Curriculum Maps.

4. Five-year assessment plan

  • A timeline for implementation of future activities will help facilitate your assessment plan as described in the Program Review Manual.
  • Every MLO and ULO or GLO must be assessed once during every 5-year learning outcomes assessment cycle. A second assessment is not required but may be especially beneficial so programs can gauge the impact of interventions.
  • Ideally all ULOs/GLOs are aligned with the MLOs and do not require a separate assessment.
  • For undergraduate programs, the CSUMB institutional-level ULO assignment guides, rubrics, and rubrics can be used to help identify which MLOs align with which ULOs. These tools were developed to assist programs in developing assignments that better help students demonstrate achievement of MLOs (see recommendations under "How to use" in the ULO support section).

5. Recent closing the loop activities

  • See Using Results to Support Student Learning
  • Explain what other MLOs and ULOs or GLOs have been assessed recently and what changes have been made in response to those assessments.
  • Explain whether the current assessment directly connects to those other improvement efforts or if it is a new or different effort.
  • If a new or different effort, explain whether the other improvement efforts were successful and/or are ongoing.

6. Assessment plan rationale & critical concern/question

  • Identify which MLO and/or ULO/GLO is being assessed and what is the critical question/concern.
  • If the MLO is aligned with an undergraduate learning outcome (ULO) or graduate learning outcomes (GLO), those should also be identified.
  • Explain where the critical question/concern came from (e.g., from existing assessment data, program/college discussions, student discussions, national standards for the discipline, etc.).
  • Explain how the critical question/concern was selected (e.g. in a department meeting, a recommendation from a committee, an external program review, etc.).
  • Present the critical question/concern in a clear manner that is understandable to different stakeholders (e.g., students, faculty/administrators, community partners, and community members).
  • Review the learning outcome statement being assessed and make sure it is SMARRT (specific, measurable, action-oriented, reasonable, relevant, and time-bound). If it is not, consider identifying that as a limitation to the assessment and how and when that outcome will be improved.
  • If there are prior assessments relevant to the question/concern, explain how the current assessment builds on previous assessment results, including prior closing the loop activities that led to improvements.
  • Explain how the assessment plan rationale and critical question relates to program strengths, key areas needing improvement, and prior closing the loop activities.

7. Assessment methodology

  • Explain what student work will be (or was) collected and the relevance of that work to the learning outcome being assessed. Include the assignment guidelines in an appendix.
  • In addition to doing direct assessment of student achievement using student work, also consider gathering results from relevant indirect assessments (e.g. from instructor or program surveys; student course evaluations; student self-assessments; institutional surveys such NSSE, CSUMB; etc.).
  • Identify how work samples will be (or were) selected. Ideally work should be randomly selected, but if not, explain why not.
  • Identify how many student work samples will be (or were) evaluated. The Assessment Committee recommends at least 30 samples. If less than ideal, explain why.
  • Explain what instruments/measures/rubrics will be (or were) developed (or used) to evaluate student work. If developed, include them in an appendix.
  • Explain how student work will be (or was) evaluated. Ideally rubrics should be calibrated and normed and each student work sample should be evaluated by more than one person. If less than ideal methods were used, explain why.
  • Identify the range of courses and level of faculty and staff participation in assessment process. If less than ideal, explain why.
  • Describe about how assessment results will be (or were) analyzed.
  • NOTES
    • Direct & indirect measures: Ideally assessments include both direct measures (e.g. course-embedded assignments assessed with rubrics aligned to the MLO, exit exams, capstone presentations, etc.) and indirect measures (e.g. senior exit surveys, focus groups, written reflections by students, course evaluations, alumni surveys, CLC data, etc. of student learning. Direct measure relevant to the direct measures are required. Indirect measures are strongly encouraged. Institutional-level data is available on the IAR website and in most cases can be disaggregated by college and program.
    • Assessment vs. Research: Keep in mind that this is an assessment for generating ideas on how to improve the program and student learning, not a research project for creating generalizable results to be published in a disciplinary, peer-reviewed journal. The goal is improving, not proving and you just need enough data to start a conversation.

8. Results and interpretation of results

  • Make sure results can be easily understood by key stakeholders (e.g. students, faculty/administrators, community partners, and community members).
  • Make sure you have interpreted results accurately. If there are multiple interpretations, identify them.
  • “Triangulate” assessment results from multiple sources (e.g. direct assessment of student work; course, department, and/or institutional survey results; program-specific data on student engagement with Cooperation Learning Center programs and resources; course evaluations; etc.).
  • Identify and discuss limitations, particularly those related to validity (i.e. the degree of alignment of the assignment or any other measure to the learning outcome being assessed: to what extent did you measure what you wanted?) and reliability (the extent to which the ratings of student achievement by those scoring the work was accurate and consistent: did different scorers consistently score the same work at the same level?).
  • Make sure the conclusions presented are reasonable and stem directly from the assessment results.

9. Dissemination

  • Describe how the assessment results will be (or were) disseminated and how you will (or did) tailor presentation of results for the intended audience.
  • Make sure dissemination was done in a timely manner so that the results are still relevant and can be used to make improvements. If not disseminated in a timely manner, explain why not.

Assessment plans

1. Assessment plan rationale & critical concern/question

See above

2. Assessment methodology

See above

3. Dissemination

See above

4. Planned closing the loop activities

See Using Results to Support Learning

Over the coming year, which previously assessed MLO(s) and/or ULOs/GLOs are you focusing on for closing the loop activities? Those can include, but are not limited to creating new assignments or revising existing ones; better scaffolding of assignments; implementing new pedagogical approaches; working with the learning center; revising the program's curriculum map; etc. Additional closing the loop activities can include, but should not be limited to, improving assessment, such as revising learning outcomes, improving rubrics, better rubric calibration and norming, implementing signature assignments across sections or courses, etc.

If the assessment results suggest opportunities for increasing student achievement, identify specific "closing the loop" strategies. Those can include, but are not limited to creating new assignments or revising existing ones; better scaffolding of assignments; implementing new pedagogical approaches; working with the learning center; revising the program's curriculum map; etc. Additional closing the loop activities can include, but should not be limited to, improving assessment, such as revising learning outcomes, improving rubrics, better rubric calibration and norming, implementing signature assignments across sections or courses, etc.