Skip to Main Content

Assessment: HOME

Are our students learning?   What are they learning?  And how do we know?

Through educational assessment, Otis is committed to student learning, success, and the continuous improvement of curricular and co-curricular programs. Resources on best assessment practices, as well as links to the College Assessment Plan (CAP) and redesigned Program Review process are shared here.

“Assessment is the systematic collection, review and use of information about educational programs to improve student learning.  Assessment focuses on what students know, what they are able to do, and what values they have when they graduate. Assessment is concerned with the collective impact of a program on student learning.” - Marchese, 1987

Other Otis College sites about Accreditation & Assessment

This GUIDE contains practical information for Faculty. See also:

Accreditation (reports, accreditation letters, etc.)

Student Outcomes (student learning and engagement evidence)

Best Practices (part of the Teaching/Learning Center site)

4 steps

Assessment Methods and Types of Data

Specific evidence might include both quantitative and qualitative measures, direct and indirect evidence. Quantitative measures are assessment findings that are summarized with a number that indicates the extent of learning and could include: student retention and graduation rate trends (disaggregated by various demographic categories); number of non-majors taking courses; number of students in a minor; placement of graduates into jobs or graduate schools, post-graduate opportunities; graduating student and/or alumnx satisfaction surveys; and disciplinary ratings of the program.

Qualitative measures are assessment findings that are verbal descriptions of what was discovered or learned and could include the ongoing efforts by the department to respond to previous assessment results; performance evaluations of students’ senior show, thesis, capstone and/or thesis projects; and student/alumnx achievements.

Direct assessments “require that students display the extent of their learning by doing something” (Allen, 103). Examples of direct evidence include published tests, locally developed tests, work accompanied by a rubric, senior shows, juried shows, embedded assignments in course activities, capstones, senior theses, critiques, portfolios, exhibitions if scored using a rubric, student reflections (if developing them were intended PLO/CLO’s), score gains (value added) between entry and exit, observations of student behaviors, ratings of students skills by qualified raters/jurors, and competence interviews.

Indirect assessments “involve a report about learning rather than a direct demonstration of learning” (Allen, 103) and include surveys, interviews, focus groups, grade distributions, admission rates into grad programs, quality and reputation of grad programs in which alumnx are accepted, placement rates of graduates into appropriate career positions and starting salaries, alumnx perception of career responsibilities and satisfaction,  retention and graduation rates, student reflections of what they have learned over the course of the program, questions on course evaluations that ask about the course and not the faculty member, exit interviews, student participation in faculty work, publications, and conference presentations, honors, awards and scholarships.  (From Assessing Student Learning by Linda Suskie)

Assessment Committee

The Assessment Committee has the overall responsibility for developing the College assessment plan and is charged with planning, developing, and disseminating procedures for the assessment of institutional effectiveness, and assuring continuous improvement in educational programs and related services.

Working Structure and Responsibilities
The Assessment Committee works with programs to develop and maintain a framework for ongoing assessment and to promote a “culture of evidence.” The Assessment Committee supports the institution in the process of collecting, organizing, evaluating, and validating existing and new evidence-gathering and assessment methodologies in programs at Otis. The Committee supports a flexible assessment framework that allows for a diversity of evidence across programs in support of the improvement of student learning. The Committee oversees the development of vehicles to archive student learning outcomes at both a program and college-wide level; coordinates and reviews the criteria for Program Review; reviews outcomes from College-wide annual assessment and Program Review; and supports the WASC Accreditation Liaison Officer (ALO) in the preparing and writing of accreditation assessments.

The membership of the Assessment Committee reflects the institution-wide scope of assessment at Otis and consists of representatives from all programs.

The Provost’s Office provides the leadership for the Assessment Committee. Joanne Mitchell is Chair.


Joanne Mitchell
Assistant Provost for Academic Effectiveness and Accreditation
Accreditation Liaison Officer for WCSUC and NASAD

Otis College of Art and Design | 9045 Lincoln Blvd. Los Angeles, CA 90045 | MyOtis

Institutional Effectiveness and Assessment | MyOtis | Contact Provost Office