This edition of Keeping Up With… was written by Candice Benjes-Small, Maura Seale, Alex R. Hodges, and Meg Meiman.
Candice Benjes-Small is the Head of Research Services at the William & Mary Libraries, email: CandiceBenjes-Small@wm.edu.
Maura Seale is the History Librarian at the University of Michigan, email: firstname.lastname@example.org.
Alex R. Hodges is the Faculty Director of the Monroe C. Gutman Library at the Harvard Graduate School of Education, email: email@example.com.
Meg Meiman is the Head of Teaching & Learning at Indiana University-Bloomington Libraries, email: firstname.lastname@example.org.
What is critical assessment?
In academic libraries, we assess people and programs in order to demonstrate how well we are achieving our goals. Often, we focus on the quality of user experiences, growth in student learning and success, and measures of institutional improvement. Critical assessment pushes us to consider the roles of power and privilege in the design of our learning measurement methods, and to give voice to the people involved with the assessment. Assessment is often motivated by a sense of accountability; libraries need to prove their worth to others. Critical assessment deplores such a perspective, as it can treat the students, faculty, and staff as a means to an end. Instead, critical assessment perceives everyone involved with assessment--from the people designing and performing the assessment to those who are being assessed--as individuals affected by social, political, and economic drivers, and seeks to account for those factors in societal change. For example, job placement statistics among college graduates may be more closely linked to their socioeconomic status than to graduating from a particular college. Student use of the library might be similarly connected to economic class. 
According to DeLuca Fernández, critical assessment:
● Exposes and addresses structures and systems of power and, privilege and structures;
● Considers thoughtfully the histories and contexts of everyone involved with the assessment;
● Makes explicit assumptions and intentions; and
● Eschews colorblind and ideological neutral claims. 
We like to think of assessment as being neutral and objective, but critical assessment helps us realize that this is not possible. Fallible humans create the assessments. The selected tools and methods reflect our positionality. As Ebony Magnus recently stated, a critical assessment framework “incorporates mindful practice in which power and positionality are at minimum laid bare if not actively questioned, and the agency and authority of participants is respected and held paramount. This means accepting that assessment is not neutral, nor are mechanisms of data creation and collection.” 
What can librarians do?
When designing and implementing assessment tools such as surveys or rubrics, we can include students and colleagues who share the identities of the population being studied. According to Ciji A. Heiser, Krista Prince, and Joseph D. Levy, this “helps practitioners challenge power dynamics, be more inclusive of diverse identities, address assumptions, disrupt ideological neutral claims, and acknowledge implicit biases throughout the assessment process.” 
When using survey tools developed by external entities, consider how they fit (or don’t fit) into critical assessment. As a profession, we depend on existing survey tools because many are tested for reliability and validity, allowing us to trust their design as well as the results. But we also need to consider how such surveys fit into the critical assessment paradigm. Who develops the surveys? They may be highly credentialed, but do their identities reflect the assessed population? Are the questions they create examined for implicit bias? What ideologies or belief systems inform the assessment instruments we currently use, and how might we critique them?
For example, Ebony Magnus, Jackie Belanger, and Maggie Faber argue that library surveys like LibQual+® follow the accountability narrative, with an emphasis on quantity (“how many” and “how often”) and avoid considering the reasons why people answer the way they do.  The focus on customer service also frames people who use the library as consumers who must be pleased, rather than active participants in shaping library services.
Finally, librarians involved with instruction may have other opportunities to pursue critical assessment. Lyda Fontes McCartin and Rachel Dineen advocate for moving away from surveys and using more authentic assessments like rubrics, journals, and portfolios.  While the authors admit that achieving the true mission of critical assessment--transformative social justice--may be beyond the library’s abilities, they nevertheless advocate for breaking down the barriers between teacher and student, between librarian and user, in order to make the library and the classroom more democratic spaces.
In a presentation at LILAC 2018, Kyle Feenstra asked, “How can the library make space for the voice of the learner, ensuring that it is visible and validated as a meaningful expression alongside the privileged voices of academics and dominant university discourses?”  This thought-provoking question is relevant to how we assess our work. Critical assessment reminds us to intentionally involve others in our assessment practices, giving them equal weight to our own efforts; to critique existing survey tools; and to use more authentic forms of assessment. By doing so, we can uncover systemic inequities and start to effect real change in higher education.
 Torche, F. 2011. "Is a College Degree Still the Great Equalizer? Intergenerational Mobility across Levels of Schooling in the United States." American Journal of Sociology 117(3), 763-807.
 DeLuca Fernández, Sonia. "Critical Assessment." Webinar delivered for Student Affairs Assessment Leaders (SAAL) Structured Conversations series. December 9, 2015. Retrieved from: http://studentaffairsassessment.org/files/documents/SAAL-SC-Critical-Assessment-sdf-9-dec-2015-FINAL.pdf.
 Magnus, Ebony. “Critical Assessment Practices: A Discussion on When and How to Use Student Learning Data.” 2019. Webinar sponsored by ACRL ISMLC. Retrieved from https://youtu.be/TCPKf_Kf9q0.
 Heiser, Ciji A., Krista Prince, and Joseph D. Levy. 2017. “Examining Critical Theory as a Framework to Advance Equity Through Student Affairs Assessment.” The Journal of Student Affairs Inquiry, 3(1). Retrieved from https://jsai.scholasticahq.com/article/1621-examining-critical-theory-as-a-framework-to-advance-equity-through-student-affairs-assessment.
 Magnus, Ebony, Jackie Belanger, and Maggie Faber. 2018. “Towards a Critical Assessment Practice.” In the Library With the Lead Pipe. Retrieved from http://www.inthelibrarywiththeleadpipe.org/2018/towards-critical-assessment-practice/.
 McCartin, Lyda Fontes and Rachel Dineen. 2018. Toward a Critical-Inclusive Assessment Practice for Library Instruction. Sacramento, CA: Library Juice Press.
 Feenstra, Kyle. 2018. “The Process is the Outcome: A Framework for Student ‘Research as Praxis’.“ Retrieved from https://www.slideshare.net/infolit_group/the-process-is-the-outcome-a-framework-for-student-research-as-praxis-feenstra.
Badia, Giovanna. 2017. “Combining Critical Reflection and Action Research to Improve Pedagogy.” Portal : Libraries and the Academy 17 (4): 695–720. https://doi.org/10.1353/pla.2017.0042.
Graf, Anne Jumonville and Benjamin R Harris. 2016. “Reflective Assessment: Opportunities and Challenges.” Reference Services Review 44 (1): 38–47. https://doi.org/10.1108/RSR-06-2015-0027.
Gregory, Lua and Shana Higgins. 2017. “Reorienting an Information Literacy Program Toward Social Justice: Mapping the Core Values of Librarianship to the ACRL Framework.” Communications in Information Literacy 11 (1): 42-54. https://doi.org/10.15760/comminfolit.2017.11.1.46.