Assessment

Evaluate programs and services using measurable criteria.

STATEMENT OF COMPETENCY


Assessment refers to a range of continual processes by which libraries aim to measure the effectiveness of their strategic plans, reference services, instructional initiatives, outreach and promotion efforts, and managerial practices, among other variables (Atkinson, 2017, pp. 422-423). By evaluating such variables against measurable criteria—whether through survey data, documented standards and guidelines, or other means—librarians can place their services and policies within valuable context, yielding informed strategies for improvement.

Subscription applications such as LibQUAL+, developed by the Association of Research Libraries’ (ARL), allow libraries to generate quantitative data regarding user perception of staff, services, collections, usability, and physical spaces (Megwalu, 2019, p. 34). As an added benefit, libraries subscribing to LibQUAL+ are granted access to survey data from over 1,200 other member libraries over a five-year period, thus allowing subscribers to assess their key institutional variables within an industry-wide context via benchmarking processes (Babalhavaeji, et al., 2010; Megwalu, 2019). Atkinson and Walton (2016) add that by issuing LibQUAL+ surveys repeatedly, libraries can conduct continuous assessments over time, as they progressively adjust their services according to user needs (p. 2). Given that users assess each variable on a numerical scale within LibQUAL+ surveys, the resulting data is quantitative and easily measured.

Librarians can collect additional quantitative data for assessment purposes by recording the frequency with which articles and eBooks are read, books are checked out, and reference transactions are initiated by users; resulting data, and fluctuations in data over time, can be consulted when revisiting collection development decisions, or scheduling reference desk personnel, for example (Babalhavaeji, et al., 2010).

By measuring services and policies against lists of guidelines and standards—such as the Association of College and Research Libraries’ (ACRL) Standards for Libraries in Higher Education, and the Reference and User Services Association’s (RUSA) Guidelines for Behavioral Performance of Reference and Information Service Providers—libraries can engage in qualitative assessment practices, which are by nature more open to interpretation and ambiguity than quantitative assessment data, but carry the benefit of facilitating assessment on a richer, more comprehensive level.

ACRL (2018) standards are articulated via nine principles, which include “institutional effectiveness,” “management/administration/leadership,” and “space”. Furthermore, each principle contains subsections, referred to as performance indicators, which address principles at a deeper level of granularity. For example, while collections is the fifth principle, performance indicator 5.2 aims to measure whether “[t]he library provides collections that incorporate resources in a variety of formats, accessible virtually and physically” (p. 12).

RUSA’s (2013) five core guidelines—comprising “visibility/approachability,” “interest,” “listening/inquiring,” “searching,” and “follow-up”—resemble the structure of a standard reference interview. Four of five guidelines contain subsections, denoting recommended practices within general, in-person, and remote settings; a few of those subsections contain additional subsections, yielding up to four levels of granularity. For example, guideline 3.1.6 (“listening/inquiring” > “general”) aims to assess whether a librarian within a given reference transaction “[s]eeks to clarify confusing terminology and avoids jargon” (p. 4). Assessing reference practices against RUSA guidelines, and the entire scope of academic library practices against ACRL standards, yields quantitative data which librarians can analyze for patterns and themes, and subsequently utilize within improvement efforts.

SWOT analysis presents another qualitative assessment tool, which libraries typically initiate ahead of strategic planning initiatives; by interpreting and recording a library’s perceived internal strengths and weaknesses, alongside external opportunities and threats, librarians can form a contextual basis for strategic planning, while indicating areas of improvement or decline since any previous SWOT analyses (Jordan-Makely, 2019, pp. 294-295).

Instructional librarians commonly establish their own measurable criteria ahead of instruction sessions and other educational contexts, in the form of learning outcomes. For example, a key learning outcome within the lesson plan for an instruction session might state that learners are to “demonstrate the ability to locate subscription databases within the library’s website.” By implementing assessment measures—such as focused discussions or quizzes—librarians can assess learner comprehension of instructional material against goals indicated within learning outcomes (Benjes-Small & Miller, 2017, p. 73.) Oakleaf (2018) indicates that assessment protocols within instruction practices allow librarians to improve such services, while demonstrating the value of successful instructional practices within outreach efforts to professors, and appeals to administrators for further funding (p. 249).

COMPETENCY DEVELOPMENT

My understanding of assessment practices within librarianship emerged during INFO 204, when I conducted a SWOT analysis of the Oakland Public Library’s central branch, and assessed the implications of these findings within the context of strategic planning. Within INFO 210 (Reference and Information Services), I was tasked with evaluating several reference transactions according to RUSA guidelines, which developed my understanding of qualitative assessment practices. INFO 256 (Archives and Manuscripts) exposed me to qualitative means for assessing potential archival acquisitions, while INFO 254 (Information Literacy and Learning) introduced me to relationships between learning outcomes and assessment methods, within the context of library instruction. INFO 294 (Professional Experience: Internships) offered me direct experience with assessment, as my competency throughout my internship was assessed based on learning outcomes written at the beginning of the term.

SELECTED ARTIFACTS

Through the following pieces of written work, I aim to demonstrate my understanding of assessment practices within library settings, and their benefits to the ongoing improvement of library services, collections, and policies.

INFO 204 - SWOT Analysis copy.pdf

INFO 204 – SWOT Analysis

After collecting demographic data for context, I conducted this SWOT analysis during a visit to Oakland Public Library’s central branch. The resulting document presents an assessment framework which can prove especially useful within strategic planning contexts, and when contrasted against previous SWOT analyses for measurement purposes.

INFO 254 - Observation Report and Analysis - Taylor Kaplan copy.pdf

INFO 254 – Observation Report and Analysis

This report contains observations gleaned from an instruction session I observed at Lewis & Clark College’s Watzek Library, in Portland, OR. The session was conducted by the library’s fine arts liaison librarian, and addressed an interdisciplinary undergraduate course, titled “Music and Social Justice”. I met with the librarian prior to the session, and discussed her pedagogical approaches and stated learning outcomes, which provided valuable context to the observation process. In my analysis of the session, I evaluated the session’s structure and utility, as well as the librarian’s instructional strategies, against the ACRL’s Framework for Information Literacy in Higher Education; this exercise proved useful, and exposed me to the value of using the Framework as an assessment tool within instructional contexts.

INFO 256 - Appraisal Assignment - Taylor Kaplan copy.pdf

INFO 256 – Appraisal Assignment

I conducted this appraisal assignment from the perspective of a special collections librarian within Bancroft Library, at University of California, Berkeley. Using an existing appraisal template from that library, I assessed the potential acquisition of a collection of manuscripts, financial records, photographs, ephemera, and other items from the Ladies’ Relief Society: a now-defunct Oakland, CA charity which provided aid to women and children experiencing homelessness and poverty. The collection is evaluated based upon its local significance, its institutional and monetary value, and its physical condition and size, in addition to estimated storage and digitization costs. This appraisal demonstrates my ability to conduct quantitative assessments of incoming library materials.

INFO 294 - Final Report - Taylor Kaplan copy.pdf

INFO 294 – Portland State University Library Internship: Final Report

I wrote this report at the conclusion of my Summer 2022 internship at Portland State University Library. Given that my supervisor and I began my internship process by collaboratively writing learning outcomes, this report necessitated my own assessment of my competency in meeting these established learning outcomes. This process further convinced me of the value of learning outcomes, as measurable criteria by which to evaluate comprehension and performance.

INFO 210 - Discussion Two copy.pdf

INFO 210 – Discussion Two: Phone Reference Transaction

Within this discussion post, I evaluate a phone reference transaction, which I initiated by calling Oakland Public Library’s central branch. I utilized RUSA guidelines as measurable criteria by which to assess each step of the transaction, and determined that although my information needs were fulfilled, RUSA guidelines were not always met. This analysis points to the value of guidelines as nonbinding frameworks, which can inform services and their assessment without exercising unnecessary control.

CONCLUSION

Without the application of measurable criteria, assessment practices in library settings would inherently lack context, continuity, and perspective. By implementing criteria including user-driven survey data, raw quantitative data collected by librarians, professional organizations’ standards and guidelines, and learning outcomes, among other forms, librarians can evaluate their institution against others, measure internal progress over time, and identify gaps between institutional conditions and established goals.

REFERENCES

Association of College & Research Libraries. (2018). Standards for libraries in higher education. https://www.ala.org/acrl/standards/standardslibraries

Atkinson, J. (2017). Academic libraries and quality: An analysis and evaluation framework. New Review of Academic Librarianship, 23(4), 421-441. https://dx.doi.org/10.1080/13614533.2017.1316749

Atkinson, J., & Walton, G. (2017). Establishing quality in university libraries: Role of external frameworks. New Review of Academic Librarianship, 23(1), 1-5. http://dx.doi.org/10.1080/13614533.2016.1271238

Babalhavaeji, F., Isfandyari-Moghaddam, A., Vahid Aquli, S., & Shakooii. (2010). Quality assessment of academic libraries’ performance with a special reference to information technology-based services: Suggesting an evaluation checklist. The Electronic Library, 28(4), 592-621. https://doi.org/10.1108/02640471011065409

Benjes-Small, C., & Miller, R. K. (2017). The new instruction librarian: A handbook for trainers and learners. ALA Editions.

Jordan-Makely, C. (2019). Libraries as bureaucracies: A SWOT analysis. Library Management, 40(5), 294-304. https://doi.org/10.1108/LM-03-2018-0019

Megwalu, A. (2018). LibQUAL+ and LibQUAL Lite. The Charleston Advisor, 20(3), 34-39. https://doi.org/10.5260/chara.20.3.34

Oakleaf, M. (2008). Dangers and opportunities: A conceptual map of information literacy assessment approaches. portal: Libraries and the Academy, 8(3), 233-253. https://doi.org/10.1353/pla.0.0011

Reference & User Services Association. (2013). Guidelines for behavioral performance of reference and information service providers. https://www.ala.org/rusa/resources/guidelines/guidelinesbehavioral