12th Annual Reference Research Forum
2006 ALA Annual Conference, New Orleans, LA
"User Expectations: a Sense-Making Approach to Mental Models of Information Seeking"
Presented by Lynn Westbrook, PhD, Assistant Professor, School of Information, University of Texas at Austin
As individuals make sense of the complex and dynamic process of information seeking, they employ mental models (MM) of that process. Those models underpin their expectations and behavior patterns. MM explain why certain tools, search tactics, and services are expected to be available, productive, or problematic. Within the socio-cognitive framework of sense-making, the following research problems are posed: (1) How do university students envision or model the process of their own academic information seeking? (2) What are the implications of those models for academic reference service? This study utilizes two techniques to identify and analyze MM components of academic information seeking in two populations. In the first sub-study, 55 graduate students in two MA reference classes used narrative and visual means of describing their MM of the academic information-seeking process. In the second sub-study, 950 transcripts of chat-reference transactions during the 2004-05 academic year at a major university were examined as MM artifacts of real-life, contextualized situations. A meta-analysis will identify components and characteristics of users' mental models of the academic information seeking process.
"Quick & Easy Reference Evaluation: gathering the users' and the providers' perspective"
Presented by Jonathan Miller, Head of Hillman Public Services, University Library System, University of Pittsburgh
We created an outcomes-based method of evaluating reference service quality that is, from both the users' and providers' perspective, quick and easy to complete. This method can be implemented in both in-person and online reference services. We identified the general desired outcomes of an academic library reference transaction and designed a two-part survey instrument based on those outcomes. We analyze the results to determine how close we came to meeting those outcomes, the differences between the perceptions of users and providers across all respondents, but also, where possible, in terms of sub-populations within and between users, providers, and institutions.
"Measuring the Library's Presence in CMS"
Presented by Scott Collard, Librarian for Education, Psychology and Linguistics, New York University and Nadaleen Tempelman-Kluit, Instructional Design Librarian, Bobst Library, New York University
The recent push by libraries to add research resources to content management systems has provided users with another access point to library services. It has also provided librarians with the opportunity to highlight key resources that might be less obvious among the collection of links on library homepages. While the subset of links added to content management systems are presumed most relevant and useful for users, librarians' mental models often tend to differ from those of our constituents. It's important, therefore, to test our assumptions about the use and relevance of the links chosen for integration into CMS. At NYU, we have been doing this not only by quantitative measures like hit counting, but also by adding a qualitative survey to the Library Links page in our CMS. This survey has been used to help us determine not only the relative success of the venture, but also whether some of the previously hidden resources on the library homepage have been of use to our students. The results have provided us with the opportunity to evaluate our assumptions about what is important for our users, and to re-design based on their needs, rather than our assumptions of their needs.