Seattle Midwinter Meeting Minutes
January 21, 2007
10:30 am – 12:30 pm
Committee members present:
Gregory Crawford, Chair
Ann Marie Short
Greg reported that several sections on the Committee’s webpage have been updated but have not yet been posted. A subcommittee is working on an update to the “Guide for New Members” and once it’s completed, it will be sent out.
Minutes from yesterday’s meeting were distributed.
Other updates relevant to our Committee:
Greg reported on his meeting with the MARS/MERS (Management of Electronic Resources and Services) Committee whose members indicated that they would like to collaborate with our Committee on future projects. They are experimenting with a collaborative Web site and soliciting proposals for a virtual poster session about the evaluation of virtual reference services to be mounted on the MARS Web site prior to the Annual Meeting. The next step could be a program on how libraries are evaluating virtual reference services.
Ruby Licona reported that the LAMA MAES (Measurement, Assessment and Evaluation Section) would also like to collaborate with us and reported that there are at least 24 groups at the ALA, Division and Section committee level involved in assessment with whom the ERUS Committee could collaborate. LAMA is planning a program at the annual conference with three speakers on data collection for managers and a LAMA MAES discussion group also met today to discuss survey fatigue.
A new book on assessment was just published: Revisiting Outcomes Assessment in Higher Education by Peter Hernon, Robert E. Dugan, and Candy Schwartz (Paperback: 472 pages Publisher: Libraries Unlimited (January 30, 2006) ISBN-10: 1591582768).
Our Committee’s work over the last few years has been the development of the Guidelines for Measuring and Assessing Reference Services & Resources. Susan Ware and Greg Crawford then left to present the Guidelines to the RUSA Standards and Guidelines Committee. It will then be submitted to the RUSA Board for approval. Greg will send out an email to update us on the results.
Future Directions of the Committee
David Vidor next led a discussion on future directions for the Committee.
Options and discussion follow:
• The Committee could organize a program or discussion forum. With two meetings scheduled during each conference, one could be a regular meeting and one a discussion group.
• The Guidelines (whether approved or not) could be discussed in some type of forum, a discussion forum or a program.
• The Vancouver British Columbia Public Library provides 4-hr sessions addressing training and expectations. The Behavioral Guidelines are a good starting point…We might look at turning the evaluation of services into a discussion about student retention and the assessment of student learning. It is vital to prove relevancy or worth to see the library as part of the retention of students. An exit interview was also used at BC Public.
• Another option is to target courses in terms of evaluation or assessment to see if the library was used. Faculty could be also be surveyed…not with courses already covered in BI but with others that require reference help…or compare classes with and without instruction in terms of help at the reference desk.
• The library is generally not part of the institutional evaluation. What pieces do the libraries cover in their mission? We need to spotlight what we’re doing and how it fits in.
• A model is already created with Google and helping students to find great electronic resources, but do they have a mental model for print materials? How do we make the best use of print resources? How do needs assessment of upper- level students address competencies of librarians at the desk. Is the quality of the work better in print or electronic…is it up to the librarian to make a judgment? This speaks to the core competencies of librarians – overviews versus more detailed resources. We need to assess both – librarian competencies and student competencies and the crossover between instruction and reference.
• What constitutes success? How do we measure success at the reference desk and how does that relate to student learning? Is there some way to evaluate competency and engagement? There is tension between reference and instruction and satisfaction versus success. Literature reviews show a pleasant experience is more important than the right answer. Most people would prefer a pleasant experience.
• Another assessment need is to measure library use or reference desk usage after instruction to see if there is peak usage after instruction.
• A program on assessment, gathering statistics would be helpful.
• It would be interesting to show the dollar value or a sort of shopping comparison between Google and library resources. Server logs and email and chat reference might provide some information but there isn’t a single instrument that does all of this to provide the big picture.
• What would be most immediately helpful? A call to see what is being used to assess usage, examples of what other people have done.
The meeting was adjourned.
Minutes submitted by Colleen Seale.