Interest Groups Report on Activities at ALA Midwinter in San Diego
The reports below are summaries of the activities that took place during meetings of ALCTS interest groups held during the 2011 Midwinter Meeting in San Diego. Included are groups whose reports were received by the editor as of February 15, 2011. For information on discussion groups not shown here, see the ALCTS Organization page on the ALCTS web site.
Automated Acquisitions/In Process
The 2011 ALA Midwinter program consisted of three presentations on Demand-Driven Acquisitions:
- PDA at Iowa: Learning by Doing. Presented by Mike Wright, Head of Acquisitions and Rapid Cataloging, University of Iowa Libraries
- EBL and PDA at OSU. Stefanie Buck, Instructional Design and eCampus Librarian, Oregon State University
- ASU-Patron Driven Acquisitions. Jeanne Richardson, Chief Officer, Collections and Scholarly Communication, Arizona State University
A lively discussion followed. Clare Appavoo, Coutts, was elected co-chair, replacing Michael Zeoli who deserves our sincere thanks for all of his fine work on behalf of the group.
Creative Ideas in Technical Services
Twenty-six librarians gathered at the San Diego Convention Center for the 2011 ALA Midwinter Meeting of the ALCTS Creative Ideas in Technical Services Interest Group. About three weeks prior to the meeting, Tony Fang, the IG chair and vice-chair Libbie Crawford selected discussion topics and developed leading questions to facilitate the conversations. The topics were disseminated to several discussion lists and social networks in advance of the meeting.
Attendees devoted the first hour to roundtable group discussions and participants chose the topic they found most inviting or germane to their work. At the end of the discussion period, one member of each group presented a brief summary of the group’s discussion. Following is a summary of the discussions held by each group.
Subject Access in Online Catalogs
Facilitator/Recorder: Rocki Strader, Ohio State University
The participants at this table represented university libraries, a public library, and a small private law library.
Subject access is important to users and a value added service that libraries provide, although treatment varies by type of library. University libraries try to be as specific as possible, which can mean up to eight or ten subject headings in some cases. Public library participants and the private law library tend to use one subject heading (two at most), and are usually more general.
Some libraries remove non-LCSH controlled terms (i.e., not coded 650 _0), and one university library accepts records as-is, including non-LCSH and subject headings in the language of non-English catalogs. They are indexed as keywords.
The group discussed the assignment of subject keywords in MARC 653 field. Only one of the university libraries followed this practice, and only when they are provided by the author. The items in question are typically theses and dissertations.
The participants did not use different treatments for different types of materials. In the interest of time, and to get records into the catalog, subject analysis may be deferred and records updated as time permits, but this is usually done for large sets of like materials (such as electronic theses/dissertations, gifts in foreign languages).
Participants indicated that they used different treatments at different stages of processing. Most of them indicated that “brief” (author/title/publisher) records are sometimes used for acquisitions and later replaced with fuller records after the items are received.
For this group, it seemed to come naturally to ponder the future of LCSH. Most of the participants agreed that LCSH, or, in a broader sense, controlled vocabulary, will be around for a while. The need is to “push LCSH to the front of the OPAC,” as one participant put it. The main issue is educating users (including other librarians) about their usefulness for collocating search results and for helping to direct searches, and how to do so in the OPAC. Many OPACs will not show a full record in the main display of a title, and most users do not know that a complete description is available and doing so often requires one or two extra clicks to get to a full record display. Users may not know that in some (if not all) OPACs, subject headings are hot links that can link to other materials with those same headings.
There was a consensus that keywords are not bad but lack to provide enough information. Keywords make a good entry into the catalog, while subject headings are good at collocating results and helping users to focus or redirect their queries. User and librarian education are needed to demonstrate how to access those capabilities in the OPAC to take full advantage of the information provided in the catalog.
Workflow Efficiency: Good Practices and Great Outcomes
Facilitator/recorder: Sarah Quimby, Minnesota Historical Society
When considering workflow efficiency, one of the first things librarians cite are tools. The discussion opened with a mention of some technological tools, such as macros, scripts, and batch processing. Many at the table found MarcEdit to be extremely helpful, and those who had not heard of MarcEdit got a detailed description of this free software’s editing and processing capabilities.
The discussion then segued into the topic of subject analysis and experts. One participant mentioned that there was not an easy way to improve subject analysis, and only time and experience could speed things up. Another attendee mentioned that her library had a basic template with subject headings that they use for sets of materials. Another library partners with language and subject experts within their consortium for advice on less-familiar materials. An observation was made that the amount of original cataloging is increasing, with the aid of crosswalks and spreadsheets.
Other workflow refinements mentioned include having reference librarians labels books, and no longer measuring books for dimensions or checking for duplicate call numbers.
The focus of the discussion shifted to how outsourcing affected workflow. One participant noted that her library uses WorldCat Cataloging Partners for Yankee and firm orders to receive shelf-ready materials and deliver them to patrons faster. They provide quality control through an exception check that their IT department set up. The bulk of the subsequent discussion was spent on quality control of outsourced records and e-book records. At one library, circulation staff performs quality control; at another, students do the checking. At another institution, catalogers are now performing a lot of the loading of records and rely on exception reports to identify quality control errors. Exceptions usually go to the copy catalogers, freeing the original catalogers to handle the unique items and archival materials. When one participant asked what to do if the bulk of vendor records are of poor quality, the answer was to complain loud, long, and often.
Participants next discussed how to handle different work styles and preferences. Discussion of this question focused on matching the right person with the work, and distinguishing between an allowable preference and a rule.
When the discussion turned to strategies for handling backlogs, one attendee responded that once his library switched to accepting vendor records, the backlog disappeared. Several people cited special collections as an area where backlogs can persist, often for generations. A participant mentioned finding a collection of VHS tapes from the 1980s and 1990s with notes attached saying “Checked OCLC. No copy”.
The remaining discussion time was spent on two topics: combining acquisitions and cataloging departments (one person said that there is a mental divide between the two, but they share the same workflow) and downsizing. Several people mentioned that they were required to work on the reference desk and that they had to do more with less.
Encroachment of Management Responsibilities on Technical Services Job Description
Facilitator/recorder: Barbara Louise Albee, Indiana University School of Library and Information Science
Participants of this discussion table were from various backgrounds: serials librarian, catalogers, head of technical services, and a SLIS faculty member. All of the participants agreed that technical services is very production driven. The quality of the catalog is still an issue of vital importance and it impacts other departments in the library. The full cataloging record is still very useful. However, professional librarians in technical services often find themselves being pulled away from their actual work. They do have other responsibilities such as committee meetings, tenure requirements, and meetings in general. Many meetings often have nothing to do with the actual work in technical services.
It is very clear that technical services librarians need to set expectations. This should be done when one accepts a position. They often do a great deal of work that is outside of their job descriptions. They are pulled in to do work in other areas. It is important to meet with management to keep communication open as to what you are doing and accomplishing. Some strategies include efforts to streamline processes, routinely review workflows, and to trust catalogers to work from home.
Participants discussed how to keep up with developments within the profession. Attendees all found ways to stay current such as attending conferences, joining discussion lists, watching webinars, talking with vendors, networking with colleagues, and reading research articles. The profession is constantly evolving, and it is a challenge to stay current. Suggestions included brown bag luncheons, and talking among library staff to tap into their expertise. Guest speakers may be invited to talk to staff. We need to be creative to educate ourselves. There is a lot of pressure on newer librarians to take on “new stuff,” such as RDA and new technologies.
The group heard the following presentations which were part of the theme “Electronic Resources Management as a Public Service: Delivering quality content at the right time, in the right place.”
- The Long Road to ERM: Are we there yet? Donna Scanlon, Electronic Resources Coordinator, Library of Congress,
- EResource Access Support: Go Team! Athena Hoeppner, Electronic Resources Librarian and Ying Zhang Acquisitions Librarian, University of Central Florida Libraries
- Right Here, Right Now: Using a Discussion Forum to Resolve Electronic Resource Access Issues Elizabeth Babbitt, Electronic Resources Librarian, Montana State University
- Business Analytics and Intelligence: Leveraging Data to Enhance User Experience Andrew McLetchie, Senior Data Analyst, ITHAKA/JSTOR
The FRBR IG proudly sponsored two presentations during the 2011 ALA Midwinter Meeting: 1) Ying Zhang and Athena Salaba (Kent State University) presented "FRBRizing existing MARC records at expression and manifestation levels" and 2) Manuel Urrizola (University of California, Riverside) presented "Coping with RDA/FRBR anxiety." A total of ninety-eight conference attendees came to the program.
Heads of Technical Services of Medium-Sized Academic and Research Libraries
More than sixty people attended the Heads of Technical Services in Academic Libraries Interest Group meeting at the 2011 ALA Midwinter Meeting. There were five very lively tabletop discussions.
The group discussed what technical services faculty and staff are doing with digital initiatives. Respondents reported that they are doing actual digitization and metadata creation; they have various roles with the Institutional repository; they are handling electronic theses and dissertations (ETDs); they are providing MARC and non-MARC metadata; and that barriers that exist (or have been/are broken down) between digital library departments and technical services; and they are overseeing coordination with Special Collections.
Staffing is always an issue. Some staff resist learning a new metadata schema. The discussion included appropriate ways to train staff, turf wars and other impediments to a smooth transition to working with digital objects. Staffing issued included the staff reluctance to learn new skills, retirements, and other change related issues. The question of traditional staff involvement in digital initiatives led to a discussion of what traditional tasks are no longer being provided to free up staff to work on digital projects. These included stopping serials check-ins, outsourcing where possible; prioritizing work to give staff have a clear hierarchy for their time and responsibilities. In the past, people who worked in technical services created the rules. But since the adoption of AACR2, the people drawn to technical services are rule followers. We need to attract the creators once again. Position titles and department names were important for some universities in helping recognize the changes the departments are undergoing. This report is based on notes taken by Tami Morse McGill,
Streaming Audio and Video
Participants discovered that most libraries are considering streaming media. Several had implemented streaming in some fashion. Alexander Street Press was cited as a vendor used for streaming audio and video. Two libraries used home grown programs for streaming. One library streamed video only for persons with a hearing disability. There was no consensus on discovery tools for streaming video. Many pointed users to the vendor’s web site. One library noted a large jump in usage of streaming video after records were added to the catalog. The consensus was that separate records either created via MARC Edit or through records supplied by the vendor was the best practice.
Copyright was also discussed. Many libraries are awaiting the outcome of a pending copyright case at UCLA. UCLA has a detailed media policy they are willing to share with other libraries. There was discussion of the possibility of just one streaming video server and homogeneity of video collections over time. That threat could be ameliorated by streaming locally created video. The consensus is that users in the future will prefer streaming to any other audio/video access. Notes were provided by Rebecca Stroker.
Patron Driven Acquisitions
Participants at this table felt they represented libraries running the gamut. Some were in the consideration phase while and others had fully implemented it. One library is using patron-driven acquisitions (PDA) for print; requests are automatically routed by ILL to acquisitions if the request meets the criteria. Most of the other libraries were using PDA for e-books. A pilot project was suggested as a way to get buy in from reluctant faculty and staff. Most libraries who had instituted an e-book PDA were using EBL or Ebrary and related a positive experience. Many experienced a much quicker use of the PDA allocation than anticipated. The point of purchase needs to be negotiated with each vendor. Implementation is relatively easy but should include library wide involvement. Separate bibliographic records were suggested in case the program had to be terminated. Participants were advised to maintain close contact with the vendor because initial plans often need renegotiation. Some issues with consortial PDA included who paid for the purchase, determining whether access will be through individual catalogs or a shared catalog, and getting other libraries interested in a shared arrangement.
RDA Technical Group
Maritta Coppieters provided notes for this group, which held diverse opinions on planning for RDA. Some were waiting for the report from the three United States national libraries following the completion of RDA testing which is expected to be released in June 2011 and others had begun thinking through the implications. One Ex Libris library noticed 3XX fields (MARC 336, 337 and 338) appearing in their brief display and dropping to the bottom of the MARC display. Another library had to look at load tables line by line to add missing data for both bibliographic and authority records. That library was not interested in display issues since they used WorldCat local and OCLC will handle the display issues. Bibliographic and authority records for both AACR2 and RDA will have to co-exist in catalogs. Authority control vendors are offering different options including the ability to send RDA or AACR2 records. Some libraries wanted ILS vendors to FRBRize their displays. Since some vendors are not doing this, OCLC is willing to make displays appear to be FRBRized, even if the record is not FRBR compliant. There appeared to be a lot of frustration with ILS vendors not being able to truly handle records for Works, Expressions, Manifestations, and Items. Instead, the vendors are implementing new displays that somewhat mimic a true FRBR display. Consensus was that once one vendor implemented FRBR, it would have a competitive advantage. Most libraries do not plan to change their existing bibliographic records, though conversion would be useful for audiovisual and media materials. Some libraries may never choose to implement RDA. There is debate on whether patrons will really care which standard we use. Local decisions will have to be made and documented for consistency. No attendee had heard of a library who had decided not to implement RDA and RDA training was discussed. Few libraries have resources to train staff. There is hope for some national training. Training is available from Lyrasis, Minitex and Amigos. Some felt that paraprofessionals did not want to make the decisions allowed by RDA and prefer to be told what to do. There is a theory that people not trained in AACR2 will learn RDA easier. It may take a generation change in technical services for full RDA implementation. Some libraries are doing individual training. Many libraries have outsourced cataloging and receive records from vendors. Will vendors implement RDA and provide those records? Will there be different pricing models for AACR2 records and RDA records? If there is, can libraries afford the change?
Preparing for RDA
The eleven participants at the table were asked to share their experiences to date in preparing for RDA. Activities included contracting for and viewing webinars offered by ALCTS, AMIGOS, and other bodies; purchasing and making available the RDA Toolkit; engaging public services staff in discussions about catalog indexing and display issues arising from RDA; considering the impact of RDA when members of a consortium share the same catalog; considerations about how to implement RDA changes in headings which could result in split files; alerting library administrators to the need for thinking about the impact of RDA on cataloging in terms of dollar costs, training needs, etc.; writing and making available to copy cataloging staff procedures for implementing RDA and integrating it with the current catalog and cataloging procedures.
Participants also raised issues such as: the impact on the use of MARC; what needs to be done to get the in-house system ready and working with those vendors; experience participating in the RDA test period; exposing staff to FRBR notions by holding workshops, showing them “manifestations” from the collection and discussing FRBR relationships (manifestation to work, person to work, etc.), On the whole, table participants did not seem dismayed by the prospect of implementing RDA, possible in a few months, but are alert to the many challenges and pitfalls for which we must plan. This report is taken from notes provided by Norm Medeiros.
MARC Formats (LITA/ALCTS)
The meeting of the MARC Formats Interest Group drew a crowd of approximately one hundred attendees to hear presentations on the group’s topic for the year, “Will RDA mean the death of MARC?” The topic was further described by a set of questions: “The end of the MARC formats has been predicted for years, but no serious alternative format has risen up to challenge MARC. Will the introduction of the new RDA code precipitate the demise of MARC? Will RDA require the description of content and functionality that cannot be accommodated by the MARC formats, or that can be more easily accommodated by alternative content formats? If so, what format(s) will replace MARC? And if MARC does continue to thrive, how will it have to change to accommodate the new content descriptions in RDA?”
Christopher Cronin, University of Chicago, delivered the first presentation titled, “The Need for Transformational Change to Our Metadata Infrastructures.” He argued that RDA will not kill MARC, unless we want this change, by which he meant that the inadequacies of the format will make it untenable in the future. He pointed out that we must be prepared to deal with the fact that any wholesale move from MARC to another format will inevitably result in the loss of some data, and will be quite expensive. Cronin discussed the idea that the future of data management is in URIs, rather than literal values in records, and that MARC is completely incapable of handling these non-literal values. However, he was uncertain as to what format would ultimately succeed MARC as the format of choice for our catalog data management. He argued that it is actually more important that we find a format or formats that can be understood by other data management communities and that can function in the semantic web. He acknowledged that this is expensive, time-consuming, difficult work that will require large-scale commitment from libraries, system vendors and other players.
Jacquie Samples, recently of Duke University, discussed the need to develop a successor format. In an apt metaphor, she likened MARC to a very old king who had done great things in his youth, but had been lingering on his deathbed for many years without a clear line of succession, and without a suitable contender for the crown available. Samples pointed out that much of the data in our current records is included for data management purposes, and not to assist patrons in resource discovery. However, many of the alternative formats that might replace MARC tend to simplify data to aid resource discovery, even though this reduces data management capabilities. Any format that replaces MARC wholesale will have to retain these data management capabilities.
The final speaker, Kelley McGrath, University of Oregon, explained how the more richly detailed data required by RDA will quickly hit up against the structural limitations of the MARC formats, and how, if we wish to take advantage of the possibilities offered by this new, richer data, with better defined relationships, we will need to find an alternative to MARC. She noted that the banking world was using jerry-rigged computer software in the late 1990s that was built on top of code originally written in the 1960s. The difficulty and expense involved with revamping and updating the code made it slow to change, but the threat of the Y2K scare finally pushed the banking world into updating their software. She argued that the library world needs its own Y2K event that outweighs the costs and inertia preventing the development of a successor to the MARC format.
The panel presentations were followed by questions from the audience.
Metadata Interest Group
Corey Harper, Metadata Librarian, New York University, presented on “Linked Library Data: 2010-2011 Update.” These updates include new developments since the 2010 ALA Annual Conference. More national libraries are publishing data, with examples from Germany, Hungary and Britain. There has been growth in the use of linked data, illustrated by a cloud graphic published by the Comprehensive Knowledge Archive Network. Best practices for libraries and linked data are still in development. There is real value in library data—rich stores of metadata combined with robust, controlled vocabularies. There is a W3C LLD XG (Library Linked data) incubator group composed of researchers, new graduates, and librarians. The group has begun collecting, curating and clustering over fifty use cases. They will issue a report in summer 2011. Developments in RDA include the registration of RDA elements, roles, and vocabularies as well as IFLA FRBR and ISBD elements and vocabularies. Discussion is underway of who will maintain the RDA and vocabularies registry.
Oliver Pesche, EBSCO Information Services, discussed Institutional Identifiers: NISO I2 Working Group” established by NISO in 2008. It is composed of members from all sectors of the information supply chain and chaired by Grace Agnew (Rutgers University) and Pesch. Their charge is to develop a robust, scalable, interoperable standard that would uniquely identify institutions and describe relationships between them. Information on the I2 Working Group can be found on NISO’s web site. The standard should be lightweight to manage, reusable by business sector registries, and interoperable with legacy applications. In answer to the question of why the creation of this standard is necessary, Pesch responded that an institution is a critical entity in any information model. It often establishes the provenance of digital information. It also has to be global and unambiguous as well as interoperable and unique. It must be able to integrate with existing workflows. It should support smooth and seamless access to information.
The group is working considering the idea of a central registry which would assign identifiers to new institution records, store core metadata about those institutions and provide lookup services. Thus, a registration agency would be needed that would manage information, including addresses and consortia/IP addresses. Draft metadata elements have been created. To prevent duplication of effort, the group looked at a few existing identifiers, including MARC codes, OCLC symbols and the SAN (Standard Address Number). However, they started focusing on exploring collaboration with ISNI: International Standard Name Identifiers. This scope of this identifier is public identification of any entity involved in creation, production, management and content distribution change. It is an actual identifier, a sixteen digit number. ISNI is investigating the possibility of including the identifier in the VIAF authority files. Detailed information on both presentations can be found in the MIG blog.
Business Meeting Program Planning Update
by- Amanda Harlan and Rhonda Marker
Harlan reported that MIG is working on a program for audio metadata and waiting to hear back from one speaker. Mike Casey will give an update on the Audio Engineering Society SC-03-06 Working Group on Digital Library and Archive Systems. MIG is seeking a third presenter who can provide practical experience of using the AES draft standard AES-X098B, Audio object structures for preservation and restoration.
The discussion then focused on the second proposed presentation about metadata creation tools. Marker reported that unfortunately, none of the suggestions have panned out. Libraries often invest resources to build these tools but they do not seem sustainable or widely available as open source. There is a market out there for these tools but very few providers. Discussion centered on whether it would be useful to look at lessons learned regarding these tools. Various ideas were discussed for the Sunday morning program, including: table talks discussing these metadata tools with a facilitator at each table or an “ask the expert” Q&A session.
- Kristin Martin
Martin reported that ALA moved the blog and it works well. It has a cleaner update and there is not as much incoming spam. Efforts to make the blog active by encouraging more activity throughout the year did not yield many postings. Harlan noted that the Texas Digital Library Metadata Group tries to blog every week, and suggested that links to those posts could be posted to the MIG blog. Appreciation was expressed for Martin’s postings of Best Bets for Metadata Librarians.
Shawn Averkamp and Kevin Clair
Averkamp reported their work to date:
How does MIG’s presence look on ALA Connect? All the necessary information about MIG needs to be on the ALA web site and in Connect.
What else besides programming should be on the MIG site? The web site should include links to resources and/or tools relevant to what metadata librarians are working with.
CC:DA Update- Nathan Putnam
Putnam reported that CC:DA had a short agenda and met only on Sunday; there was no Monday morning meeting. LC is waiting for the report that will follow the conclusion of RDA testing. The RDA Programming Task Force will hold two forums at the 2011 ALA Annual Conference.
LITA Update- Susan Cheney
Cheney reported on upcoming programs at the 2011 ALA Midwinter Meeting that might be of interest to attendees.
Music Library Association Update- Jenn Riley
There is currently an MLA task force that is working with LC on form thesauri. One of the problems encountered with music is that both forms and instruments are used in subject headings and genres. There is talk of creating an instrumentation vocabulary and the need to adjust MARC.
MLA is also involved in description of rare materials for music that has been out over a year in draft form.
The MLA Metadata Subcommittee is considering partnering with digital preservation groups at ALA.
The MLA Annual Meeting will be in Philadelphia from February 9-12, 2011.
This meeting welcomed new members to ALCTS and discussed ways in which they could become involved with the organization. The Collection Management Section was highlighted, and Kathy Tezla and Harriet Lightman explained the section and its work.
There was also an open discussion with attendees who were asked how the IG could help them. Members responded positively and discussion topics included listservs and communication, conference buddies, and job searching advice. Additionally, members were reminded to complete the volunteer form during the spring for committee appointments starting at Annual. The meeting ended with the election of new Web Coordinators: Sarah Smith and Elizabeth Siler.
The Newspaper Interest Group discussion session at the 2011 ALA Midwinter Meeting explored efforts to collect, archive and make accessible PDFs of contemporary newspapers. Sarah Quimby, Library Processing Manager at the Minnesota Historical Society (MHS), presented on a recent project at the MHS to start collecting newspaper PDFs. Vicky McCargar, Archivist at Mount St. Mary’s College, Los Angeles and digital preservation consultant, surveyed some ways publishers currently archive born-digital content. Brian Geiger, Director of the Center for Bibliographical Studies and Research at the University of California, Riverside, demonstrated software his center has started to use to allow California publishers to upload PDFs and convert them into METS/ALTO digital objects. A lengthy and lively discussion followed.
Monday, January 10, 2011, 8-10 a.m.
Eight attendees (higher than the 2010 ALA Annual Conference) Discussed future topics, which included
- ILL collaborations
- Why libraries purchase certain formats
- Why libraries purchase through certain vendors
Discussed the need for vice-chair/new co-chairs in 2012. Discussed use of ALA Connect. Group either had not used the site or found it difficult to use.
- Introduction to PVLR and chairs
- Gathered ideas for the 2011 ALA Annual Conference in New Orleans
- Solicited interest in participating as a co-chair
- Will need new chair(s) for ALA MW 2012
- Introduction to panel
- Panelist bios
- Panelist presentation
Seeking the New Normal: Periodicals Price Survey 2010
Moderators: Elizabeth Lorbeer, University of Alabama and Kim Steinle, Duke University Press, and PVLR co-chairs
Panelists: Stephen Bosch, University of Arizona Library, and Kittie S. Henderson, EBSCO Information Services
Stephen Bosch and Kittie Henderson, authors of the April 2010 Library Journal article “Seeking the New Normal: Periodicals Price Survey 2010,” shared results of the 2010 survey and how the data was compiled, as well as their thoughts regarding future trends in serials pricing. The panelists focused on their methodologies and how they differed from those who previously worked on the survey. The survey’s overall findings were similar to those from the 2009 survey, but one new point is that even though many publishers limited their increases for 2010 the savings were not enough to cover library budget cuts. The panelists also addressed questions from the audience, which included why the survey currently looks only at print prices in a digital world (electronic pricing may include packages, tiered or other pricing models) and what libraries can expect for 2011 pricing (many publishers are inching prices back up). The Q&A portion of the session included topics such as why database prices do not correlate with individual journal prices, the phenomenon of over-cancellation, smaller publishers, big deals and alternative pricing models.
Role of the Professional in Academic Research Technical Services Departments
Co-chair Jack Hall welcomed participants, introduced co-chair Wanda Jazayeri, co-vice-chairs Erica Olivier and Shoko Tokoro, and immediate past co-chairs Robert Rendall and Sandra Macke, and then asked the 30 participants to introduce themselves.
Batch Processing in Technical Services
The meeting topic was introduced by Anne Mitchell and Annie Wu, University of Houston Libraries, authors of “Mass Management of e-Book Catalog Records: Approaches, Challenges, and Solutions” (LRTS 54:3 (2010): 164-174. The speakers presented procedures developed at their library for batch processing, particularly of e-book records, and the many challenges encountered. The challenges include: inconsistency in record sets due to different vendors; acquisition of vendors by other vendors; limitations placed on the institution by vendors concerning how the institution may use the records (e.g. batch uploading records to the utilities such as OCLC); failure to remove outdated editions from sets; frequent changes in content; different modes of acquisition; varying quality of MARC records; lack of effective means to deal with multi-volume titles; lack of best practices for non-textual materials (sound, video); need to identify different sets in the local system; need in early stages of implementing batch processing for work to be done by higher level/professional staff before work can be delegated to other staff. Some procedures to consider include developing and documenting full procedures; scheduling regular updates of sets and addition/deletion of titles; carefully watching for notifications of changes/additions/deletions from vendors; developing a system locally for identifying each set of records (e.g. adding prefixes to numbers in 001 fields); keeping a timetable, perhaps in the local intranet; keeping a sample of edited records for each set/vendor to provide information on how they are handled; awareness of any limitations/requirements of the local ILS; awareness of your account limits with vendors and knowing what happens when limits are exceeded; consider assigning as much as is feasible to one vendor to increase consistency; encourage cataloging communities to work on cooperative best practices; keeping consortial aspects in mind; following resources such as the batch load subscription list hosted by Virginia Tech and the wiki hosted by the American Association of Law Librarians.
Attendees made valuable contributions to the discussion, including OCLC's plans to help with batch processing procedures, e.g., via Collection Builder. In a business meeting after the discussion, the chairs and vice chairs discussed possible future topics (including patron-driven acquisitions) and the Interest Group's charge and possible changes to it.
The froup organized a panel discussion on hybrid journals and the future of scholarly publishing at the 2011 ALA Midwinter Meeting in San Diego. The panelists were:
- Philip Bourne, Professor of Pharmacology, University of California San Diego
- Charles Eckman, University Librarian and Dean of Library Services, Simon Fraser University, Canada
- Patricia Hudson, Senior Marketing Manager, Oxford Journals, Oxford University Press
- Dan Morgan, Executive Publisher, Psychology and Cognitive Science, Elsevier
Panelists discussed the development, perceptions, and future of hybrid journals from different points of view. Judy Luther was the discussion moderator.
Charles Eckman delivered the first presentation and pointed out that there was a steady growth of library-based funding support for authors who wish to enable open access their published research. As the former Director of Collections at the University of California, Berkeley, he provided an overview of the OA fund and noted that researchers welcomed financial support for publishing in hybrid journals. Of the sixty OA articles funded in 2008, thirty were published in hybrid journals. He also discussed the OA fund at Simon Fraser University where he is currently University Librarian & Dean of Library Services. The Senate Committee there reviewed and rejected funding for publishing in hybrid journals due to fiscal accountability questions and concerns over the journals’ “double-dipping issue.” Eckman argued that the stance of campus stakeholders matters in determining whether an OA fund should support publishing in hybrid journals. Moreover, researchers who publish in hybrid journals tend to opt for OA when there is financial support for it. In principle, hybrid journals could transition to full OA journals. However, the “subscription culture” and a lack of will to develop new practices are among the factors that create barriers to such a transition. He suggested that more data be collected from different stakeholders of journal publishing for the study of this issue. He wrapped up the presentation with recommendations such as applying the OA fund to other types of scholarly publication. (Find Eckman’s presentation online at: http://bit.ly/giQw3p.)
Philip Bourne approached the topic from the researcher’s perspective. He argued that scientists and faculty members are usually more interested in publishing their research in the most prestigious journals than in ensuring unfettered online access to their research. Bourne noted that funding agencies’ policies on access to funded research are a significant factor in determining what researchers will do in terms of supporting OA. For instance, the National Institutes of Health have not strictly enforced their public access requirement at this point. Researchers, therefore, stick to their habit when selecting journals for publication and give no thought to archiving their publications in OA repositories. They realize that journal access is in general not free and like the idea that OA will help boost readership of their works. However, most of them have not thought much about publishing in hybrid journals. Bourne stated that “hybrid journals are but a small step in the right direction” and that full access to published research and related data in a machine-usable way is crucial.(The knowledge and data cycle is based on an article by Bourne, “Will a Biological Database Be Different from a Biological Journal?” at: http://dx.doi.org/10.1371/journal.pcbi.0010034.) He concluded by providing a brief description of a knowledge and data cycle, which epitomizes the future of scholarly communication. Find Bourne’s presentation online at: http://bit.ly/gZ3PkY.
Patricia Hudson then discussed hybrid journals from the perspective of a non-profit publisher. Oxford University Press currently offers the OA option in ninety-four of its subscription-based journals. Editorial decisions of these journals are entirely independent of whether the author plans to select the OA option. The publication fee is $3,000. Articles covered by the option have a Creative Commons license applied to them and are deposited to PubMed Central as needed. In 2009, the OA option uptake was highest in life sciences and lowest in humanities and social sciences. The Press informs the author of the OA option after the acceptance of the article, and notes that this option offers a possible solution to the compliance with research funders’ access policies. Hudson rounded out the presentation by alerting the audience to a number of questions that warrant attention. Among them are: How will gold OA uptake affect subscriptions to hybrid journals? How can OA articles be clearly identified within hybrid journals? Is gold OA feasible in humanities and social science publishing? Find Hudson’s presentation online at: http://bit.ly/eF7L8E.
Dan Morgan discussed the hybrid journals published by Elsevier. As of December 2010, more than 500 Elsevier subscription-based journals offered the OA option. The charge was $3,000, but $5,000 for Cell Press journals. Morgan pointed out that Elsevier does not charge subscribers for content covered by the OA option. In 2009, 515 articles were published with the option among the 260,000 articles published by Elsevier journals. As the uptake of the option had been very low since 2006, it did not generate an impact on journal pricing. Industry-wide, the low uptake rate (1-2 percent) presented risk for sustainability. However, it is likely that the uptake will increase as a result of funding support. Morgan maintained that Elsevier is open to mechanisms that have the potential to bring about sustainable universal access to published research. The company adopts a “test-and-learn approach” to “ensure that system-wide impact of such mechanisms are [sic] fully understood before scaling them up.” Meanwhile, there are questions for different stakeholders involved in the hybrid model. They address various issues such as the sustainability of funding support for the OA option, the funding distribution among different disciplines, and the perpetual costs of hosting articles published with the OA option. Last but not least, Morgan briefly discussed five future directions for scholarly publishing:
- Close remaining access gaps;
- Provide access to non-journal outputs;
- Enrich and enhance articles;
- Develop tools to derive insights across articles; and
- Strengthen anti-plagiarism and ethics enforcement.
Find Morgan’s presentation online at: http://bit.ly/dI7gz3.
Technical Services Directors in Large Research Libraries
Two main topics formed the basis of an energetic discussion. The first topic was titled “It’s the End of the World As We Know It, And I Feel (Fine?).” Ninety minutes were devoted to a discussion that focused on what technical services might look like in 2015 in reaction to the current trends that support a much more digital and unmediated user environment.
The second discussion topic was “In Hathi We Trust: coordination and management of legacy collections.” Members discussed the current state of and needs to support our vast tangible collections in an age where much of the content is available in digital form and space to hold and store tangible collections is at a premium. Questions raised included: How many copies are needed in such a world? Who agrees to hold them? How do we coordinate strategic holdings and divestitures?
Technical Services in Public Libraries
The Technical Services in Public Libraries PLTSIG jointly met with the OCLC sponsored Dewey Update Breakfast whose regularly scheduled meetings, Saturdays at both Midwinter and Annual conflict with the IG's time slots. This first joint meeting allowed attendees with shared interests to attend a single meeting with two sets of agendas. The Dewey section of the meeting covered a variety of topics including the availability of WebDewey 2.0 and a detailed overview to the 23rd edition of Dewey that is scheduled for a May 2011 release. The IG agenda was designed to elicit information concerning the challenges facing technical services' staff in public libraries including cuts in funding that impact services for patrons as well as support for attending conferences.
The need to do more with less is a shared issue. While discussing specific concerns ranging from outsourcing, changing internal workflows, RDA implementation, collection management and how CollectionHQ might help, the consensus was that there should be a panel at the 2011 ALA Annual Conference in New Orleans featuring librarians not vendors. The panelists will discuss their experiences, successes and recommendations regarding the concerns raised in the Midwinter Meeting.
Technical Services Workflow Efficiency
Inspiration for the Midwinter session stemmed from the controversial decision to privatize the entire operations for the public library system in Santa Clarita, California. Though the city is relatively healthy, the decision was seen as a long-term strategy for cost savings. An article about this decision was featured in The New York Times several months ago. As far as the outsourcing spectrum goes, most libraries do not fall anywhere near this range. However, it made us consider the many scenarios in which technical services work is currently being outsourced. While not a new trend, the outsourcing of technical services work is an increasingly ubiquitous presence in library operations and the management of resources. Many libraries are contracting out to vendors or external organizations as a solution to budget limitations, shrinking staff levels, and shifting priorities. Common areas of outsourcing include cataloging, digitization, and selection. Today, librarians refer to services like shelf-ready, patron-driven acquisitions, and the Google Books project.
The program panelists were: Judy Garrison Head of Electronic Acquisitions and Serials Control, University of Texas, San Antonio; Ann Miller Head, Metadata Services, Digital Projects, and Acquisitions, University of Oregon; and Lynette Schurdevin Library Administrator, Thomas Branigan Public Library (New Mexico). The panel shared aspects of their library’s outsourcing profile with about sixty-five attendees. Work currently being outsourced by their institutions included making materials shelf-ready, MARC records, local content digitization projects, and selection/acquisitions workflows. Motivations for outsourcing workflows included cost savings, improved efficiency rates, and staffing shortages. Some of the most notable benefits included faster turnaround between selection and the shelf, coverage for staffing shortages, and pilots for new initiatives. These ultimately produced tangible benefits for the library system, staff, and users. Assessment of outsourced services was conducted in a variety of ways: usage statistics for patron-driven acquisitions, turnaround rate comparisons, backlog reductions, and usage of local content.
One panelist stated that her library could be doing more outsourcing, but was not rushing into it. Another panelist commented that she foresees the continued use of a vendor for cataloging and selection because it has freed up staff to work more directly with users. None of the panelists viewed the outsourcing of services as degradation to the professionalism of the field, but as a way to embrace new and developing philosophies and practices.
Acquisitions Managers and Vendors
The group met to discuss patron-driven acquisitions (also known as demand-driven acquisitions) models more broadly through a panel discussion comprised of library representatives currently participating in patron driven acquisitions programs and with a publisher who provides electronic monographic content. The goal of the session was first, to draw out discussion and to obtain feedback related to questions and topics for the ALA Annual Preconference Workshop hosted by the ALCTS Technology Committee. The day-long preconference workshop is titled: Patron Driven Acquisitions in Academic Libraries: Maximizing Technology to Minimize Risk.
The second goal for the IG Midwinter meeting was to promote the workshop to attendees. The panel consisted of the following participants: Bill Kara, Head, E-Resources and Serials Management and Boaz Theodor Nadav-Manes, Head, Acquisitions Services/Philosophy Selector, Cornell University Library; David Givens, Acquisitions Coordinator, Loyola University Chicago Libraries; Carl Merat, Head, Collection Management, Liberty University Library; Lenny Allen, Director of Sales, Wholesale and Online, Oxford University Press. Audience participants included librarians who are currently using a patron-driven acquisitions model or who are considering a program and e-book and materials vendors who actively supporting these models. The combined interests among attendees created a uniquely active discussion throughout the session. Topics of discussion included: high level overviews of varying library workflows for patron-driven acquisitions including users of the three major patron-driven platforms; experiences with spending controls; how to structure the funding of pilot projects and ongoing use of this model; how to integrate demand-driven models into traditional purchase models; how a PDA model might positively or negatively affect scholarly publishing. The expressed needs from panelists and audience members to support patron driven models included but were not limited to: more definitive tools for predicting expenditure; more data on the continued use of PDA materials after pilot completion; changes to WorldCat Local and how holdings are set in this environment; better understanding of users and discoverability of PDA materials; good duplication control. Slides and discussion questions will be made available on the ALCTS Midwinter Wiki and ALA Connect.
Cataloging and Classification Section
Authority Control Interest Group
Chair Lynnette Fields (Southern Illinois University Edwardsville) welcomed the audience and introduced the speakers for the session. Complete minutes of the meeting and business meeting are currently available in ALA Connect. Session PowerPoints are currently available in ALA Connect.
Janis Young, Policy and Standards Division, Library of Congress
Announcements on Descriptive Matters
1. In August 2010, LC posted a request for comments from the library community about a proposal to change the abbreviation “Dept.” to the spelled-out word in headings, unless it reflected usage of the body named in the heading. The current practice is an exception to AACR2, and would be an exception to RDA as well. The few comments (perhaps twelve total) were generally favorable, but did not constitute the clear mandate that LC sought. The use of the abbreviation will continue for now, with that decision to be revisited after the national libraries determine whether they will adopt RDA.
2. Bob Hiatt retired from the PSD after forty-two years of service at LC. Most recently, he was the point person for catalog errors reported online, and also served as editor of the CSB.
1. The Annotated Card Program has been renamed the Children’s and Young Adult’s Cataloging Program. The new name is more descriptive of the program and its intended audience. The list of headings is now called “Children’s Subject Headings.” She stressed that there is no change to the coding or application of the headings.
2. The change from “Cookery” to “Cooking” is largely complete. Response to the initial proposal for change was overwhelmingly positive. It is thought to be the largest single change to LCSH ever—788 authority records were revised, 40,000 bibliographic records were changed via global update, with another 60,000 being changed manually to add the genre/form heading “Cookbooks.” Instruction sheet H 1475 in the Subject Headings Manual has been revised and issued.
3. The creation of validation records (records for headings with free-floating subdivisions that are created to allow machine validation and control of such headings) continues. There are now over 80,000 such records. A first A-to-Z pass has been done through the LC catalog, with records created for heading strings that appear at least 20 times.
4. Young detailed a new project that may offer another means of headings validation. This involves adding 072 fields to authority records for subjects, and 073 fields to records for free-floating subdivisions. The fields contain instruction sheet number(s) from the Subject Headings Manual pertinent to the heading or subdivision. The hope is that a validation program parsing a heading-subdivision string can look for the presence of the same instruction number in each authority record. There is also hope that such a program could suggest subdivisions to the cataloger on the fly. There are issues—not every term in a subdivision list is appropriate for corresponding main headings (e.g. “Aerial operations” not likely a good fit with the heading for the Thirty Years’ War); there are exceptions that are embedded in footnotes; some heading strings that would pass the simple instruction-sheet matching algorithm are in fact cross-references to a phrase heading; there is a need to account for factual “impossibilities” that could be embodied in works of fiction; and multi-part instruction sheets present a challenge. The biggest issue, however, is how to get the project past the pilot stage. Janis stressed that SACO libraries should not include these fields in subject proposals.
5. The Authorities and Vocabularies Service continues to grow. In addition to RAMEAU, the French vocabulary, LC is considering the addition of other translations of LCSH (in Spanish from Spain and Chile, French-Canadian, and Arabic).
6. There is now a mechanism to allow the public to suggest changes and updates to LCSH, found at http://id.loc.gov; sixteen proposals have been received since June 4, 2010. 7. A new headings proposal system is in the works for SACO members and LC catalogers for LCSH, LCGFT, and Children’s headings. Various templates will be offered for the different sorts of records, and it will be similar to the system in place for proposing new and changed LC Classification numbers; in fact, a current user of that system can use the same password for this one.
Genre/Form Authorities Developments
1. Genre/form authority records (MARC tag 155) will undergo coding changes. The LCCN prefix for such records will be “gf.” About 700 SARs will be cancelled and simultaneously re-issued as form/genre records, no earlier than March 2011 (to allow OCLC to ensure proper function of the headings-control feature). The cancelled LCCN of the former 150 record will be retained in the 155 010 field, subfield z. In bibliographic records, the tagging will change from 655, second indicator “0” to 655, second indicator 7 with a subfield 2 containing the code “lcgft.”
2. Use of cartography form/genre headings began September 1, 2010. Some tweaking continues; SACO proposals are now being accepted. A SHM Instruction sheet is being drafted and will go out for review.
3. Eighty law form/genre terms were approved on November 3, 2010. LC worked with the American Association of Law Libraries (AALL), which created a thesaurus that served as the basis for the terms. Terms will be implemented in early 2011. A lacuna exists in terms for religious law—while some proposed headings for Jewish law have been received, Janis issued a call for volunteers to help out in all religions and denominations.
4. LC continues to work with the Music Library Association on a project to develop form/genre headings for music. This began with a vetting of existing LCSH headings, which in many cases are being deconstructed into their form/genre, medium-of-performance components, and carriers. 800 terms for form/genre have been approved thus far. Implementation date is uncertain, in part because a suitable home in MARC bibliographic records has yet to be identified for the medium of performance terms.
5. A religion project has begun in collaboration with the American Theological Libraries Association (ATLA), which is in turn coordinating input from smaller societies and organizations.
6. The Subject Access Committee will sponsor a preconference on form/genre headings at the 2011 ALA Annual Conference in New Orleans.
The U.S. RDA Test: Status and Next Steps
Beacher Wiggins, Chief, Acquisitions and Bibliographic Access Directorate, Library of Congress
Wiggins’ charge was to discuss RDA and its impact on the authority file and on our own authority work. He laid out the particulars of the RDA Test (3 national libraries plus 24 other participants; LC’s role in preparing documentation and training materials; the test period from October 1 to December 30, 2010). He characterized the records that were produced—the twenty-five titles in the “common set,” cataloged by all participants once using AACR2 and once using RDA; the five made-up resources used to test copy cataloging; the “extra set” containing records for material chosen by the participants from their production stream, spanning a wide range of formats and subject areas. Wiggins addressed concerns expressed on cataloging lists about the procedural choices made for authority work in the test. He noted that participants created authority records according to their existing policies and workflows. The use of 7XX fields to record the complementary form of heading to that in the 1XX was done to minimize disruptions. A corollary activity was evaluating (in concert with the PCC) the 500-plus Library of Congress Rule Interpretations and paring them to about 200 instructions for use in the test, with the name “Library of Congress Policy Statements.” These steps reflected a conscious decision to test RDA in a real environment to afford real-life feedback, while recognizing the inconvenience to some.
Wiggins noted that some of the objections seemed based on a notion of maintaining a “pure AACR2” environment in the authority file, at least until the national libraries make an implementation decision. That is unrealistic; we live and will live in a “mixed” environment, regardless of the decision. He stated while the U.S. national libraries ceased producing RDA records with the end of the test on December 30, 2010, at least five test participants have announced their intention to continue using RDA for production cataloging. As a consequence, both AACR2 and RDA need to be supported in policies and guidelines, at least in the short term.
The national libraries, OCLC, and the PCC have issued interim guidelines for what to do with newly-encountered RDA records. The national libraries, OCLC, and the PCC will accept both AACR2 and RDA record creation, with the stipulation that when an AACR2 heading has been established in the authority file, it will be used in both AACR2 and RDA records. OCLC’s guidelines allow only one record for the same resource which may be either AACR2 or RDA. Once created, a master record must not be changed from RDA to AACR2 or AACR2 to RDA. With the test period completed, the national libraries will analyze the data collected using the Matrix of Evaluative Factors, the records themselves, and survey responses. Contracting for assistance with the evaluation is being considered. A preliminary report with recommendation is due to senior management at LC by March 31, 2011; the goal is to have the implementation decision announced at or before the 2011 ALA Annual Conference at the end of June 2011. There is a range of possible outcomes—no adoption, adoption postponed until changes deemed necessary have been made, adoption as is (with the accompanying choices among options and alternatives), or adoption with specific policy decisions counter to the instructions.
Wiggins concluded with a list of resources for more information which can be found in his PowerPoint presentation (on the ACIG portion of the ALA web site), or by writing mailto:firstname.lastname@example.org.
After thanking the speakers, the Chair announced vacancies for officers and members-at-large that will be filled at the 2011 ALA Annual Conference, and invited nominations and expressions of interest. The update program was followed by a business meeting. Approximately eighty-three people attended the update program, and eighteen attended the business meeting. Most of the business meeting was spent brainstorming topics for the 2011 Annual program.
Competencies and Education for a Career in Cataloging
Attendees were encouraged to visit http://connect.ala.org/node/112511 in ALA Connect to register to post messages to this interest group. Connect allows interest group participants to use this space to post and share information, facilitate discussions, communicate ideas, to post meeting dates and agendas, and to make plans for future meetings.
Managed Discussions vs. Formal Programs
Participants discussed the idea of having a managed discussion versus a formal program as the first initiative of this committee. They decided to begin with a managed discussion, which was considered to be more feasible as far as timing and coordination, and a good alternative to planning an official program, which requires more planning and must go through an approval process. The group further discussed having a managed discussion at the 2012 ALA Midwinter Meeting, leaving time during the 2011 ALA Annual Conference to plan for the logistics. A venue will be needed and possibly some IT requirements (such as a computer and projection screen) for a managed discussion, plus speakers. There was also discussion regarding the number of speakers. Several topics were proposed for the managed discussion, including: 1) RDA/Non-RDA cataloging: competencies needed to exist in a hybrid environment; 2) the next generation of the ILS: skill sets needed; and 3) cataloging curricula in library schools: what is currently being taught. Possible institutions to tap for speakers were also considered, including NLM, NAL and LC speakers for RDA, and Ex Libri and OCLC for the other topics.
Attendees focused on brainstorming a number of other topics, before a final decision is made at the next meeting as to how to proceed. The following topics were proposed for a future managed discussion:
- What is being taught in cataloging courses, now that RDA is being implemented by some libraries
- Required skill sets for cataloging faculty teaching both RDA and AACR
- Training for those who are novices to cataloging
- How cataloging jobs have changed and what kind of succession planning is being done at institutions to pass on knowledge
- Learning about other types of cataloging to improve expertise and be more marketable
- What practitioners want educators to teach in cataloging and the core competencies needed to qualify for a job in cataloging
- What does one need to teach in cataloging to improve resource discovery?
- Cultivating internships and practicum experiences
- Eliminating stereotypes about cataloging to attract a new generation to the industry
- Reinventing cataloging to deal with hidden or special collections
- FRBR, FRAD and FRAR: a guide for the perplexed.
Several speakers were proposed for these topics, including Barbara Tillett, Chief of the Policy & Standards Division at the Library of Congress, Karen Coyle, a consultant on copyright and standards development, Kurt Groetsch at Google Books, Marshall Breeding at Vanderbilt University Libraries in Nashville, Tennessee, and John Houser for a discussion on Open Source. There was a suggestion to coordinate a week of cataloging training, but this was considered to be an official program that would take ample planning.
Cartographic Cataloging (MAGERT ALCTS/CCS)
Twenty-five people attended the discussion, which covered three topics.
RDA: Experience with Testing; Updates Relating to Cartographic Materials
Several printouts of RDA records for cartographic materials were circulated so that the group could see some actual examples. Some differences from current practice are: Description = i; 040 $e rda; words are not abbreviated; there are no square brackets [ ] around the scale statement even when it is approximate; there are new MARC field: 336, 337, and 338 fields.
Two map catalogers from the Library of Congress who participated in the test summarized their experience (Min Zhang and Tammy Wong). Four clusters from the Geography and Map Cataloging Team participated in the test; feedback was positive and people felt at ease. They created RDA name authority records for personal names, but not corporate. They explained that once a cataloger learned how to use the RDA Toolkit, the time required to catalog a resource was not significantly different than before. Links on the LC web site provide access to files of records created by formal and informal participants in the U.S. National Library RDA Test: http://www.loc.gov/catdir/cpso/RDAtest/rdatestrecords.html.
A discussion followed about whether or not to include geographic coordinates in records when they do not appear on the piece. The information would be helpful for future GIS applications, but it takes extra time to determine what they are; the decision is generally up to local policy.
Cartographic Form/Genre Terms
With the implementation of form/genre terms, it is often necessary to include one topical subject heading and one form/genre heading in a record. For example, “Tourist maps” is no longer a subdivision; it is a form/genre term. Catalogers must include a subject heading for the place and add a subdivision $v Maps, then include a form/genre heading for Tourist maps. A discussion ensued about which of these headings is listed first in the bib record. Paige Andrew stated that his team has put the form/genre heading first, and matched it to the classification number. However, it may need to be reconsidered and an LC recommendation publicized.
Zhang (LC) reported that Janis Young is updating scope notes for genre terms. Scope notes will also be added to topical and genre terms for early maps. For globes, probably only the term “globes” will be retained; it is also likely that only “cadastral maps” will be retained (not “plat maps”). Jay Weitz (OCLC) said that OCLC has not received the LC thesaurus of genre/form terms yet. It is likely that OCLC will convert the database to the new genre/form headings sometime after the thesaurus is received this summer. Also discussed were some implications of RDA for searching in OCLC. When the “/map” limit is applied to a keyword search, theoretically it would limit results to Type e, but it is more complex now. Jay said that he and some colleagues want to restructure Material Type so that it could facilitate some differentiations, but it would be a huge project. OCLC will not take the lead to support RDA; they will base their efforts on what users want.
GIS Mash-ups and Other Web 2.0 and 3.0 Technologies
How catalogers might digitize our paper archives and apply coordinates to make them searchable is a complex issue. Bounding box information is very important and useful, and decimal degrees are easier to convert for use by GIS search engines. Crowdsourcing has been tried by the New York Public Library to obtain coordinates, but the data was often inaccurate. Zhang said that LC started putting coordinates into name authority record in May 2010, but it is very time consuming. J. Clark added that this practice may be more feasible for special projects or collections.
Catalog Form and Function
Participants broke into groups to discuss the topic “Anticipating RDA, Will It Improve Our Catalog?” Participants included representatives from RDA test sites who provided valuable input from their first hand experiences. Although there was consensus about the momentum to adopt RDA, these discussions provided participants with the opportunity to talk about their RDA anxieties, discoveries, potential solutions, and needed modifications. It was apparent that there are both general and specific concerns that should be addressed prior to RDA implementation. Among those concerns are:
- the economic impact on catalogers’ understanding of FRBR (Functional Requirements of the Bibliographic Record) as the conceptual model upon which RDA is based
- training the trainers
- onsite versus virtual training sessions
Differences and similarities to AACR2
- MARC as a metadata scheme that was not designed for RDA. There may be compatibility problems
- Controlled vocabularies
- Record creation requires more reliance on cataloger’s judgment vs. standardized records
- Spelling out obscure abbreviations will be a help to the public, but also creates greater chances for errors in records
- Vendor issues: o upgrades and timetables o making adjustments to load tables, indexes, and displays o Will ILS/discovery service vendors utilize the new 3xx fields to allow discovery by carrier, content type, media type? Now most use just the coding from the leader 06/Record type, where only one code can be assigned
- Other than WorldCat Local and OLAC’s prototype “FRBR-inspired moving image discovery interface,” there don’t seem to be many examples of FRBR in discovery tools, so it’s difficult to see how some of these changes may benefit users
- Whether or not to do global updates to existing records to RDA’ify record elements if viewed as helpful for discovery or display
- With the likelihood of RDA being implemented, will it be adopted universally?
- Taking a wait-and-see attitude towards making preparations in their technical services departments.
The ALCTS CCS Catalog Management Interest Group (CMIG) met at the ALA Midwinter Meeting at the San Diego Convention Center, Room 30C on Saturday, January 8, 2011, from 1:30-3:30pm. About 50 people participated in the program. The CMIG program featured three presentations.
Batch-conversion of Non-standard Multiscript Records
Lucas (Wing Kau) Mak, Metadata and Catalog Librarian, Michigan State University
Michigan State University Libraries (MSUL) switched to SkyRiver as its cataloging utility and has been using Z39.50 search as part of its copy cataloging workflow and using this federated search function to acquire HathiTrust records. During the course of bringing in multiscript catalog records for non-Roman script materials through Z39.50, MSUL catalogers realized that some records don't have original script data stored in 880 fields as required by MARC 21 specifications; instead, original script data are put alongside with their romanized counterparts. Bringing in these non-standard records makes record structure of multiscript records inconsistent within the ILS. On top of that, it creates confusion in searching. Since manually correcting and inserting coding (i.e. MARC tag, indicators, and values in subfield 6) of MARC fields that contain original script data is extremely labor-intensive, the metadata librarian has created an XSLT to automate the process. Lucus gave an overview of the issue as well as focused on the logic of the XSLT and the important factors that have affected the design of it.
The Journey to Single BIB
Jane Anne Carey, Metadata Resource Management Librarian, University of Florida Libraries
The State University Library System of Florida consists of 11 state-funded institutions. The Florida Center for Library Automation (FCLA) provides automation services for those libraries including the operation of Aleph, a shared Integrated Library System (ILS) created by ExLibris.
Since early 2008, subcommittees of the Technical Services Planning Committee (TSPC) have discussed the possibility of creating a shared bibliographic record architecture and sent forward a report to that effect to the Council of State University Libraries (CSUL) in August of 2008. On the strength of that report, CSUL created the Single Bibliographic Record Task Force (SBTF) to investigate the steps needed to create an effective merged ILS. In the spring of 2010, TSPC formed the Statewide Standards for MARC Records Advisory Group to draft guidelines for cataloging in a shared bibliographic environment.
In May 2010, under Phase 1.1, FCLA created a test catalog of the merged records of the three largest libraries (Florida State University, University of Florida, and University of South Florida). At each library, catalogers familiar with serials and special collections materials looked at it to gauge the results. A second merge in November opened the field to more investigators.
Besides participating in the theoretical discussions, the University of Florida has done the nitty-gritty of maintaining necessary information while correcting the legacy of successive catalogs and evolving cataloging rules.
Since 2009, the University of Florida (UF) has been working with OCLC on a reclamation project to match UF records to holdings on OCLC’s WorldCat. OCLC processed over 3,392,000 records and we rejoiced in the 89% that matched, and then started work on the file of over 375,000 records that came back unresolved. This big group of 375,000 records was massaged and 12 minor groups (circulation created, provisional, suppressed, etc.) were ignored leaving167, 000 records yet to be examined.
In January of 2010, Florida State University, University of Central Florida, and the University of Florida started work on a clean-up project of incorrect bibliographic tags. Each university received a list of all the tags used in the catalog and the total number of records with that tag. 855 were correct by current standards or local practice. 700 tags were corrected using a variety of methods, which leaves 245 tags for investigation
All of the work put into this project so far has led to a greater spirit of cooperation among technical service departments in solving problems and maintaining a clearer catalog for our patrons.
Reclaiming your Catalog: Benefits of Batch Reclamation
Roman S. Panchyshyn, Catalog Librarian, Assistant Professor, Kent State University
Kent State University Libraries have operated using Innovative Millennium software (KentLINK) since the 1990’s as one of the founding members of OhioLINK, Ohio’s statewide library consortium. Various cataloging policies and practices over the years have led to significant discrepancies between materials held locally and holdings posted to OCLC WorldCat. In 2009, KSU Libraries planned and undertook a batch reclamation project with OCLC to clean up and reset KSU library holdings on OCLC WorldCat. The planning process and the output from the project allowed accurate synchronization of library holdings with OCLC WorldCat as well as identified problem areas within the local bibliographic database that needed to be addressed. Within the past 18 months the major problem areas have been addressed with the assistance of library staff, student workers, and technology. KSU now has accurate data that is shared with both the OhioLINK central catalog and with OCLC for both cataloging and resource sharing.
Roman Panchyshyn’s presentation geared toward libraries considering a batch reclamation project. It reviewed the entire cycle of KSU’s successful batch reclamation project. The presentation covered the planning process, project setup and output, database clean-up, and ongoing maintenance. Emphasis was placed on the importance for libraries to plan and document the project thoroughly from beginning to end, to extract and deliver controlled data for OCLC processing, to locate and identify problem areas in the collection based on project output, and to organize database clean-up projects for problem data. If a batch reclamation project is carefully managed, it will benefit all involved partners.
Cataloging and Classification Research
The Cataloging & Classification Research Interest Group hosted four presentations including a summary of the ALCTS 2010 Year of Cataloging Research and reports of current research in the areas of cataloging and classification:
Cataloging and Classification Literature Review for 2009-2010, Preparation for Library Resources and Technical Services Journal: Project Update
Sue Ann Gardner, Scholarly Communications Librarian, University of Nebraska-Lincoln Libraries.
Gardener received a Carnegie Whitney grant to review the literature for 2009-2010 in the area of cataloging and classification, which she used to purchase supplies and hire an assistant to compile citations. Her work focused on English language sources and a variety of electronic and print resources including blogs. She discovered several broad topics in the field that she used to organize the citations, including RDA, FRBR/FRAD/FRSAD, cataloging and the semantic web, interoperability, non-MARC bibliographic metadata, theory of knowledge organization, cataloging and social media, ethics, history, personnel/education, and tools/reference. Her completion date for the report, which will be published in a future issue of LRTS, is June 2011.
Joint ALA/ALCTS Resolution to Name 2010 the Year of Cataloging Research
Jimmie Lundgren, Associate Chair & Contributed Cataloging Unit Head, University of Florida
After giving a brief history of events leading to the resolution, Ms. Lundgren discussed the goals expressed by the LC Working Group on the Future of Bibliographic Control, which were: “Work to develop a stronger, more rigorous culture of formal evaluation, critique and validation, and build a cumulative research agenda and evidence base. Encourage, highlight, reward and share best research practice and results.” Two goals remain unmet at the end of 2010: to build a cumulative research agenda and solid evidence base to support decisions for future cataloging. The group discussed briefly how this challenge might be addressed.
The Impact of Subject Headings on Electronic Thesis and Dissertation Downloads at Oregon State University
Richard Sapon-White, Head of Cataloging at OSU
Since Library of Congress subject heading assignment is time intensive, OSU decided to test whether LCSH headings increased access to electronic theses and dissertations (ETDs) in their institutional repository. Previous research for paper theses in the catalog indicated a 58% increase in circulation for records with LCSH compared to those without. Mr. Sapon-White used D-space statistics by city and downloads per day to determine usage. Preliminary results indicate no difference in access to ETDS with or without subject headings in their records.
In Are We Laying Bricks or Building Cathedrals? A Study of the Perception of Cataloging Quality Among Academic Catalogers
Karen Snow, Ph.D. Candidate, Dept. of Library & Information Sciences, University of North Texas
Presented preliminary results from a survey conducted to gather data for her dissertation. Her research included questions such as the top 10 important MARC fields and the top ten record quality attributes, as well as definitions of record quality. She discovered four major categories of quality: the technical details of the bibliographic record, adherence to standards, the process of cataloging and staff involved, and the impact upon users or accessibility. There was a wide variance in perceptions of quality.
Questions were encouraged after each presentation and there was much audience participation and interest in the topics. The meeting adjourned at noon.
The Cataloging Norms Interest Group is traditionally a gathering of speakers for any interested audience who desires to attend. We had a successful program of four speakers. We had a large audience (130 people) than last Midwinter and used our full time with the question and answer section at the end of the program. After the meeting, the co-chairs and vice co-chairs met to discuss the program and were happy at the discussion and turnout. The combination of vendors and practitioners worked out well. We also discussed possibilities for our annual program. One of our major concerns is to encourage more applications for speakers and that is something we will discuss further before we put out the call for speakers in the spring. We understand that people have less money for travel and may not be coming to midwinter, but we need to work harder to attract speakers who will be willing to travel--especially for Midwinter.
VTLS's RDA Implementation Scenario One (FRBR)
John Espley, VTLS, Inc.
Insights and processes from VTLS's 8 years of experience with FRBR and RDA.
Sky River's Approach to the RDA Test Period
Lynne Branch Brown (Vice President of Operations, Sky River)
Sky River is taking a different approach to the RDA test period by allowing variant versions of bib records in the catalog, so that libraries have access to the form they need in their local catalog when looking for records. We have built-in mechanisms to identify RDA records when they are contributed to our database which allow them to remain separate from their AACR2 equivalent.
The Impact of RDA Authority Records on a Non-RDA Test Library
Miloche Kottman (Assistant Head of Cataloging, University of Kansas Libraries)
The University of Kansas Libraries has close to 10 million authority records comprised of the entire LC name and subject authority files, as well as locally created authority headings in both files. We are not an RDA test library but we are loading RDA authority and bib records into our Voyager system. Based on our weekly load statistics, I will provide information on how many RDA authority records have been created and how many AACR2 authority records have been modified to add the RDA form as a 7xx field since Oct. 1. I'll also share our authority maintenance experience during the test period and our plans to handle bib record changes in the event of the adoption or rejection of RDA.
RDA testing on Non-MARC Metadata Standards
Myung-Ja "MJ" Han, Metadata Librarian, Assistant Professor of Library Administration, University of Illinois at Urbana Champaign; Melanie Wacker, Metadata Coordinator, Original Serial & Monograph Cataloging, Butler Library, Columbia University Libraries); and Judith Dartt Digitization Manager at Special Collections Research Center University of Chicago Library
The Columbia University Library and the University of Illinois at Urbana Champaign Library participated in RDA testing on metadata standards other than MARC. Since the RDA test was focused on MARC, the test workflows for the non-MARC standards were developed by each individual library. The presentation will share the non-MARC testing experience from both libraries, specifically working with MODS and Dublin Core including the test set-ups and issues encountered when applying RDA to MODS and the Dublin Core element set.
The group met on January 8 at the San Diego Convention Center with 72 people in attendance. The meeting featured four presentations.
The first two presentations were reports from the Library of Congress.
Angela Kinney, Chief of the African, Latin American & Western European Division, delivered her semi-annual report on the status of copy cataloging at the Library of Congress. Her report included statistics on the number of copy cataloging records produced by staff of the Acquisitions & Bibliographic Access Directorate and the Collections & Services Directorate in 2010. She also discussed methods the Library of Congress used in 2010 to ensure that copyright receipts, which will impact copy cataloging statistics in fiscal year 2011, are more readily available to users.
Beacher Wiggins, Director for Acquisitions & Bibliographic Access at the Library of Congress, gave an update on the U.S. RDA test and its implication for copy cataloging. The U.S. RDA test concludes on December 31, 2010. Over 6,000 RDA records were created during the test. The records are being analyzed to help the Library of Congress, the National Agricultural Library, and the Nation Library of Medicine to determine if they will implement RDA to replace the current descriptive cataloging code—Anglo-American Cataloguing Rules.
The third presentation was given by Anne Elguindi, Director of Information Delivery Services, and Alayne Mundt Sandler, Metadata Librarian. Both work at American University. The title of their presentation was “Managing the Myth of Shelf-Ready: Creating a Tiered Workflow for Bibliographic Records.” The realities of using shelf-ready vendors to process materials are that it requires management and quality control. Automating the selection of records relieves the need for technical services personnel to search for and load a record for each book coming in to the library, but no human interaction with the records can lead to a proliferation of unresolved errors within the catalog and books that are effectively 'lost' from the time they enter the library due to title or call number issues.
American University has successfully developed a tiered workflow that keeps the human element in the process but reserves trained Cataloging Specialists for records that have potential problems. Books are divided into three categories:
- Books that do not pass the “Copy Cataloging” checklist and need to be routed to the Cataloging Department for additional work.
- Books that can bypass the Cataloging Department but need additional processing, such as new spine labels
- Books that can bypass the Cataloging Department and need no additional processing, which can go straight to the Circulation Department to be shelved
The forth presentation was given by Nancy Chaffin Hunter, Metadata Librarian at Colorado State University. The title of her presentation was “We can do it!” Colorado State University Libraries has been creating metadata for digital collections for over a decade. This presentation shared how CSU structure their metadata, train their copy catalogers, their archival students, and other staff to create metadata, and repurpose existing metadata from finding aids and MARC records.
Heads of Cataloging Departments
The Heads of Cataloging Interest Group hosted a presentation entitled, Test Driving RDA on Monday, January 10, 2011 at the San Diego Conference Center. Linda Smith Griffin, Louisiana State University, (Chair) and Christopher Cronin, University of Chicago, (Vice-Chair/Chair-Elect) welcomed nearly 100 attendees.
Guest speakers: Kate Harcourt, Director, Original and Special Materials Cataloging, Columbia University Libraries; Robert Maxwell, Head, Special Collections and Formats Catalog Department, Brigham Young University; Sarah Quimby , Manager, Library Processing, Minnesota Historical Society; and, Erin Stalberg, Head, Metadata and Cataloging, North Carolina State University. Linda Smith Griffin served as moderator. Christopher Cronin served as recorder.
Each presenter and their respective institution were participants in the official RDA test that ended on December 31, 2010. Each presenter was asked to share his/her libraries’ experience by responding to:
- Challenges Implementing RDA
- Effects on Systems (hardware/software)
- Impact on Users
- Implications for Cataloging Managers
Sarah Quimby began with an overview of the RDA test. She did a quick poll of the audience—the vast majority had already been to at least one RDA session/presentation during Midwinter; there were a number of other formal and informal testers in the audience. Most attendees were from larger library organizations with more than five catalogers.
- Sarah’s institution used an “emersion approach” to the testing because they only had one month of training and practice prior to beginning testing. They budgeted two hours of training per week for the training.
- Thanked her staff for their hard work.
Kate Harcourt stated that one of the biggest benefits to being a tester is that they have a head start on the transition. Found it to be energizing to be working with colleagues from across the country, to be part of a national initiative, to take the first leap into the Semantic Web and linked data. Thanked Columbia’s staff for their excellent work.
- Kate presented an overview of the decentralized cataloging environment at Columbia. Explained that they did not do a full-scale implementation; took the philosophy that it was a test. But there was a momentum that made other staff want to be a part of the test as well.
- Melanie Wacker at Columbia led the non-MARC (DC, EAD, MODS) portion of the test.
- The testing team met weekly and was largely self-taught. The used LC’s training videos, presentations from Adam Schiff, and Webinars from Troy Linker on the Toolkit.
- They did a lot of practice records and that made them learn successfully.
- Didn’t write many local documents, except to bring together information on NACO requirements. Columbia is historically a strong PCC contributor; it took a while to understand why RDA was so much more complex than what they were used to with PCC core standards. But they also felt that knowledge would be successfully acquired over time.
- They felt challenged by non-MARC production largely because they had to make it all up for themselves without a lot of help or direction. They felt DC was too simple for RDA.
Erin Stalberg covered the demographics of who were a part of the test and the centralized nature of the NCSU cataloging operations. They are not a member of PCC, but generally follow PCC guidelines. She outlined the multiple data, systems, data standards that are used and concluded that RDA was really just another standard to add to the already long, historical, and complex list.
- NCSU used the testing as an opportunity to revive a local training program.
- NCSU was interested in analyzing the cost and value of implementing RDA.
- Involved all staff and there was a lot of interest in staff who were in library school.
- They took the approach of “could we apply what was written in RDA,” so they also didn’t create a lot of local documentation. They are a heavily sci-tech institution, and do not usually catalog many of the categories of resources that seem to present complex issues related to RDA/FRBR.
Bob Maxwell stated that participation at BYU was voluntary but encouraged. Almost everyone involved in cataloging attended the training, and approximately two-thirds contributed to the test.
- They used institution records heavily to make RDA records from existing AACR2 copy.
- Thanks Judy Kuhagen at the Library of Congress for her efforts in communicating errors so they could identify training needs.
- Recommended that audience members who have not started RDA just let their staff learning and practicing. BYU wanted to inculcate a culture that we don’t have to do everything in the same way, and that it is okay to make mistakes at first. This approach contributed to an unthreatening environment with a minimum of fear about what errors they might make.
Challenges with Implementing RDA
- Kate said that challenges included worrying about a slowdown in production levels; the need to think differently about creating linked data for the Semantic Web. Staff liked expressing relationships, and the use of controlled vocabularies in RDA.
- Sarah mentioned her staff’s anxiety about making “mistakes” and her attempts to quell them.
- Erin said that NCSU may need to revisit FRBR training. There were also problems with the Provider-Neutral standard not being compliant with RDA. They also debated whether they should be adding codes in 33X$b. Cataloger’s judgment is good in theory, but difficult to apply in practice. Staff longed for more RDA examples, of both fields and whole records in MARC.
- Bob reiterated the incompatibility of RDA with Provider-Neutral. The Toolkit was very slow, staff had trouble finding things, and they had firewall issues, which was a barrier especially during the pre-testing practice period.
Effects on Systems Hardware/Software
- Erin said that they use Sirsi Dynix. The lack of a GMD was seen as a barrier to catalogers looking at browse lists; they may use 33Xs for this.
- Kate said that Columbia uses Voyager, and they had no trouble getting new RDA fields in the system. They are not yet displaying the 33Xs; they also use the GMD for some browsing.
- The Toolkit was a major problem and unfairly spilled over into people’s impressions of RDA.
- May need to revisit FRBR training in order to use the organization of RDA and the Toolkit itself.
- Somewhat concerned about potential overhead on keeping workflows up-to-date.
- Sarah explained that MHS uses a consortia catalog, so they are not yet importing RDA records right now.
- There were some issues with the Toolkit, and they ended up abandoning workflows in favor of local procedural documentation.
- Bob simply mentioned to the audience that even if they aren’t implementing/testing RDA actively, the sheer fact that RDA copy is being made available in OCLC necessitates that they ready their systems and staff on what to do with those records.
Impact on Users
- Sarah mentioned that her staff had positive reactions to the relationship designators and the spelled out abbreviations. They ultimately felt RDA records would be easier for users to read, but they did not like the 33X fields.
- Kate said that Columbia has an internal group that managed user testing. They provided that group with 5 record sets (1 RDA record, with 1 AACR2 record) for comparison.
- Reference librarians preferred RDA records over AACR2 records. They loved spelled out abbreviations; some liked not having GMDs while other missed it. They felt relator terms were confusing the way they hanged off access points.
- Erin said NCSU’s experience was similar. They have considered suppressing the display of GMD for all records in the catalog.
- Bob said that staff who looked at records felt the differences between RDA and AACR2 were largely cosmetic.
- On authority records, Bob stated that his staff felt a responsibility to add 7XX fields to demonstrate they had evaluated the RDA forms of access points. His staff liked the 37X fields, even if that work won’t be reflected in our catalogs right now. They felt the value added to authority records was worth more than the extra work.
Implications for Cataloging Managers
- Sarah offered the following advice:
- Read FRBR as introduction to RDA;
- “Just do it”—start practicing;
- Don’t panic—first few records will take a long time, and that’s okay;
- Read slowly—language of RDA can be confusing and it is easy to get tripped up;
- Don’t sweat the small stuff that doesn’t affect searching, like punctuation.
- Kate stated that the process of testing turned into a process of evaluating how to manage a mixed metadata environment, even though they thought it would be more about assessing cost and value.
- Reiterated the challenged of applying catalogers judgment
- Stated that because they had a portion of staff involved in the testing, they are worried about other staff having to catch up; they may feel marginalized.
- Wondered what we are doing about relator codes/relationship designators—how can using them be effective when only a fraction of records in our databases will have them.
- Erin said she felt testing was almost more about testing the culture than the rules.
- Staff felt uncomfortable making local decision without knowing what LC will do.
- People like examples, and to know what expectations there are of them. They want to know that cataloging managers will have answers to their questions, while cataloging managers want LC to guide those answers.
- Catalogers need to know about and think about user tasks to approach their work successfully, and to contextualize their decisions.
- Every institution is responsible for influencing the future of metadata.
- Bob stated that during the implementation phase, staff needed to feel okay with making mistakes; it’s going to happen. BYU has always had a send pair of eyes look at NACO records, but may do the same for bibs going forward. BYU will be re-evaluating local policies in light of RDA. They need to make some decisions on handling copy. They may need to renegotiate contracts with authorities vendors (have not sent them RDA authorities yet). Wondered whether RDA will affect OCLC fees/credits, and whether shelf-ready prices will increase.
- Did testers also give their staff FRAD training?
- Bob: talked about concepts, but not in FRAD terms.
- Erin: NCSU looked at FRAD user tasks.
- What advice do you have for people getting started?
- Focus on your institution’s collection, don’t get overwhelmed, just start reading RDA, and help each other.
- Will you continue with RDA after the testing?
- Bob: yes.
- Erin: yes. They are a copy-heavy institution, so would likely not after June if LC decides it won’t.
- Kate: no, largely because they didn’t train everyone. But regardless of the national decision, they will still need to equip staff with how to deal with copy in OCLC.
- Sarah: same as Kate. Will follow what LC does.
- What advice do you have for library science educators on changing their curriculum?
- Bob: cataloger will probably need to know how to catalog both AACR and RDA for a while.
- Erin: Erin does teach cataloging and her students feel like they need to know about RDA, largely because they feel there will be pressure in the marketplace for them to know RDA after graduation. Some have expressed that they feel they will be “change agents” for organizations hiring out of library school.
- Kate: the mixed metadata environment will mean that new staff will be different and will need different training because they may not know AACR2 after the curriculum changes.
- What advice do you have for places where there is no enthusiasm to make the shift?
- Sarah: introduce a system of rewards for staff, and do not tie RDA accuracy to performance.
- Bob: mentioned he has felt a groundswell of support for RDA at the Midwinter meeting that needs to continue, and make RDA less of an unknown.
- Erin: ride the momentum of the testers. Focus on the benefits of adoption.
- Kate: it will be easier for non-testers, who can learn from the experiences of the testers.
Collection Management Section
Collection Development Issues for the Practioner
We identified E-book publishing trends as major issues and discussed various purchasing options from diverse publishers, vendors and aggregators. Access options to E-books were explored. A new Twitter account, @CMDSIP was introduced to follow the discussion established by Chair.
Collection Development Librarians of Academic Libraries
We reviewed the structural changes to the section, which is renamed the Collection Management Section as of July 1, 2011. Our name change reflects the overall umbrella concept that management activities is what brings us together. This committee will become the Section Planning Committee.
One of our program speakers at Annual Conference, Dawn Peters, expressed an interest in doing a webinar for ALCTS. Her name was sent to Andrea Wirth, chair of the Education Committee.
We spent most of the meeting discussing the coordination of contacts with and communication with the section’s interest groups. This involves Dale Swensen, Brigham Young U. Law Library, who is the ALCTS Interest Group coordinator. The challenge will be to work out an approach that does not duplicate work with Dale and does not tax the IGs too much. Currently, there are definitely six IGs. Two new petitions to form IGs were subsequently submitted to the CMDS Executive Committee. The committee welcomed Li Ma, who is the section’s liaison to the ALCTS Planning Committee.
Collection Management in Public Libraries
A variety of topics were discussed, but most of the conversation centered on financial implications of eBooks and other downloadable media. This meeting is frequently described by participants as the most rewarding activity they attend at ALA. Several people told me that they stay until Monday evening so that they won't miss it.
Continuing Resources Section
Access to Continuing Resources
Three speakers addressed the topic Expanding and Understanding Access Options: From Open Access to Patron Driven to Article Rental. The presentations were followed by a brief Q&A session.
Bob Schatz (BioMed Central and SpringerOpen) spoke about the history of Open Access. The number of publishers in the space has grown and topics are moving beyond biomedical and life sciences. The Directory of Open Access Journals now lists more than 6000 titles with more than 500,000 articles. Libraries and university research offices are still finding their way in this new landscape.
Kari Paulson (EBL) detailed the numerous projects underway in Patron Driven Acquisitions or Demand Driven Acquisition, including the growing amount of data now available regarding the pattern of demand. Such initiatives allow libraries to put their budgets towards titles that actually get used. Libraries are still involved in the selection process with the only change being the point at which the purchase takes place. Recently, libraries are incorporating print on demand into the process through use of the Espresso Book Machine.
William Park (DeepDyve) drew attendees’ attention to segments of users (unaffiliated users, knowledge workers, alumni) whose needs are not being met through journal subscriptions and who may well be best served by article rental models. Such models offer restricted access for good value. Reader surveys reveal patterns of research and reading habits with a definite price sensitivity. Cloud-based access has provided the opportunity for a wealth of behavioral data around article use.
Q&A discussion included questions about ownership versus leasing, interlibrary loan rights, and reference management in systems where users can only read articles.
College and Research Libraries
The College and Research Libraries Interest Group met on Jan 9th from 10:30 - 12:00 noon in the Sapphire P Room of the Hilton San Diego Bayfront hotel. The first presentation was titled " Patron Knows Best: Why We Should Put the Patron in the Driver's Seat" by Rich Anderson, Associate Director for Scholarly Resources and Collections. He said that as librarians, we have always been very good at building high-quality collections, but much less good at accurately predicting which books patrons would actually want and use. The current information environment makes it possible for us to expose books to patrons before we buy them, and only buy them if they're actually used. Is this the end of the traditional library collection? Do patrons really know best? And how do you control the speed and direction of the car if you let the patron drive?
The second presentation "Assessing Return on Investment for E-Resources: A Cross-Institutional Analysis of Cost-Per-Use Data" was presented by Patrick L. Carr, Head, Electronic & Cont. Resources Acquisitions, East Carolina University. Patrick spoke about how libraries often rely on cost-per-use (CPU) data to measure the return on investment for their e-resource subscriptions. By comparing CPU data supplied by several libraries, this presentation will provide added context to CPU-based assessments. It will explore what a cross-institutional CPU analysis reveals about libraries' varying returns for their subscriptions, and it will consider the potential that such an analysis has to increase returns on investment.
The third presentation "Research Databases in a Mobile Computing Environment" was presented by Ya Wang, Electronic Collections Coordinator, San Francisco State University. Ya Wang talked about how smart phones are becoming more and more common on campus for learning and social networking. Libraries are building mobile websites for their resources and services. Database vendors such as EBSCO, PubMed, and IEEE have also started to provide mobile search interfaces or applications for their patrons. San Francisco State University library goes mobile for our research databases using Xerxes, an interface application developed to Metalib, Worldcat, and a growing number of other discovery systems via an evolving plug-in architecture. This presentation gave an overview of research databases in a mobile computing environment with live demos.
Preservation and Reformatting Section
Book and Paper
The meeting was run as an informal discussion about how to use web technology to promote book and paper interest within institutions and groups. An initial question to the audience revealed that only two people of those 30 present currently maintained a blog within their institutional settings.
- Reported difficulties working with blogs included: limited institutional support (blogs hidden away on main library pages); time-consuming to create post text to go with project photos; some instances of intrusive, negative comments on posts from fellow university staff; most comments on blogs are not posted to blogs, but come through direct email.
- Reported benefits from blogs included: increased PR for department, especially through photo requests from in-house PR or outside media; ability to show what’s happening in the lab to fellow colleagues.
- Suggestions for best ways to incorporate blogs into workflow include: have all staff members take turns writing posts, which can be stored for future release; involve PR or development staff from the start to discuss content format that can be easily included in their publications/media packets; have blog serve entire library audience, with preservation alternating posts with all other library departments to underscore integrity of library services as a whole; many existing PowerPoint presentations, such as concerning Preservation Week, could be repackaged as on-going blog posts.
Audience members were split on using Facebook and Twitter in conjunction with blogs, many seeing blogs as too time-consuming. Facebook was mentioned for posting announcements and photos with short captions, and Twitter to announce “this day in history” quick updates. Concern was raised about whom to accept as a Facebook friend, and whether it was beneficial to have a Preservation Facebook account separate from the library’s account. YouTube was mentioned in only one instance of a large freezer installation, and represents another a desire was raised to have a centralized preservation blogosphere that could link in all the disparate preservation blogs. Preservation and Conservation Administration News blog (PCAN) is further developing its capacity to serve that purpose.
An informal presentation of the philanthropic model ‘Donors Choose’ was offered by Co- Chair Laura Bedford. The ‘Donors Choose’ model functions as a web-based directed philanthropic giving for K-12 public schools, specifically to fund classroom projects. Posted projects include specific short-term goals, a description of the target audience that would directly benefit from the gift, and a line-item budget detailing all project expenses. Donors can choose to fund any portion of any project, with gifts as small as $1 or in-kind donations; a running tally indicates how many donors currently support any given project and how much remains to be raised for the project to run. The model provides transparency and choice for donors: patrons can see exactly what their donations will purchase and can select projects from a variety of metrics that best fits their personal interests, such as class size, county poverty level, school subject, or re-usability of materials to be purchased. The idea of discussing the ‘Donors Choose’ concept with BPIG was to suggest a way through web presence to highlight those preservation projects that may not be “grant-worthy” either due to small size or area of interest, but that could use outside stimulus to be completed. It could also encourage a blog’s audience to move beyond conversation engagement through commenting on projects to actively supporting or becoming involved in projects through either donations or volunteerism.
The BPIG discussion continued with the question of whether institutions were being encouraged to do more outside conservation/preservation work to outlying communities. Responses included work undertaken for faculty members for good lab PR purposes, with payment going into endowments. But concerns were raised over potential insurance complications, state university mandates to exclusively serve state inhabitants, and already stressed, overloaded work schedules that couldn’t accommodate non-paying pro-bono work.
An announcement regarding the Future of the Book Conference at University of Iowa in September, 2011 was made.
A call for topic ideas and speakers solicited several ideas, including the book on demand, economics of bookstores, self-publishing, and the future of book publishers, book arts and the art of the book. Ideas for 2011 ALA Annual Conference were discussed, including revisiting past topics. The suggested topics were:
- Inviting Whitney Baker and Liz Dube to discuss their recent LRTS article Identifying Standard Practices in Research Library Book Conservation.
- Building and/or remodeling lab space—experience and results.
- Moving into new lab space.
- Health and Safety concerns—reviews once lab space built or any new developments.
- Sustainability of resource use.
The IG meeting consisted of two presentations and concluded with announcements and discussion of future program topics. Peter Murray, Assistant Director for Technology Services Development at Lyrasis presented on digital preservation storage options. He shared information on storage of digital assets (on-line, near-line and off-line storage) and preservation services relating to assets already in storage. He specifically discussed services options and costs provided by Amazon S3, Iron Mountain, and LOCKSS private network, Chronopolis, DuraCloud, OCLC Digital Archive and DAITSS. For a full description of this presentation including slides and references used see Peter’s blog post from January 9, 2011 at http://dltj.org/article/preservation-storage-options/
David Walls, preservation librarian at the U.S. Government Printing Office provided an overview of the new Federal Digital System (FDsys) that was launched in December 2010. The system features an advanced search engine and serves as a content management system and a preservation repository. Wall also spoke about the internal TRAC Audit that was conducted and future plans for an external audit once funding is secured.
Announcements: Liz Bishoff announced the NEH's Division of Preservation and Access grant opportunity for smaller institutions. The Preservation Assistance grant helps small and mid-sized cultural heritage institutions such as libraries and colleges and universities improve their ability to preserve and care for their humanities collections. These grants also support education and training in best practices for sustaining digital collections, standards for digital preservation, and the care and handling of collections during digitization. Institutions may request funds for a preservation assessment of digital collections. Grant awards are up to $6000. Go online for more information. NEH does not fund digitization or the development of digital programs in this grant category.
Tom Clareson reported on the upcoming Lyrasis workshop - Staying on TRAC Digital Preservation Implications and Solutions for Cultural Heritage Institutions. The workshop will consist of webinars and in-person components. See the Lyrasis site for more information.
Future Topics: Several topics were offered for future discussion including archiving email (e.g. National Historical Publications and Records Commission or North Carolina state government) and hearing more from storage option vendors such Amazon and others mentioned in today's presentation. It was also suggested that we share our IG topics with LITA and to consider holding joint meetings.
Annual 2011 (New Orleans) Programs
The following programs are moving forward for:
- “Planning for the Worst: Disaster Preparedness and Response in High-Density Storage Facilities”—Chair, Jennifer Hain Teper [Saturday, 6/25/11 in two parts: “High Density Storage” from 8-10am; “Local New Orleans Perspective”, 10-noon]
- “Preservation Film Festival”—Chair, Elizabeth Walters [Saturday, 6/25/11, 4-5:30pm]
- “MARC 583 Field for Conservation Actions”—Chair, Steven Riel [Sunday, 6/26/11, 1:30-3:30pm]
- PARS Forum: “Preservation of Modern Digitally Printed Materials”—presenter, Daniel Burge, IPI [Sunday, 6/26/2011 4-5:30pm]
Publications in Process
Two publications (most likely web publications) are in process: Planning and Construction of Book and Paper Conservation Laboratories and an updated version of the Preservation Education Directory. The group also reviewed continuing education offerings related to preservation including e-forums, webinars, and web courses; brainstormed ideas for programs for Annual 2012; and discussed how to develop a stronger community for digital preservation within ALA/ALCTS.