Interest Groups Report on Conference Activities in Washington, D.C.

The reports below are summaries of the activities that took place during meetings of ALCTS interest groups held during the 2010 Annual Conference in Washington, D.C. Included are groups whose reports were received by the editor as of August 1, 2010. For information on interest groups not shown here, see the ALCTS Committees page on the web site, or see the ALA Handbook of Organization.

Division | Acquisitions | Cataloging & Classification | Collection Management & Development | Continuing Resources | Preservation and Reformatting    

Division Groups

Creative Ideas in Technical Services IG

The chair and co-chair welcomed attendees, who chose topic-based discussion tables as they entered the room. The topics were based on suggestions collected from evaluation forms submitted after the group's meeting at the 2010 ALA Midwinter Meeting. The chair provided questions to start each discussion. The topics that were discussed follow.

Free like a Beer? Or Free like a Puppy? Open Source Tools for Technical Services

Themes from the discussion included available tools and how libraries may be limited by the staff members’ skills, whether librarians and MLS candidates should be studying more technologies, and how to know what questions to ask when new tools are presented.

When I Was Your Age: Thriving in the Multigenerational Workplace

Themes from the discussion included the influences of age and personality on the work environment, the role of supervisors in modeling professional relationships, and, overwhelmingly, the benefits of working with individuals from other generations.

Make Room for Metadata: Moving beyond MARC for Digital Collections

Themes from the discussion included an attempt to define metadata, metadata education in library schools, and the need to share our data and standards with other data-producing communities.

I Am Technical Services: Succeeding in a One- (Wo)man Shop

Themes from the discussion included the role of catalogers in technical and public services, the importance of learning to prioritize, and the difficulty of keeping up with changes in the profession and associated technologies.

That's Not in My Job Description: Encouraging Flexibility in Times of Changing Technologies and Guidelines

Themes from the discussion included the effects of budget cuts, how to deal with increasingly electronic formats, and the importance of maintaining a positive attitude.

Volunteers at each table facilitated and recorded the discussions. Attendees were strongly encouraged to use the provided discussion questions as a starting point and to take the conversations in any direction that would be most useful to them.

The chair and vice-chair circulated among the tables, participated in the discussions, and gathered ideas for future topics. At the end of the session, recorders from each table gave a brief presentation to the full group, summarizing the table's conversation.

Guanxian (Tony) Fang, University of Minnesota Libraries, is the incoming chair. Libbie Crawford, Product Manager for Dewey Decimal Classification, was elected vice-chair/chair-elect.

Electronic Resources IG

Amira Aaron opened the meeting. Christine Turner ran the election for a new vice-chair and asked for topics for upcoming meetings.

Six panelists gave informative and practical overviews of one aspect of the main theme of usage statistics. As with other e-resource management tasks, the retrieval, maintenance, and analysis of usage statistics is complex and not completely accurate. Even with standards in place, publishers and vendors do not necessarily follow standards, and multiple interpretations and practices still abound. E-book statistics seem to be an especially problematic area.

The panel presentations were followed by an active question-and-answer period. The agenda for the meeting follows.

Down for the Count: Making the Case for E-Resource Usage Statistics

There Are So Many Numbers...

Nadia J. Lalla, Coordinator, Collections and Information Services, Taubman Health Sciences Library, University of Michigan

Most e-resource vendors supply usage statistics by request, on-demand, or via an automated notification process. This data can provide crucial information which is potentially useful when evaluating an e-resource. In practice, collection development professionals juggle too many data-filled spreadsheets from a variety of vendors. Minimally, these statistics are used in retention and cancellation decisions e.g., how often a journal is browsed vs. article downloads. With so much data available, it is convenient to focus on the easily understood statistics such as the number of database searches. Mining the data would enable us to make better informed decisions. What is the useful data and how is it extracted from the superfluous noise? Why are turnaways more than just the number of users who cannot access a resource? When is a search not really a search? Who is counting what and why? Usage statistics are undeniably a valuable collection management assessment. Make them an effective tool that works for you.

Using E-Book Statistics to Inform the Acquisition and Weeding of Print and E-Books

Doralyn Rossmann, Collection Development Librarian and Team Leader, Montana State University Libraries

As the e-book market continues to grow in the number of titles offered, usage statistics play an increasing important role in the disposition of monograph budgets. Rossmann explained why comparing e-book usage statistics and pricing models to their print counterparts is like comparing apples to oranges. Reporting these statistics in annual reports, surveys, and accreditation can be challenging (for example, buying 1,000 e-books for $10,000 versus buying 500 print books for $10,000 can really skew statistics when reporting "price paid per book"). Since many e-books provide options for simultaneous users, new data for demand by subject areas and publishers can be gleaned.

The Problems with Use Statistics for Electronic Books

Leslie Czechowski, Assistant Director, Collections and Technical Services, Health Sciences Library System, University of Pittsburgh

Although the COUNTER Code of Practice for Books and Reference Books was published in March 2006, there is little evidence that vendors who provide electronic books in the health sciences have adopted the standard. Vendors provide use statistics in Word documents, html pages, and occasionally in Excel spreadsheets. In addition, vendors count usage in different ways. It is therefore difficult to compare use between platforms to make valid analyses of use and cost-per-use. In our tight budgetary environment, we need to be able to compare use and cost-per-use, but are hampered by the lack of consistency in reporting use.

Counts within Context: A Tempered Approach to Use Statistics

Monica Metz-Wiseman, Coordinator of Electronic Collections, University of South Florida

With a $6 million budget for online resources representing 78 percent of the materials budget, the University of South Florida is attempting to control annual increases for online resources by acquiring less print. But the line has been drawn in the sand by library administration. If additional funding is not forthcoming from the Provost and President in 2011, the electronic resources budget will see significant reductions. Metz-Wiseman discussed the role of use statistics play in the evaluation of online resources in the context of cancellations through the eyes of administrators.

Figuring Cost Per Use: Fiscal Year, Calendar Year, and What Falls between

Tansy Matthews, Associate Director, Virtual Library of Virginia

Cost per use generally needs to be reported on a fiscal or calendar year basis, but vendor contracts do not always run on the same schedule. The Virtual Library of Virginia (VIVA) has been working on calculating flexible cost per use data that can be reported for any time frame needed. The cost-per-use analysis is made possible by innovative means of data collection. This allows extensive data collection and analysis to be performed with minimal time and effort.


Bob McQuillan, Senior Product Manager, Innovative Interfaces and Member, NISO SUSHI Standing Committee

The SUSHI standard defines an automated request and response model for the harvesting of electronic resource usage data utilizing a Web services framework. It is intended to replace the time-consuming user-mediated collection of usage data reports. This presentation provides an overview of the SUSHI standard, recent updates and current objectives of the NISO SUSHI Standing Committee.


The FRBR Interest Group brought together three speakers to discuss their ongoing research on the implementation of FRBR and FRAD.

Jenn Riley, Indiana University – Bloomington, discussed IU’s Variations/FRBR project, and their approach to implementing FRBR and FRAD in XML. The project team has developed a set of XML schemas structured in three levels. The first level expresses the entities, attributes, and relationships described in the FRBR and FRAD reports, and adhering as closely to the reports as possible. The second level extends the first with additional attributes to refine the basic elements, making this level more useful as production data. The third level refines the first two levels specifically for the music domain. Music-specific elements are added, and elements not needed in the description of music materials are removed. The schemas are available on the project web site.

Yin Zhang, Kent State University, presented some of the findings Athena Salaba and she have produced in their research on FRBR work-sets and user tasks. Running a modified version of the OCLC work-set algorithm on LC collection records, they conducted a detailed statistical analysis of the work-sets produced. They are currently developing an algorithm to group expressions and manifestations automatically, by mapping MARC codes to FRBR user tasks for each entity. More information about their project is available.

Martha Yee, UCLA Film and Television Archive, discussed her alternate set of cataloging rules, developed with an emphasis on indexing and display, which differ from RDA in significant ways. She is also working to model her rules in the Resource Description Framework (RDF) for use in the Semantic Web. She finds that RDF does not model hierarchy adequately, and may be “too binary” to model the complex web of relationships in bibliographic data well. Her cataloging rules and the RDF model are available on her web site.

A discussion session followed the presentations, with attendees directing questions to the speakers. Zhang was asked if the percentages of characteristics she detailed in her presentation also applies to the full OCLC database. Salaba and she have not run their analysis on that set, so she could not comment on it.

The speakers were asked if there were aspects of FRBR that created difficulties in applying it; Zhang confirmed Yee’s statement that a better representation of hierarchy is needed. Riley commented that the model as given in the FRBR and FRAD reports did not fit music applications well without adjustment.

The speakers were also asked about practical outcomes from their projects. Riley reported that the XML Schemas are now available, a search interface should be coming out in the next few weeks, a FRBRized data set in the next few months, and an API by the end of the grant in September 2011. Zhang and Salaba will be developing and testing a prototype FRBR-based catalog as part of their research. Yee is engaging in pure research and is not planning a practical application at this time.

Karen Anderson, Backstage Library Works, volunteered to serve as vice-chair/chair-elect. The group’s current chair is Judy Jeng.

MARC Formats IG

The discussion topic was the recent implementation of changes to the MARC 21 formats to accommodate Resource Description and Access (RDA) testing. Richard Greene, OCLC, discussed OCLC’s implementation of MARC 21 Format for Bibliographic Data, Updates 10 and 11, MARC 21 Format for Authority Data Updates 10 and 11, and MARC 21 Format for Holdings Data Update 10. Complete details of the OCLC update can be found in OCLC Technical Bulletin 258. Greene also discussed OCLC’s policy concerning contribution of bibliographic and authority records created according to RDA guidelines. Complete details of the OCLC policy are available.

Rebecca Guenther, Senior Networking and Standards Specialist, Library of Congress, discussed changes to MARC 21 formats from the MARBI perspective. She also discussed briefly how the Library of Congress is making changes to the MARC tag tables for their Voyager integrated library system. A brief summary of the changes to MARC to support RDA is available.

Metadata IG

The Metadata Interest Group’s Annual program went well. More than 60 people joined the meeting and 53 signed the attendance sheet.


Chair Brad Eden welcomed the crowd, introduced the interest group and asked for volunteers to serve as new officers for the IG. He then introduced the speakers.

The first speaker was Rebecca Guenther, Library of Congress who discussed “Controlled vocabularies as linked data on the web.” She discussed, a service that enables both humans and machines to programmatically access authority data at the Library of Congress. This service is influenced by and implements the Linked Data movement's approach of exposing and interconnecting data on the Web via dereferenceable URIs. In simpler terms, this site is a web service for vocabularies and a semantic web application case.

Sara Russell Gonzalez, University of Florida, discussed VIVO, a research focused discovery tool. Her co-author Medha Devare, one of the original creators of VIVO from Cornell University, could not attend the meeting. The VIVO project will pilot the development of a common, but locally modifiable core ontology and integration with institutional and external information sources. The VIVO application is an open source Java application built on W3C Semantic Web standards, including RDF, OWL, and SPARQL. She called for other institutions to get involved by becoming an adopter, a data provider or an application developer.

Business Meeting

The agenda was approved by the group. The 2010 Midwinter minutes were posted to ALA Connect prior to the meeting. The minutes were unanimously approved.

Program Planning

Kevin Clair and Rhonda J. Marker reported that the preconference was excellent and was very well attended. The main program went very well, too and it included three great speakers: Danielle Plumer, Coordinator, Texas Heritage Online Project, Texas State Library and Archives Commission; Ching-Hsien Wang, Chief Information Officer, Smithsonian Institution; and Joyce Celeste Chapman, Library Fellow, North Carolina State University Libraries.

IG members made several suggestions for future programs:

  • The newly released audio metadata standards by the Music Library Association Metadata Subcommittee of Bibliographic Control Committee in cooperation with an ALA task force, and the new presentation and programming opportunities related to this document.
  • Metadata editing tools, such as a tool working with Fedora and accepting MODS and other metadata schema. Several tools were discussed, including METSMaker (University of Florida), EADEditor (University of Virginia), and a METS editor (Columbia University). Both well-developed tools and those in beginning stages of development stages could be considered. New developments in Jane Greenberg’s automatic metadata extraction could also be considered.

Blog Update

Kristin Martin reported that the blog will be migrated to WordPress by ALA in the near future and it might attract more participation. Martin asked people to contribute entries such as metadata project related notes to the blog and called for a “critical mass” to produce blog content. Besides new original postings, it was discussed that another way is to point and link to other metadata resources from the blog. A potential blog potential topic is institutions seeking research help in the community. The IG program slides can be added to the group space at ALA Connect and be linked to the blog. Several IG members noted that Martin’s “Best Bets for Metadata Librarians” for ALA Annual Conferences and Midwinter Meetings had become popular.

Publications Update

Joanna Burgess reported that the IG’s web site has been updated in terms of content and structure under the new system by filling out a form provided by ALCTS. Slides and news are posted to ALA Connect. The IG Secretary added that the minutes uploaded to ALA Connect prior to the meeting instantly showed up in the IG’s web site. Besides contributing to the IG discussion list and ALA Connect, Burgess suggested a potential publication project: updating the outdated situation reports in the IG’s web site.

CC:DA Update

Steven Miller reported on CC:DA activities. The group has two meetings at Annual, mainly discussing the published RDA. Questions regarding RDA’s status have been raised. Currently, the RDA Toolkit has been made available, RDA testing soon will begin, and RDA training materials and programs will be solicited by a committee. More RDA related programs, such as the RDA Update Forum are available at Annual. After two years of service, Miller requested that someone else take over this position.


Susan Cheney reported on a series of programs organized by LITA at Annual, including the LITA President's Program “Four or More: The New Demographic,” Authorized Genre, Forms and Facets in RDA, Designing Digital Experiences for Library Web Sites, and MODS and MADS: Current implementations and future directions.

Music Library Association

Jenn Riley said that the Music Library Association’s Bibliographic Control Committee and its Metadata Subcommittee have released “Metadata Standards and Guidelines for Digital Audio” online. The audio metadata standards were worked on by the Metadata Subcommittee with the ALA Audio Metadata Task Force. They will provide programming and services online to follow up the release of this document.

Officer Election

Michael Dulock, University of Colorado at Boulder, volunteered to serve as vice-chair/chair-elect. Since there were no further nominations or self-nominations, he was unanimously elected.

Amanda Harlan, Baylor University, volunteered to serve as Program Chair. Since there were no further nominations or self-nominations, she was unanimously elected.

Secretary: Mary Aycock, University of Missouri, the outgoing IG Secretary, volunteered to serve a second term. Since there were no further nominations or self- nominations, she was unanimously elected.

Publication Co-Chairs: Past Program Co-Chair Kevin Clair expressed an interest in serving as a Publication Co-Chair in the new term. Shawn Averkamp, University of Alabama, volunteered to serve as his co-chair. Since there were no further nominations or self-nominations, both were unanimously elected.

CC:DA Liaison: Nathan Putnam, George Mason University, volunteered to serve as the CC:DA liaison. Since there were no further nominations or self- nominations, he was unanimously elected.

New Members (ANMIG)

Ed.’s Note: This group did not meet during Annual 2010 since it hosted ALCTS 101.

Role of the Professional in Academic Research Technical Services Departments IG

About 25 participants, ranging from paraprofessional catalogers to heads of technical services departments attended the meeting. Special, public, and academic libraries were all represented. Three attendees volunteered serve as vice-chair for 2010-2011. The chairs-elect will select new vice-chairs after the meeting.

Co-chairs Robert Rendall and Sandra Macke began the meeting by introducing themselves and briefly describing the work of the interest group. They asked attendees to introduce themselves and mention one or two top issues related to outsourcing or batch processing at their institution. This generated a successful wide-ranging discussion on the meeting’s theme, “The Changing Role of the Professional Librarian in an Age of Outsourcing.” Major areas of discussion included:

  1. The effect of outsourcing on technical services librarians: As the processing of commonly held resources is increasingly being handled by outsourcing to vendors, fewer technical services librarians are needed. At the same time, those who remain may need to have expert skills as they are being redirected to work with uncommon or unique materials. In addition, many libraries are also turning to outsourcing for expertise in languages that their staff do not possess.
  2. Quality control with outsourcing: Most of the work is done up front. However, libraries have to monitor it and watch for problems. Small amounts of errors are acceptable—it is better to have materials made available to patrons efficiently than not. It is frequently difficult to determine if problems are due to interactions between the outsourcer and one’s local system or are quirks of one’s local system. Changes to the vendor’s software can also produce unexpected results for libraries.
  3. What to do with staff when their job functions have been eliminated: This is a frequent problem due to technological improvements or outsourcing initiatives such as the implementation of shelf-ready materials or authorities outsourcing. One attendee noted that it is very difficult to find new responsibilities for paraprofessional staff whose positions were limited to low level work. Although staff can often be trained to do other jobs, sometimes individuals lack the motivation or level of education needed to successfully manage the transition to other work. Staff members may also be afraid of change. It is often difficult for experienced staff to accept and adapt to elimination of functions that had previously been considered essential. Even when existing staff members can be successfully retrained, the institution may need to be prepared to increase salaries as the transition may involve upgrades in job classification. It can it be challenging for institutions to fund salary increases in these tough economic times, and the process may be complicated by union contracts.
  4. The impact of retirements: Many institutions are experiencing a large number of staff retirements at all levels and are not able to recruit new employees to replace all (or any) of them. While some of the positions left vacant could possibly be eliminated due to technology changes or outsourcing, other positions cover key functions that are now being reassigned to remaining staff and managers with already full workloads.
  5. Increasing demands on managers who already have limited little time: Managers are facing increasing demands on their time as they absorb functions from other units, perform tasks in their own units which previously done by retiring staff members, examine existing workflows for efficiency as staff retire or are reassigned, and train existing personnel to assume new responsibilities. In addition, managers plan, test, and troubleshoot new outsourcing initiatives or technology implementations.
  6. Staffing for batch processing and data management: How does one determine the level of staff and training needed for successful batch processing and data management? This work is often limited to a small number of high level individuals. Some institutions have successfully trained support staff to do batch processing. Others have regretted making such a decision. In one case, thousands of records were incorrectly deleted from the catalog. It can be difficult to determine whether a person has the right qualifications to successfully complete this type of responsibility.

Scholarly Communications IG

Two guest speakers, Andrea Wirth, Geosciences and Environmental Sciences Librarian, Oregon State University Libraries, and Andrea Kosavic, Digital Initiatives Librarian, York University, discussed how library faculty at their respective institutions recently passed open access resolutions requiring library faculty to deposit their research in campus digital repositories. Each speaker detailed how their policies were developed, along with any problems or hurdles that they had to overcome, whether the policy has had an effect on other departments on their campuses, and what advice they might offer other institutions that are considering similar policies.

Wirth and Kosavic noted that both institutions reviewed open access policies passed at other institutions (MIT, Harvard, Stanford, and others) as a model to draft their initial policies. There were many similarities in the development of the policies at each institution but one that stood out was that both decided to implement a policy instead of a mandate. Questions arose at Oregon State University (OSU) as to how a mandate would be implemented and enforced. York did not want the resolution to be too rigid. It was also important at both institutions to educate library staff about scholarly communication issues, including open access, author rights, and digital repositories. Both institutions have well established institutional repositories, ScholarsArchive (OSU) and YorkSpace (York). In addition, OSU has a strong subject librarians program and author rights are part of the librarian’s job description. At York, a retreat was held for library staff after returning from the 2007 ARL/ACRL Institute on Scholarly Communication to familiarize staff with scholarly communication issues.

Since both institutions had audiences that were relatively on board regarding open access, the library faculty unanimously approved both policies. The OSU policy emphasizes the importance of depositing the final published version, which is often difficult to obtain from publishers. To assist authors, they capture authors’ experiences with publishers on a wiki. Retrospective deposits are encouraged at York, and they recognize that library staff need to be prepared to assist faculty to archive their research.

Policies were passed for both institutions in 2009. Neither institution has seen widespread adoption of open access policies by other departments. OSU’s College of Oceanic and Atmospheric Sciences has a policy similar to the library’s, and the College of Forestry is in the process of drafting a policy. On a related front, the OSU News and Communication Service promotes research beyond the campus and, now that the research is deposited in the repository, users can link to the actual article. York University has a strong open access advocate in the Chief Librarian at the Law Library who believes that open access is a moral obligation. The Vice President of Research is developing a course about scholarly communication for the faculty. Concordia University passed an open access resolution in late April of 2010. Concordia has a strong sense of community and the backing of nine Canadian-funded mandates. Training and education on both campuses appeared to be a key element in getting the policies passed. Even though neither campus has passed a campus-wide policy, they are working to expand their education efforts across each campus. The speakers’ remarks were followed by a question and answer and general discussion session.

Technical Services Managers in Academic Libraries IG

The meeting theme was “Moving into the Future: Technical Services in Transformation.” Library technical services are undergoing significant changes as tremendous cultural and technological transformations impact libraries and their work. We are moving into the future as we apply initiatives like user-initiated services and acquisitions, network level data management, globalization of resource access, elimination of localization, social networked internal communications, and organizational change. Technical services staff will need an expanded set of competencies and skills to respond to the swift changes today and tomorrow. The discussion covered issues, challenges and solutions in application of some of the initiatives.

The meeting was called to order by chair, Annie Wu. Approximately 67 people attended. The discussion was organized around six tables with six different topics and discussion leaders for each table. Table topics were:

  1. The “Organizational Chart” and Location of Technical Services (Facilitator: Jack Hall; Recorder: Michelle Turvey-Welch)
  2. Collaboration of Technical Services with IT Services (Facilitator: JoAnne Deeken)
  3. Better Management of Technical Services Internal Communication (Facilitator: Linda Lomker)
  4. Bibliographic Data Creation and Management at Network Level (Facilitator/Recorder: Annie Wu)
  5. Evaluation of Technical Services Performance (Facilitator: Bruce Evans; Recorder, Mary Beth Weber)
  6. Technical Services Staff for the Future (Facilitator: Roberta Winjum; Recorder: Sharon Wiles-Young)

The table leaders were members of the interest group’s steering committee. Each table also had a volunteer recorder. Discussion at each table was reported to the audience at the end of the discussion period. The report of each table discussion below has deliberately been left long for the benefit of those unable to attend the meeting.

Table 1. The “Organizational Chart” and Location of Technical Services

Individuals from the following institutions were present at the table: Kansas State University, University of San Diego, Iowa State, University of Manitoba, the University of Texas at San Antonio, Southern Methodist University, University of Houston, University of Denver, Missouri State University, University of Colorado, and University of Wyoming.

The conversation began by having those present discuss how their organizational charts look and how technical services functions are grouped in their organizations. The list of institutions provided above shows that they were all of fairly similar size. However, there was a great variety of organizational configurations as described by the participants. Traditional technical services functions such as acquisitions, cataloging, and serials control were represented in a division called “technical services.” Other library functions often included were collection development (interestingly called “content development” at one institution); metadata (obviously having some connection to cataloging, but sometimes more allied to archives or special collections); digital resources; systems; preservation; access services (variously including circulation, interlibrary loan, shelving, and reserves)

Below is list of topics raised and discussed by the participants as affecting, or being affected by, the organizational choices made by the institutions. Bullets may represent the structure of more than one library present at the table.

  • Pairing of cataloging and preservation functions
  • Separation of preservation from cataloging functions
  • Pairing of cataloging and archives
  • Pairing of archives and special collections with a metadata librarian working with archives to address their metadata needs
  • Archives as a stand-alone unit apart from the library
  • Separate special collections/archives cataloging team that handles collection level record creation, EAD support, etc.
  • Digital services unit consisting of a digital services unit head, a metadata librarian and a staff member to handle scanning
  • No clear home for electronic resources
  • Combining serials and electronic resources into one unit
  • Impact of smaller budgets on monograph purchases and binding budgets
  • Combining acquisitions and interlibrary loan units
  • Combining interlibrary loan with technical services operations
  • Consolidating separate technical services operations from branches into a single, centralized technical services unit
  • Combining acquisitions, cataloging and library systems unit under one dean
  • Co-location of cataloging and library systems operations
  • Using consultants to analyze technical services structure for optimization
  • Increase in purchase on demand (patron-driven acquisitions), including electronic book purchases on demand effected by patrons clicking in the catalog
  • Purchase on demand funds come from a separate pot of money not tied to a particular bibliographer
  • Value of physically locating IT services in the library
  • Moving away from the idea of bibliographers working shifts on a help desk
  • Collection development and public services being located under one dean, technical services and systems under one dean
  • Shift from a flat organizational structure to a more hierarchical one
  • Combining acquisitions and copy cataloging into one unit
  • Copy cataloging now handling batch loads
  • Issues with multiple libraries and multiple library web sites not under a single university library system, impact particularly problematic with regards to different ILS implementations
  • Sharing student assistants between units
  • Use of cross-departmental teams that meet regularly
  • Increase in foreign language materials received for cataloging due to creation of additional area studies programs
  • Increase in the need for original cataloging (U.N. documents, special collections, etc.)
  • Shifting towards a model where everyone in technical services does everything (less specialization); impact of this on training needs
  • Extra funding for materials with fewer staff (collection rich, people poor)
  • Beginning to look at moving large serial runs off-site
  • Deaccessioning

Table 2. Collaboration of Technical Services with IT Systems

Participants were asked how their technical services operations worked with Library IT. The responses varied. One library had a Head of Technical Services who had responsibility for the ILS and the web site, but Systems not part of the library. At another institution, both departments reported to the same head, while there were separate heads for each unit. In many cases technical services (in addition to traditional TS responsibilities) had input into TS decisions related to that unit, but also for the library web site while the IT people handle server maintenance and hardware issues. Since upgrades to the ILS cover both with hardware and TS issues, collaboration including any upgrades to the ILS, was a joint effort.

In other libraries, IT is a separate department inside the library, with most also having an institution wide IT department. In many cases, library IT worked with all computer issues, with the technical services having some voice in ILS upgrades. In one case, the [non- MARC] Metadata Specialist reported to library IT because that person’s responsibilities include loading large datasets. TS staff serve on IT Teams that cover their areas of expertise.

In another case, the campus IT took care of all hardware, including hardware that supports the library, and there was a Library IT department dealing solely with library systems

Another library was undergoing a reorganization which included analyzing the interaction of TS and IT.

In another case, the library did not have a separate IT department. They relied on the campus IT unit. The library manages their web site, load records, and handles system upgrades performed by the ILS vendor. The college has an IT person who reports to the same dean as the library. There is consultation between IT and the library about IT issues and interests. Blending the library and IT is not on the table.

At another library, cataloging and IT was a combined unit that has split. The split was between a user-level of IT (operational) and an administrative level (strategic.)

The second question dealt with managing computers in TS. The responses ranged from TS having absolutely no control over their own desktops including permission to load programs to systems having full control. Security concerns were the reasons for decreasing TS control.

Troubleshooting problems in TS was discussed next. One library had a full system with each call being monitored, a ticket issued and then a date resolved. Another library had a TS system with full control to come in and do work without notifying anyone. This led to problems when a computer was stolen, and it was assumed that IT had taken it. A call to IT from TS revealed that TS did not have the missing computer. This delayed placing a police report.

Some libraries had a dedicated TS staff member for TS issues, but lack a back up or notification when this person is out. Others had a dedicated person, but with back up.

What is the dividing line between IT and the library? At one institution, IT and the library shared space. It was a learning experience for both groups. In most cases, the library has to reach out to IT. In one library, digital collections were handled by IT, and there was a strong effort on both sides to become purposefully interdepartmental when dealing with digital collections.

Courseware was the next topic of conversation. Responses varied widely. In at least one academic institution, libraries had total ownership and control of the courseware software (i.e., Blackboard). The library was included on each syllabus and in every page in the courseware. In other cases, libraries did not manage software, but those who did worked closely with them. In still other cases, the libraries were not involved with the software itself, but worked with individual professors or colleges to include library information.

Only a few respondents had institutional repositories. Most, but not all, appeared to be managed in the library, but not necessarily in TS. One academic institution had a Head of Scholarly Communication who controlled the institutional repository.

Each library seemed to have a separate organization for the Systems department. In one library, a librarian moved and became a technologist.

The last topic did not deal directly with TS and Systems. The conversation drifted into a discussion of TS and other departments in the libraries. There are always blurred lines between TS and Collection Management. Both departments seem to have “responsibility creep,” but the consensus was that it was better to have separate TS and Collection Management Departments. One library was undergoing “task based” reorganization. In another library, acquisitions and cataloging staff were crossed trained, creating flexibility for shifting workflows. Whatever the organization, communication (both direct and indirect) was crucial. The real truth in efficiency can only be understood by analyzing and evaluating every task.

Table 3: Better Management of Technical Services Internal Communication

The following issues were raised and discussed.


  • Staffing changes require setting priorities and determining reasonable timeframes, what can realistically be accomplished, all of which needs to be communicated.
  • Technical services (TS) staff members need to know what is expected of them.
  • People outside TS need to understand what is possible with current staffing configurations, help to set priorities for projects where project time exits due to outsourcing etc.


  • Meetings are necessary for information exchange, answering questions, soliciting ideas, but they should not be too long or too often. People reported fewer meetings, being careful to call them only if the agenda was substantial enough to justify the time.
  • Depending on how scheduling works, one may need to set core hours when everyone is available for a meeting. This has been done in various libraries with flexible or shift scheduling.
  • Another problem is that the same people always participate, but others may be encouraged by giving them time to consider what will be discussed through distributing agendas prior to meeting, encouraging participation through eye-contact, requiring respectful responses to all ideas and comments.
  • If decisions have been made by administration, explain why. Sometimes, the decision to do something is made elsewhere and technical services discussions are about how to implement the decision.
  • Departments within TS need to communicate with each other for efficiency. Acquisitions and cataloging even may be part of different departments, yet need to coordinate their work and communicate on issues.
  • TS staff members serving on committees outside the department results in TS being more visible and more approachable, and having more information about other decisions that impact the department.
  • Policies and procedures:
  • Training is crucial when major changes occur.
  • Communicating within a consortia can be especially difficult as members may have their own policies resulting in others wondering why a record is the way it is. Phone calls, email, and meetings are needed to bring about understanding.
  • Many libraries use the web to communicate policies and procedures.
  • One problem is that people copy from the web what they want and do not check back for changes. Changes should be announced via email, at meetings and/or indicated on the web to alert people.


  • Statistics can be used to communicate TS accomplishments, show trends in workflow, etc.
  • Statistics are important, but need to be regularly reviewed to determine whether what is collected is relevant and is being kept in the most efficient way possible. People reported that they are keeping fewer statistics now than in the past.
  • Keeping statistics manually takes time away from acquisitions and cataloging functions.
  • Various ILS systems may be able to provide data so that it does not need to be manually collected.

Table 4: Bibliographic Data Creation and Management at Network Level

The discussion started with clarification of bibliographic data creation and management at network level, which refers to creating and managing bibliographic data at up-scale level and not rely on local tweaking. Examples are the momentum with the OCLC Expert Community initiative which allows participants to correct, improve and upgrade all WorldCat master records and WorldCat Local initiative. It also refers to mechanisms for sharing and enhancing metadata as they are created by both publishing communities and bibliography utility.

The benefits of network-level cataloging include: interoperability and shared metadata, efficiency, non-duplicate work, elimination of local work. There are some disadvantages or concerns for network-level cataloging such as availability of records (e-records), quality of records, capability (dependability) of system.

The impact of network-level bibliographic data creation and management include: reduced workload for catalogers which may mean shrinking cataloging staff, new bibliographic display and access model from OPAC to bibliographic portals.

Network-level bibliographic data creation and management requires new skills and competencies from catalogers such as data output and input, data mapping, crosswalk, coding, xml, metadata schema, understanding publisher standards etc.

Other network-level bibliographic data creation and management activities attendants are aware of that may benefit cataloging work: SkyRiver, a new bib utility with 20 million records. In SkyRiver, catalogers do not control or initiate the replacement of existing records. Copy can be enhanced or changed from within the software before being exported to the local catalog. SkyRiver records are updated based on automated processes that detect changes to records in the local ILS. This simplifies cataloging, but it entails a paradigm shift away from a database of records that, in many ways, is directly maintained by its users.

The trend of cataloging is more toward massive creation and management of data. The new model of data sharing are the automated capture, crosswalk and enhancement of metadata from publishers, aggregators/wholesalers, retailers, bibliographic utilities such as OCLC and libraries/

Table 5: Evaluation of Technical Services Performance

The group discussed methods libraries currently use to evaluate the performance of technical services. They also discussed the tools used for evaluation and how employees react.

The group determined whether the question was about evaluating individuals or the department as a whole. It was decided that the evaluation was about the department yet individual evaluations factored into the process. Participants noted that their institutions required them to use a general form for individual evaluations and expressed lack of satisfaction with the forms. One participant reports monthly to administration on her department’s productivity. This does not include individual statistics.

The group discussed how performance expectations, including goals, can be included in evaluations. One participant’s institution asks employees to log five days in fifteen minute increments to record what they have done. The group discussed why performance is evaluated. It may be used to gauge costs and determine what can be cut.

Are bibliographic records enhanced, for example, to increase access or to correct minor errors? Do we evaluate for efficiencies in workflow or to justify the importance of cataloging? In either case, the tools and methods will differ.

An individual employee’s work may need to be tracked. One participant’s institution tracked how long it took to do copy cataloging. Acquisitions and cataloging workflows were streamlined, and they outsourced some of their work to YBP. Since this work has been outsourced, the cataloger who was doing it feels he needs to do more work and is also working in public services. The participant noted that administration at her institution is not concerned with database management until they need to justify providing full level cataloging.

There is a need to understand our performance levels to justify the need to do various types of work and determine per transaction costs. This information is then used to leverage the need to provide certain services.

Collaboration is an important factor for one participant. Faculty at her institution provide subject headings for their specific disciplines for these and dissertations. This is done weekly, and those subject headings are then used by catalogers to provide classification numbers.

The group discussed how the services technical services provides for the public are evaluated. The value of the cataloging department and ILS are measured via transaction logs of unsuccessful searches. The logs of non-hits are used to create cross-references to valid subject headings. One participant cited monthly meetings with public services librarians as her way of demonstrating the value added of cataloging. She shares statistics, search logs, etc. with them. She also shows them brief catalog records and how they are improved by cataloging (before and after views). Her meetings are used to demonstrate how cataloging and database maintenance improve discoverability. Other ideas included emphasizing the work that you do that is unique and special and collections your technical services department has cataloged that are not held by other libraries. Some participants count public services queries since they also provide reference assistance.

The question of how to convince administrators that value added requires funding was discussed. Answers included:

Staff keep reports of their activities, which are regularly reported to administrators. This is also helpful when it comes time for annual evaluations.

Positive interactions with users are recorded through letters, etc.

Employees provide self-assessments and demonstrate the value and impact of their work.

The need to evaluate costs such as salaries and overhead was discussed. This can be threatening to employees. Staff should be encouraged to apply their skill sets and expertise.

Challenges with measuring performance:

The group agreed that performance is difficult to measure. Time is a big factor since technical services personnel frequently spend a lot of time problem solving, which in turn makes it difficult to plan for other things. One institution evaluates what library resources have been purchased and what has not been used in the past year. Selectors whose items have higher circulation statistics (this does not include e-books) are given more funds in their budgets for the next year. This is part of the institution’s regular assessment.

External costs often cannot be controlled. Licensing and contracts take a long time to process. There are costs in terms of access lost during the processing of licenses/contracts. How do we evaluate the cost to users and lack of access?

External expectations for tech services performance:

Administrators believe outsourcing will triple our output and results. They do not realize the time required to handle outsourced records and other upfront work. They fail to see the fact that vendor supplied cataloging still requires work.

One participant described technical services as being like a butler: when it is working well, no one notices. When there are problems, it is immediately noticed.

We need to be advocates for our work and educate administrators as to why cataloging and metadata are important.

Reorganization in order to enhance tech services’ effectiveness:

One participant’s institution evaluates the department on a regular basis. They provide link checking for free materials and other staff specifically resolve problems with vendors.

Another participant preferred to refer to reorganization as “re-invention.” She took one year to observe her department and its work. She looked at how time was spent and charged a group of five to examine this. The time study measured the percentage of where time was focused (music cataloging, serials cataloging, etc.). The result was a functional reorganization. She will meet individually with each staff member to ensure that goals are being met and staff are engaged. Across the board buy is necessary for the reorganization to succeed. She also spoke with external stakeholders regarding their expectations.

Another participant noted that every time someone leaves is an opportunity to reorganize. They follow the rule “Do onto yourself” before the administrators swoop in.

Think like an Associate University Librarian. They are always a few steps ahead. State that you are already there.

Trends in the evaluation of tech services performance:

What are we worth? Go to public services evaluations to learn where technical services fit in. For example, how do patrons use the catalog? We need to be part of evaluations to determine how to get people to use the library both on-site and virtually.

Think beyond bibliographic control of resources. How do we manage data and how does it impact efficiency? How do we get information from means other than the catalog?

Good metadata in one’s institutional repository ranks high in Google searches. This constantly drives home the point that good cataloging and metadata exposes our collections.

What skills sets are desirable when people are hired?

Flexibility and the ability to learn were deemed important. Curiosity and interest in learning new things were also cited as desirable, plus comfort with change.

Table 6: Technical Services Staff for the Future

A list of possible table questions was distributed, and participants introduced themselves and shared concerns with the group. One of the table questions read: “Within Technical Services, have you recently eliminated positions? Created new positions? Assigned new tasks to existing staff? Redefined jobs completely? Reassigned staff within TS or to other areas? Changed your organizational structure? Made other changes?” Some participants answered “All of the above.” The following concerns were among those reported. Many were echoed around the table:

Several reported concerns about reorganizing staff and the changes and training involved, especially to cover electronic resource responsibilities. The staff needs to develop new skills to handle electronic resources and help with the ERMS. There is an increase in technical services responsibilities: digital projects, e-books, cataloging for Special Collections. In some libraries, technical services librarians have other responsibilities such as teaching and liaison activities. There are challenges in finding staff that are willing to change, and one strategy is to make small, incremental changes. For electronic resources, however, some staff have been quick to change and learn new skills. One library that had completely reorganized renamed their Head of Technical Services to be Head of Discovery Systems. Meanwhile, this library was laying off half of their staff. Another library had centralized their technical services operations to cover staff layoffs.

Outsourcing of copy or other cataloging is also common. One attendee asked how to handle threats to cataloging based on return on investment (ROI) analysis of activities, and how to balance current print needs with electronic resources demands.

There is a need to build staff expertise. Retirement of long-term staff is a concern. A challenge is how to train and help staff with professional issues if they do not attend conferences. It is important to ensure staff are subscribed to electronic discussion lists to maintain awareness of current issues. Working with unions was an added wrinkle to concerns about staff assignments and training. It is not always clear how to make distinctions between paraprofessional staff and professional staff. It is sometimes difficult to determine how to classify staff who works with electronic resources.

As supervisors, many find themselves feeling fragmented. Technical services staff also feel fragmented with so many reassignments and new training.


Suggestions were made that staff can be trained and there are webinars and other opportunities. If support staff are interested in learning, train them to assume new responsibilities and others may take note. Mentor staff and generate some interest. At some institutions, the staff are developing the skills necessary to problem solve when access problems are reported and the staff are updating the electronic resource databases. The skills needed for electronic resource management are to keep the link resolvers up to date, track problems, and resolve problems based on subscription and licensing data in the ERM or database.

A question was raised: how do you differentiate or what is the definition of professional librarian and paraprofessional staff? The consensus was that professional librarians are responsible for the strategic planning and big picture overview versus the day to day operational processes.

Another question was raised about how to provide more visibility and information about the skills necessary to successfully manage electronic resources and to ensure that the administration is aware of these management skills. Providing a means for reporting errors from reference/public desks/ILL department and to track of the reported problems and the solutions was suggested. Helping to manage the link resolver also provides more visibility for technical services staff.

Staffing needs for the future was discussed. One idea was to create more alignment between acquisitions and interlibrary loan. If ILL requests are placed for current materials, should acquisitions order the title and move closer to the model of ordering books based on demand? ILL statistics should be analyzed for serials purchase and other collection development decisions. Other areas that can be more closely aligned with technical services are Special Collections and Archives.

Other skills needed for technical services staff include: metadata skill transition to using metadata and authority data to organize and catalog IR materials, podcasts.

A report to the larger group ended with this quote from General Eric Shinseki found by one of our participants: “If you don’t like change, you’re going to like irrelevance even less!”

Technical Services Directors in Large Research Libraries IG

Erin Stalberg, Chair, Task Force on Cost/Value Assessment of Bibliographic Control, presented their report ( also available in ALA Connect). The task force found that more research is needed in the area of “value.” They presented seven operational definitions of value and suggested follow-up research. These areas are: discovery success; use; display understanding; ability of our data to operate on the open web and interoperate with vendors/ suppliers in the bibliographic supply chain; ability to support the FRBR user tasks; throughput/timeliness; and ability to support the library's administrative/management goals.

Other discussion topics:

  • Collaboration: Marilyn Wood and Catherine Tierney led a discussion of Collaboration. Topics included: how to assess collaborative efforts; what are the costs and measures of success; planning for collaborative print archives; looking at business models for collaborative partnerships.
  • Rankism/Civility: Lisa German discussed Penn State’s efforts to address issues raised in their climate surveys regarding civility and rankism/classism in the workplace.
  • Kuali OLE: Jim Mouw, Mechael Charbonneau, and Beth Camden gave an update on the Kuali OLE (Open Library Environment) project—a community source software package for academic and research libraries currently being developed.

Technical Services in Public Libraries

The PLTSIG hosted a presentation by Jeff Calcagno from Backstage Library Works, conducted a business meeting to elect a new vice-chair, and had a lively discussion of topics of interest.

This was the third meeting of the PLTSIG. Due to shared interests between this group and the attendees of the OCLC Dewey Breakfast, PLTSIG will meet immediately after the Dewey Breakfast and in the same room, sponsored by OCLC, at the upcoming 2011 ALA Midwinter Meeting and 2011 ALA Annual Conference. This should prove easier than trying to schedule the IG meeting in the same hotel as the breakfast, which is not always possible.

Calcagno’s presentation covered both catalog enrichment and authority control processing. He also discussed Backstage’s latest project, the integration of Bowker’s Syndetics data into the catalog. The overview was thorough, and also touched on Backstage’s work as an RDA test partner. There was a lot of new information on the types of enhancement that would be of specific interest to public libraries, including table of contents, summary information, author notes, fiction genre and award information and detailed biographical profile information for biographies. The information that public libraries are used to seeing from Syndetics can be embedded in 9xx tags in their MARC data.

A general discussion followed. As has been the case with previous meetings, finances were a big topic. Materials budgets are flat or cut.

RDA is looming on the horizon. People talked more about outsourcing, and about having to learn to turn their heads away from the level of quality (or lack of it) they receive. Floating collections was a concern for some.

Centralized collection development, centralized weeding, and centralized collection management tools were also hot topics. Some librarians mentioned the expense of buying/cataloging single copies, and that they have stopped or curtailed that practice in their libraries as a cost saving measure. Most discussion topics came back to money in some way.

Donna Cranmer, Siouxland Libraries in Sioux Falls, South Dakota, was elected vice-chair/chair-elect for the coming year.

Technical Services Workflow Efficiency

The session, “Transforming Technical Services during the Library Renaissance,” was inspired by Susan Gibbons’ working paper, “Time Horizon 2020: Library Renaissance.” In it, she describes a period of unprecedented change for libraries over the next ten years. In particular, there is the expectation of a renaissance in technical services and collection development work. Focus will shift from the acquisition of content to its discovery. Technology will continue to influence user expectations and provide platforms to deliver content on demand, for both print and electronic resources. The 80/20 model will be supplanted by patron-driven acquisitions and locally cultivated content.

Panelists included:

  • Teresa Negrucci, Collection Assessment and Management Librarian, Colorado State University
  • Holly Tomren, Head of Monograph, Electronic Resources, & Metadata Cataloging, University of California, Irvine
  • Sadie Williams, Director of Customer Support, Blackwell/YBP Library Services

With about sixty members in attendance, panelists discussed changes that are manifesting in their organizations and innovations in the provision of services. One clear theme was the shift from collection development to collection assessment due to funding restraints and the shift from just-in-case to just-in-time acquisitions. Each panelist described their current involvement with patron-driven acquisitions. Several points were made regarding workflows that were becoming obsolete (serials check-in) and those that were emerging (local content cataloging). Space shortage was also noted as a major influence on retention policies. Shifting priorities and processes also revealed the importance and difficulty of syncing user needs and resources. In addition, the panel shared what they envision their respective organizations would look like post-renaissance.

Dracine Hodges, Ohio State University, will continue as chair of the IG. Megan Dazey, University of Montana, is the new vice-chair/chair-elect.    

Acquisitions Section Groups

Acquisitions Managers and Vendors IG

The interest group hosted a panel to discuss the 2009 Ithaka S+R Faculty Survey and its theoretical and practical implications for libraries. Panelists included Susan Gibbons, Vice Provost, University of Rochester Libraries and Ross Housewright, Research Analyst, Ithaka S+R. Among other aspects of the report, Gibbons and Housewright discussed the library’s role in the discovery process, the practical implication of managing the transition to electronic formats, and Gibbons reflected on the ideas and predictions expressed in the paper “Time Horizon 2020: Library Renaissance,” which she delivered at the 2010 ALA Midwinter Meeting, against the context of the survey. Though not heavily attended, the session was fascinating. Both panelists were very thoughtful and feedback from participants was positive.

The group’s incoming co-chairs were also confirmed. They are Mandy Havert of Notre Dame University Libraries and Sadie Williams of Blackwell.

Gifts IG

The discussion focused on exchange programs. The Library of Congress continues to operate a program with many countries. This allows LC and their partners to get material that they may not be able to purchase. The LC (Silver Spring) librarian spoke of their techniques to obtain Chinese material (which includes having staff make acquisitions trips). Material sent from other countries to LC may still be irradiated: this damages electronic resources.

Eastern Michigan University’s English Department produces the Journal of Narrative Theory. The library receives other humanities journals from other universities in exchange for this journal.

The discussion finally circled back to the implications of electronic resources. Will this spell the end of exchange?    

Cataloging & Classification Section Groups

LITA/ALCTS Authority Control IG

Despite the remote location and conflicting time with other popular sessions, an audience of about 100 people heard three presentations. While the title of the program was “Authorized Genre, Forms and Facets in RDA,” the reference to RDA was more a placeholder for consideration of possibilities for more creative and useful ways of presenting and using genre/form terms.

All presentations are currently available on ALA Connect.

Library of Congress Update

Janis Young, Senior Cataloging Policy Specialist, Policy and Standards Division, and Form/Genre Projects Coordinator

Young began by reporting on the RDA testing process, a joint project of the three national libraries (LC, the National Library of Medicine, and the National Agricultural Library) and of twenty-plus testing partners of various types. This session report will not go into detail about the test, but will mention some of the implications of test policies for authority work that might be seen in the national authority file. One is that in RDA, fictitious characters can be represented by a name access point. Testers at LC will establish such access points as needed in the name authority file. If a record for the access points already exists in the subject authority file, testers will inform PSD. Headings for fictitious characters not needed in name access points should be established as subject headings, the current practice. RDA also provides for family names as descriptive access points; RDA testers will create name authority records for them, and will break with current practice by making such access points unique (e.g. using specific spellings for the surname, adding qualifiers as needed). Such name access points will not be eligible for use as subject headings in LCSH, however, and will have to go through the LCSH proposal process. Asked by an audience member why LC will continue treating family names used as subjects differently from those used as names, Young cited the difficulties of changing long-established practice. Young invited those with questions about the RDA test to write

A project to add geographic coordinates in MARC field 034 for jurisdictions has been completed, with about 77,000 name authority records updated. There are discussions underway to do a similar project for subject authority records.

The Virtual International Authority File (VIAF) continues to grow; among the newest members are Libraries and Archives Canada (LAC), the Getty Research Institute, and the NUKAT Center (a Polish union catalog). Potential members include the National Institute of Informatics (Japan), and the national libraries of Hungary and Slovenia. The VIAF expanded its scope in 2010 to include corporate, conference, and geographic names; there are no plans to add subject headings.

Another growing project is Authorities and Vocabularies, a SKOS-based service that provides lists of codes, subject terms (including LCSH), and terminologies. There are currently links to terms in RAMEAU (the principal subject vocabulary list for French libraries); linking to translations of LCSH from French and Spanish sources (the Université de Laval for French-Canadian; the national libraries of Chile and Spain for Spanish) is in the exploration stage. Authorities and Vocabularies offers a mechanism for social tagging, allowing the public to offer suggestions for changes or additions to terminologies; this may be a conduit for non-SACO libraries to have more input in building vocabularies. The first suggestion was received on June 4, 2010.

LC made a decision in 2007 to continue creating pre-coordinated subject strings in LCSH. The PSD has recently issued a report on LC’s progress in implementing action items and recommendations from that report. It is available at the LC ABA web site.

Young reported on the consequences of the long-awaited change in LCSH from the topical term Cookery to Cooking. The change, made on June 2, prompted revision of some 1,300 subject authority records. New subject and form authority records were created for the term Cookbooks. The process of issuing updates is ongoing. Most LC Children’s Headings for cooking (of the pattern Cookery—[ingredient]) have been cancelled; with three exceptions, children’s headings will follow LCSH. A revision of SHM 1,475 will appear in the fall.

Library of Congress Genre/Form Terms: a Faceted System

Janis Young

Two projects—for genre/form terms for moving images and recorded sound—have moved from the development stage and are part of the SACO routine. A project for cartographic headings is in full swing; sixty-five genre/form terms were approved on May 19, 2010, and approval for removal of qualifiers from map form subdivisions is expected on August 19, 2010. All changes will be implemented on September 1, 2010. Asked whether libraries can implement earlier, the response seemed to be positive for new terms, but less so for existing terms.

LC will partner with the American Association of Law Libraries (AALL) to create records for genre/form terms in law. That is expected to be completed by the end of the year, with implementation in early 2011. A lacuna is terms for religious law; collaborators have been identified for producing terms for Jewish law; other faith traditions will follow as partners are found.

Young briefly reported on progress with genre/form terms for music (the next presentation provides greater detail). Work on religious terms began in mid-June in cooperation with the American Theological Library Association, which is in turn coordinating work with other religious library organizations.

An outcome of these projects has been the decision to formally remove genre/form terms from LCSH and create a thesaurus titled LC Genre/Form Terms for Library and Archival Materials (LCGFT). There will be a separate manual for the thesaurus, which will have the MARC code “lcgft.” Authority records for terms will have the LCCN prefix “gf.” Among the MARC format implications—coding 008/11 (subject heading system/thesaurus code) as z with an 040 $f lcgft in the authority format, and tagging as 655, 2nd indicator 7 with $2 lcgft in bibliographic records. The use of $2 rather than a new 2nd-indicator value is intended to ease problems with legacy data, both terms already in 655 and those that are currently tagged in 650. The second indicator value of 8 was considered, but is used by OCLC for Sears subject headings. Asked about second indicator value 9, Young replied that “9” has a long-assumed status of being a locally-defined indicator value. More information is available online.

A Faceted System—How Will It Work?

What is faceting? Young offered her definition—coding single terms or phrases, representing individual concepts, separately in the bibliographic record. Limiting a field to one concept provides machine and user predictability—witness the problem with parsing meaning from the two-concept string “Animated Western film” versus the same three terms “Western animated film.”

A guiding principle in constructing a facet is to put significant words at the front of the string. Another is to avoid repeating data within the record—if a characteristic is present elsewhere in the record, let that stand in place of a genre/form term. Young offered the example “German films.” What does “German” denote? Language? Setting? Place of production? If explicit coding of those concepts is present (e.g., the MARC 257 field for place of production), such terms are not helpful. (A post-presentation question highlighted the complexities here; while the MARC 260 field is for place of publication and 257 is for place of production; the former can go down to the city level and may or may not be transcribed, the latter is explicitly at the country level, using controlled terms. In FRBR terms, the 260 operates at a manifestation level, the 257 at a work level). In other cases, such as “Comedy films,” the data might be present (in a MARC 520 summary note), but not machine-parsable. Genre/form terms are symbiotic and elaborate on one another, e.g. “World War, 1939-1945—Drama” (subject with genre/form subdivision) and “War films” and “Comedy films” (genre/form terms). Fictitious characters represent LCSH “crawling into” LCGFT, e.g. the subject heading “Batman (Fictitious character)” versus the possible genre/form term “Batman films.”

Young expressed the hope that future systems would retain the ability to browse both LCSH and LCGFT. She reinforced the need to retain pre-coordinated headings in our current environment; change can be discussed when systems improve. We should look to genre/form headings as means to limit or expand searches. We are walking a tightrope between supplying usable data for today and creating data for tomorrow. Current systems, still based in card-catalog structures, can’t do what we want now, but will never do so without us producing the data.

Among other questions from the audience: What is the future of genre/form subdivisions (MARC $v)? Young said they are needed for now, though she recognizes the potential redundancy when such a subdivision replicates a genre/form term in MARC field 655.

The Music Genre Form Project: Issues and Some Solutions

Geraldine Ostrove, Senior Music Policy Specialist, Policy and Standards Division, Library of Congress

Ostrove sees the Music Genre/Form Project from two perspectives—as a discrete project, and as the prototype for retrospective migration of large amounts of data. All music vocabulary is already in LCSH; there are aspects unique to music, but much that will apply to other disciplines—principles of vocabulary formation, the interrelationships between LCSH and other vocabularies, the use of headings with language qualifiers. Future work will include concepts of place and time periods. The current project is limited to terms assigned to musical works, for which there are ca. 16,000 headings explicitly established in LCSH. Many of these terms actually denote genre or form, or function as both, e.g. Music.

Realizing the enormity of the task, LC approached the Music Library Association seeking collaboration. MLA’s Bibliographic Control Committee agreed, and formed a task force to work with LC’s Genre/Form Project Group. The groups began by taking the LCSH terms and creating two lists—one of genre/form terms embodying the single-concept principle, and another of terms denoting medium of performance. So far, over 1,000 terms have been agreed on for inclusion in the music component of LCGFT; the MLA task force is currently arranging the terms in hierarchies. Many current headings will be cancelled—those that designate only medium (e.g., Piano music), and compound headings such as Sonatas (Piano), that contain both genre/form and medium of performance (there are about 850 instances in LCSH). Format subdivisions, e.g., Vocal scores with piano, cannot be secondary, but must become headings.

No musical work lacks a medium of performance. It is arguably the most important attribute of music, and may be a user’s sole interest. Indeed, some musical works will require no genre/form terms. LCSH headings containing embedded medium of performance can be deconstructed; this is an opportunity. Terms for medium of performance need a new “home” in MARC records—possibilities are a revised 048 field, or the newly-defined 382 field. Libraries using RDA should not need to qualify terms by language because that data will be found elsewhere in the record. We can ultimately look forward to bringing this data in from an authority record for the musical work as needed.

As a controlled vocabulary, terms for medium of performance will likely remain in LCSH—it already contains terms for names of instruments, families of instruments, types of ensembles, vocal mediums, and objects used as musical instruments. But—the topical term for an instrument does not denote the same thing as a term for medium of performance, even though the same word may be used for both, e.g. Violin (denoting a physical object when used as a subject). Another complication is the 200-plus terms so far identified as genre/form terms that have also served as topical headings, e.g. Piano music—History and criticism. In some cases, these strings can be complex, and will likely be simplified.

What is to be done with terms denoting carrier? The Genre/Form Thesaurus terms are at the work and expression level, not manifestation or item. In addition, music carriers for those other than printed music require consultation with other communities. Carrier data has not generally been part of a subject heading (e.g., no. “Notated salsas” or “Recorded sonatas”), but has been recorded in the general material designation, the extent area, other parts of the description, and form subdivisions. What to do is still an open question.

Other Issues

Alignment of vocabulary with RDA—this includes carrier and extent terms, and types of musical notation. The current sense is that terms for the last should be in LCGFT.

“Aboutness” of music—Some musical works are themselves about other things. This is often signaled by subdivisions such as –Songs and music or –Musical settings, or by adding (Music) to the topical term. Some of these can become MARC 655 genre/form terms readily; others cannot.

Same concept, different syntax—a current example would be the literary formation Ballads, American versus the musical string Ballads, English—United States. Deconstruction can remove these discrepancies. But what about Ballads—Texts for a collection of words to musical works? In literature, ballads are texts by definition. Would a pair of headings such as Ballads and Folk song texts be the solution?

Topical lookalikes—These look like genre/form terms, but are restricted to subject usage in LCSH. Examples include: Ear training, Harmonics (Music), Multiphonics, Rehearsals (Music), Triads (Music). Some of these would by warrant now serve as genre/form terms, and should be included in the thesaurus. Other headings, e.g., Absolute music, Avant-garde (Music), and various “ism” terms, will likely remain as topical headings.

Implementing (Parts of) FRAD in a FRBR-Based Discovery System

Jenn Riley, Metadata Librarian, Digital Library, Indiana University—Bloomington

Most catalogers are now familiar with FRBR (Functional Requirements for Bibliographic Records), but not as many are acquainted with FRAD (Functional Requirements for Authority Data), and even fewer have done tested or played with it.

The V/FRBR Project (Variations/FRBR) is one of those experimenters. The Variations Project (and its successor, Variations2) has been in place at Indiana University since 1996—a digital library system to deliver digitized content, principally of printed music and recorded sound, and associated research activities. ( Find more information and links to later iterations online). When the LC Report on the Future of Bibliographic Control appeared in late 2007, its invitation for further testing of the FRBR model found takers. The Project received a National Leadership Grant from the Institute of Museum and Library Services to provide a real and concrete testbed for FRBR—large amounts of real data (for over 200,000 scores and sound recordings) in a production environment. This would include creating cataloging and searching interfaces to the FRBRized data, and producing documentation to be shared with the wider digital community. The central challenge—taking a human-readable conceptual document and creating from it a working data model.

As background, Riley discussed some of the decisions made—to interpret FRBR as literally as possible, to represent the data in XM (version 1.0 of the XML Schema appeared in March 2010). Among the goals—reusability, couched in FRBR terminology, human-readability, and the capacity to go beyond “core FRBR” if needed.

FRAD adds new attributes to FRBR entities (place of origin of work, history of the work) and a new entity—Family. It expresses new relationships among Group 2 entities—real person versus attribution, personas. It extends the FRBR model to include the naming/assigning of identifiers to bibliographic entities as the basis for controlled access points representing those entities. By including the activities of rules and cataloging agencies in the model, FRAD goes beyond the data itself to model the creative process. In this, FRAD assumes an authority control process similar to what we do now.

What does this mean? For one thing, a change in definitions. In FRBR, one attribute of a work is its title; in FRAD, a similar attribute would be a work’s name. In providing a framework for routinely recording attributes of persons far beyond current practice (e.g. gender, place of birth and death, occupation, areas of activity), FRAD provides many new possibilities for catalogs. Among them: true internationalization by giving the user control over the language of display; offering supplementary information to provide context for the resources they seek; on-the-fly assembly and display of data for disambiguation; being an effective path to relevant resources not before known to the user; and serving as a research system and not merely a finding aid. Riley cited Open Library’s goal of creating a Web page for every author, concept, and work.

Toward this goal, V/FRBR decided to: 1) add the FRAD attributes to the FRBR entities, and 2) add the Family entity. These were relatively easy. Also being done, but with more difficulty, is adding the new relationships FRAD affords. The Name, Identifier, Controlled Access Point, Rules, and Agency entities are not being added, since their principal value is supporting a multilingual environment, which is not where Variations is.

The Project developed three models of FRBR implementation:

  1. frbr, with the entities/attributes/relationships from the FRBR report, plus an @identifier and a wrapping structure;
  2. efrbr, which adds XML attributes, and groups publication elements; and
  3. vfrbr, which expands and restricts efrbr for description of music materials. This last level models how the basic model can be adapted to a specific community’s needs.

The early beta search and FRBRized data will soon be available. There are limitations—some of the data-mapping from MARC has posed problems, and data for newly-added attributes will not be present. This may be ameliorated in future by calling such data on the fly from associated authority records. In short, things are at a modeling stage.

For next steps from the community, Riley called for:

  1. IFLA to resolve the differences between FRBR and FRAD;
  2. real FRAD implementations to be developed;
  3. demonstration of concrete benefits from the added data in FRAD; and
  4. show an ability and willingness to pull the extra data in from other sources, and to push it out as needed. This last point requires a level of trust that is still developing.

Cartographic Materials IG

Discussion centered on participants sharing what their institutions had been doing to prepare for the release of RDA. Some participants reported no action on RDA at their institutions. Other participants shared learning opportunities they had taken that already existed on web sites prior to the release of RDA. Jay Weitz, OCLC, distributed a handout of OCLC instructions regarding procedures that OCLC member libraries are expected to observe prior to and during the National Libraries Test period of RDA in October.

Participants discussed the possibility of conducting Midwinter Meetings online either during or prior to the actual ALA Midwinter Meeting. Some participants asked why we could not only conduct such online meetings as well as attend the Midwinter Meeting. Participants voted to study the issue more thoroughly and consult ALCTS leadership on the feasibility of implementing such online meetings. The meeting concluded with a vote of confidence from the group to pursue the online meeting(s) idea and firm up particulars of how to implement it by Annual 2011.

Catalog Form and Function IG

The forum at the Annual Conference included four speakers discussing the topic of mobile catalog interfaces. Discussion centered on the integration of mobile catalog interfaces with platforms such as: Summon, VuFind, Endeca, AquaBrowser, and the local catalog. Presentations addressed the following topics:

  • Mobile catalog interface design strategies and challenges
  • Do mobile catalog interfaces meet end user needs?

About 60 people attended the session. Presentation slides and abstracts from the forum are available on the Group's wiki.

Richard Guajardo (University of Houston) served as Chair during the year, with Katherine Harvey (University of California, Irvine) as Chair-Elect. Charley Pennell (North Carolina State University) remained as Past-Chair. For 2010-2011, Katherine Harvey will serve as Chair, while Cheryl Gowing (University of Miami) serves as Chair-Elect. Richard Guajardo will become Past Chair.

Catalog Management IG

The Catalog Management Interest Group meeting featured two presentations and an announcement of a collaborative effort, with time for questions after each.

Sevim McCutcheon, Catalog Librarian, Kent State University, gave a presentation titled “Integrating Enhance and NACO Work into Pre-Professional Experiences: A Successful Strategy for All.” Library school students are trained and monitored as they learn to make contributions that not only assist the library but are shared at the network level. Work included ETDs and names in music and audiovisual records that were contributed to NACO.

Krista Clumpner, Head of Technical Services and Systems, Northern Michigan University, spoke on “Using Cataloging for Weeding and Retention.” Given a mandate to reduce print collections, the project identified zero-use items as well as items to retain, such as unique items and classics. An 852 note in the holdings record avoided bibliographic record overlay and recorded retention decisions. Retained items with low usage had bibliographic records updated with tables of contents or added subject access. Philip Young, chair of the group and Catalog Librarian, Virginia Tech, announced a batch loading collaborative that began in January, 2010. The purpose of the group is to improve batch MARC record quality through communication with vendors, bibliographic utilities, and other catalog librarians. The group has an electronic discussion list and a wiki and encourages new participants. Presentations are available online.

Cataloging Norms IG

The meeting was co-chaired by Rebecca Routh and Michael Kim, with introductions of the new incoming co-chairs and co-vice-chairs. The program that followed consisted of three presentations on a variety of current issues.

The first presentation, by Karen Miller (Monographic/Digital Projects Cataloger, Northwestern University Library) focused on “Authority control and the digital repository: what happens to controlled vocabulary once it's outside the ILS?” Miller addressed the problem of how to keep controlled names and vocabularies up to date once they have been exported outside the ILS.

The second presentation, by Sheila Bair (Metadata and Cataloging Librarian, Western Michigan University) and Myung-Ja Han (Metadata Librarian, University of Illinois at Urbana-Champaign), described their Best Practices for CONTENTdm Users Creating Shareable Metadata. The presentation included a discussion of some of the issues related to migrating metadata from CONTENTdm to WorldCat (from Dublin Core to MARC) using the Gateway, and various solutions developed by the OCLC CONTENTdm Metadata Working Group.

The third presentation, by Marliese Thomas (Database Enhancement Librarian) and Greg Schmidt (Special Collections and Archives Librarian) at Auburn University Libraries, was entitled “Discovery Layer or Monster Mash?” They shared their experiences working with VuFind and EBSCO Discovery Service to better understand the importance of metadata placement and consistency for maximum discovery.

The meeting concluded with a fifteen minute question and answer period. Slides and handouts to all the presentations are posted in ALA Connect and the ALA conference wiki.

Competencies and Education for a Career in Cataloging IG

An agenda for the June 25, 2010 meeting was distributed for ALA Annual 2010 and minutes of the January 15, 2010 meeting at ALA Midwinter were distributed. Sylvia Hall-Ellis, chair of the interest group, reviewed the group’s function statement with attendees. Hall-Ellis stated that approval of the petition to establish the interest group and its function took most of the spring since the 2010 ALA Midwinter Meeting, and formation of the group had been approved by ALCTS CCS Executive Committee and forwarded to the ALCTS Office. She thanked those who signed the petition and those who were present at the meeting. There are currently thirty-five people on the interest group’s mailing list. The group then reviewed documents in handouts provided, including rules for establishing interest groups, taken from the ALCTS website.

Attendees voted to select a vice-chair. There were two nominations, one for Linda Smith Griffin and another for Angela Kinney. Kinney was elected as the vice-chair.

The Group planned the next steps as follows:

  • Hall-Ellis will ensure that interest group information and function are posted to the ALCTS web site by ALA Midwinter 2011.
  • Tamara Phalen will share information about ALA Connect with the group
  • Angela Kinney will deliver the minutes from this meeting to Sylvia to submit to ALCTS.
  • Interest group members will have a managed discussion at ALA Midwinter 2011.
  • The interest group will plan a formal program for ALA Annual 2012 (once approved by ALCTS).
  • Once the formal program is approved, Phalen will create a page on ALA Connect for ideas to build the ALA Annual 2012 program.
  • The interest group will continue to meet on Fridays 4-5:15 pm.
  • The group will try to organize a meeting using Illuminate for a future meeting.
  • Hall-Ellis will work with ALCTS to reserve a location and meeting time for ALA Midwinter 2011.

Copy Cataloging IG

The group had a very interesting and varied program reflecting how the world of copy cataloging is changing and growing and becoming increasingly dynamic, blurring traditional lines and division of labor and roles and becoming more of a cooperative enterprise. The presentations and discussion addressed four very current topics: an update on the Library of Congress’ commitment to increased copy cataloging, copy catalogers at one library as participants in the PCC program and cooperative cataloging and the library’s experience with Slavic vendor records, and a novel and engaging presentation of an approach to helping catalogers, original and copy, to become familiar with the language of FRBR and FRAD as the groundwork for RDA training. More than ninety-six people attended the meeting.

Traditional LC Report

Angela Kinney, Chief, African, Latin American & Western European Division, Acquisitions and Bibliographic Access Directorate

“Traditional” because for many years the report from LC has been a part of every Copy Cataloging IG or DG meeting. A few years back Congress issued a directive that the library comply with the spirit of Government Performance and Results Act of 1993 (GPRA). In 2008, the Library of Congress developed a strategic plan for 2008-2013 in the spirit of the GPRA to better manage the work of the library, to set goals, account for moneys spent, and produce evidence of outcomes. As a result, LC set a goal of increasing its copy cataloging to 30 percent of all materials published in United States. The percentage of copy available will increase for several years. All copy used is treated as they would original cataloging in terms of validation and necessary authority work when necessary. LC copy cataloging not only helps LC deal with its own backlogs, but still provides a valuable service to the national community.

Kinney explained that the decision to cease to establish series authority records was in part an outcome of the time consuming training needed to accomplish this work. At the same time, like other libraries, they are beginning to “grow paraprofessionals” to do some original cataloging, and are finding that the original catalogers are not ready to let go of their traditional jobs while learning to perform all parts of the acquisitions and cataloging process. It has been a question of hiring versus finding ways of getting the work done and the focus was placed on processing operations which has led to reorganization to streamline cataloging and acquisitions, training of paraprofessionals, merging acquisitions and cataloging (in a word, behaving like other libraries nationally under the same pressures). Kinney also noted that LC is working closer with such vendors as Casalini (for a few years now), and a vendor in Argentina to improve the quality of the records they provide and lower costs. LC’s overseas offices are being trained to produce copy cataloging now that they can input directly into LC‘s ILS (Voyager). At least one is ready to be independent. This means that LC vetted copy from abroad will also increase. She ended noting that the RDA testing will include copy cataloging.

With Strings Attached: Training FRBR and FRAD via Simulation and Play

Peter J. Rolla, Assistant Professor, Catalog Librarian, University of Colorado at Boulder Libraries and James P. Ascher, Assistant Professor and Rare Book Cataloger, University of Colorado at Boulder Libraries

The presenters discussed and demonstrated in a very lively presentation how they trained staff including copy catalogers, for FRBR and FRAD, as a foundation for RDA and other next generation cataloging tools. They approached the problem from a sociological point of view: that there is a strong fear factor in trying to learn how to get used to the concepts of FRBR, FRAD, and soon RDA. Their training is not meant to make everyone a specialist but to make them comfortable using the terms and talking with them. They referred to it as a “socialization” process for new standards, and divided it into three stages: generating buzz about FRBR using blogs as a space for discussion; providing a reason to know via the process; then a debriefing, sharing voices of how participants understood. Questions related to the activities around the training—what if people do not know how to use a blog? What kind of reception did you have (they worked with a group of about ten original and copy catalogers at the University of Colorado)? How did you make the RDA connection?

The audience found it interesting to hear how after an initial reading of FRBR (uploaded from a web site), the discussion was nurtured by the two librarians—active participation and engaged management of the process seemed an essential component. They began with emails, asked other people to solicit yet others to join the project and once discussion got going on the blog, actually sent false message to solicit responses and see if people are reading with attention. There was a group activity consisting simple of long multicolored strings labeled with FRBR terms and using a board catalogers worked through the ideas of FRBR in an active way, using Don Quixote in all formats, the work through different relationships established in FRBR, e.g. work-to-work, work to manifestation, etc. Some thought has to be given to planning the project—at least one person familiar with FRBR and one with various languages are essential. Result: Learning and work styles came out in the progress in each group, natural leaders arose, and people learned how to move comfortably in the world of FRBR. “Debriefing" was learning what people thought about FRBR. Besides being entertaining and instructive, one person in the audience also stated that this was a good tool to possibly use in her small library where they did not have access to online training opportunities.

Two Aspects of Cooperative Cataloging: PCC Training for Paraprofessionals and Vendor Records’ Impact on Processing

Irina Kandarasheva, Copy Cataloging Unit Librarian, Columbia University Libraries

After giving some background on Columbia’s participation in PCC, Kandarasheva described how since 1997 Columbia had been training its Slavic copy catalogers to contribute names and uniform titles authority headings to NAF and cataloging records to PCC for Slavic belles letters. There were not enough original catalogers to handle these materials, Columbia wanted to avoid a backlog, and copy catalogers were enthusiastic. The union presented no difficulties and these new tasks allowed the department to upgrade copy catalogers to the highest level possible in the classification system. It seemed like a win-win situation; everyone benefited. The process involves the paraprofessional cataloging the item directly in OCLC, creating needed authority records in OCLC, providing a classification number, which all goes to a reviewer. The reviewer then inputs the authority record into the NAF. Like LC, NACO series training was found to be too ambitious because of the time involved (PCC allows series to be put in as 490 0). Furthermore, copy catalogers use a local 053 proposal form to propose a 053 LC number. This required further training in that process but as a result these are additional contributions to NACO records and the cooperative venture. Over the years, seven copy catalogers have created full level PCC records in a variety of languages: recently, Greek, English and Spanish and some African languages have been added to the Slavic and East European languages started years ago.

Kandarasheva put this into the current context of increased cooperative cataloging and the search for new means to improve it as well as potential opportunities for catalogers to employ their knowledge and skills. Citing David Banush in his recent article in Cataloging and Classification Quarterly “Cooperative Cataloging at the Intersection of Tradition and Transformation: Possible Futures for the Program for Cooperative Cataloging” where he discusses PCC and its NACO component and the potential it has to preserve principals of authority control. She notes that copy catalogers well versed in PCC cataloging and creation of name authority records could potentially be trained in name disambiguation projects and can meet other needs in the metadata context, expanding the pool of talent available to perform such important work. She put her finger on a very current situation many technical services find themselves in: shrinking staff, changing role of the original cataloger, the need for training paraprofessionals in new tasks to continue to keep up with our responsibilities towards are collections and patrons in a changing technological reality.

Columbia’s experience with vendor records is a good illustration of how libraries are impacted by the poor quality of many vendor records and how working directly with vendors on the institution level can lead to good outcomes. Although it requires much individual attention by a variety of librarians, working with vendors is worth it. Columbia also provides excellent vendor guides prepared for e-resources. She concluded that it is important to continue the effort to work with vendors to improve the Acquisitions and Cataloging workflow.

Meg Mering, Principal Catalog and Metadata Librarian, University of Nebraska-Lincoln Libraries is the new IG chair.

Heads of Cataloging IG

The Heads of Cataloging Interest Group hosted a presentation on Monday, June 28, 2010 at the Washington Convention Center. Robert Ellett, CCS Executive Liaison to the Heads of Cataloging Interest Group and Linda Smith Griffin, vice-chair/chair-elect, welcomed approximately 140 attendees to the interest group program.

RDA: What Cataloging Managers Need to Know


  • Christopher Cronin, Director of Metadata and Cataloging Services, University of Chicago
  • Shawn Miksa, Associate Professor, Department of Library and Information Sciences, College of Information, University of North Texas

Both speakers presented timely information on what catalog managers can do to prepare their staff for RDA. Cronin shared that the University of Chicago will be a participant in the RDA testing conducted by the three national libraries beginning in July. Additionally, he discussed the University of Chicago’s experience as they oriented their staff to RDA and readied themselves for the testing. He provided a sample set of the 42 MARC 21 test records created by his institution using the new code.

Miksa shared some of the noticeable changes that staff will recognize immediately and decisions institutions yet have to make. The attendees were presented with the RDA testing timeline, information on the Toolkit, and a number of web sites for additional information most notably the Library of Congress and the Joint Steering Committee (JSC) for the development of RDA. To complete the discussion, Ellett shared an interesting virtual presentation on RDA created by one of his library school students.

An election was held for the vice-chair/chair-elect position for 2010-11. There were two nominations, and Christopher Cronin, Director of Metadata & Cataloging Services, University of Chicago was elected to fill the position.    

Collection Management & Development Section Groups

Collection Management in Public Libraries IG

As always, this group engaged in lively and interesting discussion of trends and issues in public library collection development and management. Topics included centralized collection development, floating and balancing, budget issues, patron-driven acquisitions, working with vendors, and downloadables/e-media.    

Continuing Resources Section Groups

Access to Continuing Resources IG

Three speakers addressed the topic: Re-thinking Library Business Models for Licensed Digital Content under Mobile and Cloud Computing.

Melissa Blaney, Publications Division Lead Web Analyst, American Chemical Society, provided an overview of the development and release process of ACS’s iPhone app, introduced in early 2010. This mobile application provides access to ACS ASAP articles across their thirty-eight titles; a newsfeed; saving of abstracts to an offline folder, quick search, sharing links and snippets. ACS is currently working on optimization for iPad as well as offering alternative authentication methods.

Michael Porter, Communications Manager, WebJunction, and blogger at, detailed his concerns about the future of libraries. In his estimation, libraries have a problem in that their identity is tied to content mainly in the form of the physical book. As e-readers become affordable ($30 range), electronic formats will be the preferred access method. Libraries must provide effective content access to survive and thrive, but efficient distribution is expensive. To date, publishers have no motivation to work with libraries to provide access to more patrons. Michael is forming a non-profit to help solve these issues so that libraries can get back to doing what they love.

Filling in for a missing panelist, Heather Staines , Global eProduct Manager, Springer Science + Business Media provided an overview of mobile in the STM landscape. While the adoption of Kindles and iPads leads industry news, mobile phones are the real drivers in the race for mobile content. Publishers must determine strategies for content delivery, marketing and promotional purposes, and new business models. A number of publishers have created products and apps around their current content and platforms. Libraries, too, have been forced to make their websites and their content accessible to mobile users. Issues connected to these trends include an increased demand for technical skills in libraries, methods to track usage, bandwidth concerns, data privacy issues, and strains on existing business models.

College and Research Libraries IG

Attendees heard three twenty-minute presentations followed by a brief question and answer session.

Electronic Resources Evaluation Central: Homing in on a Permanent Site

Lenore England and Li Fu of University of Maryland University College

England and Fu showed demonstrated their LibGuides-powered evaluation database, a central location where they put various pieces of data to evaluate their subscriptions. They are required to review subscriptions every year, so this database has become a very valuable central storehouse for information such as cost, usage, and resource fact sheets.

Ithaka S+R 2009 Faculty Survey Report

Ross Housewright of Ithaka S+R

Housewright shared results from the Ithaka 2009 faculty survey. Scientists are more willing to cancel print versions of journals than humanists, although more humanists than before are willing to do this. It also seems that faculty think that some library should maintain print, but not necessarily their own library. In addition, faculty place higher value on wide circulation of their works amongst their scholarly peers than they place on free access to those same works.

Scholarly Video Publishing to Increase Productivity and Standardization in Life Sciences

Moshe Pritsker, Journal of Visualized Experiments (JoVE)

Pritsker indicated that the video journal is critical as a teaching tool, and it has the added benefit of rewarding the video contributors with a CV entry for each video article submitted. He shared high-usage figures.    

Preservation & Reformatting Section Groups

Book and Paper IG

The Library Binding Institute (LBI) Toolkit is complete. Debra Nolan brought a completed binder to show the group. It includes a guide to library binding, standards, articles, a bibliography, a glossary, samples, etc. The toolkit can only be ordered by using an order form. Training sessions on how to use the toolkit have begun.

There was further discussion from the previous day’s Preservation Administers Interest Group (PAIG) meeting regarding the future of training in preservation and conservation. Conversation focused on whether other institutions were going to develop a joint training venture, whether a library degree is necessary for conservators, the need to create an alumni list with job information of former Texas graduates, and ways to support continuing education for conservators and technicians.

Sharing of cost analysis ideas and creative ways of dealing with budget issues was discussed. Included were ideas regarding training through book binding groups, creating new, stricter binding policies, reexamining materials for pam-binding, and using Web 2.0 tools for sharing information and promoting yourself and your institution.

Laura Bedford from the Huntington Library is the new co-chair.

Digital Conversion IG

Speakers and Topics

Leslie Johnston, Manager of Technical Architecture Initiatives, National Digital Information Infrastructure and Preservation Program (NDIIPP), addressed an overview of the program’s current initiatives, including distributed preservation, use of the cloud, and the documentation and validation of file formats.

Peter Alyea, Digital Conservation Specialist, Library of Congress, shared the library's latest research into 3-D imaging and digitization of analog audio discs. The research revolves around a system called IRENE, that can transform damaged and old records into digital audio. IRENE has the potential to help preserve thousands of records which are currently unplayable via a conventional needle and turntable.

Other Business

Nominations for chair and vice-chair were solicited from the attendees, but there was no response. Two possible candidates were mentioned at the PARS All Committee Meeting: Cassandra Gallegos and Kim Peach.

Intellectual Access to Preservation Metadata IG

Janet Gertz, Chair of the PARS Audio Preservation Metadata Task Force, gave an update on a chart the task force prepared with the MLA BCC Metadata Subcommittee. The chart, titled “Metadata Standards and Guidelines Relevant to Digital Audio,” provides a quick overview of metadata standards that are currently being used to describe, manage, and preserve digital audio files. It can be found on the ALCTS web site.

Gertz was followed by a presentation from George Blood of Safe Sound Archive. The following is an excerpt of the presentation SIPS, DIPS and Trips: How we will know if we've collected enough, or the right, metadata?

Many institutions rely on the OAIS model for their preservation programs, yet few have built end-to-end solutions. Libraries digitize large volumes of resources, collect metadata and create submission information packets (SIPS) for our digital repositories, trusted or otherwise. The use of metadata to access those collections is years behind the actual collection of the data. How did we choose our metadata without knowing who would need or how it would be accessed? Perhaps the focus should have been, What metadata shall we collect? rather than What metadata will users and managers need? Will we be able to access our metadata in meaningful ways to aid discovery, to manage collections, and make dissemination information packets (DIPS) users need?

Blood explored these questions and proposed that a standard set of brief information such as file property information, title, creator, date, and keywords might be all the end user would be interested in knowing from a DIP. Users interested in more information could choose to get a fuller record or even a complete record of all the metadata about that particular object or objects.

At the end of his presentation, Blood called for the formation a task force that would draft a uniform standard for what kind of metadata a DIP should include. He hopes that the task force might be able to draft something in time to present it at the 2011 ALA Midwinter Meeting. Several people in attendance volunteered to join the task force, and some names also suggested for the task force.

Preservation Administrators IG

Preservation Week included webinars with over 700 participants from all over the world, including sixty nationwide events posted in twenty-one states. Preservation Week sparked collaboration between ALA, the Society of American Archivists (SAA), and the American Institute for Conservation (AIC).

CE offerings for the year included preservation options and disaster preparedness. Preservation 101 was test run and garnered good audience turn-out. It will be offered again in October 2010.

The document “Metadata Standards and Guidelines Relevant to Digital Audio” prepared by the PARS Task Force on Audio Preservation Metadata in cooperation with the MLA BCC Metadata Subcommittee is available on the new PARS website.

There was a 15 percent response rate to a recent needs survey. Preliminary findings will be released within a few weeks.

Holly Robertson is the new member-at- large on the PARS Executive Committee. David Lowe is secretary for PARS Exec. Anne Marie Willer is the chair-elect of PARS, and Tara Kennedy is the incoming PARS section chair.

Michele Cloonan was awarded the Banks/Harris Preservation Award. Karen Motylewski and Jeanne Drewes were awarded 2010 ALCTS Presidential Citations for their work on Preservation Week.

Conservation Education

Beth Doyle facilitated a discussion on the future of book and paper conservation education. A summary has been posted at Preservation and Conservation Administration News (PCAN).

Mellon has supported conservation education programs for museums and met with museum education programs, heads of conservation programs, representatives from NEH, Kress, and IMLS in February 2010. The end result was that Mellon will consider how best to meet the demand for book and library conservators; programs were discussing how they could best meet the needs of the field; program heads would discuss at annual meeting what their plans were to address the needs for library and archives conservators. Each program has put together and plan based upon their training philosophies for how they can fold in library and archives conservation. This information has been submitted to Mellon and they are working on individual programs which Mellon may use to help fund changes.

There will be one or two conservator position endowments per year over the next five years to major academic libraries. Endowments are challenge grants with matching funds required. An announcement was made to institutions but there is no news regarding which institutions have received what.

University of Texas-Austin had been meeting the educational market for conservation education, and more students were not needed. A meeting was held which prompted discussion topics such as the required conservation skills/knowledge, how attendees might participate if conservation was established, how Simmons could partner with museum conservation programs to provide the MLIS. North Bennett, Simmons, NEDCC, Harvard, MassHistorical, Boston Athenaeum, Wellesley, Boston College are potential participants. A three-part evaluation was held at UT-Austin consisting of an alumni survey, one hour phone calls conducted by a consultant in the field speaking with twenty-eight employer institutions, and group of about ten people served as advisory panel and analyzed both surveys to determine future directions. The consensus was to move forward at Simmons with new curriculum model and how to create a sustainable conservation education program in the United States. A condensed version of synopsis has been posted to PCAN.


Preservation cannot manage environments sustainably without data that can be analyzed, a staff person assigned to the task who understands HVAC systems, a process for working together with building operators to achieve clearly understood goals (this only works when done out of preservation). Facilities departments and building operators lack adequate time and staff, usually welcome knowledgeable partners, often do not understand how their systems actually work (this parallels the preservation manager who do not know what kind of environment in which his collections actually live), and respond mainly to complaints and project requests. There are too few people who have a holistic understanding of these systems; their jobs are too fractured and those who have the knowledge are handling large systems from a central place and not on-site. It has been dumbed down, fragmented, taken advantage of by vendors taking advantage of administrators interested in saving money. Those who manipulate the electronic controls may not even be in the building, county, or state.

Research in Progress

Complete shutdown in selected spaces; use of aggressive setbacks; improvements in analysis software and data gathering hardware; program of five two-day seminars on sustainable operations. The shutdowns took place during unoccupied hours and was an IMLS funded initiative. Five libraries (Yale, UCLA, Birmingham Public, NYPL, and Cornell) participated for a three-year period. The research premise: they can be shut down without compromising preservation quality during much of the year, providing risks to collections are measured and managed. A six-month testing period at LC demonstrates that turning the air handler off for eight hours at night changes the temperature by 1.5 degrees.

During the project, stack spaces will be monitored to ensure that collections are okay. Participants will monitor HVAC and measure energy. Aggressive shutdowns are mainly used for laboratory work and explore the rh range and temperature equilibrium. There was a field trial of aggressive setbacks in the RIT library.

There will be five free regional seminars conducted by Peter Herzog during 2010 and 2011. They are designed for facilities and preservation people to learn more. The seminars will be held in Minneapolis, New Haven, Atlanta, Austin, and Los Angeles. See for more information.

Preservation’s ability to support sustainability is very encouraging. HVACs turn on and off normally and how does the shutdown impact the equipment? Normal shutdown involves variable frequency drives, rather than an entire handler and should not wear out equipment any faster. Conditions reactions will be measured by embedding sensors in cabinets and bookcases. A question was raised regarding how this information will be publicized to avoid a situation in which institutions where the environment cannot be managed appropriately will result in a worse situation.

Promoting Preservation IG

A panel consisting of George Blood (chair), Tom Clareson, Charlie Kolb, Diane Vogt-O'Connor, Emily Gore, and Janet Gertz discussed "The Project Did Not Go Entirely as Planned".

Charlie Kolb, NEH, presented tips and suggestions for grant application writers. The other panelists discussed the things that can throw a grant off-course and some suggestions for avoiding these pitfalls. Once the panelists had completed their presentations, audience members were invited to tell their own stories and share what they had learned.