Reports from Annual 2009

Volunteer Reporters Cover ALCTS Forums and Events in Chicago

ALCTS members who attended the 2009 ALA Annual Conference in Chicago provided the following summary reports. We thank the volunteers who covered a program or event sponsored by ALCTS or one of its units. Their efforts enable the rest of us to benefit from their presentations. We regret that volunteers were not available to report on all the forums.

ALCTS Preconferences

Cataloging Digital Media Back to the Future!

Marcia Barrett, The University of Alabama

This preconference was sponsored by the ALCTS Cataloging and Classification Section (CCS) and cosponsored by the Online Audiovisual Catalogers, Inc. (OLAC). It provided specific guidelines for cataloging a wide range of digital media formats and an overview of more global cataloging issues in a session on Functional Requirements for Bibliographic Records (FRBR) and Resource Description and Access ( RDA). Following opening remarks by Julie Moore, Catalog Librarian, California State University, Fresno, fifty-four participants from across the United States and around the globe settled in to hear about the challenges involved with cataloging the following digital media formats: DVD-Video, Blu-ray Discs, DVD-Audio, DVD-ROM, DualDisc, Playaway, and Streaming Media,. The workshop assumed knowledge of Machine-Readable Cataloging Formats (MARC 21), the Anglo-American Cataloguing Rules ( AACR2), and basic cataloging knowledge of sound recordings, videorecordings, and electronic resources.

Jay Weitz, Senior Consulting Database Specialist, OCLC, covered many of these formats in his presentation, namely, DVD-Video, Blu-ray Discs, DVD-Audio, DualDisc, Playaway, and Streaming Media. He focused on three areas of the description: general material designation (GMD), physical description, and the system details note. Weitz also stressed the importance of coding the correct date, and how to do so. One of the most important points that he wanted participants to take away was that a catalog record should never have a publication date that is earlier than the date any new media becomes commercially available. In summary, the earliest possible publication dates are 1997 for DVD-Video (or 1996 for Japanese DVD-Video), 1999 for Streaming Media, 2000 for DVD-Audio, 2004 for DualDisc, 2005 for Playaway, and 2006 for Blu-ray Discs. More time was spent on the DVD-Video format, as Weitz explained aspect ratios, color broadcast systems, and world region codes. For each of the formats, he covered coding the 007 field and emphasized how important it is to do this correctly because this information is used by OCLC as well as many integrated library systems. Weitz referred participants to the OLAC website, www.olacinc.org, for useful training materials for media cataloging. In 2008 alone, the Cataloging Policy Committee of OLAC published three cataloging guides (DVD, Playaway, and Streaming Media).

Anchalee (Joy) Panigabutra-Roberts, Metadata Services Librarian and Women and Gender Studies Faculty and Liaison, University of Nebraska-Lincoln, began her presentation on DVD-ROMs with cataloging basics such as the fact that description drives coding in the catalog record and thus should be completed first. With the 2001 amendment of AACR2, instructions in chapter 9 are less proscriptive, indicating that the cataloger should choose the source with the most complete information as the chief source of information. Panigabutra-Roberts’ presentation covered each area of the description and included examples. She emphasized that any information given on a DVD-ROM by the Entertainment Software Rating Board system is of great interest to users and should be included in the catalog record. Participants enjoyed the opportunity to put what they learned into practice with a hands-on exercise cataloging two DVD-ROMs.

The preconference concluded with a presentation by Robert Ellett, Lecturer with the School of Library and Information Sciences, San Jose State University, and Catalog Librarian at the Joint Forces Staff College in Norfolk, Virginia, on FRBR and RDA. An understanding of FRBR is essential in order to understand RDA, and the most important thing to understand about FRBR is that it demonstrates the relationship between and among materials, creators of works, and subjects. FRBR will improve the user experience in locating information, cut costs for the description and access to resources in our libraries, and position information providers to better operate in the Internet environment and beyond. These reasons and more formed the impetus for the creation of a new cataloging code, RDA. One major change with RDA for media cataloging will be the replacement of the GMD with three new elements, content, media type, and carrier type. Ellett also discussed the new MARC tags that will be used to convey these new elements.

Attendees were eager to know more about RDA. Once RDA is released (expected in November 2009), the national libraries will test the content and online functionality, so it will be at least a year before RDA may be implemented. In the meantime, Ellett encouraged participants to talk with ILS vendors to determine what preparations are being made for the new cataloging code.

Attendees indicated in the evaluation forms that the preconference and the course materials were extremely valuable. The preconference received high ratings, overall.

Members of the preconference planning committee were Julie Moore, Anchalee (Joy) Panigbutra-Roberts, and Carolyn Walden, Head of Cataloging at University of Alabama at Birmingham. Presentation slides and other program materials from the ALCTS Preconference Cataloging Digital Media Back to the Future at ALA Annual (July 9, 2009) are available on the ALA Conference Materials Archive.

Manipulating Metadata: XSLT for Librarians

Lucas Mak, Michigan State University

Sponsored by the ALCTS Networked Resources and Metadata Interest Group and co-sponsored by LITA, this preconference workshop was taught by Frances Knudson (Los Alamos National Laboratory) and Christine Ruotolo (University of Virginia Library). XSLT (eXtensible Stylesheet Language Transformation) is a declarative language that allows one to define rules describing the desired transformation. In library context, it is commonly used to convert data encoded in XML (eXtensible Markup Language) from one metadata standard to another (e.g. from MARCXML to Dublin Core) or to transform an XML document into HTML for display.

This workshop was intended for librarians with some basic experience with XML but who are new to XSLT. Since XSLT style sheets are themselves XML documents, the workshop quickly reviewed some basic concepts of XML (e.g. basic structure of XML, rules of well-formedness, nesting, and namespaces) as an introduction. Among them, namespaces are essential for metadata mash-up and application profiles since they help differentiate elements and attributes that have the same name but different meanings and different origins.

Christine Ruotolo next introduced participants to some basic elements in XSLT version 2.0. A fundamental component of XSLT is “Template,” which contains one or more XSLT elements (also called XSLT instructions) that describe how to select, sort, or further process nodes from the source XML document. Elements like <xsl:apply-templates>, <xsl:value-of>, <xsl:for-each>, and <xsl:output> were covered. XPath, another fundamental concept for using XSLT, was also discussed. XPath is essential for instructing how an XSLT processor navigates along various axes within an XML document and marks locations and selects sets of nodes for processing. XPath expression is used in “match,” “select,” and “text” attributes within XSLT elements.

The workshop also covered some advanced concepts which allow users to manipulate and customize the output document. The first one introduced was <xsl:sort>. It allows users to sort data, alphabetic or numeric, in ascending or descending order and is instrumental for customizing the order of data in an output document. The second group of advanced XSLT functions is relevant to string manipulation. In combination with the “select” attribute in other XSLT elements, functions like “concat(string1, string2, string 3, …)”, “substring-before(string, string)” let users to extract certain data from a text string. The third group, <xsl:if> and <xsl:choose>, is essential for conditional processing. These two are called conditionals since they allow users to test some condition of a node and specify different processing strategies accordingly. The last pairs of XSLT elements covered were “variables” and “parameters”.

After an overview of the advanced XSLT elements, Knudson led the hands-on exercises in the second half of the afternoon session. There were five exercises that required workshop participants to apply various XSLT elements in transforming an XML document into a HTML document based on different specifications. Although there was not enough time to go over all five exercises in class, participants were encouraged to finish them later to get a sense of how various XSLT elements operate.

In wrapping up the preconference workshop, Ruotolo recommended XSLT 2.0 Programmer’s Reference by Michael Kay as a ready reference, and Beginning XSLT 2.0 by Jeni Tennison as a top-to-bottom step-by-step walk-through reading.

RDA, FRBR, FRAD: Making the Connection

Julie Moore, California State University, Fresno

With the publication of Resource Description and Access ( RDA) on the horizon in November 2009, all of the RDA sessions drew large audiences at the 2009 ALA Annual Conference. This preconference was no exception with approximately 150 attendees. Presenters spoke in detail about the Functional Requirements for Bibliographic Records (FRBR), the Functional Requirements for Authority Data (FRAD), new terms and concepts, and how they relate to RDA, the new rules designed to ultimately replace the Anglo-American Cataloging Rules, 2nd ed. ( AACR2).

Barbara Tillett (Chief, Cataloging Policy and Support Office, Library of Congress) went into great detail explaining FRBR, including the history of how and why FRBR came to be. She explained that the user and the four main user tasks (to find, identify, select, and obtain) have been at the forefront of the developers’ minds in constructing this conceptual model. Tillett discussed the three groups of entities, attributes/elements (“attribute” is the RDA term; “element” is the FRBR term), and the relationships. Group 1 entities are the products of intellectual and artistic endeavor (or the bibliographic resources), which consist of the Work, Expression, Manifestation, and Item. The work and expression levels are the intellectual/artistic content. The initial, abstract idea that a person conceives in his head is considered to be “the work”. The work is realized through “the expression” via text, sound, movement, or gathering resources together. The expression is embodied in a “manifestation.” The manifestation is the point where there is an actual recording of the intellectual/artistic content. This is where the intellectual content is added to what we call a “carrier.” It can be presented in any format, such as a book, periodical, VHS, DVD, music CD, streaming video, etc. (When catalogers create bibliographic records, they are typically describing at the manifestation level.) The physical object in one’s hand is the “item.” Tillett presented a particularly interesting slide showing a continuum of a family of works, depicting the “Equivalent Work” (original work, same expression—e.g., microform reproduction, reprint) through a sea of “Derivatives” (same work, new expression—e.g., onto various editions, translations) … and the cutoff point where a “Derivative” becomes considered a new work unto itself (e.g., summary, abstract, digest, genre change, adaptation) … and the far end of the new work spectrum, “Descriptive works” (e.g., review, casebook, criticism, commentary).

Group 2 entities are the people or parties responsible for the intellectual/artistic content, the physical production, manufacture, and dissemination of manifestations, or the custodianship of bibliographic resources. This may be a Person, Corporate body, or Family.

Group 3 entities are the subjects of works. These include Group 1 and Group 2 entities plus Concepts (topics or subjects that we use to describe what works are about), Objects, Events, and Places. FRBR places a much greater emphasis on the relationships between the entities (and authority control) than current practices. Tillett took the class through a quick history of cataloging with Cutter’s Rules for a Printed Dictionary Catalogue (1876) and Seymour Lubetzky’s “Paris Principles” (1961), upon which today’s cataloging rules are built. The FRBR entities are particularly helpful in the collocating (or gathering) objects. FRBR is merely a conceptual model that can be used by any forthcoming codes or rules, such as RDA. It is expected that FRBR will help us to meet the user objectives of the catalog (to find, identify, select, and obtain) in new and better ways. Tillett laid the groundwork for the entire day with her lecture.

Ana Lupe Cristán (Library of Congress) provided a FRBR quiz (and later, a FRAD quiz) for a change of pace.

Robert L. Maxwell (Metadata and Special Collection Cataloging Department Chair, Brigham Young University) presented the Entity-Relationship Model. Maxwell (author of FRBR: A Guide for the Perplexed) described the entity-relationship model since the notion of relationship is so important to grasp in understanding FRBR (and thus, RDA). The entity-relationship model is much more robust than the current flat-file structure (i.e., MARC) databases used in our catalogs. Data are divided into entities. Entities are linked by relationships. Entities and relationships have certain attributes. Maxwell reviewed the three groups of FRBR entities and how they build on each other via these relationships. He provided many diagrams to show how the relationships worked and a hands-on exercise. Each participant received a notebook filled with forms to represent a seed “database.” They were instructed to populate the database to experience how this entity-relationship model works.

Glenn Patton (Director, WorldCat Quality Management, OCLC) presented “Get to Know FRAD.” Patton edited a recently-published book (June 2009) for IFLA, Functional Requirements for Authority Data: A Conceptual Model. The goal of this document was to extend the FRBR model to authority data. User tasks for authority data were defined as find, identify, contextualize, and justify. FRAD closely follows FRBR with the same or similar vocabulary. In FRAD, there is an entity-relationship model composed of entities, attributes of the entities, and relationships between the entities. As with FRBR, the developers of the FRAD conceptual model kept the user tasks at the forefront. Attributes and relationships were mapped to the user tasks. Patton explained all of the FRBR entities (that are also used in FRAD) along with the new FRAD entities, namely: Name, Identifier, Controlled access point, Rules, and Agency. Name includes the alternative linguistic form relationship, conventional name relationship, and other variant name relationships. Controlled access point includes parallel language relationship, alternate script relationship, different rules relationship, controlled access point/corresponding subject term or classification number relationship, and controlled access point/identifier relationship.

Tom Delsey ( RDA Editor, JSC) presented FRBR and FRAD as Implemented in RDA. Delsey is the mastermind behind this whole FRBR movement. He put the elements altogether—FRBR, FRAD, the whole entity relationship model, and how this all ties into RDA—at the end of his presentation. Delsey also discussed design objectives, especially emphasizing that RDA will provide a consistent, flexible, and extensible framework for describing all resources; it will be compatible with internationally established standards, principles, and concepts; and it will be able to be used by a wide range of resource description communities. Delsey discussed the possibility of publishers creating the initial data and then libraries can add value and create the relationship linkages necessary to make the records meaningful. He also discussed FRBR and FRAD attributes and relationships and how they fit in with RDA structure and explained how these attributes have been mapped to user tasks. The RDA elements are in alignment with FRBR and FRAD attributes and relationships. (He had many slides listing the FRBR/FRAD attribute and how this relates to the RDA elements; the FRBR/FRAD relationships; and the RDA extensions to FRBR).

Delsey briefly discussed the Resource Description Framework (RDF), which is a data model in the World Wide Web Consortium (WC3). RDF is the set of rules for how things go into the web. XML is an implementation of RDF. RDA is an XML database, and it is intended to be used as an online tool. Catalogers can start at any level on the continuum from the work to the item, depending on the particular needs. One of the big positives for special formats catalogers is that RDA is being designed to relieve the “content versus carrier” tension. Rather than using the general material designation, there will be a categorization of resources that puts all of this data on common ground. The following will be described: Media Type (e.g., audio, computer, microform, video, etc.); Carrier Type (e.g., audio disc, audio reel, audio tape, etc.); Content Type (e.g., notated music, text, spoken word, still images, etc.); Mode of Issuance (e.g., multipart monograph, serial, integrating resource).

This preconference was sponsored by the ALCTS Cataloging and Classification Section. Shawne D. Miksa (Associate Professor, Department of Library and Information Sciences, University of North Texas) was the preconference chair. June Abbas (Associate Professor, School of Library and Information Studies, The University of Oklahoma) and Barbara Schultz-Jones, (Assistant Professor, Department of Library and Information Sciences, University of North Texas) were on the program planning committee and were Masters of Ceremonies.

Presentations from the preconference have been posted to the ALA Conference Materials Archive.

ALCTS Forums and Programs

ALCTS 101

Debbie Ryszka, University of Delaware

“ALCTS 101 : A Primer : Who We Are, What We Do and How You Fit” was the topic for this year’s ALCTS 101 program, held on Friday, July 10, 2009 at the American Library Association Headquarters in Chicago. About seventy-five new and veteran ALCTS members attended the fun-filled evening, organized and sponsored by ALCTS and the ALCTS Membership Committee.

Dina Giambi, ALCTS President, and Mary Case, ALCTS President-Elect, opened the evening’s program by welcoming attendees. They briefly described the work, goals, and history of ALCTS, and provided some insight into the work of the sections within ALCTS. Additionally, they highlighted some of the meeting and program offerings that would be happening during this year’s annual conference. Dina Giambi reminded the audience of her President’s Program, Monday, July 13, featuring James Cuno, president and director of the Art Institute of Chicago, who would discuss “Who Owns Antiquity? Museums and the Battle over Our Ancient Heritage.”

Both leaders spoke about the recent ALCTS e-Forum on “Creating the future of ALCTS” and encouraged members to voice their opinions about the future direction and structure of ALCTS. An open forum entitled, “Creating Our Future,” will take place on Monday, July 13 where members could share their ideas and opinions about what type of professional organization ALCTS should be down the road.

After remarks by Giambi and Case, attendees heard brief reports from individuals representing the various sections within ALCTS. Karen Darling, University of Missouri, spoke about the Acquisitions Section and its work. Valentine Muyumba, Indiana State University, discussed the Cataloging and Classification Section and why she was drawn to membership in that section when she first became a librarian. She also reminded members that RDA, the new cataloging code, is coming and that there would be several important meetings taking place at the conference about these new cataloging rules.

Lynda Wright, Randolph-Macon College, acquainted members with all of the committees in the Continuing Resources Section and the important work that many of them are undertaking. Marlene Harris, Alachua County Library District, described the Council for Regional Groups, and the types of services, assistance, and information they provide to regional and local professional library organizations. She also announced the formation of the Public Libraries Technical Services Interest Group, a new interest group within ALCTS. The group, led by Marlene Harris and Cynthia Whitacre, was scheduled to hold its first business meeting during the conference in Chicago. Gina Minks, Amigos, represented the Preservation and Reformatting Section (PARS) and discussed the recent PARS reorganization and the formation of a New Members Interest Group within the section.

The highlight of the evening was the “topical table talks” where six teams of “table talkers” spoke on specific topics to those gathered around tables. The talks were timed for ten minutes each. When the ten minutes were up, the “table talkers” moved to another table and began their “table talk” again. The “table talkers” covered a wide variety of professional topics related to ALCTS and involvement in ALCTS. Mary Case, University of Illinois at Chicago, and Pamela Bluh, University of Maryland, School of Law, were the designated time keepers for the evening and used noise makers to signal the end of every ten-minute session.

Cindy Hepfer, Susan Davis, and Daisy Waters, the University at Buffalo, formed a team of “table talkers” and lead informative discussions on attending and organizing programs within ALCTS. Gina Minks, Amigos, gave attendees information about ALCTS committees and how to become involved in them. Becky Ryder, University of Kentucky, talked about interest groups within ALCTS and how to form one.

Charles McElroy, Florida State University, spoke to attendees about technology and Web 2.0 and sought opinions on how ALCTS should use these technologies to communicate and get its message out.

Dina Giambi and Deborah Ryszka, University of Delaware, let their groups know about the formal and informal mentoring opportunities within ALCTS and publicized the formation of the New Members Interest Group within ALCTS.

Mary Beth Weber, Rutgers University, and Peggy Johnson, University of Minnesota, teamed up to provide information about ALCTS publications and offered helpful advice on how to ready manuscripts for submission to ALCTS publications.

After the event, many commented that they enjoyed the “speed dating” format of this year’s ALCTS 101 event. It was a fun and exciting way to interact with like-minded colleagues and to hear information about ALCTS and its mission.

The Membership Committee is working on plans for ALCTS 101 events at Midwinter in Boston and Annual in Washington, D.C. in 2010. The committee will post announcements as soon as arrangements for the events are made.

ALCTS President’s Program: Who Owns Antiquity? Museums and the Battle Over Our Ancient Heritage

Joshua P. Barton, Michigan State University

The ALCTS President’s Program featured James Cuno, President and Director of the Art Institute of Chicago, who discussed his book Who Owns Antiquity? Museums and the Battle Over Our Ancient Heritage. This was an excellent program that stimulated lively discussion over controversial issues.

In recent years, many governments have demanded that ancient artifacts originating from their jurisdictions be returned by the foreign museums that hold them. Cuno presented sophisticated arguments for why artifacts should remain where they are. He pointed to the Enlightenment mission of institutions such as the British Museum, where collections are brought together to illuminate the relationships of cultures and information across time and geography. Cuno contends that culture does not know national borders; such borders are artificial boundaries made by man to define jurisdiction. When a government voices the desires of the Chinese, Greek, Italian or whichever people, they attempt to make a cultural claim that is in fact political. By doing so, they impose borders on antiquity that do not cohere. Cuno believes that in many cases the lack of a direct connection between the ancient culture of a particular geographic region and the current government controlling that region makes illegitimate any of that government’s claims over antiquity. In particular, he cited modern Greece’s demand for the repatriation of ancient Greece’s Elgin Marbles.

Cuno advocated for the traditional archaeological practice of partage, in which local and foreign excavators would divide the artifacts recovered. Disseminating collections in this way prevents the risk to the artifacts from being concentrated in a single location, which in calamity could lead to devastating loss, as it has in Baghdad, Kabul and other places. Partage ceased with the rise of nationalism. Cuno acknowledged the good that nationalism has done in liberating people, but the resulting borders are not tidy and can actually disenfranchise cultures, e.g. the Kurds in Turkey and Iraq. Cuno feels that consolidating the retention of artifacts locally presents the false view of culture as national instead of historical.

During the question period, Cuno’s ideas were challenged for their removal of artifacts from the context in which they could best be understood: alongside the other artifacts they were discovered with. Cuno responded that this does not put items out of context, just in a different one. There is no one preferable context.

A question was raised about similarities between this debate and the one surrounding Native American artifacts in North America. Cuno sees this issue as distinct in that artifacts can be returned to an identifiable sovereign people with a living tradition. There is no such living tradition or identifiable people to repatriate antiquities. For this reason and for the historical circumstances of how the artifacts were acquired, Cuno believes that such items should be returned to their Native American communities.

Collection Development 2.0: The Changing Administration of Collection Development

Lisa Gardinier, Cochise College

Presenters:

  • Rick Anderson, Associate Director for Scholarly Resources and Collections, University of Utah
  • Steven Harris (moderator), Director of Collections and Acquisitions, University of New Mexico; author of Collections 2.0 blog
  • Jonathan Nabe, Collection Development Librarian for Science and Technology, and Coordinator of OpenSIUC, Southern Illinois University – Carbondale
  • Martha White, Director of Library Experiences, Lexington (Kentucky) Public Library.

Steven Harris’ opening remarks spoke of moving beyond the trend of 2.0 to the enduring changes “2.0” will create in libraries. He explained that 2.0’s endurance will be in connecting users to resources, whether technology is the sole pathway to the resources or assists in the connection; changing collections and collection development, and creating new collections; and engaging users and learning from them to improve existing services and to develop new services.

Jonathan Nabe’s presentation, “Fewer Cooks at SIU-Carbondale,” focused on the restructuring of collection development at Southern Illinois University – Carbondale. SIUC previously had a system of subject librarians, responsible for collection development, instruction, and department liaison relations within their subject areas, under the supervision of a collection development librarian. SIUC restructured to a system of subject liaison librarians, responsible for instruction and department relations within their subjects, and three dedicated collection development librarians, responsible for selection, analysis, and collection policies. Liaisons now function only in an advisory role when it comes to collection development. Nabe explained that the switch has created more efficiency in collection development and better balanced workloads for all. The team of dedicated collection development librarians can conduct more evaluation and analysis of the collections while the subject liaisons have been able to work on new projects and activities. The positive outcomes have been a new outlook on and increased attention to collection development. However, liaisons feel they have “lost entrée” with faculty and the workload is still quite heavy for collection development librarians.

Martha White’s presentation approached collection development and management in a more “holistic” manner, from selection to weeding and all facets of collection management in between. White is an advocate of the “give ‘em what they want” philosophy of collection development in public libraries and strives to keep library collections fresh and circulating. Illustrating her presentation, she discussed best practices, such as “power walls,” face-outs, displays, and signage, and also the employee training needed to keep displays and shelves looking full, tidy, and inviting, to ultimately create a welcoming and convenient environment for patrons. A significant portion of White’s talk was spent discussing centralized selection for public libraries, including the rationale and methods for keeping librarians no longer responsible for selection connected to collection development and collection development librarians in touch with patrons. White recommends scheduling collection development staff for “public service visits” on reference and circulation desks at least monthly. Branch librarians still receive Publishers Weekly and Library Journal, run analysis and reports, and participate in weeding. Finally, White included a stock turnover formula, included in her slides, to assist in selection and weeding.

Rick Anderson’s presentation focused on the more difficult questions facing academic collection development in the digital age. Beginning with a comparison of reasons for collecting before digital access to reasons for collecting now, he explained that collecting in print is largely a guessing game and our odds of guessing well are getting worse as patrons increasingly prefer digital materials. Anderson presented statistics from a range of Association of Research Library members showing decreased circulation and reshelving of print materials, upwards of a 58 percent decrease in circulation, paired with statistics showing that 40 to 57 percent of books are not checked out within their first ten years on the shelves. Anderson suggested shifting budgets toward e-on-demand, print-on-demand, interlibrary loan as collection development, and cooperative collection development for more efficient and wiser spending.

Slides available from the Conference Materials Archive.

Electronic Resource Management Systems: The Promise and Disappointment

Ellen Symons, Queens University, Kingston, Ontario, Canada

The ALCTS CRS Acquisitions Committee sponsored a thought-provoking session on electronic resources management systems (ERMS). The emergence of commercial ERMs has held the promise of providing solutions for multiple problems, particularly acquisitions and licensing, for electronic resource managers. However, libraries adopting these systems have achieved varying degrees of success in their implementation and maintenance.

Jeannie Downey, Electronic Resources Coordinator, University of Houston, began the session by discussing some of the joys—and the problems—she has encountered with the Serials Solutions ERM 360 suite of products. She was pleased with the explicit documentation, client service with a 24-hour response time, and online community support provided by Serial Solutions. Some good features of the 360 suite include: it is fully integrated, allowing users to find and view information such as licenses, contacts, and alerts easily; alerts which send an email when a trial is about to expire or a license needs to be renewed; and the contacts section which has a good selection of statuses for vendor representative information. Downey’s wish list of features she would like to see in the 360 Resource Manager, which comes from the problems they have encountered, is:

  • the ability to have “All titles” search all titles, databases as well as e-journals and e-books
  • the ability to add an unlimited number of customizable fields to the ERM to give more granularity and continuity
  • the ability to add an alert when adding notes to a resource

Betsy Friesen, Technical Services Analyst, University of Minnesota, reported on their experience using the Ex Libris ERM, Verde. Friesen discussed the “promises” that Ex Libris had made for Verde, such as being a complete solution for all electronic resource management requirements; facilitating budget management and collection development; extensive consortium support; and integration with existing library applications. The technical services department liked the fact that Verde offers comprehensive licensing that is compliant with ERMI guidelines, that it tracks use statistics from e-resource vendors, and that it integrates well with SFX. However, Verde has turned out to be an incredibly complex system to use, and it is not flexible enough. The documentation is not very good, and although the customer service from Ex Libris is “decent,” help is required too often. Integration with Aleph and SFX is not as seamless as expected. The overall functionality is troubled by too many bugs with, for example, reporting and data loading. Since there have been so many problems with Verde, Ex Libris has ceased development of the product. They will fix the functional bugs, but there will no longer be any enhancements.

Bowling Green State University (BGSU) began implementation of the Innovative Millennium ERM in 2004. Jeanne Langendorfer, Coordinator of Serials, spoke about the need for an ERM at BGSU because information about e-resources was stored in many different places: order or item records, spreadsheets and email messages, and only some people had access to this information. They also had a need for an A-Z list, which subject librarians maintained in a spreadsheet. Langendorfer’s presentation focused on the details of resource records and license records. Most of the disappointments they encountered have been with decisions made upon implementation, such as not using the Coverage Load or Usage Statistics modes. They would also use more ticklers, or email alerts, in resource records to improve workflow.

Apryl Price, Electronic Resource Librarian, Texas A&M University, spoke about the Gold Rush ERM, which started off with promise but ended up with lots of disappointment. Gold Rush is a project of the Colorado Alliance of Research Libraries, and is a web-based product which provides subscription management, as well as a link resolver and an A-Z list. Gold Rush is simple to use, inexpensive (the whole product is under $5000.00) and quickly implemented. It has good documentation, training, webinars and personal customer service. They have encountered some problems, such as not being able to upload e-book packages because Gold Rush is meant for e-journals, and would not accept ISBNs. Price recommends Gold Rush for small libraries and although Price is happy with this product, the Texas A&M University consortia is too complex and they will be migrating to a larger ERM in the future.

The Future of MARC

Joshua P. Barton, Michigan State University

This forum, organized by the ALCTS Catalog and Classification Section Executive Committee, was moderated by Karen Coyle and featured the following speakers:

  • Rebecca Guenther, Library of Congress, Senior Networking and Standards Specialist in the Network Development and MARC Standards Office
  • Ted Fons Director, OCLC WorldCat Global Metadata Network
  • Amy Eklund Catalog Librarian in JCLRC Technical Services, Georgia Perimeter College
  • Diane Hillmann Director of Metadata Initiatives, Information Institute of Syracuse and Partner, Metadata Management Associates

Presentation materials are archived in ALA Connect. An audio recording (mp3) is also available online.

Coyle observed that there have been several forums on the death of MARC since 2000, and yet the debate on MARC’s future continues. Rebecca Guenther’s presentation, “Evolving MARC 21 for the Future,” set a useful context for the rest of the forum. Guenther began by distinguishing between MARC as a syntax and MARC as a carrier, a distinction that the debate ought to acknowledge. She outlined the MARC environment: the sharing it has enabled and the wide ILS support. MARC has had success in its ability to carry data originating from different rules and conventions and its support of multifaceted retrieval, but it has also had problems. Only a small community understands its syntax, it is limited by numerical and alphabetical fields and subfields, it carries redundant data and has linking limitations, among other things. MARC has seen progress recently in MARCXML, linking capability, exploration of URIs and the use of MODS for repackaging. To streamline MARC for future use, Guenther advocated for further use of XML. Guenther briefly discussed the MODS and METS schemas. She also pointed out the Library of Congress’ experimentation with linked data in the availability of LCSH in SKOS format at http://id.loc.gov.

Ted Fons presented “Beyond the Record: OCLC and the Future of MARC.” Fons gave an overview of the Crosswalk Web Service at OCLC Research, a translation service to move between metadata standards, including MARC. The core of the translation is OCLC’s own Common Data Format (CDF), sitting in the middle of all standards translations. Data is pulled from one format into CDF and outputted into another format. Fons presented this as one way that OCLC is “moving beyond” MARC. He also gave instances of how OCLC is using data originating in MARC to do many different things, such as the detail displays on worldcat.org, WorldCat Identities and data mining.

Amy Eklund gave an overview of the MARC Content Designation Utilization project (MCDU), which originated at the University of North Texas. MCDU analyzed usage of MARC fields and subfields in Library of Congress and OCLC records. They found commonly used fields and subfields and compared these with those prescribed by national and cooperative standards. Discrepancies were discovered between actual usage and standard prescriptions, i.e. some prescribed fields were used little and some unprescribed fields were used commonly. Eklund suggested that projects like MCDU could be drawn upon to streamline decisions among standards-making bodies and that the future of MARC could be better informed by empirical data. There seemed to be agreement among the audience that such empirical data could be useful, but there was division on the appropriateness of using frequency of field or subfield usage to inform standards decisions. Further information about MCDU is available at http://www.mcdu.unt.edu/ and in forthcoming publications.

Diane Hillman pondered the question “Does MARC have a Future?” Some will say “yes” because we are in a transition and still need to communicate in MARC until other tools are available. Hillman contended that some who say “yes” say so because MARC is simply “how we’re wired” to think. Others will say “no” because keeping up with MARC raises the cost of change later; because we need to move on with FRBR and MARC is built for flat files, not FRBRized environments; and because RDA can not be used well with MARC. Still others will say “maybe” to the future of MARC if that future is the next five to eight years. Since RDA is influenced by MARC, it will look like it, so there is a future for it in that sense. Hillman suggested that we replace MARC with something more easily used and understood by other communities and that we look to those other communities for ideas on how to do so. In particular, Hillman mentioned using RDA with XML.

Going Global! Finding Non-English Language Specialists

Magda El-Sherbini, Ohio State University

The ALCTS Non-English Access Steering Committee's program, “Going Global! Finding Non-English Language Specialists,” took place on Monday, July 13. The program was sponsored by ALCTS/ACRL/LLAMA. Support was generously provided by Casalini Libri.

The program was not very well attended. There were about thirty participants and an additional ten who signed in but did not stay until the end of the program. Beth Picknally Camden, Goldstein Director of Information Processing, University of Pennsylvania Libraries was the program moderator.

Magda El-Sherbini, Chair of the Non-English Steering Committee, opened the program by welcoming the audience, introducing the moderator and thanking the Steering Committee for their work on this program.

Camden introduced the speakers and gave a brief background on the original Non-English Task Force and the Steering Committee.

The first speaker was Amy Hart, Minuteman Library Network, Natick, Massachusetts, who gave a presentation on “Providing access to Chinese and Russian language materials in the Minuteman Library Network.” She began by sketching the background of the early efforts to support Non-English language access by hiring a temporary cataloger for processing Chinese and Russian. She also addressed the issues in selection and acquiring of materials in foreign languages and how her institution used volunteers from the community to assist the library with collection development. In addition, she discussed how the needs for permanent staffing in this area led to hiring the first part-time Chinese language materials cataloger. She also addressed the challenges of managing the workflow and how they worked on a way to solve these issues.

The second speaker Patricia Stelter, Vice President, Bibliographic Services at Backstage Library Works, who discussed briefly how to identify and recruit the right non-English language specialists. She discussed ways to find specialists through contacting universities and posting free and for cost advertisements. She also addressed how difficult it is to deal with staff from oversees with languages skills and how difficult it is to check the quality of the non-English cataloging work from oversees.

Sherab Chen, the third speaker from Ohio State University gave a presentation on “Empowerment, Innovation and Collaboration in the Area of Non-English Language Cataloging at OSUL's Cataloging Department” where he discussed how to find non-English specialists. He talked about how the Ohio State University Libraries has been using student language specialists and other non-traditional types of employees to assist its non-Latin cataloging workflow. Taking advantage of today's globalization and the Web 2.0 environment, the Cataloging Department has initiated a series of projects to step into the frontier of non-English language cataloging. In the presentation, he demonstrated some innovative training methods that can empower student assistants, and shared experiences in engaging web 2.0 tools to enhance cataloging section management, which has a great diversity of staffing. He also discussed the question of cataloging quality control in non-Latin cataloging and how department collaboration can help in this important aspect.

The fourth presentation was given by Alena Aissing and Hikaru Nakano who discussed “Adding Vernacular Cyrillic to Catalog Records Despite Too Little Money and Too Few Staff.” They discussed how difficult it was to find language proficient staff to assist in adding Cyrillic characters to many of the Slavic records. They also shared their success in finding a specialist and now added Cyrillic characters to 10,000 records in both OCLC and their local system.

Keeping the Best in Challenging Economic Times: Evaluating and Assessing Collections for Cancellation Decisions

Susan Thomas, Indiana University South Bend

A three member panel discussed different methods for evaluating serial and database collections to facilitate the difficult decision of what to cancel in lean economic times. Greg Raschke, Associate Director for Collections and Scholarly Communication at North Carolina State University Libraries, initiated the discussion by noting that by 2004 e-journals had become a dominant format, creating a cognitive dissonance phenomenon with users when told titles need to be cancelled. His serial review involved examining data consisting of usage statistics, publication and bibliographic information, along with feedback from users. He also compared expenditure data with institutional data. Raschke indicated that user-based collection models, such as purchase on demand, are growing; that many e-journals still do not provide user data (which we should all start requesting), and that it is important to conduct serial reviews often. He found that usage and feedback data showed a strong correlation. He also found the Eigenfactor to be a good predictor of usage and popularity of a title. He stressed the importance of involving users in the review process.

Gerri Foudy, Manager of Collections and Scholarly Communications, University of Maryland at College Park, discussed a database review and cancellation project. She advocated the importance of evaluating and prioritizing databases for possible cancellation to avoid depleting limited funds for the purchase of other needed resources. For example, her library has never adjusted the book budget to cover serial expenditures. Her review involved the use of a criteria-based decision grid (Ingrid Bens/Facilitation at a Glance) with a goal to identify 25 percent of the database collection for possible cancellation. An initial discussion revealed a core list of databases that equated to 75percent of the collection. Evaluative criteria consisted of rank from subject teams, access, cost-effectiveness determined by cost per user, coverage/audience, and uniqueness of the title. Titles identified for possible cancellation were ranked with a tiered system where the rank of 3 indicated that cancellation would cause the least loss for users.

Paul Metz, Director of Collection Management, Virginia Tech University, discussed a method of serial review that involved examining table of contents alerts, clicks through Serial Solutions, title information such as if it was referred, Katz listings, abstract and index coverage, along with the number of libraries holding a subscription. Metz also factored user feedback and online comments into the serials review. His analysis revealed that sometimes it is more cost effective to stay with a big serials package deal instead of subscribing to individual titles.

Leadership Development in Transition: Steering the Ship from Helm and Deck

Rebecca Schroeder, Brigham Young University

This program featured three panelists who addressed the challenges faced by librarians at all levels in the effort to reshape the library as institution and foster a new generation of leadership.

Jill Canono, a leadership consultant from the State Library and Archives of Florida, gave an excellent overview of the complexities of shared leadership. She used questions and audience participation to offer suggestions, draw conclusions and share solutions for meeting leadership challenges. She specifically pointed out that difficulties arise when librarians do not work to revise outdated rules and regulations nor encourage and empower front-line staff to experiment and take calculated risks. To help promote effective leadership, she suggested librarians seek new answers to their questions and different ways to communicate effectively. She also encouraged them to use all their resources and to volunteer for different duties. She concluded by advising librarians to nurture their “imaginal” cells, a concept she likened to the change in DNA that happens as caterpillars change into butterflies. A similarly transforming change can happen to librarians if they use their “imaginal” cells as they mature in their work responsibilities.

In response to Canono’s presentation, Olivia Madison, Dean of the Library, Iowa State University, spoke about leadership in an organization. She pointed out that a common way for everyone to participate in leadership is to serve on committees. Using her library as an example, she described the benefits of being heavily invested in committee work. Nearly forty different committees or task groups, each utilizing five to ten people, has contributed to her library establishing a great leadership base. With this in mind, she suggests that the way for growing leadership is to volunteer for and then contribute and participate fully in committee service. It does not matter whether you are a member of the group or the chair of the committee. What matters is if you are engaged, involved, and abreast of the issues central to the committee efforts. Committee members should provide constructive criticism, refrain from ridicule, spread their enthusiasm and humor, and learn good project management skills. In her closing remarks, Madison reminded library administrators that they have a commitment to and a responsibility for mentoring leadership in their libraries.

Nanette Donohue, Technical Services Manager, Champaign Public Library, also responded to Canono. She gave advice for both new and veteran librarians in their different leadership roles. She described a culture of entitlement often observed between different generations of librarians. Many younger librarians may think the older ones will not change anything and the older librarians think the younger ones want to change everything. Donohue says that leadership requires negotiations from both sides. Veteran librarians should create an environment where new librarians can blossom. Such an environment would foster a feeling of safety for people to raise questions and offer criticism. Librarians should view any such criticisms as opportunities to make changes. New librarians should be respectful, cultivate connections, and work to develop a network of colleagues. They should not be afraid to ask questions and should have the courage to take risks.

As a summary of the program, the panelists gave participants a handout with eight take-away points for leadership development in transition:

  1. Approach volunteer calls and new duties as learning and networking opportunities.
  2. Support leadership and enhance the effectiveness of the work group, whatever your role.
  3. See strong project management skills and processes as everybody’s job.
  4. Mentor and be willing to be mentored to gain new knowledge, skills, and abilities.
  5. Take calculated risks with win-win results.
  6. Seek new answers to existing questions, situations, challenges, problems…
  7. Acknowledge and use all resources.
  8. Build and expand strategic alliances.

Managing Preservation without a Preservation Librarian

Lisa Gardinier, Cochise College

Presenters:

  • Oliver Cutshaw, Librarian, the Chicago School of Professional Psychology at Southern California
  • Roberta Pilette, Head, Preservation Department, Yale University
  • Michele Stricker, Consultant, Library Development Bureau, New Jersey State Library

Roberta Pilette’s presentation focused on Paul Banks’ Ten Laws of Conservation rearranged to serve as an introduction, as follows:

  • No one can have access to a document that no longer exists.
  • Multiplication and dispersal increase chances for survival of information.
  • The physical medium of a book or document contains information.
  • No reproduction can contain all the information contained in the original.
  • Conservation treatment is interpretation.
  • Authenticity cannot be restored.
  • No treatment is reversible.
  • Use causes wear.
  • Books and documents deteriorate all the time.
  • Deterioration is irreversible.

Michele Stricker, currently with the New Jersey State Library, served as the director of the Library Company of Burlington, the oldest continuously operating library in New Jersey, founded in 1757. The Library is home to many old and rare books, as well as documents important to local history, all housed in a building constructed in 1864. Stricker’s tenure as director served as a case study in managing all aspects of preservation, from planning to implementation, and from individual items to the library building. Stricker’s activities included assessing the environment of her library as well as evaluating existing preservation measures and adjusting priorities where needed. She recommends the following six steps:

  • Take responsibility.
  • Gather support.
  • Evaluate collections and environment.
  • Create policies and planning.
  • Train and assign tasks.
  • Stick to the plan.

Oliver Cutshaw is a solo librarian at the Chicago School’s Los Angeles campus. He emphasized focusing on the essentials to serve patrons’ basic research needs and to recognize that print collections are still a valuable important asset for patrons. The “small library challenge” is continuous learning, both in teaching and training patrons to handle print collections and to use library resources, but also in staying up-to-date as a professional librarian, and taking some time to read professional literature.

New Selectors and Selecting in New Subjects: Meeting the Challenges

Ginger Williams, Wichita State University

Panel moderator Harriet Lightman noted that new selectors need to learn both traditional collection development skills and new skills to ensure library collections remain relevant in a rapidly changing information universe. The first panelist, Linda L. Phillips, examined the changing nature of collection development. Library collections include all the materials that libraries make available, whether purchased, locally created, available through social networking tools, or freely accessible in digital collections around the world. New selectors should quickly master the traditional collection development tasks, then act independently and be willing to take risks to meet the needs of their clientele. Selectors should curate resources in the aggregate, considering all sources of information instead of focusing on purchased materials. To remain relevant, they should promote new methods of scholarly communication, customize digital resources for local needs, and help faculty create and disseminate digital content.

Arro Smith discussed some of the resources ALCTS offers to assist new selectors in learning collection development tasks. The online course series, Fundamentals of Collection Development and Management, of Acquisitions, of Electronic Resources Acquisitions, and of Collection Assessment (forthcoming) provide an introduction or refresher for librarians and paraprofessionals. ALCTS began a webinar series in 2009; most webinars have been two hour live sessions, but ALCTS is recording them so they can be viewed on demand later. ALCTS publishes a variety of print resources, such as the Sudden Selector series which focuses on collecting in specific subject areas. Finally, the ALCTS website includes a variety of documents in the professional resources section, ranging from recommended syllabi for graduate courses to ethics statements.

Jeff Kosokoff proposed a vision of collection development that considers the library’s role locally and globally. Kosokoff pointed out that what libraries acquire individually is becoming less important as services such as full-text e-resources, document delivery, rapid interlibrary loan, and online finding tools that make information from elsewhere easily available. Collection development today needs to consider both local needs and contributions to the global collection. Selectors today should worry less about getting all the best stuff and more on getting enough copies of each item into the global collection and on providing access to unique materials through shared metadata and openly accessible digital copies.

PVLR Forum: Print on Demand

Katharine Farrell, Princeton University

The PVLR Forum was a panel presentation with David Taylor, President of Lightning Source; Maria Bonn, Scholarly Publishing Office at University of Michigan; Tony Sanfilippo, Marketing Director, Penn State University Press; and Mitchell Davis, founder of Book Surge.

Taylor began by defining the difference between print on demand (POD) and short-run digital printing. Short run is based on the intent to have stock whereas print on demand is not about printing, but about book production. Print on demand is the reason for the explosion of publishing activity. Books are being kept “alive,” offering publishers the opportunity to generate revenue from older titles. Taylor noted that Taylor and Francis derives 25 percent of its revenue from print on demand. Books are also being published in a new way which facilitates both self-publishing and micropublishing. Taylor noted key trends in publishing as being away from inventory- driven by both financial and environmental considerations. Book distribution is changing as inventory models change. He also pointed to the potential for increase in the availability of large print titles at no additional cost to the publisher.

Maria Bonn of the University of Michigan described the demand for print copies of public domain titles generated by the digitization project Making of America. In a service-oriented effort to satisfy this demand, the library explored several possible options for production, settling eventually on Book Surge (subsequently acquired by Amazon) to provide print on demand copies of requested titles. In 2003, the library sold 300 copies of digitized titles; in 2008, they sold 18,000 copies. This collection is now available as print on demand through Amazon. Bonn noted that the University sees this as expanding access to print. She also said it was interesting to see what was selling, commenting that the Art of Perfumery was a “best seller.” The University of Michigan Press is now experimenting with making titles available free online with print on demand options. Bonn stated that 20 percent of the titles Google has scanned from the Michigan collection are public domain, and Michigan intends to make these available for print on demand sale once the issue of image quality for non-text has been resolved. They are working with Hewlett Packard to make these images “reprint ready” and expect to offer a button from their ILS that would allow users to request purchase of a print copy. She mentioned the library’s Espresso machine, indicating that it has been used primarily for printing dissertations and for faculty projects. However, they are exploring the potential for printing text books and review copies of University of Michigan Press titles.

Tony Sanfilippo discussed print on demand from a marketing perspective. Penn State University Press has approximately 700 titles that are now out of print. He pointed out two main issues: copyright (including third party rights) and fair use, saying these need to be “fixed,” particularly with regard to art reproductions. He stated that offset production was still best for art publications, but that capital expenditure was significant and there are warehouse expenses. He sees print on demand as a possible solution. Penn State Press is converting the backlist to print on demand. They have reissued the Romance studies series as free online. New monographs are being produced as short run digital (200 hardcover copies), but paperback is only available print on demand.

Mitchell Davis, founder of Book Surge, gave an interesting quick history of print on demand, starting with the CARL Uncover project in 1989. He mentioned Project IBID, established in 1993 to supply copies of books between Cornell and the College of Charleston; the 1995 launch of Amazon; the explosion of self-publishing beginning in 1996 driven by changes in technology; the founding of Lightning Source in 1997 and of Book Surge in 2000. By 2005, the advent of on demand printing and self-publishing had the attention of equipment manufacturers as well. Print on demand puts the end product where the sales are. Davis has now founded BiblioLife, a long tail content distributor. In phase one, Project Gutenberg is being mined for content. They are also targeting niche markets. A series of Conrad Anker mountaineering books will be distributed through North Face, the outdoor equipment retailer. Davis noted that presence of rich metadata is critical to making backlist and long out of print titles marketable.

The ensuing question and answer session highlighted the need for standards in e-book publishing; the public reaction to the Espresso machine; and concerns about piracy.

RDA Implementation Task Force

Julie Moore, California State University, Fresno

Resource Description and Access ( RDA) was the hot topic for catalogers at the 2009 ALA Annual Conference. The need for more RDA information was demonstrated by the masses attending all of the RDA sessions. This four-hour program, titled "Look Before You Leap: Taking RDA for a Test Drive," had literally hundreds of attendees (with estimates varying from 300-500). This program gave attendees an idea of the look and feel of RDA. Time was built in between each speaker for questions and answers. Audience members were encouraged to write their questions on cards. In between, the speakers would answer the questions.

Tom Delsey ( RDA Editor, JSC) presented RDA compared with AACR2. He discussed the RDA design objectives, which include consistency, flexibility and extensibility in the framework to describe all materials; it must be compatible with internationally accepted principles; and it must be adaptable to the needs of the wider resource description communities. He explained that they are trying to move cataloging beyond the walls that we normally work within … especially with regard to the semantic web. Throughout the presentation, Delsey provided a detailed, side-by-side comparison of the two sets of rules in looking at various areas, including scope, structure (FRBR is the underlying conceptual model of RDA – FRBR and FRAD attributes and relationships were explained), level of description, changes requiring a new description, sources of information, transcription, categorization of resources, and the “Rule of Three.” Special formats catalogers should welcome the movement from the general material designations in AACR2 to RDA’s Media Type, Carrier Type, and Content Type. This will help to reduce the content versus carrier issues of today. The “Rule of Three” becomes a thing of the past in RDA. Entry will be under the first-named person, family, or corporate body with principle responsibility in most cases. In addition to all the new vocabulary and many of the detail changes, transcription will take some getting used to (in this reporter’s opinion). For example, in RDA, inaccuracies will be recorded as they appear on the preferred source of information. RDA only allows abbreviations in transcribed elements if they are printed as abbreviations on the source.

Nanette Naught (Vice President, Strategy and Implementation, Information Management Team (IMT), Inc. presented an overview of the RDA Online Product. She got the audience’s attention by introducing her talk, which was presented in conjunction with Christine Oliver’s talk, with the analogy of building a new home. “Step 1: Visualize the dream home of FRBR and FRAD Objects.” She explained how RDA is laid out with the Browse, Schema Dictionary, ERD (Entity Relationship Diagram), Group 1 Entities: Work, Expression, Manifestation, Item, Group 2 Entities, Group 3 Entities, FRAD Entities, Relationship Entities, Registry, Workflows, Mappings, and showing complete examples. There is an indexed way to search the product. She showed the actual online RDA tool. It is in “alpha” at this point. (This was the first glimpse that many attendees had of the online RDA tool.)

Christine Oliver (Head of Technical Services, McGill University Libraries) presented RDA in LIS Education. Oliver began her talk with: “Step 2: Assess the Existing House of AACR2 in MARC; Step 3 Sketch the Remodeled Structure” … but keep the foundation footprint … and change the flow. In the house analogy, she explained that RDA represents a remodeling. AACR2 represents the stones in the walls of the new house. They are keeping the stones, although some of them need to be reshaped. In the online product, AACR2 is included in the resources. She showed that one can search a rule number in AACR2 and it shows the new RDA instruction. It is possible to toggle between AACR2 and RDA. She demonstrated how one can hide text, create bookmarks, and write annotations (which can be named and then later searched in an index). She showed the Schema Dictionary, Group 1 Entities, Group 2 Entities, Group 3 Entities, and FRAD Entities. Oliver discussed integrating the RDA product with one’s daily work. She explained that there are Wizards for examples and workflows. She suggested that expert communities can create workflow for specific types of materials (e.g., digital cartographic materials) that can then be shared with other libraries. The product will be highly customizable.

Sally H. McCallum (Library of Congress) provided the presentation “ RDA in MARC. “ McCallum provided a thorough overview of changes that are beginning to happen in MARC due to RDA. She stated that RDA was designed to be compatible with MARC. There is a Process Working Group and there have been approved changes to MARC. One of the areas that come to mind are the replacements for the general material designation: tags 336 (Content Type), 337 (Media Type), and 338 (Carrier Type). The documentation for this may be found online.

John Espley (Chief Librarian, VTLS) provided a presentation entitled “ RDA and an ILS.” VTLS is in a very nice position, as far as the ILS world is concerned … because they have jumped out in front of the pack by experimenting with implementing FRBR. Espley demonstrated examples of the VTLS Work Record, Expression Record, Manifestation Record, and Item Record and showed us how they fit together. He added different expression records and different item records to show the relationships. Espley pointed out decisions that need to be made for the future.

Barbara Bushman (Assistant Head, Cataloging Section, and National Library of Medicine) presented “Testing Resource Description and Access.” On May 1, 2008, the three U.S. national libraries (Library of Congress, National Agricultural Library, and National Library of Medicine) issued a joint statement in response to concerns raised by the LC Working Group on the Future of Bibliographic Control. They made a commitment to help with the completion of RDA. However, the three libraries agreed to make a joint decision on whether or not to implement RDA, based on results of testing of RDA. Testing partners include OCLC, ILS Vendors, system developers, and a group of twenty-three diverse institutions which were chosen from over ninety applications. The test will cover all types of materials and communications formats and rules. RDA will be initially published in November 2009. The current timeline of the testing period ends with the formal assessment in September 2010, and the final report will be shared with the U.S. library community in October 2010. See the LC web site.

Some presentations from this program have been posted to the ALA Conference Materials Archive.

Resuscitating the Catalog: Next-Generation Strategies for Keeping the Catalog Relevant

Tricia Jauquet, Purdue North Central Library

This program, cosponsored by ALCTS, RUSA, and PLA, aimed to show how to bridge what librarians are doing now with OPACs with what users have come to expect from their experiences with Google-like search engines and interactive social sites. The handouts and power point presentations for this session can be found in the Conference Materials Archive.

Renee Register ( register@oclc.org), Senior Project Manager of OCLC’s Cataloging and Metadata Services, was the first presenter. Her presentation focused on the next generation of metadata. She maintains that the current methods of creating, sharing, and maintaining metadata are too costly to be effective. That is, from the time information is created to the point where it is shared, there are usually large gaps in the process that create misinformation, but there are not any automatic ways of changing wrong data once it is out there. She envisions an environment where metadata is exchanged seamlessly, so that all participants, including end users, can take part in the change and exchange of information as data changes and morphs during the process. This would encourage interoperability and reduce information redundancy. Register said that OCLC is currently running a pilot of a cooperative program between publishers, vendors, libraries, and other involved partners toward this end. She referenced a video on the OCLC web site that goes into depth on this project: “From ONIX to MARC and Back Again.”

Beth Jefferson ( beth@bibliocommons.com), President of BiblioCommons, was the next speaker. Her main focus was public library catalogs and how it is not a matter of resuscitating the catalog, but of rethinking the possibilities of what it can become. She said that library catalogs have missed a generation of development, so now is the perfect time to “leap-frog” to create something new and better. “Less is the new more” should be the main focus of this development. That is, users want “true discovery,” not an overload of information. How can catalogs recreate the ways in which users really search for things and how they discover new material? Some examples included a “Recently Reviewed Items” option to mimic users browsing the return carts and a way to browse the library that mimics how the books are shelved and includes book jacket pictures. Jefferson also stressed that the catalog should be a space, not a database, for its users. She suggested adding more social networking functions such as users being able to create a user name in place of a patron barcode, follow links related to their interest in an item (such as user discussions of that item and community events related to the item), add their own reviews, and follow another patron’s reviews.

David Flaxbart’s ( flaxbart@uts.cc.texas.edu) presentation was titled, “Lipstick on a Dinosaur? Keeping an Old-Gen OPAC ‘With It’ in a Next-Gen World.” He is a chemistry librarian at the University of Texas-Austin. He spoke of dressing up old technology by adding Web 2.0 features to old ILS catalog principles. Flaxbart said that the academic world has a cultural resistance to change so we are bringing up the rear in this technology. He also spoke of the difficulties in customizing an OPAC from its “out of the box” settings when academic libraries serve two very different customer bases, students and professors, who have very different needs and expectations of the catalog. However, both groups do not see the catalog as a destination, so we have to find ways to pull them there from other sources, such as the university home page and WorldCat. Some of his assumptions for librarians going forward with OPAC modification include that the technology is limited and in perpetual beta, that technology is always changing, and that we will always be behind the user curve, so we must always view the current system as a bridge to the new system since no system is forever. Most important to remember is that while all change will be resisted in the academic community, no change will be remembered once implementation occurs.

The final speaker, Ellen Safley ( safely@utdallas.edu), Senior Associate Director at the University of Texas-Dallas, discussed adding a “discovery layer” to the OPAC to make it more user-friendly for students. Her main point is that users do not want to have to learn how to use the catalog. She has met with complaints of it being too difficult to learn how to use the OPAC from PhD holding faculty members. Users expect a catalog that, like Google, Amazon, or shopping sites, needs no instruction. Safley’s methods for this include losing the library jargon such as “holdings” and acronyms, making searching simple (i.e., no Boolean or search limits), making options big and colorful, adding a “spellchecker without attitude,” and making search results based on relevancy like a Google search. Some of the problems she has encountered with the discovery layer is that the search display can sometimes be complicated (it cannot display print and electronic books together) and that some things can be changed while others cannot (display flexibility is lost while real-time status is gained). Also, since discovery layers rely on item records, and most electronic materials and many journals do not have item records, libraries may have to rethink how they process those items.

Rethinking Staff Resources in the E-Serials Environment

Mike Wright, University of Iowa

Moderator Lori Kappmeyer (Iowa State University) opened by noting that all types of libraries – not just academics – are looking at what to give up in an environment where print is fast becoming the exception and where e-serials have added complexity to workloads. Kappmeyer introduced speakers Rick Anderson (University of Utah,) Gloria Guzi (Cleveland Public Library,) and Carol Ann Borchert (University of South Florida.) They were asked to address specific questions about ceasing print-related processes related to serials.

Rick Anderson spoke first, with a talk entitled “To Print: Drop Dead.” Noting that he would take liberties with the topic, Anderson said he wants to give up print itself. Utah’s journal collection is 95 percent electronic, and e-books are increasing. Among the tasks discontinued are journal check-in, journal binding, claiming, and in-house MARC cataloging.

Anderson is convinced that journal check-in is a waste of time, that binding is a waste of money (high-quality protection is provided for content with low use) and that claiming’s mediocre success rate provides poor return on investment. There must be, he said, a mindset shift as libraries move from print to electronic: the focus of our energy must be on unique collections, not processing materials which are widely available. What is the bottom line? Libraries must return top value for every dollar of funding.

In a different vein, Gloria Guzi indicated that the Cleveland Public Library has stopped few print processes, largely due to its popularity with users. But there have been a number of workflow and staff assignment changes with print and electronic serials. Cleveland’s process review started with the move to a new ILS which did not accept their non-standard serials holdings data. This, in turn, led to a gradual realization that serials processes needed an overhaul. Perhaps the biggest change, appreciated almost immediately, was moving some work outside of technical services: they allowed branch library staff to manage their own subscriptions and claim serials directly, rather than doing it centrally. This has been a success. Centralized claiming was often done too late, and subscription extensions were more common than routine fulfillment.

Increasing numbers of e-serials and their different demands accented the need for some organizational changes as well. Guzi noted that a team approach was seen as necessary to get all the background work done, and such an approach would have to include subject librarians and public services staff. The serials acquisitions librarian still has responsibility for licensing and negotiations with vendors as well as activating items in Serials Solutions. Other tasks, such as determining availability of archival access, pricing structure, etc. were dispersed to others. She noted that a key to their success in the e-serials environment was letting go of “sacred” past practices.

Carol Ann Borchert notes that in 2003 the University of South Florida had a new director for technical services, and was migrating from NOTIS to Aleph. This created an opportunity to examine operations: which processes were unnecessary, which could be done better, and what work was not being done that should be? At the time USF had no cohesive workflows for e-serials, and though SFX was on-line there was difficulty in getting all the serial titles into the system. The coordinator of collections had too much work and no staff, so a coordinator of serials position was created. To begin, staff mapped out all serials processes and workflows. Later, an Access database was created as an in-house e-resource management system, and they created tracking system for e-journal problems. These steps and others helped organize and simplify e-serials workflows immensely, while improving the SFX setup to use subtargets improved the user’s experience. USF briefly did away with receiving and binding of paper serials, but for a variety of reasons these were reinstated. They aim to try selective binding in 2009. Judging by the number of questions and audience comments, it is clear that the print to electronic transition is causing libraries to reconsider many longstanding practices.

Swingin' With the Pendulum: Facing Cancellations in the Age of E-Journal Packages

Gracemary Smulewitz, Rutgers University

The discussion focused on the issues that libraries face due to budget cuts.

Panel participants introduced by Eleanor Cook were:

  • Beth Bernhardt, Electronic Resource Librarian, UNC, Greensboro
  • Bob Boissy, Director of Network Sales, Springer
  • Rebecca Day, Manager, E-Resource Services, EBSCO
  • Rick Moul, Director, PASCAL

Bob Boissy suggested that libraries use the word negotiation rather than cancellation. He described what he thought to be the state of STM publishing. Publishers have been unable to dispense with current subscription models entirely and are trying new models such as comprehensive consortium, usage models, pay per view, author pays e.g. open choice, open access, access only, tiers, etc. However, the publishers are still constrained by the essential nature of scholarly publishing, the limited audience. Boissy also felt that while the economy is a catalyst for change, we cannot throw out some successful Internet models.

From our experiences during the heaviest Internet development, libraries and publishers have gained quite a bit:

  • Over a decade of warming up our negotiation skills
  • License evolutions towards more liberal policies
  • Golden age of journal content access for end users
  • Same goal of maximizing user access to all content
  • Shared willingness to use technology to achieve our goal
  • Shared sense of duty to preserve the scholarly record

The key in this economic environment is negotiation. Everyone should negotiate. Publishers are not opposed to open access, they just favor fewer labels, less rhetoric and more scientific facts.

Beth Bernhardt noted that UNC at Greensboro is a medium size school with 17,000 FTE. The emphasis is on doctoral programs. UNC Greensboro was a women's college, and the landscape has changed in the last few years. For the next two years, they anticipate a 34 percent permanent cut in collections (57 percent from books, 23 percent from serials, and 18 percent from databases).

UNC Greensboro has developed a strategy for evaluating databases using the following tools and concepts:

  • Cost per use.
  • Overlap analysis.
  • Get rid of duplication of print.
  • Perform format analysis – change platforms where possible, may be available somewhere else for less.
  • Reduce concurrent users where possible– take it down to one in many cases.
  • Negotiate for no inflation rate increases.

Strategies have also been developed to evaluate serial subscriptions, but first in collaboration with departments, determined core mission of university, then:

Titles that were not in package deals were identified. At Greensboro, 1700, 600 had an online component. They evaluated usage statistics from publishers and from their serials management system and made format adjustments. They also compared print pricing to document delivery and online pricing. If print was cheaper with deep discounts, they changed to print.

Nuisance analysis was performed. For example, if username and password were needed, the resource required more management time and in many cases it was canceled. They also negotiated for no inflation increases and sent letters to publishers. Pay-per-view options were evaluated for some resources. They relied on ILL and document delivery when possible

UNC at Greensboro used the following package deal strategies:

  • Everyone has a lot invested – both the library and publisher
  • Negotiated for lower or no inflation increase
  • Put cards on the table with the publisher- e.g., package is $40,000 and Greensboro is using $25,000, what can be done – will pay per view make sense for some titles.
  • Take cuts allowed by contract
  • Check cancellation clauses carefully
  • Look at cost per use for package deals
  • Go with a smaller subject collection where possible.
  • Look at the impact of other members of your consortium

Strategies for consortia participation included sending letters to publishers explaining financial conditions. They referred to a letter from International Coalition of Library Consortia (ICOLC) regarding impact on current economic environment for consortia purchasing. UNC at Greensboro negotiated for no inflation increases. They also negotiated that if any school drops out of deal, there will be no change or impact for remaining participants. A cheat sheet was established for all questions in preparation for negotiations and included the following questions:

  • What is the expected fee increase?
  • Can we cancel individual titles and stay in this deal?
  • What is the impact to my school if we drop out of the deal?
  • What is the impact to the Carolina consortium?

Rick Moul is the Executive Director PASCAL, a statewide consortium of 57 public and independent academic libraries in South Carolina. The consortium is part of the state government virtual academic library. PASCAL has an operating budget of 3.1 million consisting of state funds and other funds.

In June, 2008 the state funded the following electronic resources:

  • Access Science
  • Academic Search Premier
  • CINAHL Plus with full text
  • Business Source Premier
  • Gale Literature Resource Online

PASCAL worked with the state to develop a funding strategy. In FY 2008-2009, due to pending budget reductions, the state built a contingency fund since money could be carried over from previous fiscal year. In January 2009, the fund went into steep decline which amounted to a 90 percent cut. PASCAL met with the Board of Directors, had carry over money and tried to stabilize the program. PASCAL cut LexisNexis, Access Science, Nature Science, ProQuest Nursing and Allied Health Source and Lippincott Nursing Premier. There was a 40 percent delivery service reduction for PASCAL Delivers. Aggregated databases were retained.

PASCAL developed a campaign and revenue generation for FY 2009-10. There was great support from students and faculty, and they made strides with key legislators. There was no chance for any restoration of funds, and they switched the focus to academic administrators. The seminal moment was at a group meeting with Chief Academic Officers. Key provosts understood the value and developed a proposition to establish an ad hoc committee of Chief Academic Officers. This led to a revenue plan for 2008-10. The results are that there is breathing room and core is still intact. Everyone knows PASCAL and only one school opted out.

During the first year, a retail price and price index for core electronic resources and delivery service were calculated. No library pays more than retail for core databases and PASCAL delivers. There are significant discounts for more libraries – thirty-three libraries pay between 20 percent and 40 percent of retail price. It is an easy sell

In summation, PASCAL:

  • Reversed engineered our collection.
  • Maintained our role in canceling subscriptions.
  • Kept licenses intact.
  • Worked on lower cost alternative for one product.
  • Renegotiated core licenses.

During the second year, they “stopped bleeding.” During the third year, they continued to focus on revenue generation durability of our model used this year. They re-evaluated resources and looked for synergies based on institutional spending. There was discussion of enhanced document delivery for non-returnable items. Recovery will be multi-year process.

Rebecca Day, Manager, E-resource Development for EBSCO discussed “The way we were and are.” In 1999, Ideal was best known package. Package subscriptions represent 1percent of EBSCO sales. A typical four year master’s institution has approximately 2,000 print subscriptions.

In 2009, EBSCO has 300 packages cataloged in the EBSCO title databases. Package subscriptions represent 55 percent of EBSCO’s e-sales. Four year masters institutions have access to 50,000 titles.

In 2004, subscription purchases reflected that individual titles sales are $180,000 and e-package sales are $120,000. This is the reverse in 2009. At this time, there is an opportunity for EBSCO to see where they are.

Recommendations:

  • Look at new models and old models and not the whole big deal.
  • Look at subject collections, pay per view and ILL options.
  • Negotiate, negotiate negotiate.

Day recommended going to people who provide the money and explain what the value is of what you provide. Look for efficiencies in:

  • Cost of staff time
  • Options for outsourcing
  • Cost of maintaining management tools
  • Options for streamlining

Look at vendor provided services that can help you manage e-resources. Consolidate to one resource management tool instead of many. Weigh the consequences if content is lost or if details are lost. Decide and act since this avoids interruption of service. It allows staff to make any necessary changes and to communicate those decisions to all parties involved, such as the publisher, intermediary consortium, and patrons.

The program closed with an extensive question-and-answer period.

Workflow Tools for Automating Metadata Creation and Maintenance

Teressa M. Keenan, The University of Montana

Digital projects are becoming less peripheral and more integral to library operations. Institutions must begin to address the implications of this change. With the increasing amount of digital content libraries are expected to create and maintain, data curation has emerged as a key objective. This program was intended for librarians who are involved with the development and management of metadata. Examples of current work and discussion opportunities for collaborative development of tools among institutions were provided by this session. Presentation slides and handouts are available at the ALA Conference Materials Archive.

Ann Caldwell, Coordinator of Digital Production Services, Brown University, began the session with a discussion aptly titled “Herding Cats.” She indicated that spending a great deal of time working with faculty in combination with digital objects and metadata felt similar to herding cats. She described a recent endeavor with the university’s Engineering Division to assist with their reaccreditation process. Digital objects included syllabi, websites, lab reports, homework, exams etc. Throughout this process, a set of tools were developed which allow faculty to easily contribute digital objects to the university’s repository. The set of tools include a file uploading system, a Metadata Object Description Schema (MODS) editor and a file tracking system.

Caldwell explained the workflows involved in adding and maintaining digital objects in their repository and showed how the behind the scenes technology supported the user-friendly front end processes. Two of the central problems encountered by the division were keeping track of an assortment of digital materials, and ensuring the creation of useable metadata by a variety of people with differing skills and interests. Two tools that addressed these issues directly were described in the presentation. The “Project Manager” was developed to track projects, equipment, software, users, and processes. The “MODS Editor” was developed as a user-friendly metadata editing interface. The editor keeps the extensible markup language (XML) encoding behind the scenes while still allowing access to it, shows a list of required fields and incorporates authority control through the use dropdown menus. The project has been a success so far and that other departments are interested in trying out the system after seeing how well it worked for the Engineering Division.

The second speaker was Jenn Riley, Metadata Librarian, Indiana University Digital Library Program. Her presentation was entitled “Using Schematron for Analyzing Conformance to Best Practices for EAD, TEI, and MODS (and some other thoughts on workflow tools).” She began by suggesting that one of the greatest challenges in quality metadata is consistency. It is easier to obtain consistency with data centric xml than it is with document centric xml. Examples of data centric xml are the Metadata Object Description Schema, Dublin Core, etc. while some examples of document centric xml are the Encoded Archival Description (EAD) and Text Encoding Initiative (TEI). With data centric xml, the fields are provided, while with document-centric XML, text is provided and the encoding must be added. A great deal of work is done with TEI and EAD encoding at Indiana University. Rather than use the Archivist’s Toolkit, they work with the XML directly. They used ideas from the RLG EAD report card to develop a tool called the Schematron. This is a plug-in that will help achieve consistency by defining EAD guidelines in a machine readable way, validate the XML documents and report any problems.

The main portion of the presentation was a description of how the Schematron technology works. It is a Java plug-in that is added to the Oxygen XML editor to check files against their local guidelines. Riley provided examples of the types of error messages and warnings produced by the validator. Correct expressions are copied and pasted into the original xml file. Schematron is an XML assertion language. It makes a declaration on how an XML document should look, based on organization, patterns and rules. Users can further define rules and tests using XPath language. The tests will generate error reports in xml or html that user can draw on to make corrections to the original XML file. The software is extensible stylesheet language transformation (XSLT) 1.0 and 2.0 compliant and can be downloaded from the Schematron web site.

To wrap up her presentation Riley discussed some general issues related to metadata tools and workflows. She emphasized the importance of automation, streamlining and validation. Tools should be modular, configurable and sharable. In addition, the user interface should be taken into consideration when designing cataloging tools.

The third speaker was Rhonda Marker from Rutgers University Libraries. Unfortunately she was unable to present at this time, however her presentation slides on “Open WMS: Workflow Management System for Digital Objects” are available on the ALA Conference Materials Archive.

Other Programs of Interest from the 2009 ALA Annual Conference

SERU (Shared E-Resources Understanding)

Sue Anderson, Eastern Washington University

Speaker: Clinton K. Chamberlain, Coordinator for Electronic Resource Acquisitions, University of Texas at Austin

When libraries moved from print to electronic journals, librarians and publishers believed that the other required licenses. In the print world, there were contracts with publishers; now we have license agreements that cannot override contract law. In many signed agreements, librarians give away rights they should keep. We have a backlog of licenses that we negotiate with publishers that are fair to both entities.

In 2005-2006, a discussion ensued on libraries’ reliance on contract or copyright law. Could we find an alternative to license agreements? In 2007, librarians and publishers met in a working group to see if they could find this alternative. During discussions, the group wanted to avoid legal language, use the language of practice, and address common situations. Out of those discussions, the working group at NISO created a draft of SERU. In 2008, SERU was recommended as a statement of common understanding that could be used as an alternative to electronic resource licenses. The document established a framework, provided FAQs as well as specific examples. This document was created for use in the United States.

SERU offers libraries and publishers the option to reference a set of common understandings as an alternative to negotiating a signed license agreement. It was designed to streamline the acquisitions/sales process. SERU cannot: eliminate all license agreements, be used as a standard license, or be customized. However, SERU can: reduce costs for librarians and publishers; streamline sales and acquisitions process and let librarians and publishers show they employ best practices.

To get involved, libraries can join the SERU discussion list, talk to university of library legal counsel before joining SERU and then join the SERU registry.

Find more information online.

OCLC Symposium: Leadership Beyond the Recession

Susan Thomas, Indiana University South Bend

This Friday afternoon OCLC Symposium presented insightful information on keeping our libraries relevant in lean times. Cathy De Rosa began the symposium with a brief introduction and later facilitated a panel discussion that included Steven Bell, Associate University Librarian at Temple University, Charles Brown, Director of Libraries, Public Library of Charlotte and Mecklenburg County, and Ed Rivenburgh, Director of College Libraries at the State University of New York at Geneseo.

Joseph Michelli, author of The Starbucks Experience: 5 Principles for Turning Ordinary into Extraordinary, was the keynote speaker. He discussed the importance of driving change to stay relevant instead of taking a hunker down/avoidance approach. Since libraries are not profit driven, they should focus on creating a user experience that will generate and motivate public support. He cited the OCLC Report, From Awareness to Funding, which covers research by Cathy De Rosa on patrons’ sentiment regarding libraries. Michelli noted that listening is the key, and that involves listening to users, listening for opportunities, and listening for ideas. He provided engaging and at times humorous examples of successful business practices (Ritz Carlton, Starbucks, Pike Place Fish Market in Seattle) that are successful because they focus on the user experience. He stressed the importance in creating the concept of a positive experience by focusing on the service, not the product. He also stressed the importance of doing what you do so well that users will want to experience it repeatedly and refer friends.

A panel discussion followed Michelli’s presentation, moderated by Cathy De Rosa. Michelli also participated in the panel discussion. The panelist took turns answering questions posed by De Rosa, discussing transformative tactics they use to stay relevant to constituents. Transformative activities included branding, generating positive “Wow” experiences, fostering positive relationships with users, and obtaining feedback from patrons. The importance of not forgetting to positively encourage, recognize, and ask for feedback from staff was stressed. Steven Bell noted that there are many great ideas that can be incorporated in any library but to keep in mind that what works well in one library may not be appropriate in another library. In addition to recommending his own books, Michelli also recommended a book titled Dream Manager. Michelli, in response to a question, ended with a final comment that good business is personal and Amazon cannot beat the personal library experience.

Redesigning Technical Services Workflows

Rebecca Schroeder, Brigham Young University

Four panelists presented their ideas for redesigning technical services workflows in this lively and informative discussion. Ideas for streamlining the book metadata workflow were the first point of discussion. Todd Carpenter from the National Information Standards Organization (NISO) and Renee Register from the Online Computer Library Center (OCLC) reported on the results of a joint NISO/OCLC research report. The report examined how the exchange of metadata between the many varied metadata stakeholders occurs and how to best facilitate the back-and-forth nature of metadata exchange. Carpenter began by explaining that the basis of the report came from the collective comments and insights of about thirty interviewees. The interviewees answered questions about the nature and volume of the metadata managed, the changes in the metadata lifecycle, current issues, challenges, and ideas for improvement in sharing metadata.

Register continued the discussion by describing the different metadata stakeholders and the challenges they face in exchanging metadata. The first group of stakeholders is the metadata providers. They are faced with the task of aggregating data from multiple sources in various formats and adding standardization and enhancement. They play a leading role in the development of metadata standards, identifiers, and best practices while providing metadata, acquisitions assistance, and fulfillment services to retailers and libraries. Booksellers are a second group of stakeholders. They use good metadata for sales and business intelligence. They must maintain staff to look after both bibliographic and buy/sell metadata. A third group of stakeholders is the Library of Congress and other national libraries. These entities play a key role in standards related to data exchange. They provide record creation and record distribution services for libraries. The fourth group of stakeholders is local libraries who are challenged with fewer staff, higher volume of records, new formats of materials, and evolving workflows. In addition to these core stakeholders, Google’s digitization projects and the book rights registry create direct involvement for it in metadata issues.

Carpenter concluded this part of the program by outlining three ways stakeholders can facilitate the back-and-forth exchange of metadata:

  1. Identify best practices to enable optimal re-use and re-purposing of metadata areas for collaborative work across communities;
  2. Optimize identifiers such as author, individual works, related works, series;
  3. Optimize subjects used by publishers, Book Industry Communications (BIC) and Book Industry Standards and Communications (BISAC) subjects, Library of Congress Subject Headings (LCSH), Medical Subject Headings (MeSH), Sears, and user tags.

The full NISO/OCLC report is available online.

Arlene Klair from the University of Maryland Libraries shared her experience in changing technical services workflow. Processing materials is very time consuming and complex at her institution. The University of Maryland Libraries does centralized technical services for a remote campus as well as for seven on-campus libraries. They also share a database with the University System of Maryland and Affiliated Institutions (USMAI) which consists of 126 libraries at thirteen institutions. In this shared environment, technical services spends a considerable amount of time and energy in providing services to their users. The original workflow procedures involved acquisitions receiving new materials and sending them to the Adaptive Cataloging Team (AC) for cataloging. With this process, it took three weeks to six months for books to make it to the shelves. Implementation of a new workflow in technical services involved three phases. During phase one, technical services improved the turnaround time for their purchase plan. This was accomplished by accepting all PromptCat full records and sending only books purchased for branch libraries and those needing corrections to AC. This allowed acquisitions to send 60 percent of the materials directly to labeling. In phase two, technical services further streamlined the process by implementing shelf ready processing and assigning acquisitions to do quality control of the shelf ready program. Phase three marked the libraries’ vendor reprofiling the purchase plan by call number. The call number was used to assign item location and branch mapping to the label, which significantly reduced the number of re-labels needed. In a future phase, fully cataloged records will be applied to firm orders and attention will be turned to smaller approval plans.

Rick Anderson from the Marriott Library, University of Utah, addressed four areas of technical services where he believes retooling is needed. First, acquisitions processes should be reworked to consolidate the workflow of monographs, serials, and e-resources. Acquisitions processes ought to be simplified whenever possible, by conversion to shelf-ready. Second, serials processing should be retooled. In addition to combining with acquisitions, serials processes must be simplified by eliminating check in, binding, and claiming. Third, cataloging units should outsource bibliographic maintenance to both vendors and OCLC. Cataloging should simplify their processes and remember that completeness and accuracy are not the end goal or the primary focus of the bibliographic record. Patron connection is the primary focus and catalogers should treat the bibliographic record with this goal in mind. The last area that requires retooling is collection development. Since the University of Utah circulation statistics show that roughly 50 percent of librarian-selected titles never circulate, Anderson makes a case for the assertion that libraries should rely more on the patron for collection development decisions. This can be accomplished by purchasing materials requested for interlibrary loan, as well as by using print-on-demand and buy-on-demand models. In sum, Anderson asserted that technical services processes need to reflect a new reality in patron behavior and expectations – one that is almost completely online, radically faster, far more responsive, and provides immediate access to materials. (This program was recorded and will be posted on the OCLC website.)

Rough Waters: Navigating Hard Times in the Scholarly Communication Marketplace

Adrian Ho, University of Western Ontario

“Rough Waters: Navigating Hard Times in the Scholarly Communication Marketplace” was the topic of the SPARC-ACRL Forum that took place on July 1 at the 2009 ALA Annual Conference in Chicago. It was facilitated by Kimberly Douglas, University Librarian, California Institute of Technology.

The first presenter was Charles Lowry, Executive Director, Association of Research Libraries. He reported the findings of an ARL survey (The Current Fiscal Landscape of Research Libraries). Of the ninety-nine respondent libraries, 55 percent indicated that they had already experienced base budget reductions or take-backs in fiscal year 2008-2009. Both the mean and the median of the reductions were 3 percent. Staffing was the hardest hit area in comparison to operations and acquisitions. That translated into measures such as hiring freezes, elimination of vacant positions, layoffs, early retirement incentives, etc. For fiscal year 2009-2010, 69 percent of the libraries expected further reductions, 11 percent foresaw a flat budget, another 11 percent anticipated a budget increase, while 9 percent could not say for certain.

The second presenter was Ivy Anderson, Director of Collection Development and Management, California Digital Library (CDL). As the coordinating body of electronic resources licensing for the ten campuses of the University of California (UC) system, CDL had issued an open letter in May to inform licensed content providers of the need for collaboration to tackle the current economic crisis. CDL was considering journal cancellations with a value-based pricing approach. At the same time, CDL made an effort to enhance access to journal content. For instance, it supported the CERN-based initiative SCOAP3 to help convert subscription-based core High Energy Physics journals to open access publications. It also worked with the journal publisher Springer to create a pilot project in which “all UC-authored articles are published with full and immediate open access via Springer’s Open Choice program.” On the other hand, UC Berkeley implemented the Berkeley Research Impact Initiative to provide authors there with open access publication funds if they wish to publish articles in open access journals but lack funding to cover the related publication fees. Finally, Anderson discussed CDL’s publishing services that are comprised of eScholarship’s digital publishing and repository services, as well as UCPubS (a combination of “open access digital publishing services provided by eScholarship with distribution, sales, and marketing services offered by UC Press”).

Emma Hill, Executive Editor, The Journal of Cell Biology at Rockefeller University Press (RUP), was the next presenter. She described how RUP had broken away from the traditional journal publishing model by introducing innovative policies with regard to accessibility, affordability, archiving, and article ownership. The reasons for adopting those policies were that online publishing had changed the landscape of scholarly communication and that the new policies would be beneficial to the published contents. Hill pinpointed that commercial publishers’ focus on profit-making was at odds with scholars’ needs for information access. Quoting a study conducted at the Rockefeller University Library, she said that four “megapublishers consumed 69 percent of the total 2009 serials budget” there. Hill asserted that librarians, authors, and readers should “make demands of the publishers” to initiate changes in the journal publishing system. She said squarely that “librarians and authors hold all the power, and they should not be afraid to wield it.” Meanwhile, she advised that publishers be forward-thinking, listen to demands, and evolve.

The last presenter was Jim Neal, Vice President for Information Services and University Librarian, Columbia University Libraries. He argued that the future of scholarly publishing consists in having a competitive market, easy distribution and reuse of content, innovative applications of technology, quality assurance, and permanent archiving. He also discussed twelve scholarly communication issues as viewed from the stakeholders’ perspectives. Some of them concerned new modes of communication, university’s role in disseminating scholarship, monograph publishing, assessment and accountability of scholarly communication, and collaboration for quality, productivity, and innovation.

SPARC has made v ideos, podcasts, and slides of this program are available online.