Reports from Midwinter 2010
Volunteer Reporters Cover ALCTS Forums and Events in Boston
ALCTS members who attended the 2010 ALA Midwinter Meeting in Boston provided the following summary reports. We thank the volunteers who covered a program or event sponsored by ALCTS or one of its units. Their efforts enable the rest of us to benefit from the presentations. We regret that volunteers were not available to report on all the forums.
Living Digital: The Future of Information and the Role of the Library
Betsy Simpson, University of Florida
What is the library's role in a knowledge economy driven by an increasingly pervasive digital infrastructure where a digital mindset is the norm? Is there time to transition to this digital world, or is a more fundamental and accelerated transformation needed? “Living Digital: the Future of Information and the Role of the Library,” an ALCTS 2010 Midwinter Symposium, faced these issues head-on through provocative speakers and brainstorming sessions. A common thread among the presentations was the notion that society is experiencing a swift and significant shift away from “business as usual” as a result of the growing predominance of digital technologies. In order to thrive in this new environment, organizations must create a culture of innovation focused on broad-based collaborative initiatives involving myriad stakeholders. Margaret Ashida, Project Director, New York State STEM (Science, Technology, Engineering, and Mathematics) Education Initiative, highlighted the need to seek workers with a breadth of understanding across a field coupled with deep knowledge in a specialized area (IDEO co-founder Bill Moggridge's “T-shaped people”), to utilize social networking recruitment strategies, and to spearhead innovative workplace practices, such as IBM's Jam Events. Elaborating on this theme, Kevin Guthrie, president of ITHAKA, demonstrated the importance of scalability and diversification by comparing Blockbuster to Netflix. While both companies must contend with a bricks and mortar legacy, Netflix has developed a robust service layer that positions it to capitalize on the trend toward video-on-demand technology. In an era of digital expansion, Guthrie posits that libraries must think strategically about market forces and engage users with relevant and scalable value-added services.
Increasingly, digital natives are forcing a new paradigm. According to John Palfrey, Henry N. Ess III Professor of Law and Vice Dean for Library and Information Resources at Harvard Law School and co-author of Born Digital: Understanding the First Generation of Digital Natives, digital natives do not distinguish between their online and offline identities. Generally speaking, they are digital multi-taskers who expect information to be delivered digitally; in fact, many have been on the forefront of digital technology development. Their views on information delivery, privacy, and intellectual property tend to be contextual, assumptive, and based on social norms. Libraries are challenged to meet the needs of digital native within a complex information environment impacted by copyright restrictions and commercial interests. What steps should libraries take to address these issues? John Wilkin, Associate University Librarian for Library Information Technology, University of Michigan and Executive Director of HathiTrust, set forth his view that libraries have to recognize that the problems exist in the aggregate and require a collective response. Digital libraries have a history of largely operating in isolation from one another and with an emphasis on technical architecture rather than shared purpose. Wilkin advised that focused collaboration will foster a more coherent and user-centered approach.
Observations from a distinguished group of panelists brought additional insights. John Yemma, editor of The Christian Science Monitor, described the newspaper's move to a Web-first platform in March 2009. This major change stemmed from a perception that the Web represents the future for journalism. It required a significant realignment of staff and workflows, and, although only a few months out, there are positive indicators from the subscription data that make Yemma optimistic about the new direction. A different but equally dramatic shift at the Fisher-Watkins Library at Cushing Academy to a primary reliance on digital, rather than print, resources thrust Tom Corbett, the library's Executive Director, into the spotlight. In spite of concerns on the part of some that student learning and love of reading might be adversely affected, Corbett said the opposite appears to be the case. He was quick to point out that Cushing Academy's decision, made prior to his arrival, was based on its feasibility locally.
Ann Wolpert, Director of MIT Libraries and MIT Press, outlined the role of librarians as educators and service providers who offer support and context in the use of digital technologies. An information commons gives librarians a technology-oriented setting to reach out to the digital native generation whose members come to academe with varied levels of expertise in digital technologies. Creative services and gaming activities can bridge generational and cultural gaps at all types of libraries, noted Jenny Levine, Internet Development Specialist and Strategy Guide at the American Library Association and author of The Shifted Librarian blog. Games have been used for many years in libraries as educational and recreational tools. Today's wide array of digital gaming options afford libraries exciting possibilities for connecting with users.
With more than eighty participants, the symposium was well-attended and a great opportunity to discuss key issues facing the profession. The planning committee: Pamela Bluh, Cindy Hepfer, and Lisa German, deserves much credit for their work in developing such a lively and interesting event. ALCTS thanks Sun Microsystems for its generous support of the symposium.
And Now for Something Completely Different: Our Future from Outside the Box
Keri Cascio, St. Charles City-County Library District
Several cutting-edge thinkers prepared short opinion pieces on future trends that are likely to impact research, instruction, and scholarly communication. These essays served as the foundation for group discussions on emerging roles for libraries and librarians, particularly collections and technical services librarians. The full papers from the symposium are available online.
The essays were divided into three themes for discussion. Each discussion group reported back to the larger group on themes and issues they found in the essays. Highlights for each set of essays included:
New Content, New Roles (Gemmell and Bell; Levy; Rhind-Tutt; Weinberger)
- A hyperlinked world with new browsing options allows for a different kind of serendipity in research.
- Library services are broad and we need to help our users focus and filter content, the time of vetting collections is over.
- Consider the move from the role of gatekeeper to guide; our mantra is shifting from access to connection.
Facilitating Collaborative Research and Scholarship (Farkas; Gibbons; Leonard; Salo)
- How are we spending our money on digital content when our users are used to getting online materials for free?
- How can we drive the economic model with publishers and software vendors?
- We can get closer to the researcher and insert ourselves in the research and publishing process.
- How can we get the money to fund collaborative collection development and work on the network level?
- Library staff needs to be reallocated for a future without paper collections.
Library Structure and Infrastructure (Brantley; Greenstein; O’Brien)
- Technology is not always a solution to barriers to access.
- Collection development for the “just in case” scenario will go away; we will only collect material at the point of discovery.
- We need to overcome privacy issues relative to access and delivery.
After a day of discussion, attendees were asked to consider the implications of these themes for their own institution, ALCTS and/or ALA as an association, and the profession as a whole. Comments included:
- Stop being reactionary; live in the “beta,” explore and take risks
- Move from transition to transformation
- Create strategic partnerships and work towards more collaboration with other institutions, publishers, and vendors
- Push processes and collections out to the network level
- Have a long-term plan for services and collections
- Be able to experiment and move on
- Better communication for trends and issues; an emphasis on continuing education
- Research the impact on practices
- Program planning to address skills sets
- Blending different library types and specialties; break down the splintering of the Association
- Collaborate with other associations and professions to bring in a different level of expertise, and to offer our expertise to others
- More emphasis on soft skills and change management; learning and teaching business skills such as contract negotiation and running a meeting
- Students have to be ready to think and adapt to change when they enter the profession
- Librarians need to be technologically savvy; be flexible and explore what is out there without needing to be an expert
- Communication, collaboration, and advocacy will bring us forward.
Mix and Match: Mashups of Bibliographic Data
Linda Lomker, University of Minnesota
The forum, moderated by David Miller, provided interesting insights into three efforts to use metadata from a variety of sources including MARC records: ONIX Enrichment at OCLC, the Open Library Project, and Google Book Search metadata. These efforts have their challenges.
Renee Register, Global Product Manager for OCLC Cataloging and Metadata Services, was the first speaker. With a complex diagram to illustrate her point, she lamented the fact that library data is created apart from the flow of information from and to the publishers. Reasons include different data formats (ONIX and MARC) and different timing (publishers from the start of the publishing cycle to the end and libraries at the point of actual publication and upon receipt). She then showed how OCLC is planning to bring the ONIX data into the ordering/cataloging workflow and the MARC data into the publisher's information set.
Karen Coyle, project consultant, was the next speaker and discussed the Open Library project which has as its goal to provide "one web page for every book ever published." Her role is to provide assistance in understanding library data. They also use data from Amazon Books, publishers, users etc. The project follows semantic web concepts using field types, data properties and values. Everything (types, properties, values) gets a URI. The data structures are such that anyone can develop new types and properties, can edit any value, or can develop displays. One realization she reached is that users no longer care much about alphabetic order, and therefore the reverse order for names and the elimination of initial articles in titles in standard library practice are not a consideration for the project.
The final presenter was Kurt Groetsch, Technical Collections Specialist at Google who discussed Google's metadata procedures for Google Books Search. They have pulled data from over 100 sources including libraries, publishers, retailers, aggregators, etc. and processed all of it to come up with one record. Google parses the data and uses identifiers such as ISBNs, Clans, OCLC numbers to pull the data together into clusters, falling back on text matching if an identifier is not available. In a field by field process based on the trustworthiness of the sources, a "best-of" record for the cluster is created. One difficulty is identifiers such as ISBNs which are not unique and another is the enumeration of multivolume works, both easily recognized as problematic by members of the audience. Of course, there are also problems with poor quality records. Google is working to refine their process and resolve known data problems.
For more information, see the blogs written about the forum by Peter Murray (dltj blog), Eric Hellman (go to hellman) and William Denton (FRBR blog).
Year of Cataloging Research
Anne Sleeman, Community College of Baltimore County
To begin the Year of Cataloging Research, CCS sponsored this forum that attracted an audience of about 100. Qiang Jin, Chair of the ALCTS Cataloging and Classification Section moderated. Michele Cloonan, Dean of the Simmons Graduate School of Library and Information Science, opened the forum by reiterating the need for cataloging research in her presentation “Problems and Opportunities in Cataloging Research and Pedagogy.” She noted that each generation has its own challenges. Currently, “evidence-based” is the buzz word in academia; research provides the evidence.
Sandy Roe, Editor of Cataloging & Classification Quarterly, then spoke about how to do and publish research. She recommended that those interested in publishing research refer to the LIS journals website (www.lis-editors.org) for further information. She spoke about the need to identify types of research (e.g., experimental, theoretical) and create a better way to review and share cataloging research. Cloonan and Roe both advocate cooperative and multidisciplinary projects.
The second half of the forum was devoted to presentations of current cataloging research by Martha Yee, Cataloging Supervisor at the UCLA Film and Television Archive, and Daniel Joudrey, Arlene Taylor and Tina Gross. They all noted that working catalogers are ideally placed to address research questions that appeal beyond their institutions.
Yee has a vision of the web as a shared database rather than a shared document store. She is doing experimental research on RDF, specifically subject relationships. (You may see her Power Points presentation and underlying paper online). During the course of her presentation, she posed many possible research questions. She suggested we “reverse engineer” by deciding which displays and indexes we need and then figure out how to create them. We must be proactive in the process.
Joudrey, Taylor, and Gross reviewed their research on “The Effect of Controlled Vocabulary on Keyword Searching,” a follow up to their 2005 study on the same subject. Since that time, catalogs have been enriched. They were curious if this would change the effect. The question this group addressed is: “What proportion of records retrieved by a keyword search has a keyword only in a subject heading field, and thus would not be retrieved if there were no subject headings, even in a catalog enriched with TOC and summary notes?” So far (the research is about 70 percent completed), the answer is 24.98 percent.
Future of CMDS: Reorganization
Reeta Sinha, YBP Library Services
Moderated by Kathy Tezla, chair of Collection Management and Development Section (CMDS), this forum presented members with an opportunity to hear about the section’s plans to reorganize its current structure. The proposed new structure is intended to mirror shifts in the profession experienced by those managing collections.
The session began with a brief summary of the reorganization project. Starting in early 2009, the CMDS Reorganization Task Force evaluated the section’s current structure, its mission, committees and charges of each. The Executive Committee reviewed the Task Force’s recommendations through the fall of 2009. A final draft of the recommendations was ready before the 2010 ALA Midwinter Meeting. Prior to the conference, ALCTS issued a call for members to email their feedback about the draft plan to the CMDS Chair.
In addition to a handful of members of the Executive Committee, a few CMDS members attended the forum. One member commented that the Task Force had undertaken a “thoughtful” process and that the reorganization proposal represented “fresh thinking.” There was discussion of the proposed name change to “Collection Management Section.” While it is shorter, some of the interest groups used “collection development” as part of their name and collection development practitioners may not identify with “collection management.” Problems related to recruiting leaders for CMDS interest groups were also mentioned. One proposed solution was to assign a coordinator to oversee these groups.
Tezla wrapped up the forum by thanking members for attending and noted that attendees should look forward to finalizing the reorganization plan and scheduling the membership vote.
Find more information about the CMDS reorganization plan online:
- CMDS web page
- CMDS reorganization background information
- CMDS draft reorganization structure diagram
CRS Cataloging Committee
Shana L. McDanold, University of Pennsylvania
Steve Shadle, co-chair of the Continuing Resources Cataloging Committee (CRCC), opened the session with some brief introductions and a review of the agenda. The first report was from Les Hawkins and covered CONSER activities. Hawkins began by congratulating Shadle for being awarded this year's Ulrich's Serials Librarianship Award. He discussed the training that Hien Nguyen, Valerie Bross and he did with the new ATLA CONSER Funnel group, and the establishment of a liaison to the CRS Holdings Committee. Hawkins then reviewed what was discussed at the CONSER at-large meeting held on January 17, 2010 during the 2010 ALA Midwinter Meeting. The agenda included an ongoing discussion of establishing a cooperative effort within CONSER to authenticate open-access electronic journals such as those listed in the Directory of Open Access Journals (DOAJ). This effort would increase the coverage of full and authenticated records in the knowledge bases used by libraries to maintain their electronic journal collections.
There is an ongoing discussion of the LCRI footnote that designates the change from a CD-ROM to a DVD-ROM as a major change requiring a new record. The LCRI will be re-written to indicate that a new record is no longer warranted if it is just a format change, including a change to the 300 field to make the SMD non-format specific. Other questions included how to address uniform titles that have format qualifiers and if the title has the words CD-ROM in it. For now, the changing of a uniform title will be left to cataloger's judgment. For changes to the title, it was clarified that if CD-ROM is purely dropped from the title and not replaced, it will be considered a major change. However, changes from CD-ROM to DVD-ROM in the title will be considered a minor change (it falls under the type of resource category).
Hawkins also summarized the MARBI proposals that affect continuing resources cataloging. MARBI proposal 2010-01 was passed with amendments, splitting the fixed field form code into multiple options: "q" for direct access electronic resources, and "o" for online electronic resources. The current code "s" for electronic will not be made obsolete, but will be redefined as optional. Also passed was 2010-02 adding subfield 5 (Institution to which field applies) to the 80X-830 Series Added Entry fields.
Regina Reynolds reported next on news from the Library of Congress (LC), reviewing some of the ongoing changes from the reorganization and physical moving of staff. The United States General and United States Regional publisher liaisons are currently operating as a combined division for the interim. They are looking at proposals and evaluating restructuring options. LC is participating in a copyright "surge" to eliminate a copyright backlog. The project goal is to process 130,000 items from the backlog, working with ten additional staff from the combined publisher liaison division. The electronic deposit of copyright materials project will begin soon. LC will be allowed to demand electronic only submissions via the copyright office. The project will start with about ninety titles for cataloging and processing, with patron access to be established in the future. The subject heading Cookery is being replaced with Cooking in LCSH and current bibliographic catalog records. The policy and standards division will provide more information with a targeted implementation date of May/June 2010.
Reynolds then reported on the United States ISSN Center news. The ISSN automatic register was successfully implemented in August 2009. The first ISSN in the print register for a print journal was issued to Publishers Weekly (0000-0019), and so accordingly the first ISSN issued using the automatic register was for the online version of the same journal. The benefits of the automatic register is that it is online so that an ISSN can be assigned from any location, including teleworking staff (there will be no central paper registry to physically update), there is an automatic email notification to publishers which has been very positively received, and the system can produce various reports about the requests received.
The ISSN news continued with an update on the ISSN-L. OCLC has processed the necessary MARC update and changes to accommodate the ISSN-L as of August 2009. OCLC will populate WorldCat with the ISSN-L utilizing the tables from the ISSN.org homepage. The tables, which are free, are updated quarterly. According to Robert Bremer, who will be monitoring the process, it may take up to six months to fully populate WorldCat. They are currently working out some logistics with the international ISSN centers regarding updated records being sent to them via the automatic reporting function. At the 2010 ALA Annual Conference, the United States ISSN Center is considering holding a symposium/gathering for publishers, abstracting and indexing services, vendors, etc. that have an interest in the ISSN-L to discuss implementation and use of the ISSN-L.
Other ISSN news includes a decision of the provisional scope defining what types of integrating resources are eligible for ISSN. Currently included are databases, directories, and resources that have print counterparts that have already been assigned an ISSN. Excluded materials include web sites (such as institutional, personal, or corporate pages) and blogs. Reynolds reported that there are some international ISSN centers that are interested in being informal testers of RDA, which is contingent on access to the RDA tool. Finally, the changing of the LCRI to make a change from a CD-ROM to a DVD-ROM should be considered a minor change is currently in sync with the ISSN guidelines. The ISSN guidelines for issuing a new record and ISSN is based on the changing of the format code in the 007; since a change from a CD-ROM to a DVD-ROM does not change the code, no new ISSN or record is issued. Currently, they anticipate that this treatment will be the same with RDA.
Jennifer Young and Steve Shadle next reported on CC:DA. Barbara Tillett provided a report on LC to CC:DA covering the serials and publication division project to converting the paper file of newspapers and comic book issues, the project by LC to archive online news web sites, and the National Digital Newspaper Program development of a searchable database of all American newspapers available via the Chronicling America web site.
John Attig gave a JSC report. The current publication of RDA is slated for June 2010, and the JSC expects to continue to make changes and correct errors even after the product is released. There are a few issues deferred until after the release of RDA, several of which were sent back to their respective constituencies for review. Serials issues that are deferred include the use of square brackets, the handling of inaccuracies in the title, the preference for the use of the full title in place of an acronym or initialism, and appendix J which covers relationship designators.
Attig's report at CC:DA was followed by a brief demo of the RDA web site and the assurance that complementary access to the product will be available until August 31, 2010 to provide everyone with the opportunity to become familiar with it. Other CC:DA topics included a presentation by Diane Hillman on application profiles, Karen Coyle's paper on the semantic web, a review of the MARBI proposals, an overview by the Task force on RDA Training, a report from Peter Fletcher on the PCC non-Latin characters task forces activities, and the CC:DA web site issues as of late including problems transferring files to and from the ALA servers.
The majority of the forum was a two part presentation from Regina Reynolds titled "Testing RDA; Some Impacts on Continuing Resources". Reynolds used Flickr images (with permission) from Julie Miller, and some slide content is courtesy of Barbara Tillett or Judy Kuhagen. Part one of the presentation covers the testing of RDA. The methodology is based on the testing that was done for the CONSER Standard Record, which emphasizes a data-driven testing model. Reynolds began with an overview of the process, participants, and the current timeline. Information can be found on the LC web page dedicated to RDA testing. The purpose of the testing is to gather data driving information to answer concerns about usability, costs and the utility of implementation; and the cost benefit analysis will include technical, operational and financial aspects. Results will be both quantitative (numerical) and qualitative (descriptive). Assumptions being made about the process include that the test results (both the data and records created) will be publicly shared, that the tool to be used will be the first release of the RDA product, existing library systems will be used for testing however system developers are invited to explore the use of RDA both in existing and future systems, and informal testers are encouraged to follow the same methodology and share and compare their results.
Reynolds then reviewed the specifics of the methodology. Formal testing is slated to begin on October 1, 2010, provided the RDA product is released as expected. A common set of titles will be cataloged using both RDA and AACR2, following the usual LCRIs and practices. The common set consists of twenty-five English language titles: five serials, ten textual monographs, five A/V monographs, and five integrating resources. The list of titles used will be publicly available. In addition to the twenty-five common titles, formal testers will catalog other titles that are part of their normal workflow. Statisticians from both LC and NLM indicated that 800 records will be a statistically valid sample for the test. For each title, an online survey will be completed, in addition to an overview survey covering the process a whole. Surveys will be gathered using Survey Monkey. No database searching will be performed, and no subject analysis is to be done. Authority records will be created per usual by the institutions that currently do so. In addition, all records created during the testing using RDA will have "rda" in a subfield e of the 040 field. LC will host all the test records, and is working with OCLC to determine how those records will be transferred and be available to the entire community. Arrangements are being made with OCLC to prevent duplication in the database, particularly in the case of authority records and the common set. Glenn Patton has suggested that using the current institution record model is a possibility, with the addition of a note indicating the record is part of the test set.
It is recognized that RDA provides lots of different options, and as part of the testing process, the institutions and individual catalogers (exercising their judgment) will make decisions. How options are applied will be part of the analysis, as well as the impact on copy cataloging and users. Records will be reviewed for conformity and any patterns of errors. At the end of the testing period, all the data will be compiled and a report will be issued. Decisions about implementation will be made after the report is distributed. A member of the audience suggested that user testing of the records should be blind; for example, a user is shown two records without any indication which record followed which rules, and asked to indicate which one is preferred or more useful.
The second part of the presentation from Reynolds covered " RDA and Continuing Resources." Reynolds emphasized that the presentation was not intended as training, but rather an overview of some of the changes. She also emphasized that RDA is a set of instructions, not rules, and that cataloger's judgment is very important when using RDA. In RDA, “continuing resource” does not appear as a concept or a term, instead serial and integrating resource are both considered a mode of issuance. Reynolds began by reviewing RDA 1.6, when to make a new description. RDA includes new explicit reasons such as a change in mode of issuance, a change in media type, a change in edition statement, or the re-basing of an integrating resource. Again, cataloger's judgment plays a much larger role in making the final decision. RDA notes that the "preferred title" is similar to the uniform title. It may have an impact on the key title, as the title of the first resource (such as an issue) received is the preferred source title. The title proper is addressed in RDA 2.2.1, with a principle of representation. Per RDA, catalogers will not correct errors in the title when transcribing information for monographs; instead they will add a 246 for the corrected version. The opposite is true for serials and integrating resources; catalogers will correct errors in the title to ensure a stable title and record, and transcribe the error in a MARC 246 for title variations.
The general guidelines for numbering of serials can be found in RDA 1.6.1, and more information for first or only numbering sequence in 2.6.2 through 2.6.5. Sources for numbering are specified. The option to give the numbering as an unformatted or as a formatted note is included. Additionally, communities will have the ability to add instructions (possibly under workflows) in the RDA online product, giving groups such as CONSER a space to include their guidelines. RDA 184.108.40.206 covers dates, and if a date is not known, catalogers can supply an approximate date. Edition statements should be transcribed exactly as they appear. The ISSN for series and subseries can be repeated in the transcription, and should have no impact on collocation and display. The place of publication should be transcribed as it appears with no abbreviations or additions. If unknown, use "place of publication not identified" in place of the Latin abbreviations. Likewise, transcribe the name of the publisher and distributor in separate elements exactly as they appear. Unknown publishers should be listed as "publisher not identified."
When including the key title and ISSN in a record, RDA 2.3.9 stipulates that the first preferred source is the ISSN register, and to record it if it is on resource or readily available. There are instructions for transcribing the manifestation ISSN ( RDA 2.15) as well as transcribing an incorrect ISSN ( RDA 220.127.116.11). RDA does assume the use of separate records, but will allow for a single record if preferred. RDA emphasizes relationships for all modes of issuance, and in RDA the ISSN can be used as an identifier in relationship elements. More relationships possibilities will emerge as RDA is used. Reynolds listed a few other changes, including other title information treatment, the instruction to no longer use abbreviations in the 300 field subfield b, and to always include carrier description characteristics.
RDA has impacted MARC21 in several ways. The GMD is now treated as three elements, all of which are repeatable: 336 for content, 337 for media, and 338 for carrier. There are new codes in the 007 and 008 fields. Other changes in MARC21 to accommodate RDA will be forthcoming.
Reynolds then reviewed the provisional timeline for RDA testing, emphasizing that it is dependent on the release date. The testing is expected to take approximately nine months: three months of training, three months of formal testing, and three months for review and analysis. Reynolds then opened the floor for questions. Of concern was the display of the new MARC21 fields and elements. Reynolds explained that display will be determined by vendors and institutions; however, for planning purposes the core elements will be posted so display issues can be addressed in advance. An attendee inquired what users will look at during the user testing. Each library will determine what and how they involve user testing, but the survey will include a space/question about how the records were viewed to ensure that context is taken into account during evaluation. It was then suggested that testers provide a screen shot with their reports. The last question concerned what elements of the CONSER standard record are not in RDA and vice versa. Reynolds acknowledged that was something CONSER will need to analyze, such as the issue surrounding the treatment of statement of responsibility in RDA versus CONSER guidelines. Reynolds concluded by emphasizing once again that RDA is a set of instructions, not rules, so the impact of following the instructions or not following them will have to be assessed prior to making a decision.
Shadle closed the forum with a discussion about possibly doing informal testing that would be coordinated by the Continuing Resources Cataloging Committee. The informal testing would allow for testing an increased number of serial titles, and would ensure that all types of continuing resources are included and well represented. Interestingly, the committee was initially formed to test AACR2, thus doing informal testing of RDA would be appropriate and supported by Committee history. Maggie Horn reported that the Executive Committee of the Continuing Resources Section is supportive, and that there has been a suggestion to ask ALA Publishing to allow an ALA committee to have access for the expanded testing. It was also made clear that at LC as well as the national Library of Agriculture all serials cataloged during the testing period, in addition to the five core titles, will be cataloged using RDA. In general, the attendees at the Forum were very interested and supportive of the idea of expanded informal testing. A committee member compiled a list of the questions to be answered (such as the access to the RDA online product). The committee will produce a plan, with a goal of having their own set of test titles and a plan in place before the 2010 ALA Annual Conference in June.
CRS Committee on Holdings Information Forum
Holding Information for E-Books
Sion Romaine, University of Washington
This forum, composed of a three member panel, discussed issues in managing electronic books (e-books), including holdings standards, the role of holdings information, and the similarities and differences between managing holdings information for e-journals and e-books.
Matt Goldner, OCLC Product and Technology Advocate, discussed the sustainability of managing e-books from the viewpoint of accurate discovery and easy access. Not only are e-books being produced in two different formats (text and audio), they are also accessible via many different devices and platforms. Maintaining individual records with access points in the library catalog does not appear to be sustainable in the long run. Instead, libraries will need to rely on publishers and distributors to provide good title level bibliographic data, an access point (e.g., URL, downloadable console) and an indication of the formats held (e.g., PDF, html, proprietary, audio). Having individual e-book records tied to a knowledge base may be the most practical solution, and is a logical outcome of vendor neutral records.
Goldner noted that on December 25, 2009, Amazon sold more e-books than physical books. The bottom line is: Libraries need to get in front of the wave and work with e-book suppliers and discovery and delivery system suppliers to develop sustainable resource management models.
Maria Stanton, Serials Solutions Director of E-Content, discussed the similarities between the growth of e-journals and the growth of e-books, and the benefits of the provider neutral record.
E-books are behaving much like e-journals did in the late 1990s. E-book collections are offered through multiple business models on multiple platforms, identifiers (ISBNs) are proving to be inconsistent, and metadata varies from platform to platform. As with e-journals, we need a sustainable model to successfully connect users and content. Stanton pointed to the growth of the Serials Solutions Knowledge Base (300,000 e-books in 2008; 2 million+ in 2010) as an example of how quickly the e-book market is growing.
Serials Solutions supports the MARC Record Guide for Monograph Aggregator Vendors and the provider neutral record, which allows a single bibliographic record to cover all equivalent manifestations of an online monograph. The e-book provider and publisher information is considered local holdings information, extraneous to the bibliographic record. The benefit of the one record approach is less confusion, as the focus shifts from the e-book provider to the e-book title, and maintenance becomes easier. Essentially, provider-neutral MARC records are making it easier for users to access high quality resources.
Kathy Klemperer, Project Manager Harrassowitz/EDItEUR, discussed the development of ONIX holdings standards. In the print world, data flow was simple – a straight-forward relationship between the publisher, an intermediary (agent), the library’s integrated library system, and, possibly, a shared cataloging system. Book holdings were exchanged through loading of MARC records, while serials holdings were handled by the check-in process. With the advent of electronic resources, however, new players joined the scene: content hosting systems, resource manager knowledge bases, and link resolvers.
The recently developed ONIX Serials Online Holdings (SOH) standard allows for transmission of subscription information from the publisher, agent or content-hosting system to a knowledge base and from the knowledge base to the catalog. However, the SOH standard needs to be expanded to cover e-books. Unlike serials, e-books do not have enumeration or chronology, so they do not fit with the SOH message format. While many e-book packages are subscription products, they are not serials. We need to redefine holdings as access rather than as serials or subscriptions.
EDItEUR is currently working to adapt the SOH standard for e-books. However, as with e-journals, suppliers will need to commit to using these standards to transmit holdings information about e-books. To sum up: what you have access to is now more important than to what you subscribe.
Questions from the audience focused on the quality of vendor-supplied MARC records, citation linking for e-books, and having knowledge base managers like Serials Solutions share e-books holdings with OCLC so institutions can set holdings.
CRS Standards Update Forum
Marla Chesler, Library of Congress - FLICC/FEDLINK
Hana Levay, Information Resources Librarian, University of Washington Libraries, discussed SUSHI (Standardized Usage Statistics Harvesting Initiative) and Project Counter (Counting Online Usage of Networked Electronic Resources) updates and applications. Project COUNTER is an international initiative serving librarians, publishers and intermediaries by setting standards that facilitate the recording and reporting of online usage statistics in a consistent, credible and compatible way. SUSHI is a model for automation of statistics harvesting. The most recent release of SUSHI is 1.6 and COUNTER is version 3.0. Version 3.0 of COUNTER requires SUSHI compatibility. Some of the useful additions to the SUSHI page are the server registry which has information about SUSHI compliant vendors, the SUSHI toolkits to encourage development of new applications, and the FAQ/Getting Started pages.
Using the Scholarly Stats feed, Levay has configured the Innovative Millennium ERM (electronic resource management) to download the usage statistics from the COUNTER Compliant vendors and then to pull the total cost into the statistics to automatically calculate the total cost per use. The conclusions of the presentation were that the SUSHI standard was active and usable, that ERM systems can work with SUSHI to provide effortless cost per use data, and that the updated SUSHI webpage has made it easier to identify current vendors and the information libraries need to implement SUSHI with those vendors.
Rick Burke, Executive Director of SCELC (Statewide California Electronic Library Consortium) discussed ONIX-PL (Online Information Exchange for Publications Licenses) and NISO CORE (Cost of Resource Exchange). Working with Springer, OUP (Oxford University Press), Nature Publishing Group, and Elsevier to encode their licenses into ONIX-PL, SCELC hopes to take advantage of the ONIX-PL encoding standard to express licenses in a machine-readable format so that they can be loaded seamlessly into electronic resource management systems. ONIX-PL can simplify the process of license negotiations since it would standardize the way the key license terms display, making it easier to identify terms that needed to be negotiated. The free Open Source software, OPLE (ONIX-PL Editing Tool), would make it easier to enter the XML license data. Library services vendors will add ONIX-PL support to their ERM when and if the standard is adopted. Impediments to ONIX-PL adoption include the time-consuming nature of encoding a license, publishers’ lack of staff and/or technical skill to provide ONIX-PL versions of their licenses, and the reluctance of third party vendors to encode licenses fearing legal repercussions due to potential misinterpretations.
NISO CORE facilitates the exchange of cost, fund, vendor and invoice information between integrated library systems, business systems, ERMs, and other third parties, such as subscription agents. The working group drafted standard in 2008-2009 and CORE trials are now underway. The goals of the draft standard are to develop and refine the list of data elements holding acquisitions metadata, utilizing XML and to create a transport protocol to move these data elements from one system to another. The standard could be the missing link that resolves the current need to enter redundant data in multiple systems.
In Summary, Burke stated that ONIX-PL provides the best possibility for simplifying a cumbersome licensing process and improving ERM functionality, and NISO CORE has the potential to effectively and seamlessly share e-resource acquisitions data.
Publisher Vendor Library Relations Forum
Katharine Farrell, Princeton University
The Publisher-Vendor-Library Relations (PVLR) Forum topic “Innovation by Necessity” was addressed by four panelists: Beth Bernhard, University of North Carolina at Greensboro; Lindsey Schell, University of Texas at Austin; Judy Luther, Informed Strategies; and Hester Campbell, YBP. The economic downturn, coupled with upheaval throughout the library materials supply chain, is forcing organizations to reassess routine workflows and balance demands for new services against shrinking resources.
Bernhardt outlined UNC Greensboro’s foray into patron driven acquisitions, a response to changing user expectations for speed of delivery and changing research needs as programs expand and mutate. She described the pilot program with MyiLibrary for patron selected e-books, in which the first use incurs no cost to the library. She noted that a similar pilot was planned with Rittenhouse in the field of nursing which will allow both library and publisher to explore models. The library will monitor both pilots this year and analyze results. She discussed budget management strategies and noted that these pilots were occurring alongside traditional selection models.
Schell described UT Austin’s patron driven e-book service through EBL. They are using a pay-per-view model in which the fourth use triggers purchase. They have made 70,000 titles available and have purchased 4,000 since 2007. UT Austin is beginning to consider the logistics of providing a patron driven print purchase model, which would provide pre-selected sets of records from which patrons might choose. They are analyzing patron purchasing patterns and trying to resolve the delivery issues. Schell also noted that in technical services they are discussing other options for efficiency, including eliminating approval review; moving standing orders for monographic series into approval plans; and eliminating serials check-in.
Luther discussed efforts to develop a common platform for the delivery of university press e-books. She compared the trajectory of e-books to that of e-journals moving from aggregator back to publisher. She noted that “university press” is a brand. The project, funded by Mellon, involves four mid-sized university presses (New York, University of Pennsylvania, Temple, and Rutgers) and has a goal of determining library expectations, reviewing platform functionality, and developing models and tools for presses to use. Informed Strategies surveyed prominent librarians in the e-book arena to identify questions. They are now testing conclusions and gathering more data. They are evaluating pros and cons of purchase versus subscription models, and assessing platforms and possible business models. Luther noted the need to be able to track royalties, not an issue in the e-journal market. The report is due in February.
Campbell pointed out that vendors are always innovating; they must in order to remain competitive. She noted the perception that e should be easy and cheap. She reviewed the supply chain reminding the group about the services vendors offer both publishers and libraries (consolidation, marketing, approval services, customer service) and described the rocky beginnings offering e-books through vendors. Were they books or serials? The vendor never sees the e-book. There are no standards. Should the vendor develop a platform for deliver or pass through to a third party platform? She noted that now there are more e choices in vendor databases, that librarians are demanding publishers work with vendors to deliver e content and reduce the delay in production of the electronic version.
Brief discussion followed with the audience centering on the inadequacy of the current generation of e book readers and the increasing challenges for interlibrary loan service in the environment of e-books.
Pass It On: Preservation Week
Tara Kennedy, Yale University
This ninety-minute forum shared experiences from programs to inspire participation in Preservation Week 2010 (May 9-15) and in future years. Learn how the widespread interest in family and local history and personal collections (photographs, diaries, letters, scrapbooks, film and video, quilts, costumes, artworks, and more) can help individuals to build preservation and library awareness, reach new audiences, and further community ties.
- Jill Rawnsley, Consultant, Philadelphia, Pennsylvania
- Michele Stricker, Consultant, Library Development Bureau, New Jersey State Library, Trenton, New Jersey
- Donia Conn, Workshop Program and Reference Coordinator, Northeast Document Conservation Center, Andover, Massachusetts.
The co-chairs of the Preservation Week Working Group were not able to attend the 2010 ALA Midwinter Meeting, so Karen Brown, Chair of the Preservation and Reformatting Section (PARS), opened the program on their behalf.
Preservation Week, sponsored by the Institute for Museum and Library Services (IMLS), the American Institute for Conservation (AIC) and the Library of Congress, is the first national preservation week in the United States focusing on collections. These organizations are encouraging people in their community to do at least one thing to recognize Preservation Week in hopes to raise community awareness in promoting the care of their collections. The Preservation Week web site is available and full of ideas and more information.
Michele Stricker spoke first. She handed out a “Shocking Statistics Sheet” consisting of abstracts from the Heritage Health Index (HHI), a survey of cultural heritage’s “health” in the United States. The survey was conducted by Heritage Preservation. The HHI statistics were a driving force in instituting Preservation Week; this week is an official method of highlighting what cultural property professionals and the general public can do to save shared and our own collections.
Ms. Stricker went on to describe how the state of New Jersey participates in preservation activities. Some of these action items were:
- Regional workshops for New Jersey librarians in preservation, focusing on collections assessment, fundraising and disaster preparedness.
- A state-created web page for preservation reference for New Jersey librarians.
- A statewide planning grant from the Institute for Museum and Library Services (IMLS) Connecting to Collections grant to conduct a preservation needs survey across the state of New Jersey.
Ms. Stricker encouraged the audience to do a small program, and that one does not have to start from scratch. A resource that she mentioned as a helpful tool is Capitalizing on Collections Care, a book available from IMLS.
Jill Rawnsley spoke next. As a Preservation Consultant, she is developing a workshop for the state of NJ as part of a statewide program called “Supporting Library Staff Skills for Fundraising and Outreach.” The workshop will be focusing on how to:
- Assist the public in caring for their treasures,
- Develop outreach programs, and
- Help institutions to care for their own treasures.
The workshop will focus on different types of materials that may exist in people’s personal collections.
Ms. Rawnsley focused on the importance of preservation and getting the information out to the general public. This is especially simple nowadays with all of the resources available to people such as web sites, publications, etc. She also spoke about preservation being a theme that could be used in cultural institutions as a way of drawing in new audiences.
Rawnsley also emphasized that it is okay to start small and that activities do not all need to be during Preservation Week. Some areas of focus that she thought might be good starting points are:
- Talking to people about simple preservation strategies that can help slow down deterioration collections.
- Focusing on handling, display, storage, etc. as topics.
- Donia Conn was the final speaker. She introduced the Preservation Week web site. Some of the highlights of the web site were pointed out such as:
- The Preservation Tool Kit which provides resources and information on a variety of preservation topics. It includes a section for teaching preservation to children.
- The Events Toolkit which includes flyers and other materials to help plan a Preservation Week event at one’s institution or in one’s community. There is a “Speakers Bureau” where people can find a speaker for their event. Ms. Conn encouraged preservation professionals in the audience to sign up for the Bureau.
- There is a poster and bookmark available through ALA publications to promote the event.
There was a question and answer period following the speakers’ presentations.
RDA Update Forum
Emily Prather-Rodgers, North Central College (Naperville, Illinois)
Just over four years after the release of the first draft of RDA: Resource Description and Access was released to the library community for review, this forum provided firm answers to the big question that many catalogers and library administrators have been asking: how much will RDA cost? Speakers also answered other important questions about the launch and implementation of the online-only product.
Sally McCallum, Chief of the Network Development and MARC Standards Office at the Library of Congress opened the forum with an overview of the major changes to the MARC bibliographic and authority formats that will be required in order to fully implement RDA. For bibliographic records, these include the treatment of Content, Carrier, and Media Types with the introduction of 336, 337, and 338 fields and changes to the Leader/06 field and the 007 field. More dramatic changes will be required in the authority format in order to clarify attributes of names and to allow for the expression of relationships. McCallum acknowledged that other, smaller changes will be required once RDA enters the formal testing phase.
Don Chatham, Associate Executive Director of ALA Publishing, spoke briefly about the current state of ALA Publishing and its goal to combine traditional publishing with emerging electronic resources. He then introduced Troy Linker, Director of Publishing Technology at ALA. Linker described the RDA Toolkit and as a browser-based interface that allows catalogers to interact with a variety of documents, including RDA; the RDA element set (formerly known as the schema set dictionary); AACR2; and, eventually, specialized community documentation, encoding documentation, and policy statements. The RDA Toolkit will launch in June 2010, and it is currently in user testing to ensure that the interface is intuitive. Libraries that subscribe to the Toolkit will be able to choose from three access models: a traditional username/password model, IP range authentication, or access via a referring URL. Users will find that the content is logically arranged under tabs rather than in a traditional A-Z index format. The publisher considers this an electronic substitute for the bookmarks and dog ears readers often use in the print world. In addition, the Toolkit will include a table of contents for RDA as its own document and in tree browse format in a left-hand frame. Linker pointed out some of the most useful features of the toolkit, which include showing or hiding examples, adding bookmarks and local notes to the text, printing the full text as a PDF, and space to create local workflows and mappings. It is anticipated that licensing fees will support continual development of the Toolkit and allow for multiple releases with improved functionality.
Annual licenses for the RDA Toolkit must be purchased directly from one of the co-publishers. In the United States, the fee has been set at $325 for a single concurrent user. Additional concurrent users are priced as follows: two to nine users for $55 each, ten to nineteen users for $50 each, and twenty or more users for $45 each. Library schools will be offered special pricing models and, in general, short term additional concurrent users will be granted to licensees for training purposes. Potential subscribers are encouraged to visit the site or to e-mail for additional information about pricing and the product. The RDA Toolkit is scheduled to launch in June 2010, and ALA Publishing will offer completely open access until August 31, 2010 so that libraries may evaluate the product before signing a license agreement. According to Linker, “We’re very excited to launch it….It looks like the basics are really there, and it’s a matter of fine tuning at this point.”
After Linker answered questions about the RDA Toolkit, Beacher Wiggins, Director of Acquisitions and Bibliographic Access at the Library of Congress, discussed the United States coordinated testing of RDA. The twenty-four test partners convened at Midwinter to finalize the testing timeline, determining that the official test period will comprise nine months. The first three months (July through September) will serve as a learning curve period, allowing librarians to learn the product and ask any questions that may arise. During the formal test period (October through December), participating libraries will catalog a common set of twenty-five titles, using both AACR2 and RDA rules. These twenty-five titles will all be in English, in a variety of formats, and will be chosen in an attempt to capture all of the differences between the two standards. Once an institution has cataloged the common set, they will catalog all regular materials that flow through the institution. The records created during the formal test period will be made accessible to vendors, researchers, data specialists, and other interested parties. The final three months of the testing period (January through March 2011) will be used for assessment of the data. Wiggins stated, “We think we are on a roll now. We will, at ALA Annual in Washington, have a session for the participants in the testing and, assuming there are updates,” another RDA Update Forum will be held. “We hope that you will be deluged by all the information that you need.”
The final speaker on the program, Glen Patton, Director of WorldCat Quality Management, OCLC, spoke about his company’s plans for working with the three national libraries (Library of Congress, the National Library of Medicine, and the National Agriculture Library) to support the testing institutions. OCLC hopes to provide testers with the option of creating RDA/MARC 21 records in the Connexion browser and client and to implement batch loading processes for records in that format. MARC format changes should be finalized and installed in May, including implementation of a 040 subfield e of “rda.” Catalogers at libraries that are not participating as formal testers should be aware that they will begin seeing RDA records in Connexion by September as the current plan is for all test records to be added to OCLC’s live database. How this will effect non-testing libraries will be made clearer as documentation is developed over the next several months. Training plans include webinars for test participants and other member libraries, the standard technical bulletin of MARC changes, and other documentation that will be determined along the way.
The many years of work the members of the Joint Steering Committee put into the creation of RDA should come to fruition late this summer, and plans for testing are in place. The only major hurdle that remains is getting the vendor community on board. Wiggins, Linker, and others are reaching out in various ways, but it appears at this point that everyone is waiting to see who will be first.