Volunteer Reporters Cover ALCTS Forums and Events in San Diego
Volunteers who attended the 2011 ALA Midwinter Meeting in San Diego provided the following summary reports. We thank the volunteers who covered these events. Their efforts enable the rest of us to benefit from their presentations. We regret that volunteers were not available to report on all the events.
The Administrator, RDA, and the Future Catalog: Issues, Viewpoints, Alternatives
Jane Rosario, University of California, Berkeley
The first speaker, Tim Strawn, Director of Information Resources at Cal Poly, San Luis Obispo, gave a broad overview and introduction to implementation issues. He entitled his presentation, “Regret Rien or Tomorrow Never Knows.” Strawn believes that RDA will be adopted, if not by all, then by some part of the industry; AACR2 and RDA records will reside simultaneously and interact in a web-based environment. RDA represents a philosophical shift from older database models to that of a relational database based on FRBR. Under the new paradigm, data can be exploded, deconstructed, reconstructed, and better parsed by machines than in the past. Professional education for this new model is much more accessible than in the past, with ALCTS webinars, the RDA Toolkit and the like readily accessible via the Web. RDA promises to be more responsive to user needs; it is an international standard (unlike AACR2), and will be able to liberate data from isolating silos. RDA can be used with MARC and outside of MARC. RDA liberates MARC metadata to be of wider use. It has more precisely designed bibliographic elements that can be used in other systems, increasing the visibility and viability of the library. To make the transition to a FRBR-based environment a success, catalogers must be trained, and librarians must work with vendors and integrated library systems to display new fields. Another issue is: to what degree do we work for our respective communities, and to what degree should we work on shared content? This is an extremely important managerial decision. “Regret Rien” is a reference to a restaurant in “Life is Sweet” (a Mike Leigh film from 1991) that served dishes with unusual, incompatible, and sometimes unpalatable ingredients. Our job is to make our mish-mash of standards palatable.
Olivia Madison, Dean of the Library, Iowa State University, spoke about RDA from a director’s point of view. She presented a vision of the future catalog, and explored how RDA could support management and provide better access to users. RDA would promote collaborative, decentralized information sharing, be Web-based, and be implemented outside the United States. The implementation of RDA will affect all aspects of our daily lives and that of vendors. Madison was deeply impressed at how RDA testing brought the community together and helped us to focus on who we are and what are we doing. RDA creates a future for catalogers; there are incredibly rich training opportunities through the Web. We must ask ourselves what we are willing to achieve, and clearly articulate our goals to administrators. Implementation will take time. A whole community must make this happen together. There was widespread anxiety in the past about changes to rules and implementations and the profession survived. We must look to the broader context of our bibliographic systems. RDA is not tearing the catalog down, but building it up. The analytical content often cannot be found within a catalog, but now we can bring many elements together for our users. RDA is a framework for building relationships, not a marginalization of the catalog. We need to put our energies toward building an optimal catalog that takes advantage of all of our resources. Madison sees three types of content that is imperative to integrate into the catalog of the future: (1) journal literature, (2) special collections, and (3) institutional repositories. Researchers are frustrated that they cannot find journal literature in our current catalogs. Bring the content in. Stop calling catalogs “databases.” They are indexes. Use terminology that makes sense to the community. Special collections should be brought into the fold; librarians have so much expertise and assistance to give them. It would be good to see an end to “hidden collections.” Institutional repositories hold published research, some from publicly funded grants. The catalog of the future will have a simple search-box interface that mines the great richness of all the collections. Libraries are in great shape. We have a pathway to relevance.
Linda Barnhart, Head, Metadata Services Department, University of California San Diego presented RDA from a department head’s view. Her presentation was entitled, “RDA: A View from the Middle.” A department head would focus on three main issues: (1) how difficult will it be to implement RDA, (2) how expensive, and (3) is RDA the right direction, worth the cost and effort? For a department head, RDA is only one of many concerns. RDA testing has yielded some anecdotal information regarding implementation from participating institutions: it is not that difficult to implement, start preparing for RDA now and wide involvement helps implementation. How expensive will RDA be to implement? The staff time will be expensive. But Barnhart argued that libraries cannot afford not to adopt RDA. There is a historic trend in libraries toward cooperation and cost sharing. RDA’s underlying technical data model allows precise description at each data point. We must rethink how we build, maintain, and share data and we must share with other communities; this benefits all. Linked data is the emerging technology. RDA is our future; it is not perfect, but it is our best hope. Much more difficult that the implementation of RDA will be reworking library systems and tools in a post-MARC era; RDA implementation pales in comparison to that.
Molly Tamarkin, Associate University Librarian for Information Technology, Duke University, spoke about connecting RDA to the catalog. Her presentation was entitled: “May You Live in Interesting Times,” a phrase purported to be a Chinese curse. Ms. Tamarkin spent ten years outside of libraries in the information technology sector, so her perspective is unique in our industry. Libraries have in fact been using hosted services (“cloud computing”) for years. Currently, much information in libraries resides in silos. AACR2 and MARC standards predate relational databases; they are a means to describe a physical item in a condensed, digital format. MARC is a structure of necessity; our catalogs are limited by our integrated library systems. We must ask who the catalog serves: us or the community?
Tamarkin discussed structured searches, recalling the time when online search tools such as AltaVista, Dogpile, and Yahoo used structured topics. They have been replaced by the single search-box of Google. There is no search strategy needed for Google. Has searching gotten better or worse? Perhaps that is not the right question. Users find that getting “good enough” results immediately are better than digging for “perfect” results through more refined searching. In the current environment, immediacy is privileged. Tamarkin displayed the cover of an issue of a computer magazine that was completely devoted to online searching, and noted that no librarians participated in writing the issue. Searching is lucrative business outside of libraries; we should look to what is happening with description outside of libraries. RDF (W3C) seems to be the best data format for linked data. So what is unique to libraries? Physical materials. The future of content in print is decreasing, digital content is becoming unbundled, and in the future, printed acquisitions are likely to be “special.” Digital content will be less owned, more leased. The result of the content change on the catalog will be more silos, more discovery tools, less control, more “good enough,” and less value in the catalog. The future role of the catalog will be for special collections and internal inventories. Content will be discoverable via Web standards; descriptive standards will apply only to a minority of objects available. We must try to understand what is possible and process around change. Innovation happens continually. Cloud services expand our ability to model data in different ways. Library administration must determine what kind of organization it is, what kind of organizational culture it has, and where the organization should be and where it is in reality. We must seek what is best for our community, current and future.
Tim Bucknall, Assistant Dean of Libraries and Head of Electronic Resources and Information Technologies, University of North Carolina at Greensboro, gave a presentation entitled, “The Future of the Library Catalog.” He began with a historical look at the library catalog as inventory control tool, then to public searching of the catalog on-site, then to off-site searching, and the emergence of e-content. He compared the library catalog with the “Web-based competition.” (Google, for the sake of argument.) The Web is perceived as easier to use and provides satisfactory results. The catalog is still an inventory, but it is not the one place people search. Bucknall predicted that the library catalog will go back to its original purpose: inventory control. The searchers of the future will pull data from several sources simultaneously and more efficiently through the Web. Libraries will still have catalogs, but they will not be central. Integrated library systems will disappear. Data will be shared in the cloud. With fewer local records, it will be cheaper. There is a disconnect between library staff and users. Google Books is coming our way; why would anyone want to search MARC records if the content itself is available? The trend toward disintermediation continues. The catalog of the future will be local, small, and searched with other systems. The future catalog will be a “Frankenstein” composed of many sources. The traditional, local integrated library system will die, more quickly than slowly in the recent poor economy.
Christopher Cole, Associate Director for Technical Services, National Agricultural Library, led the breakout discussion session. Groups were formed to discuss and report back on the following questions:
- Is implementing RDA the right direction for libraries?
- What can libraries do to move beyond MARC?
- Authority control: can AACR2 and RDA records co-exist? What about integrating other schemes and countries’ records?
- If our catalogs continue to be primarily metadata, will they survive?
- Is there value in subject controlled vocabulary? If so, can we harmonize systems and will some current ones die or evolve? (i.e., LCSH?)
- How do future data silos/catalogs manage multiple content carriers and levels of granularity?
- Is the future catalog linked to a specific search application, or is data agnostically accessible to multiple personalized search applications?
Beams & Bytes: Constructing the Future Library—Architectural and Digital Considerations
Keri Cascio, St. Charles City-County Library District
The symposium was built on the idea that changing user expectations and the shift to the digital medium are rapidly influencing library structures and services. Seven presenters related how these issues affect their libraries and their work, and symposium participants were given time to create model library spaces of their own. Here are some highlights from the presentations:
Michael Miller, Cal Poly State University
- We must reduce duplication and favor online access to resources. Space is expensive, and the cost of keeping books is not justifiable in the digital age.
- Move towards aggressive weeding and the highlighting of our unique collections, rely on ILL arrangements and partnerships with “sister” institutions.
- Think about ways to refurnish your spaces without a large renovation, using movable walls and additional electric outlets.
- Find the “common good” of the library, its importance in connecting communities and offering innovation in service.
Jeffrey Hoover, Tappe Associates and Denelle Wrightson, PSA-Dewberry
- It is difficult to create spaces for emerging services. Create flexible footprints for future, unseen needs.
- Rethink the need for shelving and introduce more areas for customer seating. Change from being collection-centered to reader-centered.
- Allow for collaboration, encourage customers to “get curious together” with an atmosphere of discovery and exploration.
- Focus on mobility and portability of library furnishings, mix the seating types. Some customers might like to be “alone in public” while using the library; others will want to create social spaces.
John Blyberg, Darien Library
- The User Experience (UX) Department looks at how users experience the library in the context of the community, staff, services, and building.
- Reference desks are now “lightweight” with a smaller desk and a monitor on a swivel arm for collaboration with customers. Reference staff use mobile devices in the library, currently an iPod Touch.
- No paper signage is in the building; if there is a problem, they find a different way to solve it. There is general, digital signage with flat panel screens with scrolling displays.
- Use of social media is important; the library implements Vimeo for a weekly online newsletter, uses Facebook to interact with teens. If you become the Mayor of the library on FourSquare, you get a tote bag. Twitter reaches out to the local community for news and information.
Robert Kieft, Occidental College
- The library has created an Academic Commons to make the college’s “academic commitments” visible to the community.
- Materials are becoming increasingly digital, and the library is more dependent on the cloud and networks.
- Make a choice between being in the “library business” or the “education business.”
- They are looking to reduce their space for the collection by half to accommodate other programming commitments.
- They will continue to purchase print materials, but other mechanisms such as ILL will allow the collection to shrink to a “core collection” of consumables instead of building a long-term investment.
- Patricia Hswe, Pennsylvania State University (Michael Giarlo contributed to presentation, but he was unable to attend)
- They are looking for flexibility in their digital stewardship and want to be able to use interchangeable tools and services.
- Projects include scientific data management, publishing and access to digital images, and institutional repositories.
- During the first year in their positions, they did a platform review to find the gaps in management of e-records and research data using the Purdue software rubric at http://bit.ly/adbdMW.
- They are focusing on a digital curation program, instead of attaching themselves to a specific university department.
- They have studied the benefits and costs of sustainability and are investigating the scalability of the program, including content, applications, microservices, and storage.
Lizabeth Wilson, University of Washington
- We have not yet reached our library without walls or our paperless society.
- The library is seen as the crossroads of the community, a trusted service-driven organization that transcends community boundaries.
- The library opened its Research Commons in October 2010 in a flexible space. The space was created by consolidating some of the branch libraries and moving collections to other areas.
- Everything is on wheels, and whiteboards are all around, even on the tables. Electricity comes from cords in the ceiling.
- Buildings must be technology enabled, but not defined by the technology currently in use. Spaces must be flexible to accommodate future needs. Thinking about building and furnishing for the “long haul” is no longer feasible.
- Building planning horizons need to be recalibrated with continual renewal and refurbishment.
CCS Executive Committee Forum
Christina Hennessey, Loyola Marymount University
True cooperative cataloging: single record sources vs. open record sources
Cataloging institutions are moving away from a world of standalone, in-house catalogs, and moving towards shared catalogs between institutions that are kept in places remote to our buildings. All the presentations for this session can be found at ALA Connect at http://connect.ala.org/node/128093.
Joan Chapa, MARCIVE, discussed “Legal issues involving cataloging provided by outside vendors that come with restrictions”. MARCIVE is a vendor that sells bibliographic services to libraries of all types and sizes. They get their records from many different sources and also create many in-house. With many libraries moving their cataloging records into the “cloud” instead of keeping them locally, MARCIVE has had to examine the rules and legalities behind where they retrieve their cataloging records, and what rights its customers have to redistribute those records. Chapa reviewed some of the restricted record types, and noted some MARCIVE record types that used to be proprietary can now be shared. This would be a good time for all of us to take a closer look at our catalogs, understand the source of our non-locally created records, and note which records can be legally shared.
Nancy Fleck, Michigan State University (MSU), gave a presentation on “SkyRiver versus OCLC: in a cooperative environment, can there be ownership?” MSU is the first (and only, as of January 2011) ARL library system to be using SkyRiver. Fleck she shared their MSU’s timeline and experience with transitioning to SkyRiver in the summer and fall of 2009. The move to SkyRiver exposed the reliance of MSU’s cataloging practices on assumptions that they were using OCLC for cataloging. Many cataloging operations were centered on the OCLC number. She raised the question of how one defines “original cataloging”. If it is when one creates a record that is not in OCLC, is everything considered as “original cataloging” when records are no longer obtained from OCLC? MSU re-examined all their cataloging codes and procedures and the reasons for them: were they cataloging this way due to OCLC practices, or due to national cataloging standards? MSU is still sharing a catalog with a few other OCLC institutions and have retained OCLC numbers in their catalog records as match points. MSU is pleased with their move to SkyRiver and have found it liberating. Cataloging statistics were the same after the move and there was no backlog increase.
Becky Culbertson, University of California San Diego, discussed “Standards in a cooperative environment. Who gets to set them?” There are many standards cataloging institutions can choose to use (or not use): rules (AACR2 and RDA), guidelines and best practices (BSRs [BIBCO Standard Records], CSRs [CONSER Standard Records], provider-neutral guidelines, OLAC Streaming media). An institution can use full standards or floor standards. Culbertson described her institution’s process of developing across-the-board cataloging standards. An in-house committee decided to adopt the BSRs, added and subtracted their changes to each, and then published the UCSD version of the BSRs.
RDA Update Forum
Dale Swensen, Brigham Young University
News from the Front: Briefings from RDA Test Participants
The forum drew a crowd of over two hundred Midwinter attendees, filling the room to capacity and leaving many standing at the back or sitting on the floor along the sides. Beacher Wiggins, LC, opened the meeting with a brief overview of the RDA test procedures and timeline and introduced the panel.
Christopher Cronin, University of Chicago, began the lineup with his presentation entitled “It’ll be all right … really, it will.” He described the multitude of activities leading up to the actual production phase: new fields in the ILS, test records, new displays, local policy decisions, a library-wide staff presentation, and, of course, training. Although only seventeen people participated in the test, over forty were trained in RDA, including catalogers, paraprofessionals, and library school students. Test records covered the gamut of library materials—monographs, serials, maps, and digital collections in a wide range of subjects and languages. At the conclusion, total production in RDA came to 1,283 bibliographic records and over 1,800 authority records. Post-test analysis revealed the participating staff members’ personal feelings about RDA: dismay at having to change already established headings; distaste regarding the method for recording the copyright date in the 260 field; the unrealized utility in the new 33X fields; and the poor navigation capabilities and lack of indexing in the RDA Toolkit. On the other hand, they liked the new authority record structures, the ability to express relationships between entities, and elimination of the rule of three. One thing is certain: RDA is here to stay and catalogers voiced unanimous support for moving ahead with it.
Penny Baker, Clark Art Institute Library, Williamstown, Massachusetts, titled her presentation “Clark RDA: From Here to Venus.” With only three catalogers, Clark had the smallest team of all the test participants. The Clark group found the language and terminology of RDA difficult to grasp. The common set posed a challenge too, since it contained materials they did not normally handle. Much of Baker’s presentation consisted of slides illustrating the unique items Clark included in the test: paintings, photographs, graphic arts, an art installation, and a collection of archival materials of Robert Sterling Clark. Work on the archival collection resulted in the creation of a mapping from DACS to RDA, which will be contributed to the RDA Toolkit.
Ric Hasenyager, North East Independent School District, San Antonio, Texas, was the next speaker. With two catalogers and seven copy catalogers, North East operates much like a public library and processes around 70,000 new items per year. Three people participated in the test. Training was accomplished through webinars and materials provided by the Library of Congress. The new 33X fields posed the most challenging problem. Terms presented across the various vocabularies seemed redundant and the three of them could not always agree on meanings. The North East group agreed that a list of terms geared specifically to school libraries would be helpful and they would like vendors to create macros to facilitate rapid insertion of the new fields into their bibliographic record. Overall feelings about RDA were mixed: they were pleased with the training and generally comfortable with the standard, but recognized that future implementation at North East would largely depend on what their system and book vendors decide to do. For the time, however, they will continue to subscribe to and use the Toolkit.
Hasenyager was followed by Kathryn La Barre, Graduate School of Library and Information Science, University of Illinois. Sixteen people from Illinois took part in testing, including LIS students, instructors, and library faculty. The strongest focus was on the common set. In the end, they produced sixty-four records, which included eight Dublin Core records in the extra set. Response to the Toolkit was somewhat ambivalent. All agreed it was a good learning experience. The examples for the test were well chosen and the workflows helpful. Some thought the Toolkit was an improvement over AACR2. Others, however, thought that RDA was MARC-centric and incompatible with Dublin Core, and that instructions were non-linear, overly complex, and not always clear. The Toolkit, they said, was slow, confusing, and “not ready for primetime.” Advice for trainers and educators: teach FRBR first and get a solid foundation in AACR2 before tackling RDA. Aim for a balance between theory and practice. The concept of applying “RDA in the wild”—in a system designed specifically for RDA and FRBR—was an idea that intrigued them.
Shawne Miksa, University of North Texas, offered additional insight from the library educator perspective. As an unofficial tester, Shawne had her students practiced with RDA. She said they found it much easier to use for electronic resources than AACR2. The documentation, though, was daunting and they felt there was lot to keep up with. Nevertheless, for the most part it was a positive experience.
Maritta Coppieters, Backstage Library Works, represented the vendor’s view. Backstage was one of two vendors who participated in the test. They provide services in cataloging and other technical operations and have offices in Provo, Utah, and Bethlehem, Pennsylvania. Five of their thirty-seven staff members participated in the test, including one manager, two novice catalogers, and two experienced catalogers. In addition to the common set, they produced 102 extra set records which included many e-books. Aside from the test records, one of their catalogers created records in both RDA and AACR2 for a set of items they chose so that they could make their own comparisons. There were concerns that RDA would make Backstage’s work more expensive. With so many options, all clients might want to make unique local policy decisions which would be difficult for Backstage to manage. Relaxed restrictions might mean more metadata. Some clients might want their existing AACR2 records upgraded to RDA.
The panel presentations were followed by questions from the audience.
CRS Committee on Holdings Update Forum
Adrian Ho, University of Western Ontario
It is now common that subscription-based journals provide an open access (OA) publishing option in exchange for a publication fee. In other words, the article published upon payment of the publication fee will be freely available online with minimal copyright and licensing restrictions. This is known as the hybrid business model of journal publishing because the costs are covered either by subscription or by the publication fee. While this model has become common, some aspects of it are not widely discussed. For example, what factors determine the amount of the publication fee? How is this model faring? Will hybrid journals turn into full OA titles over time?
The ALCTS Scholarly Communications Interest Group organized a panel discussion on hybrid journals and the future of scholarly publishing at the 2011 ALA Midwinter Meeting in San Diego. The panelists were:
- Philip Bourne, Professor of Pharmacology, University of California San Diego
- Charles Eckman, University Librarian and Dean of Library Services, Simon Fraser University, Canada
- Patricia Hudson, Senior Marketing Manager, Oxford Journals, Oxford University Press
- Dan Morgan, Executive Publisher, Psychology and Cognitive Science, Elsevier
The panelists discussed the development, perceptions, and future of hybrid journals from different points of view. Judy Luther was the discussion moderator.
Charles Eckman delivered the first presentation and pointed out that there was a steady growth of library-based funding support for authors who wish to enable OA to their published research. As former Director of Collections at the University of California Berkeley Library, he provided an overview of the OA fund there and noted that researchers welcomed financial support for publishing in hybrid journals. Of the sixty OA articles funded in 2008, thirty were published in hybrid journals. He also discussed the OA fund at Simon Fraser University where he is currently University Librarian and Dean of Library Services. The Senate Committee there reviewed and rejected funding for publishing in hybrid journals due to fiscal accountability questions and concerns over the journals’ “double-dipping issue.” Eckman argued that the stance of campus stakeholders matters in determining whether an OA fund should support publishing in hybrid journals. Moreover, researchers who publish in hybrid journals tend to opt for OA when there is financial support for it. In principle, hybrid journals could transition to full OA journals. However, the “subscription culture” and a lack of will to develop new practices are among the factors that create barriers to such a transition. He suggested that more data be collected from different stakeholders of journal publishing for the study of this issue. He wrapped up the presentation with recommendations such as applying the OA fund to other types of scholarly publication.(http://bit.ly/giQw3p)
Philip Bourne approached the topic from the researcher’s perspective. He argued that scientists and faculty members are usually more interested in publishing their research in the most prestigious journals than in ensuring unfettered online access to their research. He noted that funding agencies’ policies on access to funded research are a significant factor in determining what researchers will do in terms of supporting OA. For instance, the National Institutes of Health have not strictly enforced their public access requirement at this point. Researchers, therefore, stick to their habits when selecting journals for publication and give no thought to archiving their publications in OA repositories. They realize that journal access is in general not free and like the idea that OA will help boost readership of their works. However, most of them have not thought much about publishing in hybrid journals. Bourne stated that “hybrid journals are but a small step in the right direction” and that full access to published research and related data in a machine-usable way is crucial. (The knowledge and data cycle is based on an article by Bourne, Will a Biological Database Be Different from a Biological Journal?, which is freely accessible at: http://dx.doi.org/10.1371/journal.pcbi.0010034) He concluded by providing a brief description of a knowledge and data cycle, which epitomizes the future of scholarly communication. (Bourne’s presentation is available online at: http://bit.ly/gZ3PkY)
Patricia Hudson then discussed hybrid journals from the perspective of a non-profit publisher. Oxford University Press currently offers the OA option in ninety-four of its subscription-based journals. The editorial decisions of these journals are entirely independent of whether the author plans to select the OA option. The publication fee is $3,000. Articles covered by the option have a Creative Commons license applied to them and are deposited to PubMed Central as needed. In 2009, the OA option uptake was highest in Life Sciences and lowest in Humanities and Social Sciences. The Press informs the author of the OA option after acceptance of the article, and notes that this option offers a possible solution to the compliance with research funders’ access policies. Hudson alerted the audience to a number of questions that warrant attention. Among them are: How will gold OA uptake affect subscriptions to hybrid journals? How can OA articles be clearly identified within hybrid journals? Is gold OA feasible in Humanities and Social Science publishing? (Hudson’s presentation is available online at: http://bit.ly/eF7L8E)
Dan Morgan followed by speaking on the hybrid journals published by Elsevier. As of December 2010, more than 500 Elsevier subscription-based journals offered the OA option. The charge was $3,000, but $5,000 for Cell Press journals. Morgan pointed out that Elsevier does not charge subscribers for content covered by the OA option. In 2009, 515 articles were published with the option among the 260,000 articles published by Elsevier journals. As the uptake of the option had been very low since 2006, it did not generate an impact on journal pricing. Industry-wide, the low uptake rate (1-2 percent) presented risk for sustainability. However, it is likely that the uptake will increase as a result of funding support. Morgan maintained that Elsevier is open to mechanisms that have the potential to bring about sustainable universal access to published research. But the company adopts a “test-and-learn approach” to “ensure that system-wide impact of such mechanisms are [sic] fully understood before scaling them up.” Meanwhile, there are questions for different stakeholders involved in the hybrid model. They address various issues such as the sustainability of funding support for the OA option, the funding distribution among different disciplines, and the perpetual costs of hosting articles published with the OA option. Last but not least, Morgan briefly discussed five future directions for scholarly publishing: 1. Close remaining access gaps; 2. Provide access to non-journal outputs; 3. Enrich and enhance articles; 4. Develop tools to derive insights across articles; and 5. Strengthen anti-plagiarism and ethics enforcement. (Morgan’s presentation is available online at: http://bit.ly/dI7gz3)
Collection Management and Development Section (CMDS) Forum
Kristin W. Andrews, University of California-Irvine
The ALCTS Collection Management and Development (CDMS) forum at ALA Midwinter 2011 featured a panel of four speakers who discussed the changing role of selectors and selection in collection management.
The first speaker was Rick Anderson, Associate Director for Scholarly Resources and Collections, University of Utah, who argued that "Selection is not Dead, but the Selector Is." He reminded the audience that while it seems that collection management has changed suddenly over the last year or two, it has actually been evolving since the 19th century. We have shifted from an environment where materials were expensive and were carefully selected by librarians to a nearly wall-free situation in which documents are readily available from multiple sources and materials are not so much selected as provided in bulk packages. 21st century patrons can no longer tell where materials come from and whether it is a "library resource" or not. The key value is that materials are available. Anderson believes that the game-changers for the next five years will be the economy and models such as Google Books, HathiTrust, and Patron Driven Acquisition (PDA). The previous ideal had been to create comprehensive research collections, which Anderson dubs "Monument to Western Civilization" collections. Some libraries should still strive for this, but most should aim for a new (“unattainable”) ideal in which "every patron gets everything s/he needs, with zero effort, in the moment s/he needs it, in the format s/he prefers". In the meantime, our options are to share collections and financing, PDA (with some filtering), and “by-the-drink” purchasing of journal articles (“patrons don’t need journals; they need articles”).
Stephen Bosch, Materials Budget, Procurement and Licensing Librarian, University of Arizona, spoke next. Like Anderson, he emphasized the changing economic and budgetary environment. Costs for materials have gone up, but funding has not. He also emphasized that users do not tend to start with the library web site when they do research; they use other tools first. Users want broad access without impediments. Information is now discovered and accessed via web-scale tools such as Google, Facebook, and Wikipedia. Library search tools are the least used. The traditional library collection model is no longer sustainable in a web-scale world. With PDA and cooperative collection management, we are moving toward a model that focuses less on selection and more on central funds and web-scale discovery and delivery. Bosch believes that resource management is still essential; it just requires new skills to manage processes, metadata, information delivery, and more.
The third speaker was Nancy Gibbs, Head of Acquisitions, Duke University Libraries. In accordance with the common theme of the speakers, she stated that “selection is not dead but it has morphed”. There is an increased emphasis on electronic resources, PDA, Google Books, approval plans, and consortia. There are fewer changes in international studies and special collections since they are unique. Most foreign materials are not yet available electronically. While Patron Driven Acquisition is reducing the burden on selectors, some selection is still required. Most libraries using PDA download a pre-filtered set of records, rather than dumping everything into the catalog. Gibbs stated that these changes mean that statistics will become increasingly important for collection assessment. Good record keeping is also vital to keep track of which titles are available from which of the multiple packages a library subscribes to. Collection management is an increasingly collaborative task. Acquisitions, collection development, cataloging, and access services must now work together to make collections discoverable and accessible. Librarians who previously spent most of their time on selection tasks are now marketing library services to students and faculty, teaching critical thinking and information literacy skills, writing grant proposals, working on digital projects, troubleshooting e-resource access issues and providing reference services in multiple forms.
The final panelist was Reeta Sinha, Senior Collection Development Manager, YBP. Unlike the other speakers, she took a more positive view of the continued role of the selector. Selection is not yet dead—the solution to the problem of unused books is Patron Driven Acquisition, which would provide for more accurate selection and reduce the workload of the selector. Even with PDA, the library still controls the plan by providing parameters to filter the materials made available. She believes that we should not be trying to cut materials and services too precipitously due to budget concerns and worries about the ramifications of such choices in the next five to ten years.
A lively Q&A session focused on practical & philosophical issues:
- Do we default to print or e-only? Most panelists said that they had e-book preferred policies.
- What is the purpose of a collection? It is not to simply be a collection; it is to meet the needs of the patron.
- PDA eliminates inaccurate guesswork as to what patrons need. Patrons know what they need better than we do.
- PDA can be useful for smaller libraries because they do not have space for everything.
- How do we know a book that has not been used will be needed in fifteen years? Bosch asserted that we can have archival/comprehensive collections for these but it is not sustainable for every library to buy everything.
- How do we provide ILL services? Many vendor licenses do not allow ILL. We need to work with vendors to resolve this. One library in Ohio is only purchasing e-books if ILL is allowed. This increases pressure on publishers.
Brian Falato, University of South Florida
This year’s ALCTS Forum at the 2011 ALA Midwinter Meeting was devoted to archives and special collections. Bradley Westbrook, University of California, San Diego (UCSD), was the first to speak, on the planned merger of Archivists’ Toolkit and Archon. These online resources for archival management were developed separately, but both debuted in 2004. The idea behind the merger of the two is to take the best qualities of both products and create a unified tool with lower maintenance costs, interoperability, and the capability of multi-tenant hosting. The merged product will be a Web application called ArchivesSpace. UCSD, the University of Illinois at Urbana-Champaign, and New York University are partnering in its development, with funding from the Andrew W. Mellon Foundation.
Jackie Dooley spoke on the October 2010 report Katherine Luce and she prepared for OCLC Research, titled “Taking Our Pulse: The OCLC Research Survey of Special Collections and Archives,” which was also used as the program title for the ALCTS Forum. This report is available online at http://www.oclc.org/research/publications/library/2010/2010-11.pdf. The report updated and expanded a 1998 Association of Research Libraries (ARL) survey on special collections. The new survey was sent to ARL members, plus members of Canadian Academic and Research Libraries, Independent Research Libraries Association, liberal arts colleges in the Oberlin Group, and U.S. and Canadian research institutions in the RLG Partnership. There was a 61 percent response rate.
The survey found that use of all types of special collections materials has increased across the board since the 1998 survey. Many backlogs have decreased, but almost as many continue to grow. The three “most challenging issues” for survey respondents are space, born-digital materials, and digitization.
Recommended action items listed include: facilitating access to and interlibrary loan of rare and unique materials; replicable, sustainable methodologies to stop the growth of backlogs; conversion of legacy finding aids; models for large-scale digitization; development of basic steps, use cases, and cost models for managing born-digital materials; and development of education and training opportunities for those areas identified as high-priority needs.
Continuing Resources Cataloging Forum
Sandy Roe, Illinois State University
Have you ever used an OpenURL link, only to have it not work as you expected? Susan Marcin, Licensed Electronic Resources Librarian at Columbia University Libraries, described an initiative to improve user experience with OpenURLs. The NISO OpenURL Quality Metrics Working Group, known as IOTA (Improving OpenURL Through Analytics) was created in January 2010, and is a two-year project focused on testing and validating metrics used to determine OpenURL quality. Current work includes a vendor completeness index and element presence/absence, weighting, and format. Reporting software is in production with over 9 million URLs to date and available at http://openurlquality.niso.org/. The desired outcomes of this work include a qualitative report that will help OpenURL providers to compare their OpenURL quality to that of their peers, and recommendations that will give source vendors information on how to improve their data so that the maximum number of OpenURL requests resolve to a correct record. More information is available at the NISO IOTA Working Group’s official blog, http://openurlquality.blogspot.com/.
At a time when researchers increasingly use online resources and rarely visit the physical library, how can librarians stay involved with the research process—both for collection building and to design new services? Sara Russell Gonzalez, Assistant University Librarian at the Marston Science Library, University of Florida, described VIVO as one solution to this question. VIVO (http://VIVOweb.org/) is an open source, research discovery platform for hosting information about faculty, their research interests and accomplishments for the purpose of enabling collaboration between scientists across disciplines at a particular institution. The application that supports VIVO was developed at Cornell University and implemented there in 2004 (http://vivo.cornell.edu/). It is being expanded through a stimulus grant from the National Center for Research Resources of the National Institutes of Health. Currently the VIVO project has seven partner institutions, including the University of Florida. Data is stored in RDF triples (subject, predicate, verb), and may be input directly or harvested from verified sources. Individuals may also edit and customize their profiles to suit their professional needs. All data in VIVO is public. Its ontology may be extended to support institution-specific needs. VIVO enables authoritative data about researchers to join the linked data cloud (semantic web). Name disambiguation—both for persons and corporations—continues to be a major challenge, although VIVO’s ontology supports the use of unique author IDs such as ORCID ID and others. Future versions of VIVO are expected to be able to generate CVs and biographical sketches for faculty reporting or grant proposals, and display visualizations of complex research networks and relationships. In closing, Gonzalez encouraged the audience to become an adopter by becoming an adopter, linking an existing or building a new VIVO-compatible application, or becoming a data provider to the national network.
The third speaker was unable to attend but replaced by a colleague, Laurie Taylor, from the Digital Library Center at the Smathers Library, University of Florida. Taylor described how they have been mining Encoded Archival Descriptions (EAD) to obtain historical researcher profiles in order to make them available in something like VIVO. For additional information about Encoded Archival Context—Corporate bodies, Persons, and Families (EAC-CPF), see http://eac.staatsbibliothek-berlin.de/. A round of questions and answers concluded the session which was ably moderated by Naomi Young.