Designing for Experts: How Scholars Approach an Academic Library Web Site

Thura Mack, Maribeth Manoff, Tamara J. Miller, and Anthony D. Smith


This study examines the use of an academic library Web site by experienced researchers and active scholars. It is part of a larger effort to understand how experienced users approach online information resources and how fully the library’s Web site meets their needs. Subjects were asked to complete eight online tasks, beginning each task at the library’s home page. Data were gathered by means of screen- and audio-capture software, and human observers as study participants worked through sets of tasks. Results were analyzed in terms of the experience and expertise of the participants, success rate, and the first click indicating the chosen path to the information requested. Subjects had high success rates for most tasks. Searching for information about journals and locating journal articles proved to be the most difficult tasks to successfully complete. Analysis of session recordings revealed some traits of expert users that can be used to improve Web site design, and indicated a correlation between success in searching and the double-expertise of subject knowledge combined with frequent use of the library’s Web site.

This research stems from an interest in the online information-seeking behaviors of experienced researchers and the extent to which a customizable portal might improve their ability to locate and manage information sources. By analyzing the information-seeking habits of active scholars, the authors hope to identify online tools that would improve the efficiency of scholars’ information retrieval. A broad range of solutions are under consideration, from the design elements in a traditional Web site to the feasibility of employing advanced technologies to create a personalized information environment. One example of such an advanced technology would be an expert system powered by artificial intelligence and responsive to natural language, that could assist researchers in discovering and using library resources. The ultimate aim is to explore the usefulness of a scholar’s portal that not only mirrors a research environment, but also establishes a customizable virtual workspace allowing the user to organize both resources and ideas.

The authors chose to begin their study with a formal process of usability testing, employing the results in an ongoing Web-design process. In June 2001, a team of four librarians and a graduate assistant collaborated to develop a test of the University of Tennessee (UT) Libraries’ Web site as it was being redesigned. One member of the team has ongoing responsibility for the library Web site design. The team’s first objectives were to determine how to design and implement a usability test, to gather feedback on the new site, and to plan, create, and document a user-centered design process for all future Web sites.

The study comprised two parts: (1) a set of Web-usability tasks was developed to test the ease of information discovery and retrieval; and (2) a structured, open-ended interview was then conducted to determine user preferences, expectations, and current online searching practices.

This article will discuss the Web-usability portion of the study. The interview data will be reported separately.

Data Collection Methods

The study employed thirteen subjects who participated in individual Web-usability sessions lasting between forty and ninety minutes. Volunteers were recruited by contacting the graduate student association and library faculty representatives, giving an explanation of the study and what participation would involve. The research team offered two incentives: that the subjects would be assisting the library in better serving the needs of the research community and that participants would be entered in a drawing for a gift certificate for a dinner for two.

The team sought to attract a purposeful sample, rather than the randomly drawn samples typical of quantitative research. Purposeful sampling allows for selection of a limited number of information-rich cases for in-depth observation. 1 Within purposeful sampling, an intensity sample of cases was utilized that manifests several characteristics intensely, but not extremely. 2 The team wanted to observe and identify the Web-use efficiency and effectiveness of faculty and graduate students actively engaged in research. An attempt was made to recruit mainly from the science and humanities disciplines, given a working hypothesis that those two disciplines would have different information needs and use the Web site in substantially different ways. However, the team eventually decided that broad participation from all three of the traditional discipline areas would be beneficial. All but one of the participants were conducting a current research project (see table 1).

Table 1. Participants

Discipline %

Science

62

Humanities

23

Social Science

15
Status  

Faculty

46

Graduate Student

54
Current research  

Yes

92

No

8

Sample-size determination for qualitative research is not driven by the statistical guidelines familiar in quantitative work. The size of the sample is less important than the rationale for including participants and the scope of the study. 3 Rather than trying to reduce statistical sampling error, the team made an effort to avoid discovery failure, or the failure to discover an attitude, perception, or behavior within the scope of this study. Usability sessions were continued until the information gathered became repetitive. Thirteen subjects appears to be an appropriate study size based on previous observations of usability-study effectiveness with similar subjects. 4

All but one of the users in this study can be classified as expert in their respective discipline areas or knowledge domains. They are active researchers at an advanced level. More than three-fourths of the subjects reported experience with the older version of the library Web site, with 60 percent of subjects using the site for an average of three or more hours per week. Thus, a majority of subjects could be categorized as double-experts with both high levels of domain knowledge and substantial experience with Web-based library content and services (see table 2).

Table 2. Use of the library Web site

Use library web site %

Yes

77

No

23
How many hours per week?  

3 or more hours

60

1-2 hours

30

Less than 1 hour

10
Favorite online uses  

Databases

50

Journal articles

30

Other

20

For this study, the usability sessions were scheduled to coincide with the launch of a new library Web site design. Although many of the participants indicated heavy use of the library Web site, none had experience with the new Web site design. The main page and several of the first-level pages were completely redesigned. Much of the original functionality and many lower-level pages remained unchanged after the launch of the new Web site. Frequent users of the library Web pages were likely to encounter familiar pages and resources within a few clicks from the front page.

All of the usability sessions were conducted in one of the library’s bibliographic instruction rooms, normally used for classroom teaching. Instruction rooms functioned very well for the study as they both provided sanctuary from the busy areas of the library and housed the needed computer equipment. The rooms are equipped with an instructor’s workstation and a projector that displays the instructor’s monitor on a wall screen. Each subject was seated at the instructor’s computer, allowing the research group to closely observe the participant’s searches on the wall screen without the need to hover around the participant. It is believed that this contributed to a more relaxed environment for participants.

The workstations used for bibliographic instruction already had been configured to provide access to all of the library’s information resources. This permitted the study team to organize the room and conduct the testing with minimal workstation preparation. It was necessary to add one piece of software and a microphone to capture each test session. In order to gather as much information as possible, a decision was made to record the screen activities and the comments of the participant and moderator. TechSmith’s Camtasia software was used to record each session. The software proved to be economical and provided a permanent record of the screen activities, synchronized with the audio, for all of the sessions. Camtasia is relatively easy to install and requires a minimal hardware configuration to operate effectively. An Intel-based, Windows NT workstation was used to conduct the study, and this worked superbly. Figure 1 gives the recommended system requirements as well as the actual system used in this study. Each session was captured as a compressed Audio Video Interleave (AVI) file and then archived to CD-ROM.

TechSmith system requirements
System used

Microsoft Windows 95 OSR2, 98, Me, NT 40.0, 2000, XP, or later version Microsoft Windows NT 4.0

90 MHz processor (400 MHz recommended)

450 MHz processor

16 MB RAM (64 MB recommended)

384 MB RAM

Windows-compatible sound card and microphone

Windows-compatible sound card and microphone
12 MB of hard-disk space for program installation 30 GB hard disk space

Figure 1. Camtasia software information

In addition to the participant, each session included a moderator and an observer. The moderator was charged with leading the participant through the required steps. The observer’s role was to note incidental information about the performance of each task that later could be compared with the recording. The observer also contributed by asking questions and interjecting ideas during the discussion portion of the session.

Because the participants would be recorded, it was necessary to obtain written consent from each participant as part of the requirements for using human subjects at UT. The participants were also asked to complete a brief questionnaire designed to gather basic demographic data, level of online experience, and discipline areas of research activity. The authors recognized that the test environment might create anxiety among the participants, and therefore a conscious effort was made to make the subjects feel relaxed. The moderator and observer spent a few minutes at the beginning of each session introducing themselves and learning a little about the participants and their academic interests. Bottled water was provided for each participant. A written script guided the entire session to ensure consistency across sessions. The moderator’s script included reassurance that the session was a test of the library’s Web site and not a test of the participant. These extra efforts were intended to make the subjects feel more comfortable about the questioning and thereby provide a more realistic reflection of the information-seeking process.

The part of the study reported here consisted of eight tasks meant to discover the difficulty level of locating and retrieving information resources from the library’s Web site. The tasks were designed to include the major information resources and services typically found in an academic library. Similar instruments used at Appalachian State University, Massachusetts Institute of Technology, the University of Arizona, and others were reviewed before developing the usability tasks. 5 An attempt was made to carefully construct tasks that would reflect a real-world situation in which the information seeker may not have a full citation. For example, the first question asks the participant if the library owns a copy of The Color Purple by Alice Walker. It does not state that the item is a monograph or that most reference librarians would first check the online catalog to answer this type of question. Constructing the tasks in this fashion allowed the authors to observe how scholars might approach a typical, modern academic library with its diverse set of information resources.

The tasks were pretested by members of the library staff and revised to improve the wording and remove ambiguity. The entire test session was then pretested with a graduate student to give the team experience with the script and the timing of a session, and to test the recording routine.

For each task, the moderator read the question aloud and then placed a printed copy of the question near the keyboard so the participant could refer to it while attempting the task. A time limit of three minutes was placed on each task as a way to measure the effective completion of that task. After three minutes, the moderator stopped the participant and moved on to the next task.

Analysis

The study team began the task analysis by collating data. Using the recordings, along with the notes taken by observers, the team calculated the success rate and the percentage of participants who were able to find the information within the three-minute time frame. The authors noted the path each user took in their search for the information requested in that particular task. The path information, a record of each page the participant visited, and the terms they typed into any search box were transferred to a worksheet, arranged by participant. Arranging the data in this way, by task and by participant, made it easier to detect patterns at a single glance.

To better assess information-seeking behavior, the team took a closer look at success rate and first clicks (the first page each participant visited in their information quest). The team previously had determined the best, most direct route for successful completion of each task, starting from the library’s home page. For each actual path taken, the team asked the following questions (see table 3): Was the first page the user visited along this best route? If not, was the first click on any path that would lead directly to the information, or would the participants have to retrace their steps in order to get where they needed to go?

Table 3. Summary of task analysis

Task % Success rate % First click on "best" route % First click on any direct route

1. Is The Color Purple by Alice Walker available at the UT Libraries?

85 62 NA

2. Is Understanding Physics by Isaac Asimov available at another library in the region?

69 15 38
3. Where could you request an item on interlibrary loan? 92 62 15

4. In what formats does the UT Libraries receive the magazine Scientific American?

54 62 NA

5. Search for information on medieval universities or human cloning.

92 NA NA

6. Find K. Wesley Berry's article "The Lay of the Land in Cormac," McCarthy's "The Orchard Keeper and Child of God" or Peter M. Cox's article "Acceleration of Global Warming Due to Carbon-Cycle Feedbacks in a Coupled Climate Model."

62 38 NA
7. Where can you find a definition for the word "gaff"? 92 8 54

8. Where can you submit a reference question online?

92 8 54

Journal articles

30    

The first task was to discover the availability of a book in the library’s collections. The most direct route to this information is from the home page to the catalog, where a number of different searches would yield the correct results. Eight of the thirteen participants went directly to the catalog and three more eventually got to the OPAC, for a success rate of 85 percent.

The second task asked the participant to determine whether a book was available in another library in the region. Again, there is a direct link on the library’s home page to a union library catalog called Kudzu that contains the holdings of fifteen southeastern academic libraries. Only two of the participants selected the preferred route, directly linking to Kudzu from the home page. Five others first clicked on the Other Catalogs link. This page has a description of Kudzu along with a link so that participants were able to find the information without backtracking. In addition to these seven, two more subjects eventually arrived at the Kudzu catalog and found the book in question.

The third task was to locate the pages for submitting an interlibrary loan request. There is a direct link from the home page to the online interlibrary loan form, and eight subjects used this link to reach the form on their first click. Two more subjects linked to Forms, where they could choose Interlibrary Loan from the list. Two more participants were successful by a more indirect route, for an overall success rate of 92 percent.

The fourth task was to find the formats available for a specific journal title. The library catalog contains three records describing various formats of the journal. The best route is to choose the catalog, then conduct a browse-type search using the periodical title index. Eight users went directly to the catalog but chose keyword searches, which retrieved lengthy lists of volume-holdings statements. The results were confusing to users, as the information needed did not appear on the first screen. Only seven participants, or 54 percent, were able to finally locate the journal-title formats.

For the fifth task, participants were asked to find any information on a specific subject. There was no predetermined best path for this task. Interestingly, the Subject Guides link from the home page, which four people chose, was not a particularly fruitful first click, as the subject areas covered on these pages are too broad for the task given. Five others went first to Databases, two went to Library Catalogs, and two more went to Internet Search Engines. Only one participant became lost and was not able to find any information on one of the topics given.

The sixth task, which instructed the participants to locate a specific journal article, was one of the most complicated. Users were given the author and title of the article, but not the journal in which it appeared. This task required people to link to Databases, locate an appropriate database, and search within the database to locate the citation. Five participants went to Databases on their first click, and three more eventually discovered the correct path to an article, for a success rate of 62 percent.

The seventh task was to find a word definition online. In this case, the Subject Guides page was considered the best first click, as it leads to a list of links that includes a number of online dictionaries. Only one of the participants went first to Subject Guides, and seven went to Internet Search Engines. Several of the latter went to Google.com, where they either located an online dictionary or found the definition directly. The relatively high success rate of those who sought an answer from outside the library Web site and the inability of users to locate the library dictionaries from the home page indicate both a failure of the library Web design and a growing tendency to seek simple answers using Internet search engines.

The last question was where to submit a reference question online. A new service called AskUs.Now had been launched at about the same time as the new Web site and was available on the library’s home page. Choosing AskUs.Now (which one user did on the first click) brings up a page offering chat reference during certain hours, along with links to the e-mail reference service, which was the target page. Several other links from the home page are less direct paths to the e-mail reference service, and seven users chose one of these alternate links as a first click. In the end, twelve of the thirteen participants found the online reference form.

Discussion

The participants had relatively high success rates regardless of discipline or status. The three participants who completed every task successfully within the three-minute time limit were all native English-speaking graduate students who were actively engaged in research projects and who reported making regular use of the library Web pages—qualifying them as double-experts in terms of both their subject area and online library use. Successful searching has been characterized as requiring this double-expertise in general Internet searching studies. 6

There is further evidence of a strong correlation between frequent access to the library Web site and success in locating information and services online. Of the two participants who had the least success, one had never used the library online and the other reported usage at the lower end of the scale. Even so, some faculty members with equally low library Web page experience had higher success rates, perhaps because of their strong discipline knowledge. One computer science faculty member, who reported no regular use of the library Web page, completed all but one of the tasks successfully. His research area is indexing and information retrieval, which he felt allowed him to use unfamiliar online information systems with ease. The relationship between successful online searching and (1) online experience; (2) familiarity with resources within a discipline; or (3) deep knowledge within a domain warrants further examination (see table 4).

Table 4. Success rates and expertise

Status Department Use library web pages? Hours per week Native English speaker Success rate %

Graduate student

Sociology Yes 10.0 Yes 100

Graduate student

English Yes 5.0 Yes 100
Graduate student Industrial engineering Yes 5.0 Yes 100

Faculty

Music Yes 5.0 Yes 87

Graduate student

Sociology Yes 3.0 Yes 87

Faculty

Botany Yes 1.0 No 87
Faculty Computer science No 0.0 Yes 87

Faculty

Environmental biology Yes 3.0 No 75
Graduate student Physics Yes 2.0 No 75
Faculty Geological science Yes 0.25 Yes 75
Faculty Industrial engineering No 0.0 Yes 62
Graduate student Information science Yes 1.0 Yes 50

Graduate student

Mechanical engineering No 0.0 No 50

Although each task started at the library home page, participants were not required to stay within the library site as they searched. They sometimes chose to use Internet search engines, primarily for locating general subject information and to obtain the word definition. Seven participants went to Google to find a dictionary or a definition and one went directly to dictionary.com, which she said she used frequently. Four participants left the library Web site and used either Google or Yahoo! to gather subject information for task five. These participants indicated that they preferred to use an Internet search engine to gather broad information on a topic before beginning a more narrow and specific search for information, particularly when a topic was new to them.

One of the most challenging tasks given to the participants was locating a specific article or its citation with the author and title given, but without knowing the journal title. The key to success was selecting the correct database from more than one hundred offered on the Web site. When participants could comfortably stay within their area of subject expertise, this task was accomplished with ease. However, when they were not familiar with the literature and tools in a field, there was little success in guessing at an appropriate database in which to search. The need to offer better subject assistance in searching the journal literature was evident here. One possibility for achieving this goal within a scholar’s portal would be a federated searching function, which would allow the user to search across a number of databases within an unfamiliar subject area.

University faculty and graduate student users—with their attendant levels of experience and expertise—expect to find the information they are seeking. If the desired information, or a clear path to it, does not appear near the top of a display, they are unlikely to read the full page, much less scroll down through search results to find it. In several instances, participants reached the right screen, or made a proper search, but did not read far enough down to realize they had succeeded. Their assumption was that the needed information should appear near the top, unless they had made an error in search protocol or in database selection. Rather than scroll through a list, they would simply initiate a new search in hopes of more specific results. This evidence provides a challenge to further refinements in the design of the Web site. How much information, and how many paths to further information, can appear on one page and still be effective?

The authors discovered that most participants were not confused by terms used on the library Web pages, with a few notable exceptions. The cleverly named Kudzu, a virtual catalog of southeastern academic libraries, was not easily recognized at first glance. Users did not select it from the front page, but did select it from lower-level screens where the link appears under the description Other Catalogs. A similar pattern was apparent with AskUs.Now and other services or resources that had been given names or titles outside the usual library jargon. AskUs.Now was selected less often from the home page than from a lower-level page that contained the clues needed to allow the user to identify it as a reference service. It seems that regular users have learned to identify with catch phrases; however, new users will need to be provided with additional context.

Most participants found the redesigned library page an improvement over the older design. There are more direct links on the first page than previously. A common navigation bar appears at the bottom of all screens with quick links to the main university page, branch libraries, library hours, help information, a site-search link, and a link to a site index. Two participants attempted to use the site-search tool to locate journals, books, or articles. They each tried this strategy more than once, seeming to mistake this link for a general search engine of all library resources. It could be that a more hierarchical structure to the site, with a corresponding site map, would help to minimize the confusion of different search engines. No one used the site index provided for any of the tasks, perhaps because the title “A to Z” does not accurately describe this navigational tool.

The subjects in this study were persistent as they attempted to complete each task. The somewhat artificial nature of the test may have motivated them to continue searching longer than they normally might. The less-experienced searchers had a tendency to continue further down a wrong path, doggedly looking for clues. More-experienced searchers were likely to refine their searches, backtrack when lost, or try multiple searches to complete the task.

There were a number of ways in which the participants recovered when they had strayed from the best path or any direct path. Moving backward through the search with the browser Back button was a common way to make sure they had not missed a relevant selection, or to try an alternative choice. Some experienced users became aware of their mistake when they encountered a familiar lower-level page and could then reorient themselves within the Web site. Almost all the participants benefited from the presence of multiple links to commonly used tools and resources at different levels within the site. The provision of a variety of paths allowed the participants to successfully locate information and services even when they had overlooked a direct link from the main page.

Implications

The research team intends to conduct further testing of user-centered Web design applying lessons learned from this study. The screen- and audio-capture software provided an unobtrusive method for creating a permanent record of each session in a level of detail that the human observer could not accomplish. The development of the session script was critical to ensuring consistency across all the sessions. The team tried to create a comfortable setting for each participant. The screen- and audio-capture software allowed the team to observe the participants from a comfortable distance. The portion of the script that stressed that the test was of the Web site, not the participant’s skill, set a nonjudgmental tone.

The study led the team to consider a number of design options to make the library’s Web site a more useful information-seeking tool. For example, would a visual path and site map help in navigation? A visual path that is always visible would help orient the user within the Web site. The site map provides another method for orienting a user within the Web site. Are there ways to provide better context for terms such as Kudzu and AskUs.Now or to better describe the content of the Subject Guide pages? One solution for dealing with nonintuitive terminology for links and services would be to provide a description of the link that appears on MouseOver. Web designers must determine how to provide just enough information on a page, given the tendency not to scroll past the first screen.

Many users had difficulty locating journal articles in an unfamiliar subject area. In the current arrangement, abstracting and indexing databases are arranged as alphabetical lists by general subject area. The participants in the study were able to locate the list of databases that they needed for the journal article task, but then had difficulty with where to go from there. A portal product that incorporated a federated search feature would go a long way towards solving this problem. For example, in the Ex Libris portal product, MetaLib, users who choose a general subject area are presented with a list of databases. Instead of having to choose from among them, they can enter a term into a single search box that will be applied to all the databases.

The second part of the study consisting of structured interviews is revealing some fascinating information about how scholars might benefit from a customizable portal. That portion of the study was more open-ended, generating rich data requiring detailed analysis. The team will report additional findings in the near future.

References

   1. Philip Schatz, Sampling in Research. Accessed Jan. 14, 2003, http://schatz.sju.edu/methods/sampling/sampling.html.

   2. M. Patton, Qualitative Evaluation and Research Methods, 2d ed. (Newbury Park, Calif.: Sage, 1990).

   3. Philip Schatz, Sampling in Research.

   4. Jakob Nielsen, Usability Engineering (San Francisco, Calif.: Morgan Kaufmann Pub., 1994).

   5. Lynn Lysiak, “A Web Usability Study of Freshmen at Appalachian State University, University of Arizona, and Georgia Southern University,” presented at the University of Tennessee Libraries, Jan. 2001; MIT Libraries, Web Site Usability Test, Apr. 2, 1999. Accessed Jan. 7, 2002, http://macfadden.mit.edu:9500/webgroup/usability/results; Elaina Norlin, “Usability Tests and the User-Friendly Web Site: The University of Arizona Experience,” The University of Arizona Library, 1998. Accessed Jan. 14, 2003, http://dizzy.library.arizona.edu/library/teams/access9798/asu/index.htm.

   6. Christoph Hölscher and Gerhard Strube, “Web Search Behavior of Internet Experts and Newbies,” Ninth International World Wide Web Conference, Amsterdam, May 15–19, 2000. Accessed Jan. 14 2003, http://www9.org/w9cdrom/81/81.html.


   Thura Mack (tmack@utk.edu) is Reference and Training Librarian, Maribeth Manoff (mmanoff@utk.edu) is Coordinator for Networked Services Integration, and Anthony D. Smith (adsmith1@utk.edu) is Digital Initiatives Coordinator at the University of Tennessee Libraries, Knoxville. Tamara Miller (tamaramiller@montana.edu) is Associate Dean of Libraries at Montana State University, Bozeman.